Skip to content

Commit

Permalink
Update Layout
Browse files Browse the repository at this point in the history
  • Loading branch information
leestott committed Aug 14, 2024
1 parent 6012bcf commit 264d419
Show file tree
Hide file tree
Showing 99 changed files with 1,670 additions and 1,842 deletions.
8 changes: 4 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -64,18 +64,18 @@ This demo takes an image png and then converts the image to code using Phi3 Onnx

- The opportunity of SLMs and LLMs

[**DEMO- ONNXRuntime WebGL + AI PC** (5 min)](/src/01.InferencePhi3/)
[**DEMO- ONNXRuntime WebGL + AI PC** (5 min)](/src/02.ONNXRuntime/01.WebGPU%20Chat%20RAG/README.md)
- [iPhone](/src/01.InferencePhi3/02.ios/README.md)
- [AIPC Sample]()
- [WebGPU](/src/01.InferencePhi3/03.chat/README.md)
- [WebGPU](/src/02.ONNXRuntime/

### Criteria for Choosing the Right Model (10 min)
- Task requirements and model capabilities
- Performance metrics and evaluation methods
- Iterative refinement and validation processes
- Fine-tuning options for model improvement

- [**DEMO - Phi-3 Fine-tuning** (5 min)](/src/02.AIToolsSolutionE2E/Readme.md)
- [**DEMO - Phi-3 Fine-tuning** (5 min)](/src/03.AIToolsSolutionE2E/Readme.md)

Cloud Based FineTuning using Azure AI Compute and Local based Fine Tuning using AI Toolkit

Expand All @@ -88,7 +88,7 @@ Cloud Based FineTuning using Azure AI Compute and Local based Fine Tuning using
- Examples of successful model applications
- Lessons learned from model deployment and usage

- [**DEMO - Phi-3 RAG using .NET Aspire** (5 min)](/src/03.CloudNativeRAG/Readme.md)
- [**DEMO - Phi-3 RAG using .NET Aspire** (5 min)](/src/04.CloudNativeRAG/Readme.md)

RAG Aspire demo(We can deploy Phi-3 as Service and .using .NET Aspire to create Cloud Native Distribution Application)

Expand Down
Submodule Phi-3-mini-4k-instruct-onnx-web deleted from 80a279
Submodule jina-embeddings-v2-base-en deleted from 6fd0fd
Original file line number Diff line number Diff line change
Expand Up @@ -127,5 +127,4 @@ Click on the button that says “Choose File” to pick the document you want to
After selecting your file, click the “Upload” button to load your document for RAG (Retrieval-Augmented Generation).

### Start Your Chat:
Once the document is uploaded, you can start a chat session using RAG based on the content of your document.

Once the document is uploaded, you can start a chat session using RAG based on the content of your document.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
Original file line number Diff line number Diff line change
@@ -1,18 +1,18 @@
## Demo-02 : Fine-tuning Phi-3 with AI Tools VSCode Extensions

Using Azure AI Tools & Microsoft VS Code

This demo guides you through the process of fine-tuning the Phi-3 model using AI Tools VSCode Extensions, including steps for fine-tuning, inference, and deployment using Azure Machine Learning Service. Ensure your Azure subscription has access to an Azure Compute A100 GPU to complete this demo.

This demo provides a structured approach to fine-tuning the Phi-3 model using AI Tools VSCode Extensions and deploying it with Azure Machine Learning Service

***Note*** Please Ensure your Azure Subscription has access to an Azure Compute A100 GPU to complete this demo.

**Sample Code**

| Step | Description | Operation |
|-------------------|----------------------------------|-------------------|
|01.Installation| Please follow this step to set your env|[Go](./qa_e2e/docs/01.Installation.md)|
|02.Prepare your QA datasets| Prepare your datasets, and tell you how to clean your datasets|[Go](./qa_e2e/docs/02.PrepareDatasets.md)|
|03.Use Microsoft Olive to architect SLMOps | Using Microsoft Olive tools to fit your SLMOps cycle|[Go](./qa_e2e/docs/03.E2E_LoRA&QLoRA_Config_With_Olive.md)|
|04.Inference your Fine-tuning models| Inference your onnx model after fine tuning|[Go](./qa_e2e/docs/04.E2E_Inference_ORT.md)|
## Demo-02 : Fine-tuning Phi-3 with AI Tools VSCode Extensions

Using Azure AI Tools & Microsoft VS Code

This demo guides you through the process of fine-tuning the Phi-3 model using AI Tools VSCode Extensions, including steps for fine-tuning, inference, and deployment using Azure Machine Learning Service. Ensure your Azure subscription has access to an Azure Compute A100 GPU to complete this demo.

This demo provides a structured approach to fine-tuning the Phi-3 model using AI Tools VSCode Extensions and deploying it with Azure Machine Learning Service

***Note*** Please Ensure your Azure Subscription has access to an Azure Compute A100 GPU to complete this demo.

**Sample Code**

| Step | Description | Operation |
|-------------------|----------------------------------|-------------------|
|01.Installation| Please follow this step to set your env|[Go](./qa_e2e/docs/01.Installation.md)|
|02.Prepare your QA datasets| Prepare your datasets, and tell you how to clean your datasets|[Go](./qa_e2e/docs/02.PrepareDatasets.md)|
|03.Use Microsoft Olive to architect SLMOps | Using Microsoft Olive tools to fit your SLMOps cycle|[Go](./qa_e2e/docs/03.E2E_LoRA&QLoRA_Config_With_Olive.md)|
|04.Inference your Fine-tuning models| Inference your onnx model after fine tuning|[Go](./qa_e2e/docs/04.E2E_Inference_ORT.md)|
Original file line number Diff line number Diff line change
@@ -1,16 +1,16 @@
einops
accelerate
bitsandbytes
datasets
huggingface_hub
peft
scipy
neural-compressor
sentencepiece
torch>=2.2.0
transformers
onnxscript>=0.1.0.dev20240126
git+https://github.com/microsoft/Olive
--extra-index-url https://aiinfra.pkgs.visualstudio.com/PublicPackages/_packaging/ORT-Nightly/pypi/simple/
ort-nightly-gpu==1.18.0.dev20240307004
einops
accelerate
bitsandbytes
datasets
huggingface_hub
peft
scipy
neural-compressor
sentencepiece
torch>=2.2.0
transformers
onnxscript>=0.1.0.dev20240126
git+https://github.com/microsoft/Olive
--extra-index-url https://aiinfra.pkgs.visualstudio.com/PublicPackages/_packaging/ORT-Nightly/pypi/simple/
ort-nightly-gpu==1.18.0.dev20240307004
onnxruntime-genai
Binary file removed src/03.CloudNativeRAG/libs/libonnxruntime-genai.so
Binary file not shown.
Binary file removed src/03.CloudNativeRAG/libs/libonnxruntime.so
Binary file not shown.
Binary file removed src/03.CloudNativeRAG/libs/libonnxruntime.so.1
Binary file not shown.
Binary file not shown.
Original file line number Diff line number Diff line change
@@ -1,44 +1,44 @@
<Project Sdk="Microsoft.NET.Sdk">

<PropertyGroup>
<OutputType>Exe</OutputType>
<TargetFramework>net8.0</TargetFramework>
<ImplicitUsings>enable</ImplicitUsings>
<Nullable>enable</Nullable>
<IsAspireHost>true</IsAspireHost>
<UserSecretsId>35500357-d54a-4562-89ac-5a1324a55e40</UserSecretsId>
</PropertyGroup>



<ItemGroup>
<ProjectReference Include="..\Phi3.Aspire.ModelService\Phi3.Aspire.ModelService.csproj" />
<!-- <ProjectReference Include="..\Phi3.Aspire.Console\Phi3.Aspire.Console.csproj" /> -->
<ProjectReference Include="..\Phi3.Aspire.SK.API\Phi3.Aspire.SK.API.csproj" />
<!-- <ProjectReference Include="..\Phi3.Aspire.SK.RAG.API\Phi3.Aspire.SK.RAG.API.csproj" /> -->
<!-- <ProjectReference Include="..\Phi3.Aspire.FrontEnd\Phi3.Aspire.FrontEnd.csproj" /> -->
<!-- <ProjectReference Include="..\Phi3.Aspire.WebApp\Phi3.Aspire.WebApp.csproj" /> -->
<!-- <ProjectReference Include="..\Phi3.Aspire.WasmApp\Phi3.Aspire.WasmApp.csproj" /> -->
</ItemGroup>


<ItemGroup>
<PackageReference Include="Aspire.Hosting.AppHost" Version="8.0.2" />
<PackageReference Include="Aspire.Hosting.NodeJs" Version="8.0.2" />
<PackageReference Include="Aspire.Hosting.Qdrant" Version="8.0.2" />
<PackageReference Include="Aspire.Hosting.Redis" Version="8.0.2" />
</ItemGroup>



<Target Name="RestoreNpm" BeforeTargets="Build" Condition=" '$(DesignTimeBuild)' != 'true' ">
<ItemGroup>
<PackageJsons Include="..\*\package.json" />
</ItemGroup>

<!-- Install npm packages if node_modules is missing -->
<Message Importance="Normal" Text="Installing npm packages for %(PackageJsons.RelativeDir)" Condition="!Exists('%(PackageJsons.RootDir)%(PackageJsons.Directory)/node_modules')" />
<Exec Command="npm install" WorkingDirectory="%(PackageJsons.RootDir)%(PackageJsons.Directory)" Condition="!Exists('%(PackageJsons.RootDir)%(PackageJsons.Directory)/node_modules')" />
</Target>

</Project>
<Project Sdk="Microsoft.NET.Sdk">

<PropertyGroup>
<OutputType>Exe</OutputType>
<TargetFramework>net8.0</TargetFramework>
<ImplicitUsings>enable</ImplicitUsings>
<Nullable>enable</Nullable>
<IsAspireHost>true</IsAspireHost>
<UserSecretsId>35500357-d54a-4562-89ac-5a1324a55e40</UserSecretsId>
</PropertyGroup>



<ItemGroup>
<ProjectReference Include="..\Phi3.Aspire.ModelService\Phi3.Aspire.ModelService.csproj" />
<!-- <ProjectReference Include="..\Phi3.Aspire.Console\Phi3.Aspire.Console.csproj" /> -->
<ProjectReference Include="..\Phi3.Aspire.SK.API\Phi3.Aspire.SK.API.csproj" />
<!-- <ProjectReference Include="..\Phi3.Aspire.SK.RAG.API\Phi3.Aspire.SK.RAG.API.csproj" /> -->
<!-- <ProjectReference Include="..\Phi3.Aspire.FrontEnd\Phi3.Aspire.FrontEnd.csproj" /> -->
<!-- <ProjectReference Include="..\Phi3.Aspire.WebApp\Phi3.Aspire.WebApp.csproj" /> -->
<!-- <ProjectReference Include="..\Phi3.Aspire.WasmApp\Phi3.Aspire.WasmApp.csproj" /> -->
</ItemGroup>


<ItemGroup>
<PackageReference Include="Aspire.Hosting.AppHost" Version="8.0.2" />
<PackageReference Include="Aspire.Hosting.NodeJs" Version="8.0.2" />
<PackageReference Include="Aspire.Hosting.Qdrant" Version="8.0.2" />
<PackageReference Include="Aspire.Hosting.Redis" Version="8.0.2" />
</ItemGroup>



<Target Name="RestoreNpm" BeforeTargets="Build" Condition=" '$(DesignTimeBuild)' != 'true' ">
<ItemGroup>
<PackageJsons Include="..\*\package.json" />
</ItemGroup>

<!-- Install npm packages if node_modules is missing -->
<Message Importance="Normal" Text="Installing npm packages for %(PackageJsons.RelativeDir)" Condition="!Exists('%(PackageJsons.RootDir)%(PackageJsons.Directory)/node_modules')" />
<Exec Command="npm install" WorkingDirectory="%(PackageJsons.RootDir)%(PackageJsons.Directory)" Condition="!Exists('%(PackageJsons.RootDir)%(PackageJsons.Directory)/node_modules')" />
</Target>

</Project>
Original file line number Diff line number Diff line change
@@ -1,23 +1,23 @@
var builder = DistributedApplication.CreateBuilder(args);


var cache = builder.AddRedis("cache");


var phi3service = builder.AddProject<Projects.Phi3_Aspire_ModelService>("phi3service");



var skService = builder.AddProject<Projects.Phi3_Aspire_SK_API>("skservice")
.WithExternalHttpEndpoints()
.WithReference(phi3service);


builder.AddNpmApp("vue", "../Phi3.Aspire.Vue")
.WithReference(skService)
.WithHttpEndpoint(env: "PORT")
.WithExternalHttpEndpoints()
.PublishAsDockerFile();


builder.Build().Run();
var builder = DistributedApplication.CreateBuilder(args);


var cache = builder.AddRedis("cache");


var phi3service = builder.AddProject<Projects.Phi3_Aspire_ModelService>("phi3service");



var skService = builder.AddProject<Projects.Phi3_Aspire_SK_API>("skservice")
.WithExternalHttpEndpoints()
.WithReference(phi3service);


builder.AddNpmApp("vue", "../Phi3.Aspire.Vue")
.WithReference(skService)
.WithHttpEndpoint(env: "PORT")
.WithExternalHttpEndpoints()
.PublishAsDockerFile();


builder.Build().Run();
Original file line number Diff line number Diff line change
@@ -1,29 +1,29 @@
{
"$schema": "https://json.schemastore.org/launchsettings.json",
"profiles": {
"https": {
"commandName": "Project",
"dotnetRunMessages": true,
"launchBrowser": true,
"applicationUrl": "https://localhost:17189;http://localhost:15147",
"environmentVariables": {
"ASPNETCORE_ENVIRONMENT": "Development",
"DOTNET_ENVIRONMENT": "Development",
"DOTNET_DASHBOARD_OTLP_ENDPOINT_URL": "https://localhost:21259",
"DOTNET_RESOURCE_SERVICE_ENDPOINT_URL": "https://localhost:22132"
}
},
"http": {
"commandName": "Project",
"dotnetRunMessages": true,
"launchBrowser": true,
"applicationUrl": "http://localhost:15147",
"environmentVariables": {
"ASPNETCORE_ENVIRONMENT": "Development",
"DOTNET_ENVIRONMENT": "Development",
"DOTNET_DASHBOARD_OTLP_ENDPOINT_URL": "http://localhost:19053",
"DOTNET_RESOURCE_SERVICE_ENDPOINT_URL": "http://localhost:20110"
}
}
}
}
{
"$schema": "https://json.schemastore.org/launchsettings.json",
"profiles": {
"https": {
"commandName": "Project",
"dotnetRunMessages": true,
"launchBrowser": true,
"applicationUrl": "https://localhost:17189;http://localhost:15147",
"environmentVariables": {
"ASPNETCORE_ENVIRONMENT": "Development",
"DOTNET_ENVIRONMENT": "Development",
"DOTNET_DASHBOARD_OTLP_ENDPOINT_URL": "https://localhost:21259",
"DOTNET_RESOURCE_SERVICE_ENDPOINT_URL": "https://localhost:22132"
}
},
"http": {
"commandName": "Project",
"dotnetRunMessages": true,
"launchBrowser": true,
"applicationUrl": "http://localhost:15147",
"environmentVariables": {
"ASPNETCORE_ENVIRONMENT": "Development",
"DOTNET_ENVIRONMENT": "Development",
"DOTNET_DASHBOARD_OTLP_ENDPOINT_URL": "http://localhost:19053",
"DOTNET_RESOURCE_SERVICE_ENDPOINT_URL": "http://localhost:20110"
}
}
}
}
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
{
"Logging": {
"LogLevel": {
"Default": "Information",
"Microsoft.AspNetCore": "Warning"
}
}
}
{
"Logging": {
"LogLevel": {
"Default": "Information",
"Microsoft.AspNetCore": "Warning"
}
}
}
Original file line number Diff line number Diff line change
@@ -1,9 +1,9 @@
{
"Logging": {
"LogLevel": {
"Default": "Information",
"Microsoft.AspNetCore": "Warning",
"Aspire.Hosting.Dcp": "Warning"
}
}
}
{
"Logging": {
"LogLevel": {
"Default": "Information",
"Microsoft.AspNetCore": "Warning",
"Aspire.Hosting.Dcp": "Warning"
}
}
}
Original file line number Diff line number Diff line change
@@ -1,19 +1,19 @@
<Project Sdk="Microsoft.NET.Sdk.Web">

<PropertyGroup>
<TargetFramework>net8.0</TargetFramework>
<Nullable>enable</Nullable>
<ImplicitUsings>enable</ImplicitUsings>
</PropertyGroup>

<ItemGroup>
<PackageReference Include="Microsoft.ML.OnnxRuntime" Version="1.18.0" />
<PackageReference Include="Microsoft.ML.OnnxRuntimeGenAI" Version="0.3.0" />
</ItemGroup>


<ItemGroup>
<ProjectReference Include="..\Phi3.Aspire.ServiceDefaults\Phi3.Aspire.ServiceDefaults.csproj" />
</ItemGroup>

</Project>
<Project Sdk="Microsoft.NET.Sdk.Web">

<PropertyGroup>
<TargetFramework>net8.0</TargetFramework>
<Nullable>enable</Nullable>
<ImplicitUsings>enable</ImplicitUsings>
</PropertyGroup>

<ItemGroup>
<PackageReference Include="Microsoft.ML.OnnxRuntime" Version="1.18.0" />
<PackageReference Include="Microsoft.ML.OnnxRuntimeGenAI" Version="0.3.0" />
</ItemGroup>


<ItemGroup>
<ProjectReference Include="..\Phi3.Aspire.ServiceDefaults\Phi3.Aspire.ServiceDefaults.csproj" />
</ItemGroup>

</Project>
Loading

0 comments on commit 264d419

Please sign in to comment.