Wednesday, July 23, 2025

.NET Isolated Azure Durable Functions: Distributed Tracing

In this post, let's have a look at the power of Distributed Tracing in .NET Isolated Azure Durable Functions. This is one of the new features that got GA like few weeks ago.

First let's have a look at a simple Durable Function and see how it's logged in Application Insights.
public static class Function
{
    [Function(nameof(HttpStart))]
    public static async Task<HttpResponseData> HttpStart(
        [HttpTrigger(AuthorizationLevel.Anonymous, "get")] HttpRequestData req,
        [DurableClient] DurableTaskClient client,
        FunctionContext executionContext)
    {
        string instanceId =
            await client.ScheduleNewOrchestrationInstanceAsync(nameof(RunOrchestrator));

        return await client.CreateCheckStatusResponseAsync(reqinstanceId);
    }

    [Function(nameof(RunOrchestrator))]
    public static async Task<List<string>> RunOrchestrator(
        [OrchestrationTrigger] TaskOrchestrationContext context)
    {
        EntityInstanceId entityId = new(nameof(HelloHistoryEntity)"helloHistory");

        await context.Entities.CallEntityAsync<string>(entityId,
            nameof(HelloHistoryEntity.Reset));

        string result = await context.CallActivityAsync<string>(nameof(SayHello)"Tokyo");
        await context.Entities.CallEntityAsync<string>(entityId,
            nameof(HelloHistoryEntity.Add),
            result);

        result = await context.CallActivityAsync<string>(nameof(SayHello)"Seattle");
        await context.Entities.CallEntityAsync<string>(entityId,
            nameof(HelloHistoryEntity.Add),
            result);

        result = await context.CallActivityAsync<string>(nameof(SayHello)"London");
        await context.Entities.CallEntityAsync<string>(entityId,
            nameof(HelloHistoryEntity.Add),
            result);

        List<string> outputs = await context.Entities.CallEntityAsync<List<string>>(entityId,
            nameof(HelloHistoryEntity.Get));
        return outputs;
    }

    [Function(nameof(SayHello))]
    public static string SayHello([ActivityTrigger] string nameFunctionContext executionContext)
    {
        return $"Hello {name}!";
    }
}

public class HelloHistoryEntity : TaskEntity<List<string>>
{
    public void Add(string message) => State.Add(message);

    public void Reset() => State = [];

    public List<string> Get() => State;

    [Function(nameof(HelloHistoryEntity))]
    public Task RunEntityAsync([EntityTrigger] TaskEntityDispatcher dispatcher)
    {
        return dispatcher.DispatchAsync(this);
    }
}
Here I have a single Orchestrator that gets triggered by a HTTP function, and the Orchestrator calls an Activity and a Entity few times.

Once I trigger the HTTP function, the logs for HTTP Request looks like below.
Without Distributed Tracing
And this isn't quite helpful.

Now let's enable Distributed Tracing. For .NET Isolated Durable Functions, Distributed Tracing V2 is supported with Microsoft.Azure.Functions.Worker.Extensions.DurableTask >= v1.4.0. Make sure to update your packages before doing the next step.

Now modify the host.json as follows.
{
  "version""2.0",
  "extensions": {
    "durableTask": {
      "tracing": {
        "distributedTracingEnabled"true,
        "version""V2"
      }
    }
  }
}
And that's about it.

Now if I I trigger the HTTP function, the logs for HTTP Request looks like below.
Distributed Tracing
Now we can see the full execution, the call to the Orchestrator and all related Activity and Entity calls.

Isn't it just nice.

Happy Coding.

Regards,
Jaliya

Tuesday, July 22, 2025

Azure Functions on Azure Container Apps: KEDA Scaling for Service Bus Functions When Using AZURE_CLIENT_ID

In a previous post, I wrote about creating Azure Container Apps using az containerapp create --kind functionapp (az functionapp create VS az containerapp create --kind functionapp).

One of the main advantages of this approach is these function apps are preconfigured with auto scaling rules for triggers like Azure Service Bus, Azure Event Hubs etc.

However, in one of our Function Apps that was running on Azure Container Apps, noticed the scaling rules wasn't created for Service Bus Functions.

Only 1 rule for http-scale-rule
The function app was using a Managed Identity (AZURE_CLIENT_ID), and then noticed for Service Bus connection, we were using ServiceBus Connection String (😢).

az containerapp update `
    --name $ContainerAppName `
    --resource-group $ResourceGroup `
    --image $Image `
    --set-env-vars `
        "AzureWebJobsStorage=<AzureWebJobsStorage>" `
        'AZURE_CLIENT_ID="<ManagedIdentityClientId>"' `
        'AzureWebJobsServiceBus="<ServiceBus_ConnectionString>"'

In order for KEDA scaling rules to configure, we need to be using Identity-based Connections instead of secrets.

Something like,

az containerapp update `
    --name $ContainerAppName `
    --resource-group $ResourceGroup `
    --image $Image `
    --set-env-vars `
        "AzureWebJobsStorage=<AzureWebJobsStorage>" `
        'AZURE_CLIENT_ID="<ManagedIdentityClientId>"' `
        'AzureWebJobsServiceBus__fullyQualifiednamespace="<ServiceBusName>.servicebus.windows.net"'

Here for identity-based Connection, we don't need to set <CONNECTION_NAME_PREFIX>__clientId as AZURE_CLIENT_ID is declared and it will be used as <CONNECTION_NAME_PREFIX>__clientId. However, we can explicitly set <CONNECTION_NAME_PREFIX>__clientId to override the default.

 And now the rules are auto configured as expected.

Expected Output
Scale Rules
More read:
   Azure Functions on Azure Container Apps overview
   Azure Functions: Connection values
   Tutorial: Use identity-based connections instead of secrets with triggers and bindings

Hope this helps.

Happy Coding.

Regards,
Jaliya

Wednesday, July 16, 2025

Expose Secondary Azure Document Intelligence Service through Azure Front Door

In the last a couple of months, we had 2 incidents where Azure Document Intelligence Service in East US region has degraded performance. Because of that, we were getting a lot  of 503s (Service Unavailable) while doing various operations and our retries didn't help. Microsoft acknowledged the service degradations.


In this post, let's see how we can expose secondary Azure Document Intelligence Services through Azure Front Door.

We can add another Origin to the Origin Group that contains Document Intelligence Service. But then there is an important factor, from the consumer side we can't use the Ocp-Apim-Subscription-Key for authentication. That's because we won't know to which origin the traffic will get routed to and different Document Intelligence services will have different keys.

So we need to have a shared authentication mechanism for all our consumers and it can be achieved by using Managed Identities. And using keys (Ocp-Apim-Subscription-Key) is not recommended anyway and we need to be using Managed Identities as much as possible.

We can implement the authentication at 2 places. Either the consumer authenticate the request or we can have the AFD Origin Group do the authentication on behalf of the consumer before routing the request to a Origin.

For both these approaches, we need to have a managed identity created, and for that identity given role Cognitive Services User at both Document Intelligence services.

Consumer authenticating the request against Document Intelligence Services

Consumer authenticating the request against Document Intelligence Services
Here we are making the authentication at the Consumer level using the Managed Identity. This is helpful when you are consuming the Document Intelligence service through a SDK.

For an example, if you are using Azure.AI.DocumentIntelligence package,
var documentIntelligenceClient = 
    new DocumentIntelligenceClient(new Uri("<ENDPOINT>")new DefaultAzureCredential());
With this ManagedIdentityCredential will be attempted and a token will get retrieved as long as you have necessary the environment variables set.

Azure Front Door authenticating the request against Document Intelligence Services

AFD authenticating the request against Document Intelligence Services
Here, we really don't care about how a consumer is making the request to AFD, from the Origin Group in AFD, we will be authenticating the request using the Managed Identity for respective Document Intelligence service prior routing the request.

For that first we need to assign the identity to AFD. 
AFD Identity
And then update Origin Group enabling Origin authentication.
Enabling Origin Group Authentication
Hope this helps.

Happy Coding,

Regards,
Jaliya

Saturday, July 12, 2025

Exposing Azure Document Intelligence Service through Azure Front Door

In this post, let's see how we can expose an Azure Document Intelligence (DI) Service through Azure Front Door (AFD) and consume it via a  .NET Client Application that uses Azure.AI.DocumentIntelligence package.

Say, we have a DI service,

https://{document-intelligence-service-name}.cognitiveservices.azure.com/

And we need this service to be consumed via,

https://{front-door-name}.azurefd.net/di-api/

For an example, 

POST https://{document-intelligence-service-name}.cognitiveservices.azure.com/documentintelligence/documentClassifiers/{modelId}:analyze
will now be consumed via,
POST https://{front-door-name}/di-api/documentintelligence/documentClassifiers/{modelId}:analyze

To start off, following are already created.

  • An Azure Document Intelligence Service
  • Azure Front Door and CDN profiles (Azure Front Door Standard with Quick Create)
First step is adding Origin Group with Origins in AFD.
Add Origin Group
Added Origin is as follows:
Add Origin
Once these are added, next we need to configure how to route the requests to this Origin group.

We can do it two ways.

1. Using default-route, with a Rule set to Override origin group
2. Creating a new route with Pattens to match and Origin path

Now let's see how we can configure both these ways.

1. Using default-route, with a Rule set to Override origin group

With this approach, first we need to create a Rule set as follows.
Rule set configuration
Now we need to associate this rule set to the default-route.
Update default route

2. Creating a new route with Pattens to match and Origin path

In this approach, we don't need to create a Rule set. Instead, we can create a new Route with Pattens to match and Origin path.
Add new route
Now I have a .NET Client Application that uses Azure.AI.DocumentIntelligence package, that I was using to test the DI functionality via AFD.
using Azure;
using Azure.AI.DocumentIntelligence;

string endpoint = "https://{front-door-name}.azurefd.net/di-api/";
string apiKey = "{document-intelligence-service-api-key}";
DocumentIntelligenceClient documentIntelligenceClient = new (new Uri(endpoint)new AzureKeyCredential(apiKey));

string classifierId = "{some-classification-model-id}";
string testFilePath = "path\to\test\file.pdf";

using FileStream fileStream = new FileStream(testFilePathFileMode.Open, FileAccess.Read);
BinaryData binaryData = BinaryData.FromStream(fileStream);

ClassifyDocumentOptions classifyDocumentOptions = new(classifierIdbinaryData);

Operation<AnalyzeResult> operation = 
    await documentIntelligenceClient.ClassifyDocumentAsync(WaitUntil.Completed, classifyDocumentOptions);

AnalyzeResult result = operation.Value;

foreach (AnalyzedDocument document in result.Documents)
{
    Console.WriteLine($"Found a document of type: '{document.DocumentType}'");
}
And I can see this is working with both the approaches.

Hope this helps.

Happy Coding.

Regards,
Jaliya

Friday, July 11, 2025

Received Microsoft MVP Award in Developer Technologies

I am humbled and honored once again to receive the precious Microsoft Most Valuable Professional (MVP) Award for the 12th consecutive year.

As always looking forward to another great year on top of Microsoft Development Stack.
Microsoft Most Valuable Professional (MVP)
Thank you Microsoft for your appreciation and Thank you everyone for your continuous support.

Happy Coding.

Regards,
Jaliya

Monday, June 30, 2025

Azure Container Apps: az functionapp create VS az containerapp create --kind functionapp

At Build 2025, Microsoft announced Announcing Native Azure Functions Support in Azure Container Apps.

We already can do this: That is create a Function App in Container Apps Environment using az functionapp create.

az functionapp create `
    --name ${APP_NAME} `
    --resource-group ${RESOURCE_GROUP} `
    --environment ${ENVIRONMENT} `
    --image ${IMAGE} `
    --registry-server ${REGISTRY_SERVER} `
    --registry-username ${REGISTRY_USERNAME} `
    --registry-password ${REGISTRY_PASSWORD} `
    --max-replicas 10 `
    --min-replicas 1 `
    --storage-account ${STORAGE_ACCOUNT}

With the new announcement, we can now do this.

az containerapp create `
    --name ${APP_NAME} `
    --resource-group ${RESOURCE_GROUP} `
    --environment ${ENVIRONMENT} `
    --kind functionapp ` # Create a Function App in Azure Container Apps
    --image ${IMAGE} `
    --registry-server ${REGISTRY_SERVER} `
    --registry-username ${REGISTRY_USERNAME} `
    --registry-password ${REGISTRY_PASSWORD} `
    --ingress external `
    --target-port 80

Notice the --kind functionapp.

I have created these two container apps for testing:

Function Apps in Container Apps Environment
So what's the difference.

When I look at fa-hello-world, it's the regular function app experience, I can see the functions etc.

When I look at ca-hello-world it's the regular container app experience. I can't see the functions which is kind of reduced functionality. However, this approach gives access to ACA specific features such as multi-revision management & traffic split, preconfigured auto scaling rules, authentication/authorization, health probes, side cars etc (more information here) which were previously limited when using az functionapp create.

The future Direction from Microsoft is to continue evolving and supporting the az containerapp create --kind functionapp approach. And hopefully we will be able to see the functions as well.

Hope this helps.

Happy Coding.

Regards,
Jaliya

Wednesday, June 25, 2025

ASP.NET Core in .NET 10 Preview 4: JSON Patch with System.Text.Json

With .NET 10 Preview 4 onwards, you can now use JsonPatch with System.Text.Json in ASP.NET Web API. 

Currently if you need to use JsonPatch you need to rely on Newtonsoft.Json as described in this article: JsonPatch in ASP.NET Core web API

Note: this isn't a complete drop-in replacement for the existing Newtonsoft.Json based implementation. In particular, the new implementation doesn't support dynamic types (like ExpandoObject).

Now let's see how this works.

First you need to install a new package, Microsoft.AspNetCore.JsonPatch.SystemTextJson. Note: it's still prerelease.

dotnet add package Microsoft.AspNetCore.JsonPatch.SystemTextJson --prerelease

Next, we can use the JsonPatchDocument<TModel>, that is introduced in Microsoft.AspNetCore.JsonPatch.SystemTextJson.

using Microsoft.AspNetCore.JsonPatch.SystemTextJson;
using Microsoft.AspNetCore.Mvc;
namespace WebApplication1.Controllers;
[ApiController]
[Route("[controller]")]
public class EmployeesController : ControllerBase
{
    [HttpPatch]
    [Route("{employeeId}")]
    public Employee PatchEmployee([FromRoute] int employeeId
        JsonPatchDocument<Employee> patchDocument)
    {
        Employee employee = new()
        {
            Id = employeeId,
            FirstName = "John",
            LastName = "Doe",
            Address = new Employee.AddressDto
            {
                Street = "123 Main St",
                City = "Redmond",
                State = "WA",
                ZipCode = "12345"
            }
        };
        patchDocument.ApplyTo(employee);
        return employee;
    }
}

You don't have to do any changes to the Program.cs.

WebApplicationBuilder builder = WebApplication.CreateBuilder(args);

builder.Services.AddControllers();

WebApplication app = builder.Build();

app.MapControllers();

app.Run();

And invoke the endpoint like follows.

@WebApplication1_HostAddress = https://localhost:7070
PATCH {{WebApplication1_HostAddress}}/employees/1
Content-Type: application/json
[
    {
        "op": "replace",
        "path": "/LastName",
        "value": "Bloggs"
    },
    {
        "op": "replace",
        "path": "/Address/ZipCode",
        "value": "90100"
    }
]

PATCH
Hope this helps.

Happy Coding.

Regards,
Jaliya

Sunday, June 22, 2025

Super Simple .NET Run Program.cs

From .NET 10 Preview 4 onwards, we can run individual .NET Code files without any ceremony just like you would do in Python etc.

For example, I can just create a .cs file (HelloWorld.cs) with some C# code.
Console.WriteLine("Hello World!");
And then do the following.
dotnet run HelloWorld.cs
dotnet run HelloWorld.cs
If we need to use external packages within our code, we can do something like this. Here we don't have a .csproj file, so we can do the package reference inline. For demo purposes, I am using  Humanizer package.
// Reference the Humanizer package
#:package Humanizer@2.*

// Using directive
using Humanizer;

// Using the package
Console.WriteLine("Hello World!".Humanize(LetterCasing.AllCaps));
dotnet run HelloWorld.cs
That's neat!

Happy Coding.

Regards,
Jaliya

Thursday, June 19, 2025

Running Python Code within .NET Projects

Back home now after what was meant to be a relaxing holiday - unfortunately, it took quite a turn with some unexpected hospital stays both overseas and after coming home.

Started catching up and there has been some mind blowing announcements in the world of Software Development with AI.

For ML and AI, I think we all agree that Python is the go to programming language. Might change in the future, but at least currently that's the main one.

I recently had a requirement where we want to use a functionality that is written on Python to be consumed by a .NET application. The way I thought is to expose a Python API (using Flask, FastAPI etc), but then there is going to be a lot of data (floats) travelling through HTTP. Since I didn't have any other option, started on that.

And then on Microsoft Build 2025, there was this cool session on Python Meets .NET with Anthony Shaw and Scott Hanselman. It's a new project called CSnakes, where we could use Python using .NET and most importantly that's not by emulating or simulating Python. It's embedding Python into .NET process. .NET and Python code shares the same threads and same memory etc.
CSnakes
Let's have a look at a basic working example.

Here I have a basic Python function in a file called hello_world.py.
def say_hello(name: str) -> str:
    return f"Hello {name}!"
I have now created a Console Application that targets .NET 9 and installed CSnakes.Runtime package.
<Project Sdk="Microsoft.NET.Sdk">

  <PropertyGroup>
    <OutputType>Exe</OutputType>
    <TargetFramework>net9.0</TargetFramework>
    <ImplicitUsings>enable</ImplicitUsings>
    <Nullable>enable</Nullable>
  </PropertyGroup>

  <ItemGroup>
    <PackageReference Include="CSnakes.Runtime" Version="1.0.34" />
  </ItemGroup>

</Project>
I am putting the hello_world.py inside the project and updating .csproj file to copy the .py to Output directory. And this is a very important step, this enables CSnakes to run the source generator over Python files.
<ItemGroup>
  <AdditionalFiles Include="hello_world.py">
    <CopyToOutputDirectory>Always</CopyToOutputDirectory>
  </AdditionalFiles>
</ItemGroup>
And now updating the Program.cs as follows.
using CSnakes.Runtime;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Hosting;

IHostBuilder builder = Host.CreateDefaultBuilder(args)
    .ConfigureServices(services =>
    {
        // Path to your Python modules
        var home = Path.Join(Environment.CurrentDirectory, ".")
        services
            .WithPython()
            .WithHome(home)
            .FromRedistributable()// Download Python 3.12 and store it locally
    });

IHost app = builder.Build();

IPythonEnvironment pythonEnvironment =  app.Services.GetRequiredService<IPythonEnvironment>();

// IMPORTANT: Source generated by CSnakes
IHelloWorld helloWorld = pythonEnvironment.HelloWorld();
string result = helloWorld.SayHello("John Doe");
Console.WriteLine(result)// Hello John Doe!
Output
If you are wondering how IHelloWorld comes into the picture, it was generated at compile time. 
Generated Source
While it's the recommended approach, you can still use CSnakes without source generator.

Now let's see how an example of how to use a Python file that uses packages. We can make use of Python pip install and  requirements.txt .

First I am adding a requirements.txt file and enable copying it to the output.
<ItemGroup>
<AdditionalFiles Include="requirements.txt">
   <CopyToOutputDirectory>Always</CopyToOutputDirectory>
  </AdditionalFiles>
</ItemGroup>
Here for the demo purposes I am adding a simple package stringcase.
stringcase
Now I am modifying the hello_world.py file as follows.
import stringcase

def say_hello(name: str) -> str:
    return f"Hello {stringcase.titlecase(name)}!"
Now I am updating Program.cs the as follows.
using CSnakes.Runtime;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Hosting;

IHostBuilder builder = Host.CreateDefaultBuilder(args)
    .ConfigureServices(services =>
    {
        // Path to your Python modules
        var home = Path.Join(Environment.CurrentDirectory, ".");
        services
            .WithPython()
            .WithHome(home)
            .FromRedistributable() // Download Python 3.12 and store it locally
            .WithVirtualEnvironment(Path.Join(home".venv"))
            .WithPipInstaller()// Optional, Installs packages listed in requirements.txt on startup
    });

IHost app = builder.Build();

IPythonEnvironment pythonEnvironment =
    app.Services.GetRequiredService<IPythonEnvironment>();

// Source generated by CSnakes
IHelloWorld helloWorld = pythonEnvironment.HelloWorld();
string result = helloWorld.SayHello("JohnDoe");
Console.WriteLine(result)// Hello John Doe!
Output
This is pretty cool!

Watch the video:

Happy Coding.

Regards,
Jaliya