I have a multi-step, branching pipeline using TPL Dataflow, that I would like to get working with Unified Logging (through Azure Application Insights).
My issue is that the APIs at my disposal (no pun intended) all seem to be based on the use of IDisposable
objects to represent scopes.
For instance the System.Diagnostics
approach looks like these:
void Foo(){
using var a = new Activity("I'm doing some work!");
DoTheWork();
}
void Bar(){
using var a = ActivitySource.StartActivity(name: "I'm doing some work!", kind: ActivityKind.Internal, tags: []);
DoTheWork();
}
And the app Insights SDK uses:
void Baz(){
using var op = telemetryClient.StartOperation<RequestTelemetry>("I'm doing some work!");
DoTheWork();
}
All the work that involves tracing metrics or emitting log messages happens in DoTheWork
which in the above examples takes place while the IDisposable
object a
is still in scope and hasn’t been disposed yet.
But when you use TPL Dataflow, the rest of the work to be done is handled after you return from the function. That means that if you want the Activity
to still be in scope you need to ignore the fact that it is an IDisposable
, and you need somehow to keep it around till all the work that might happen in the pipeline for that specific activity is done.
Are there any accepted design patterns or idioms that are used with platforms like TPL Dataflow to allow it to work nicely with App Insights, supporting logging and nesting of activities etc?