Response Caching & Pipeline Status
This presentation covers building the pipeline status endpoint, designing the analytics controller with response caching, and wiring everything together with proper DTOs and HTTP conventions.
Service Layer Implementation
The previous lesson covered the IAnalyticsService interface with two methods:
GetParcelCountByStatusAsync()- Returns current parcel counts by statusGetDeliveryPerformanceAsync(from, to)- Returns delivery statistics for a date range
These methods encapsulate the complex aggregation queries and keep the controller focused on HTTP concerns.
Response DTOs
The DTOs were defined in the previous lesson:
ParcelCountByStatusDto
csharp1public class ParcelCountByStatusDto2{3 public string Status { get; set; } = string.Empty;4 public int Count { get; set; }5}
DeliveryPerformanceDto
csharp1public class DeliveryPerformanceDto2{3 public DateTimeOffset From { get; set; }4 public DateTimeOffset To { get; set; }5 public int TotalParcels { get; set; }6 public int Delivered { get; set; }7 public int InTransit { get; set; }8 public int Exceptions { get; set; }9 public double AverageDeliveryTimeHours { get; set; }10 public double OnTimePercentage { get; set; }11}
The From and To fields echo back the date range that was used. This makes the response self-describing: the client can see exactly what time window the stats cover.
Why Strings Instead of Enums?
The DTOs use string for Status instead of the enum type. This makes the JSON serialization predictable. The client receives "InTransit" instead of 2 (the integer enum value). If you use JsonStringEnumConverter globally, you could use the enum types directly, but string properties are safer across different serializer configurations.
The Analytics Controller
Wire up both endpoints in a single controller:
csharp1[ApiController]2[Route("api/[controller]")]3public class AnalyticsController : ControllerBase4{5 private readonly IAnalyticsService _analytics;67 public AnalyticsController(IAnalyticsService analytics)8 {9 _analytics = analytics;10 }1112 [HttpGet("parcel-count-by-status")]13 [ResponseCache(Duration = 60)]14 public async Task<ActionResult<List<ParcelCountByStatusDto>>> GetParcelCountByStatus()15 {16 var counts = await _analytics.GetParcelCountByStatusAsync();17 return Ok(counts);18 }1920 [HttpGet("delivery-performance")]21 [ResponseCache(Duration = 300)]22 public async Task<ActionResult<DeliveryPerformanceDto>> GetDeliveryPerformance(23 [FromQuery] DateTimeOffset? from,24 [FromQuery] DateTimeOffset? to)25 {26 var fromDate = from ?? DateTimeOffset.UtcNow.AddDays(-30);27 var toDate = to ?? DateTimeOffset.UtcNow;2829 var performance = await _analytics.GetDeliveryPerformanceAsync(fromDate, toDate);30 return Ok(performance);31 }32}
Default Date Range Logic
The delivery performance endpoint uses DateTimeOffset.UtcNow.AddDays(-30) as the default from and DateTimeOffset.UtcNow as the default to. This ensures the query always has bounds, preventing full-table scans.
Route Structure
Both endpoints are under /api/analytics/:
1GET /api/analytics/parcel-count-by-status2GET /api/analytics/delivery-performance?from=2025-01-01T00:00:00Z&to=2025-01-31T23:59:59Z
Response Caching Deep Dive
How [ResponseCache] Works
The [ResponseCache] attribute does not cache anything by itself. It sets the Cache-Control HTTP header on the response:
http1Cache-Control: public, max-age=300
This tells three different layers to cache the response:
- The browser stores the response and reuses it for 300 seconds
- CDNs and reverse proxies (if present) cache the response for downstream clients
- ASP.NET Core response caching middleware (if configured) caches the response in server memory
Enabling Server-Side Caching
To enable the middleware that actually caches responses on the server:
csharp1// In Program.cs2builder.Services.AddResponseCaching();34var app = builder.Build();56app.UseResponseCaching(); // Must be before endpoint mapping7app.MapControllers();
Without the middleware, [ResponseCache] only sets HTTP headers. The browser and proxies cache, but every request that reaches your server still executes the controller action.
VaryByQueryKeys
The delivery performance endpoint accepts from and to query parameters. The cache must treat different date ranges as different cache entries:
csharp1[HttpGet("delivery-performance")]2[ResponseCache(Duration = 300, VaryByQueryKeys = new[] { "from", "to" })]3public async Task<ActionResult<DeliveryPerformanceDto>> GetDeliveryPerformance(4 [FromQuery] DateTimeOffset? from,5 [FromQuery] DateTimeOffset? to)6{7 // ...8}
Without VaryByQueryKeys, a cached response for January could be served when the client requests February data. The VaryByQueryKeys property tells the middleware to include the specified query parameters in the cache key.
Cache Profiles
If multiple endpoints share the same caching settings, define a cache profile to avoid repetition:
csharp1builder.Services.AddControllers(options =>2{3 options.CacheProfiles.Add("Analytics", new CacheProfile4 {5 Duration = 600,6 Location = ResponseCacheLocation.Any7 });89 options.CacheProfiles.Add("RealTime", new CacheProfile10 {11 Duration = 60,12 Location = ResponseCacheLocation.Any13 });14});
Then reference the profile by name:
csharp1[HttpGet("delivery-performance")]2[ResponseCache(CacheProfileName = "Analytics", VaryByQueryKeys = new[] { "from", "to" })]3public async Task<ActionResult<DeliveryPerformanceDto>> GetDeliveryPerformance(...)
csharp1[HttpGet("parcel-count-by-status")]2[ResponseCache(CacheProfileName = "RealTime")]3public async Task<ActionResult<List<ParcelCountByStatusDto>>> GetParcelCountByStatus()
Cache Location Options
The Location property controls who can cache the response:
| Location | Cache-Control Header | Who Caches |
|---|---|---|
Any | public | Browser, CDN, server middleware |
Client | private | Browser only |
None | no-cache | No one (forces revalidation) |
For analytics endpoints, Any (the default for public) is appropriate because the data is not user-specific. Every caller gets the same aggregated stats for a given date range.
When Not to Cache
Do not apply response caching to endpoints that:
- Return user-specific data (use
Location = Clientor do not cache) - Accept POST, PUT, or DELETE requests (only GET and HEAD are cacheable)
- Return data that must be real-time accurate (use very short durations instead)
Putting It All Together
Here is the complete Program.cs registration for the analytics feature:
csharp1// Services2builder.Services.AddScoped<IAnalyticsService, AnalyticsService>();3builder.Services.AddResponseCaching();45builder.Services.AddControllers(options =>6{7 options.CacheProfiles.Add("Analytics", new CacheProfile8 {9 Duration = 600,10 Location = ResponseCacheLocation.Any11 });1213 options.CacheProfiles.Add("RealTime", new CacheProfile14 {15 Duration = 60,16 Location = ResponseCacheLocation.Any17 });18});1920var app = builder.Build();2122// Middleware - order matters23app.UseResponseCaching();24app.MapControllers();
Testing the Cache Headers
Use curl -I to inspect response headers and verify caching is working:
bash1curl -I https://localhost:5001/api/analytics/delivery-performance23HTTP/1.1 200 OK4Content-Type: application/json5Cache-Control: public, max-age=300
The Cache-Control header confirms the response is cacheable for 300 seconds.
Testing Cache Behavior
To verify server-side caching, watch the logs. The first request hits the controller and runs the database query. The second request (within the cache window) should return instantly without a log entry from the controller, because the middleware serves the cached response.
Error Handling for Analytics
Analytics endpoints should be resilient. If the database is empty or the date range returns no data, return a valid response with zero values rather than an error:
csharp1// Instead of throwing when no data exists2if (totalParcels == 0)3{4 return new DeliveryPerformanceDto5 {6 From = from,7 To = to,8 TotalParcels = 0,9 Delivered = 0,10 InTransit = 0,11 Exceptions = 0,12 AverageDeliveryTimeHours = 0,13 OnTimePerformance = 014 };15}
A dashboard that receives zeros can display "No data for this period." An error response forces the dashboard to handle failure states differently.
Key Takeaways
- The parcel count by status endpoint uses
GroupBy(p => p.Status)with no date filter for a live snapshot - Fill in missing enum values with zero counts so the front-end always gets a complete dataset
[ResponseCache]setsCache-Controlheaders; the response caching middleware performs server-side caching- Use
VaryByQueryKeysto cache different query parameter combinations separately - Cache profiles centralize caching configuration for related endpoints
- Use
ResponseCacheLocation.Anyfor non-user-specific analytics data - Return zero-value DTOs instead of errors when the date range has no matching data
- The service layer encapsulates complex aggregation logic and keeps controllers focused on HTTP concerns