16 minlesson

Response Caching & Pipeline Status

Response Caching & Pipeline Status

This presentation covers building the pipeline status endpoint, designing the analytics controller with response caching, and wiring everything together with proper DTOs and HTTP conventions.

Service Layer Implementation

The previous lesson covered the IAnalyticsService interface with two methods:

  • GetParcelCountByStatusAsync() - Returns current parcel counts by status
  • GetDeliveryPerformanceAsync(from, to) - Returns delivery statistics for a date range

These methods encapsulate the complex aggregation queries and keep the controller focused on HTTP concerns.

Response DTOs

The DTOs were defined in the previous lesson:

ParcelCountByStatusDto

csharp
1public class ParcelCountByStatusDto
2{
3 public string Status { get; set; } = string.Empty;
4 public int Count { get; set; }
5}

DeliveryPerformanceDto

csharp
1public class DeliveryPerformanceDto
2{
3 public DateTimeOffset From { get; set; }
4 public DateTimeOffset To { get; set; }
5 public int TotalParcels { get; set; }
6 public int Delivered { get; set; }
7 public int InTransit { get; set; }
8 public int Exceptions { get; set; }
9 public double AverageDeliveryTimeHours { get; set; }
10 public double OnTimePercentage { get; set; }
11}

The From and To fields echo back the date range that was used. This makes the response self-describing: the client can see exactly what time window the stats cover.

Why Strings Instead of Enums?

The DTOs use string for Status instead of the enum type. This makes the JSON serialization predictable. The client receives "InTransit" instead of 2 (the integer enum value). If you use JsonStringEnumConverter globally, you could use the enum types directly, but string properties are safer across different serializer configurations.

The Analytics Controller

Wire up both endpoints in a single controller:

csharp
1[ApiController]
2[Route("api/[controller]")]
3public class AnalyticsController : ControllerBase
4{
5 private readonly IAnalyticsService _analytics;
6
7 public AnalyticsController(IAnalyticsService analytics)
8 {
9 _analytics = analytics;
10 }
11
12 [HttpGet("parcel-count-by-status")]
13 [ResponseCache(Duration = 60)]
14 public async Task<ActionResult<List<ParcelCountByStatusDto>>> GetParcelCountByStatus()
15 {
16 var counts = await _analytics.GetParcelCountByStatusAsync();
17 return Ok(counts);
18 }
19
20 [HttpGet("delivery-performance")]
21 [ResponseCache(Duration = 300)]
22 public async Task<ActionResult<DeliveryPerformanceDto>> GetDeliveryPerformance(
23 [FromQuery] DateTimeOffset? from,
24 [FromQuery] DateTimeOffset? to)
25 {
26 var fromDate = from ?? DateTimeOffset.UtcNow.AddDays(-30);
27 var toDate = to ?? DateTimeOffset.UtcNow;
28
29 var performance = await _analytics.GetDeliveryPerformanceAsync(fromDate, toDate);
30 return Ok(performance);
31 }
32}

Default Date Range Logic

The delivery performance endpoint uses DateTimeOffset.UtcNow.AddDays(-30) as the default from and DateTimeOffset.UtcNow as the default to. This ensures the query always has bounds, preventing full-table scans.

Route Structure

Both endpoints are under /api/analytics/:

1GET /api/analytics/parcel-count-by-status
2GET /api/analytics/delivery-performance?from=2025-01-01T00:00:00Z&to=2025-01-31T23:59:59Z

Response Caching Deep Dive

How [ResponseCache] Works

The [ResponseCache] attribute does not cache anything by itself. It sets the Cache-Control HTTP header on the response:

http
1Cache-Control: public, max-age=300

This tells three different layers to cache the response:

  1. The browser stores the response and reuses it for 300 seconds
  2. CDNs and reverse proxies (if present) cache the response for downstream clients
  3. ASP.NET Core response caching middleware (if configured) caches the response in server memory

Enabling Server-Side Caching

To enable the middleware that actually caches responses on the server:

csharp
1// In Program.cs
2builder.Services.AddResponseCaching();
3
4var app = builder.Build();
5
6app.UseResponseCaching(); // Must be before endpoint mapping
7app.MapControllers();

Without the middleware, [ResponseCache] only sets HTTP headers. The browser and proxies cache, but every request that reaches your server still executes the controller action.

VaryByQueryKeys

The delivery performance endpoint accepts from and to query parameters. The cache must treat different date ranges as different cache entries:

csharp
1[HttpGet("delivery-performance")]
2[ResponseCache(Duration = 300, VaryByQueryKeys = new[] { "from", "to" })]
3public async Task<ActionResult<DeliveryPerformanceDto>> GetDeliveryPerformance(
4 [FromQuery] DateTimeOffset? from,
5 [FromQuery] DateTimeOffset? to)
6{
7 // ...
8}

Without VaryByQueryKeys, a cached response for January could be served when the client requests February data. The VaryByQueryKeys property tells the middleware to include the specified query parameters in the cache key.

Cache Profiles

If multiple endpoints share the same caching settings, define a cache profile to avoid repetition:

csharp
1builder.Services.AddControllers(options =>
2{
3 options.CacheProfiles.Add("Analytics", new CacheProfile
4 {
5 Duration = 600,
6 Location = ResponseCacheLocation.Any
7 });
8
9 options.CacheProfiles.Add("RealTime", new CacheProfile
10 {
11 Duration = 60,
12 Location = ResponseCacheLocation.Any
13 });
14});

Then reference the profile by name:

csharp
1[HttpGet("delivery-performance")]
2[ResponseCache(CacheProfileName = "Analytics", VaryByQueryKeys = new[] { "from", "to" })]
3public async Task<ActionResult<DeliveryPerformanceDto>> GetDeliveryPerformance(...)
csharp
1[HttpGet("parcel-count-by-status")]
2[ResponseCache(CacheProfileName = "RealTime")]
3public async Task<ActionResult<List<ParcelCountByStatusDto>>> GetParcelCountByStatus()

Cache Location Options

The Location property controls who can cache the response:

LocationCache-Control HeaderWho Caches
AnypublicBrowser, CDN, server middleware
ClientprivateBrowser only
Noneno-cacheNo one (forces revalidation)

For analytics endpoints, Any (the default for public) is appropriate because the data is not user-specific. Every caller gets the same aggregated stats for a given date range.

When Not to Cache

Do not apply response caching to endpoints that:

  • Return user-specific data (use Location = Client or do not cache)
  • Accept POST, PUT, or DELETE requests (only GET and HEAD are cacheable)
  • Return data that must be real-time accurate (use very short durations instead)

Putting It All Together

Here is the complete Program.cs registration for the analytics feature:

csharp
1// Services
2builder.Services.AddScoped<IAnalyticsService, AnalyticsService>();
3builder.Services.AddResponseCaching();
4
5builder.Services.AddControllers(options =>
6{
7 options.CacheProfiles.Add("Analytics", new CacheProfile
8 {
9 Duration = 600,
10 Location = ResponseCacheLocation.Any
11 });
12
13 options.CacheProfiles.Add("RealTime", new CacheProfile
14 {
15 Duration = 60,
16 Location = ResponseCacheLocation.Any
17 });
18});
19
20var app = builder.Build();
21
22// Middleware - order matters
23app.UseResponseCaching();
24app.MapControllers();

Testing the Cache Headers

Use curl -I to inspect response headers and verify caching is working:

bash
1curl -I https://localhost:5001/api/analytics/delivery-performance
2
3HTTP/1.1 200 OK
4Content-Type: application/json
5Cache-Control: public, max-age=300

The Cache-Control header confirms the response is cacheable for 300 seconds.

Testing Cache Behavior

To verify server-side caching, watch the logs. The first request hits the controller and runs the database query. The second request (within the cache window) should return instantly without a log entry from the controller, because the middleware serves the cached response.

Error Handling for Analytics

Analytics endpoints should be resilient. If the database is empty or the date range returns no data, return a valid response with zero values rather than an error:

csharp
1// Instead of throwing when no data exists
2if (totalParcels == 0)
3{
4 return new DeliveryPerformanceDto
5 {
6 From = from,
7 To = to,
8 TotalParcels = 0,
9 Delivered = 0,
10 InTransit = 0,
11 Exceptions = 0,
12 AverageDeliveryTimeHours = 0,
13 OnTimePerformance = 0
14 };
15}

A dashboard that receives zeros can display "No data for this period." An error response forces the dashboard to handle failure states differently.

Key Takeaways

  1. The parcel count by status endpoint uses GroupBy(p => p.Status) with no date filter for a live snapshot
  2. Fill in missing enum values with zero counts so the front-end always gets a complete dataset
  3. [ResponseCache] sets Cache-Control headers; the response caching middleware performs server-side caching
  4. Use VaryByQueryKeys to cache different query parameter combinations separately
  5. Cache profiles centralize caching configuration for related endpoints
  6. Use ResponseCacheLocation.Any for non-user-specific analytics data
  7. Return zero-value DTOs instead of errors when the date range has no matching data
  8. The service layer encapsulates complex aggregation logic and keeps controllers focused on HTTP concerns