top of page
Writer's pictureThe Tech Platform

Maximize Your Application's Speed and Efficiency with ASP.NET Caching Tips

Application speed and efficiency are essential to providing a positive user experience. Slow loading times and poor performance can quickly turn users away, costing your business valuable traffic and revenue. Fortunately, ASP.NET offers a powerful caching system that can significantly improve your application's performance and speed. In this article, we will guide you to maximize your application's speed and efficiency with ASP.NET caching tips.


However, simply implementing caching is not enough to ensure optimal results. To get the most out of ASP.NET caching, you need to follow certain best practices and techniques that have been tried and tested by seasoned developers.


So. let's understand what caching is and the different types of caching in ASP.NET. After that, we will discuss some of the most effective ASP.NET caching tips and best practices that will help you maximize your application's speed and efficiency.


What is Caching in ASP.NET?

Caching in ASP.NET is the ability to store a website page or data in the memory for rapid access. It can improve the performance and responsiveness of your web applications by reducing the load on the server and database. The caching technique allows to store/cache page output or application data on the client. The cached information is used to serve subsequent requests that avoid the overhead of recreating the same information. This works only with static data because dynamic data varies with every request made to the server.


Importance of caching in ASP.NET Applications?

Caching plays an important role in the performance and scalability of ASP.NET applications because it can reduce the load on the server and database by storing frequently accessed data or page output in memory. This allows the web server to process the request faster and reuse the cached information without recreating it. This can improve the responsiveness of your application and make it more scalable for a large number of users or requests. Caching can also make data available when the data source is temporarily unavailable.


Caching can significantly improve the performance and scalability of your ASP.NET applications, especially on multi-core machines where partitioning the cache can reduce contention and improve throughput


Types of caching in ASP.NET

Below we have different types of caching available in ASP.NET:


1. In-memory caching uses server memory to store cached data. This type of caching is suitable for a single server or multiple servers using session affinity. Session affinity means that the requests from a client are always routed to the same server for processing. You can use the MemoryCache class in the System.Runtime.Caching namespace to implement in-memory caching.


2. Distributed caching uses a distributed cache service to store data in memory when the app is hosted in a cloud or server farm. The cache is shared across the servers that process requests. A client can submit a request that’s handled by any server in the group if cached data for the client is available. ASP.NET Core works with SQL Server, Redis, and NCache distributed caches. You can use the IDistributedCache interface in Microsoft.Extensions.Caching.Distributed namespace to implement distributed caching.


3. Distributed cache tag helper caches the content from an MVC view or Razor Page in the distributed cloud or web farm scenarios. The distributed cache tag helper uses SQL Server, Redis, or NCache to store data. You can use the <distributed-cache> tag helper in your views or pages to enable distributed cache tag helper.


4. Cache tag helper caches the content from an MVC view or Razor Page with the cache tag helper. The cache tag helper uses in-memory caching to store data. You can use the <cache> tag helper in your views or pages to enable cache tag helper.


5. Response caching enables caching server responses based on HTTP cache headers. It implements the standard HTTP caching semantics and caches based on HTTP cache headers like proxies do. It is typically not beneficial for UI apps such as Razor Pages because browsers generally set request headers that prevent caching. It may be beneficial for public GET or HEAD API requests from clients where the conditions for caching are met. You can use the ResponseCachingMiddleware class in Microsoft.AspNetCore.ResponseCaching namespace to enable response caching.


6. Output caching enables caching of HTTP responses. Output caching differs from response caching in that the caching behavior is configurable on the server, while response caching behavior is defined by HTTP headers. Output caching benefits UI apps because configuration decides what should be cached independently of HTTP headers. You can use the OutputCachingMiddleware class in Microsoft.AspNetCore.OutputCaching namespace to enable output caching.


Maximize Your Application's Speed and Efficiency with ASP.NET Caching Tips


1. Use IMemoryCache and IDistributedCache interfaces

IMemoryCache is suitable for single-server or session-affinity scenarios, where the cached data is stored by the app instance on the server where the app is running.


IDistributedCache is suitable for multi-server or cloud scenarios, where the cached data is shared across the servers that process requests.


By using these interfaces, you can:

  • Switch between different cache implementations without changing your code.

  • Take advantage of dependency injection to inject the cache services into your classes.

  • Use the same methods and options to get and set values in the cache.

  • Use cache dependencies, callbacks, and expiration policies to control the cache behavior.

  • Improve the performance and scalability of your applications by reducing the load on the server and database.


To use IMemoryCache and IDistributedCache interfaces, you need to add the following NuGet packages to your project:

  • Microsoft.Extensions.Caching.Memory for IMemoryCache

  • Microsoft.Extensions.Caching.Distributed for IDistributedCache

  • Microsoft.Extensions.Caching.SqlServer, Microsoft.Extensions.Caching.StackExchangeRedis, or NCache.Microsoft.Extensions.Caching.OpenSource for distributed cache implementations.

Then, you need to register the services in your Program.cs file using the AddMemoryCache and AddDistributedXXX methods. For example:

using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Hosting;

var host = Host.CreateDefaultBuilder(args)
    .ConfigureServices(services =>
    {
        // Register IMemoryCache service
        services.AddMemoryCache();

        // Register IDistributedCache service with Redis implementation
        services.AddDistributedRedisCache(options =>
        {
            options.Configuration = "localhost";
            options.InstanceName = "SampleInstance";
        });
    })
    .Build();

To use the cache interfaces in your classes, you need to inject them using dependency injection. For example:

using Microsoft.Extensions.Caching.Distributed;
using Microsoft.Extensions.Caching.Memory;

public class MyService
{
    private readonly IMemoryCache _memoryCache;
    private readonly IDistributedCache _distributedCache;

    public MyService(IMemoryCache memoryCache, IDistributedCache distributedCache)
    {
        _memoryCache = memoryCache;
        _distributedCache = distributedCache;
    }

    // Use the cache interfaces in your methods
}

2. Use Cache Tag Helper and Distributed Cache Tag Helper

You should use cache tag helper and distributed cache tag helper because they provide a convenient way to cache the content of your views or pages in ASP.NET Core applications.


By using these tag helpers, you can:

  • Improve the performance of your applications by reducing the load on the server and database.

  • Control the cache behavior using attributes such as expires-on, expires-after, expires-sliding, vary-by, and priority.

  • Choose between in-memory caching or distributed caching depending on your scenario and requirements.

  • Use Razor syntax to mark the sections of your markup that you want to cache.

To use Cache Tag Helper and Distributed Cache Tag Helper, you need to add the following NuGet packages to your project:

  • Microsoft.AspNetCore.Mvc.TagHelpers for Cache Tag Helper

  • Microsoft.AspNetCore.Mvc.Razor.RuntimeCompilation for Distributed Cache Tag Helper

Then, you need to register the services in your Program.cs file using the AddRazorRuntimeCompilation method. For example:

using Microsoft.AspNetCore.Builder;
using Microsoft.AspNetCore.Hosting;
using Microsoft.Extensions.DependencyInjection;

var builder = WebApplication.CreateBuilder(args);

// Register Razor runtime compilation service
builder.Services.AddRazorPages().AddRazorRuntimeCompilation();

var app = builder.Build();

// Configure middleware pipeline
app.UseHttpsRedirection();
app.UseStaticFiles();
app.UseRouting();
app.UseAuthorization();
app.MapRazorPages();
app.Run();

To use the tag helpers in your views or pages, you need to add the following directives at the top of your files:

@addTagHelper *, Microsoft.AspNetCore.Mvc.TagHelpers
@addTagHelper *, Microsoft.AspNetCore.Mvc.Razor.RuntimeCompilation

Then, you can use the <cache> and <distributed-cache> tags in your markup. For example:

<cache expires-after="@TimeSpan.FromMinutes(5)">
    <h1>This content will be cached for 5 minutes</h1>
</cache>

<distributed-cache name="my-cache" expires-after="@TimeSpan.FromMinutes(10)">
    <h1>This content will be cached in a distributed cache for 10 minutes</h1>
</distributed-cache>

3. Use Response Caching and Output Caching Middleware

You should use response caching and output caching middleware because they provide different ways to cache server responses based on HTTP cache headers or configuration settings. By using this middleware, you can:

  • Improve the performance of your applications by reducing the load on the server and database.

  • Control the cache behavior using attributes such as enabled, expires-on, expires-after, expires-sliding, vary-by, and priority.

  • Choose between response caching or output caching depending on your scenario and requirements.

  • Use middleware configuration to enable and customize caching.

To use response caching middleware, you need to add Microsoft.AspNetCore.ResponseCaching package to your project. Then, you need to register the middleware in your Program.cs file using the AddResponseCaching and UseResponseCaching methods. For example:

using Microsoft.AspNetCore.Builder;
using Microsoft.AspNetCore.Hosting;
using Microsoft.Extensions.DependencyInjection;

var builder = WebApplication.CreateBuilder(args);

// Register response caching middleware service
builder.Services.AddResponseCaching();

var app = builder.Build();

// Configure middleware pipeline
app.UseHttpsRedirection();
app.UseStaticFiles();
app.UseRouting();
app.UseAuthorization();

// Use response caching middleware
app.UseResponseCaching();

app.MapRazorPages();
app.Run();

To use output caching middleware, you need to add Microsoft.AspNetCore.OutputCaching package to your project. Then, you need to register the middleware in your Program.cs file using the AddOutputCaching and UseOutputCaching methods. For example:

using Microsoft.AspNetCore.Builder;
using Microsoft.AspNetCore.Hosting;
using Microsoft.Extensions.DependencyInjection;

var builder = WebApplication.CreateBuilder(args);

// Register output caching middleware service
builder.Services.AddOutputCaching();

var app = builder.Build();

// Configure middleware pipeline
app.UseHttpsRedirection();
app.UseStaticFiles();
app.UseRouting();
app.UseAuthorization();

// Use output caching middleware
app.UseOutputCaching();

app.MapRazorPages();
app.Run();


4. Use the Wrapper function to access the cache

A wrapper function in ASP.NET caching is a function that encapsulates the logic of getting and setting values in the cache. Using a wrapper function to access the cache can provide a consistent and convenient way to get and set values in the cache. By using a wrapper function, you can:

  • Abstract away the details of the cache implementation and interface.

  • Simplify the code that interacts with the cache by avoiding repeated checks and casts.

  • Ensure that the cache is always used consistently and correctly by following a standard pattern.

  • Handle common scenarios such as cache miss, expiration, or eviction with a single function call.

To use a wrapper function to access the cache, you need to define a function that takes the cache interface and the cache key as parameters. Then, you need to use the cache interface methods to get and set values in the cache. You can also use additional parameters or options to customize the cache behavior. For example:

# Import functools for lru_cache
import functools

# Define a wrapper function for lru_cache
def lru_cache_wrapper(cache_interface, cache_key):
    # Define an inner function that takes the cache value as parameter    
    @functools.lru_cache(maxsize=128)
    def cached_wrapper(cache_value):
        # Return the cache value
        return cache_value
    
    # Try to get the value from the cache using the cache key
    cached_value = cached_wrapper(cache_key)
    
    # If the value is not in the cache
    if cached_value is None:
        # Compute the value using some logic
        computed_value = some_logic(cache_key)
        # Set the value in the cache using the cache key and the computed value
        cached_wrapper(cache_key) = computed_value
        # Return the computed value
        return computed_value
    
    # If the value is in the cache
    else:
        # Return the cached value
        return cached_value


5. Use Versioning and cache groups

Versioning is a technique that involves adding a unique identifier to the URL of a cached file, such as a number, a string, or a hash. This identifier changes whenever the file changes, forcing the browser to download a new version of the file instead of using the old one from the cache. This is also known as cache busting.


Cache groups are a technique that involves grouping your cached files by their type, purpose, or dependency. This can help you optimize the caching behavior and performance of your application. For example, you can use cache groups to:

  • Apply different cache expiration policies for different types of files, such as images, scripts, or styles.

  • Evict multiple cache entries at once based on a common condition or event, such as a change in a database table or a user action.

  • Store and retrieve cache entries using a shared data structure or storage mechanism, such as a partition or a distributed cache service.


Using versioning and cache groups can help you manage and update your cached content more efficiently. By using versioning and cache groups, you can:

  • Force the browser to download a new version of a cached file when it has been changed, instead of using the old version from the cache. This is also known as cache busting.

  • Use a consistent naming scheme for your cached files that indicates their version number and stability level. This can help you track changes and identify bugs in your code.

  • Group your cached files by their type, purpose, or dependency. This can help you optimize the caching behavior and performance of your application.

  • Use different cache expiration policies for different cache groups. This can help you balance between freshness and efficiency of your cached content.

To use versioning and cache groups, you need to configure your caching system to use a version identifier and a cache group name for each cached file. The version identifier can be a number, a string, or a hash that changes when the file changes. The cache group name can be a string that describes the category or function of the file. For example:

// Define a version identifier and a cache group name for each file
const CACHE_VERSION = "v1.0.0";
const CACHE_GROUPS = {
  images: "images",
  scripts: "scripts",
  styles: "styles",
};

// Define an array of files to cache
const FILES_TO_CACHE = [
  {
    url: "/images/logo.png",
    // Append the version identifier and the cache group name to the url
    cacheKey: `/images/logo.png?version=${CACHE_VERSION}&group=${CACHE_GROUPS.images}`,
  },
  {
    url: "/scripts/main.js",
    // Append the version identifier and the cache group name to the url
    cacheKey: `/scripts/main.js?version=${CACHE_VERSION}&group=${CACHE_GROUPS.scripts}`,
  },
  {
    url: "/styles/main.css",
    // Append the version identifier and the cache group name to the url
    cacheKey: `/styles/main.css?version=${CACHE_VERSION}&group=${CACHE_GROUPS.styles}`,
  },
];

// Use the cacheKey to store and retrieve files from the cache// For example, using a service worker
self.addEventListener("install", (event) => {
  // Open a cache with the version identifier as part of the name
  event.waitUntil(
    caches.open(`my-cache-${CACHE_VERSION}`).then((cache) => {
      // Add all files to cache using their cache keys
      return cache.addAll(FILES_TO_CACHE.map((file) => file.cacheKey));
    })
  );
});

self.addEventListener("fetch", (event) => {
  // Intercept requests for files
  event.respondWith(
    // Try to match the request with a cached file using its cache key
    caches.match(event.request.url + `?version=${CACHE_VERSION}`).then((response) => {
      // If there is a match, return the cached responseif (response) {
        return response;
      }
      // Otherwise, fetch the file from the network and update the cache
      return fetch(event.request).then((response) => {
        // Clone the response to avoid consuming it
        let responseToCache = response.clone();
        // Open the cache with the version identifier as part of the name
        caches.open(`my-cache-${CACHE_VERSION}`).then((cache) => {
          // Add the file to the cache using its cache key
          cache.put(event.request.url + `?version=${CACHE_VERSION}`, responseToCache);
        });
        // Return the original response
        return response;
      });
    })
  );
});

6. Use Background Service such as IHostedService

IHostedService is an interface that defines two methods for objects that are managed by the host:

  • StartAsync (CancellationToken) contains the logic to start the background task. StartAsync is called before the app’s request processing pipeline is configured and the server is started.

  • StopAsync (CancellationToken) contains the logic to end the background task. StopAsync is triggered when the host is performing a graceful shutdown.

Using IHostedService can help you implement long-running tasks that run in the background of your application. By using IHostedService, you can:

  • Define the logic to start and stop the background task using the StartAsync and StopAsync methods.

  • Register the background service with dependency injection and logging using the AddHostedService extension method.

  • Use the host lifetime events and cancellation tokens to gracefully handle app shutdown scenarios.

  • Use the BackgroundService base class or the Worker Service template to simplify the implementation of common background tasks such as timers, queues, or hosted web services.

To use IHostedService, you need to create a class that implements the IHostedService interface and defines the StartAsync and StopAsync methods. Then, you need to register the class as a hosted service in your Program.cs file using the AddHostedService extension method. For example:

using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Hosting;

public class MyBackgroundService : IHostedService
{
    // Define any dependencies that you need for your background task
    private readonly ILogger<MyBackgroundService> _logger;

    // Use constructor injection to get the dependencies
    public MyBackgroundService(ILogger<MyBackgroundService> logger)
    {
        _logger = logger;
    }

    // Define the logic to start the background task
    public Task StartAsync(CancellationToken cancellationToken)
    {
        _logger.LogInformation("My background service is starting.");

        // Do some initialization or start a timer or a queue
        return Task.CompletedTask;
    }

    // Define the logic to stop the background task
    public Task StopAsync(CancellationToken cancellationToken)
    {
        _logger.LogInformation("My background service is stopping.");

        // Do some cleanup or stop a timer or a queue
        return Task.CompletedTask;
    }
}

var host = Host.CreateDefaultBuilder(args)
    .ConfigureServices(services =>
    {
        // Register your background service as a hosted service
        services.AddHostedService<MyBackgroundService>();
    })
    .Build();

await host.RunAsync();

7. Use PostEvictionCallbacks to set the callbacks that will be fired after the cache entry is evicted from the cache.

PostEvictionCallbacks is a property of the MemoryCacheEntryOptions class that allows you to set the callbacks that will be fired after the cache entry is evicted from the cache. Using PostEvictionCallbacks will be fired after the cache entry is evicted from the cache because it can help you perform some actions or clean up when the cache entry is no longer available. By using PostEvictionCallbacks, you can:

  • Get notified when a cache entry is evicted due to expiration, capacity, dependency, or manual removal.

  • Access the cache key, value, reason, and state of the evicted cache entry.

  • Log or report the eviction event for debugging or monitoring purposes.

  • Reload or refresh the cache entry from another source if needed.

To use PostEvictionCallbacks, you need to create a delegate that takes the cache key, value, reason, and state as parameters. Then, you need to register the delegate with the cache entry options using the RegisterPostEvictionCallback method. For example:

using Microsoft.Extensions.Caching.Memory;

public class MyService
{
    private readonly IMemoryCache _cache;

    public MyService(IMemoryCache cache)
    {
        _cache = cache;
    }

    public void SetCacheEntry(string key, string value)
    {
        // Define a delegate that will be called when the cache entry is evicted
        void EvictionCallback(object k, object v, EvictionReason r, object s)
        {
            // Do something when the cache entry is evicted
            Console.WriteLine($"Cache entry with key {k} and value {v} was evicted due to {r}.");
        }

        // Define a cache entry options with an expiration policy
        var options = new MemoryCacheEntryOptions()
            .SetAbsoluteExpiration(TimeSpan.FromSeconds(10));

        // Register the eviction callback with the options
        options.RegisterPostEvictionCallback(EvictionCallback);

        // Set the cache entry with the key, value, and options
        _cache.Set(key, value, options);
    }
}

8. Use CancellationTokenSource

CancellationTokenSource is a class that represents a source of cancellation tokens. It can be used to signal to one or more cancelable operations that they should be cancelled. This allows multiple cache entries to be evicted as a group. Using CancellationTokenSource in ASP.NET caching can help you invalidate or evict cache entries based on some conditions or events. By using CancellationTokenSource, you can:

  • Create a cancellation token that can be triggered manually or automatically when a timeout or cancellation request occurs.

  • Pass the cancellation token to the cache entry options using the AddExpirationToken method.

  • Register one or more cache entries with the same cancellation token to evict them as a group.

  • Use the PostEvictionCallbacks to perform some actions or clean up when the cache entry is evicted.

To use CancellationTokenSource in ASP.NET caching, you need to create an instance of CancellationTokenSource and get its token using the Token property. Then, you need to create an instance of CancellationChangeToken that wraps the cancellation token. Then, you need to add the CancellationChangeToken to the cache entry options using the AddExpirationToken method. For example:

using Microsoft.Extensions.Caching.Memory;

public class MyService
{
    private readonly IMemoryCache _cache;

    public MyService(IMemoryCache cache)
    {
        _cache = cache;
    }

    public void SetCacheEntry(string key, string value)
    {
        // Define a delegate that will be called when the cache entry is evicted
        void EvictionCallback(object k, object v, EvictionReason r, object s)
        {
            // Do something when the cache entry is evicted
            Console.WriteLine($"Cache entry with key {k} and value {v} was evicted due to {r}.");
        }

        // Define a cache entry options with an expiration policy
        var options = new MemoryCacheEntryOptions()
            .SetAbsoluteExpiration(TimeSpan.FromSeconds(10));

        // Create a cancellation token source that can be triggered manually
        var cts = new CancellationTokenSource();

        // Create a cancellation change token that wraps the cancellation token
        var cct = new CancellationChangeToken(cts.Token);

        // Add the cancellation change token to the cache entry options
        options.AddExpirationToken(cct);

        // Register the eviction callback with the options
        options.RegisterPostEvictionCallback(EvictionCallback);

        // Set the cache entry with the key, value, and options
        _cache.Set(key, value, options);

        // Trigger the cancellation token source to invalidate the cache entry
        cts.Cancel();
    }
}

9. Use SetSize, Size, and SizeLimit to limit cache size

You should use SetSize, Size, and SizeLimit to limit cache size because they can help you control the memory consumption of your cache and prevent it from growing too large. By using SetSize, Size, and SizeLimit, you can:

  • Specify a size limit for the entire cache using the SizeLimit property of the MemoryCacheOptions class. This is the maximum amount of memory that the cache can use, measured in units defined by the developer.

  • Specify a size for each cache entry using the SetSize method of the MemoryCacheEntryOptions class. This is the amount of memory that the cache entry uses, measured in units defined by the developer.

  • Get the size of each cache entry using the Size property of the MemoryCacheEntryOptions class. This is the same value that was set by the SetSize method.

  • Get the current size of the entire cache using the Size property of the MemoryCache class. This is the sum of the sizes of all cache entries.

To use SetSize, Size, and SizeLimit to limit cache size, you need to configure your cache options with a size limit and set a size for each cache entry. For example:

using Microsoft.Extensions.Caching.Memory;

public class MyService
{
    private readonly IMemoryCache _cache;

    public MyService(IMemoryCache cache)
    {
        _cache = cache;
    }

    public void SetCacheEntry(string key, string value)
    {
        // Define a cache options with a size limit of 100 unitsvar options = new MemoryCacheOptions()
        {
            SizeLimit = 100
        };

        // Create a new cache instance with the optionsvar cache = new MemoryCache(options);

        // Define a cache entry options with a size of 10 unitsvar entryOptions = new MemoryCacheEntryOptions()
            .SetSize(10);

        // Set the cache entry with the key, value, and options
        cache.Set(key, value, entryOptions);
    }
}

Conclusion

Caching is a great tool for improving the speed and efficiency of ASP.NET applications. By storing frequently accessed data in memory or on disk, caching can reduce the time and resources required to retrieve data from a database or other external source. Selecting the appropriate caching strategy, there are also best practices to follow when implementing caching in your ASP.NET application. These include setting appropriate cache expiration times, using a consistent cache key naming convention, and monitoring cache usage to ensure that it is having the desired impact on application performance.

0 comments

Comentários


bottom of page