Subscribe to receive notifications of new posts:

Cache Rules are now GA: precision control over every part of your cache

2023-10-24

6 min read
This post is also available in 简体中文, 日本語, 한국어 and 繁體中文.

Cache Rules go GA: precision control over every part of your cache

One year ago we introduced Cache Rules, a new way to customize cache settings on Cloudflare. Cache Rules provide greater flexibility for how users cache content, offering precise controls, a user-friendly API, and seamless Terraform integrations. Since it was released in late September 2022, over 100,000 websites have used Cache Rules to fine-tune their cache settings.

Today, we're thrilled to announce that Cache Rules, along with several other Rules products, are generally available (GA). But that’s not all — we're also introducing new configuration options for Cache Rules that provide even more options to customize how you cache on Cloudflare. These include functionality to define what resources are eligible for Cache Reserve, what timeout values should be respected when receiving data from your origin server, which custom ports we should use when we cache content, and whether we should bypass Cloudflare’s cache in the absence of a cache-control header.

Cache Rules give users full control and the ability to tailor their content delivery strategy for almost any use case, without needing to write code. As Cache Rules go GA, we are incredibly excited to see how fast customers can achieve their perfect cache strategy.

History of Customizing Cache on Cloudflare

The journey of cache customization on Cloudflare began more than a decade ago, right at the beginning of the company. From the outset, one of the most frequent requests from our customers involved simplifying their configurations. Customers wanted to easily implement precise cache policies, apply robust security measures, manipulate headers, set up redirects, and more for any page on their website. Using Cloudflare to set these controls was especially crucial for customers utilizing origin servers that only provided convoluted configuration options to add headers or policies to responses, which could later be applied downstream by CDNs.

In response to this demand, we introduced Page Rules, a product that has since witnessed remarkable growth in both its popularity and functionality. Page Rules became the preferred choice for customers seeking granular control over how Cloudflare caches their content. Currently, there are over 5 million active cache-related Page Rules, assisting websites in tailoring their content delivery strategies.

However, behind the scenes, Page Rules encountered a scalability issue.

Whenever a Page Rule is encountered by Cloudflare we must transform all rule conditions for a customer into a single regex pattern. This pattern is then applied to requests for the website to achieve the desired cache configuration. When thinking about how all the regexes from all customers are then compared against tens of millions of requests per second, spanning across more than 300 data centers worldwide, it’s easy to see that the computational demands for applying Page Rules can be immense. This pressure is directly tied to the number of rules we could offer our users. For example, Page Rules would only allow for 125 rules to be deployed on a given website.

To address this challenge, we rebuilt all the Page Rule functionality on the new Rulesets Engine. Not only do ruleset engine-based products give users more rules to play with, they also offer greater flexibility on when these rules should run. Part of the magic of the Rulesets engine is that rather than combine all of a page's rules into a single regular expression, rule logic can be evaluated on a conditional basis. For example, if subdomain A and B have different caching policies, a request from subdomain A can be evaluated using regex logic specific to A (while omitting any logic that applies to B). This yields meaningful benefits to performance, and reduces the computational demands of applying Page Rules across Cloudflare's network.

Over the past year, Cache Rules, along with Origin Rules, Configuration Rules, and Single Redirect Rules, have been in beta. Thanks to the invaluable support of our early adopters, we have successfully fine-tuned our product, reaching a stage where it is ready to transition from beta to GA. These products can now accomplish everything that Page Rules could and more. This also marks the beginning of the EOL process for Page Rules. In the coming months we will announce timelines and information regarding how customers will replace their Page Rules with specific Rules products. We will automate this as much as possible and provide simple steps to ensure a smooth transition away from Page Rules for everyone.

How to use Cache Rules and What’s New

Those that have used Cache Rules know that they are intuitive and work similarly to our other ruleset engine products. User-defined criteria like URLs or request headers are evaluated, and if matching a specified value, the Cloudflare caching configuration is obeyed. Each Cache Rule depends on fields, operators, and values. For all the different options available, you should see our Cache Rules documentation.

Below are two examples of how to deploy different strategies to customize your cache. These examples only show the tip-of-the-iceberg of what’s possible with Cache Rules, so we encourage you to try them out and let us know what you think.

Example: Cached content is updated at a regular cadence

As an example, let’s say that Acme Corp wants to update their caching strategy. They want to customize their cache to take advantage of certain request headers and use the presence of those request headers to be the criteria that decides when to apply different cache rules. The first thing they’d need to decide is what information should be used to trigger the specific rule. This is defined in the expression.

Once the triggering criteria is defined Acme Corp should next determine how they want to customize their cache.

Content changing quickly

The most common cache strategy is to update the Edge Cache TTL. If Acme Corp thinks a particular piece of content on their website might change quickly, they can alter the time Cloudflare should consider a resource eligible to be served from cache to be shorter. This way Cloudflare would go back to the origin more frequently to revalidate and update the content. The Edge Cache TTL section is also where Acme Corp can define a resource’s TTL based on the status code Cloudflare gets back from their origin, and what Cloudflare should cache if there aren’t any cache-control instructions sent from Acme’s origin server.

Content changing slowly

On the other hand, if Acme Corp had a lot of content that did not change very frequently (like a favicon or logo) and they preferred to serve that from Cloudflare’s cache instead of their origin, they can define which content should be eligible for Cache Reserve with a new Cache Rule. Cache Reserve reduces egress fees by storing assets persistently in Cloudflare's cache for an extended period of time.

Traditionally when a user would enable Cache Reserve, their entire zone would be eligible to be written to Cache Reserve. For customers that care about saving origin egress fees on all resources on their website, this is still the best path forward. But for customers that want to have additional control over precisely what assets should be part of their Cache Reserve or even what size of assets should be eligible, the Cache Reserve Eligibility Rule provides additional knobs so that customers can precisely increase their cache hits and reduce origin egress in a customized manner. Note that this rule requires a Cache Reserve subscription.

Example: Origin is slow

Let’s consider a hypothetical example. Recently, Acme Corp has been seeing an increase in errors in their Cloudflare logs. These errors are related to a new report that Acme is providing its users based on Acme’s proprietary data. This report requires that their origin access several databases, perform some calculations and generate the report based on these calculations. The origin generating this report needs to wait to respond until all of this background work is completed. Acme’s report is a success, generating an influx of traffic from visitors wanting to see it. But their origin is struggling to keep up. A lot of the errors they are seeing are 524s which correlate to Cloudflare not seeing an origin response before a timeout occurred.

Acme has plans to improve this by scaling their origin infrastructure but it’s taking a long time to deploy. In the meantime, they can turn to Cache Rules to configure a timeout to be longer. Historically the timeout value between Cloudflare and two successive origin reads was 100 seconds, which meant that if an origin didn't successfully send a response for a period lasting longer than 100 seconds, it could lead to a 524 error. By using a Cache Rule to extend this timeout, Acme Corp can rely more heavily on Cloudflare's cache.

The above cache strategies focus on how often a resource is changed on an origin, and the origin’s performance. But there are numerous other rules that allow for other strategies, like custom cache keys which allow for customers to determine how their cache should be defined on Cloudflare, respecting strong ETags which help customers determine when Cloudflare should revalidate particular cached assets, and custom ports which allow for customers to define non-standard ports that Cloudflare should use when making caching decisions about content.

The full list of Cache Rules can be found here.

Try Cache Rules today!

We will continue to build and release additional rules that provide powerful, easy to enable control for anyone using Cloudflare’s cache. If you have feature requests for additional Cache Rules, please let us know in the Cloudflare Community.

Go to the dashboard and try Cache Rules out today!

Cloudflare's connectivity cloud protects entire corporate networks, helps customers build Internet-scale applications efficiently, accelerates any website or Internet application, wards off DDoS attacks, keeps hackers at bay, and can help you on your journey to Zero Trust.

Visit 1.1.1.1 from any device to get started with our free app that makes your Internet faster and safer.

To learn more about our mission to help build a better Internet, start here. If you're looking for a new career direction, check out our open positions.
General AvailabilityCache RulesProduct NewsApplication ServicesPerformanceCacheConnectivity Cloud

Follow on X

Alex Krivit|@ackriv
Cloudflare|@cloudflare

Related posts

October 31, 2024 1:00 PM

Moving Baselime from AWS to Cloudflare: simpler architecture, improved performance, over 80% lower cloud costs

Post-acquisition, we migrated Baselime from AWS to the Cloudflare Developer Platform and in the process, we improved query times, simplified data ingestion, and now handle far more events, all while cutting costs. Here’s how we built a modern, high-performing observability platform on Cloudflare’s network. ...

October 24, 2024 1:00 PM

Durable Objects aren't just durable, they're fast: a 10x speedup for Cloudflare Queues

Learn how we built Cloudflare Queues using our own Developer Platform and how it evolved to a geographically-distributed, horizontally-scalable architecture built on Durable Objects. Our new architecture supports over 10x more throughput and over 3x lower latency compared to the previous version....