
<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:media="http://search.yahoo.com/mrss/">
    <channel>
        <title><![CDATA[ The Cloudflare Blog ]]></title>
        <description><![CDATA[ Get the latest news on how products at Cloudflare are built, technologies used, and join the teams helping to build a better Internet. ]]></description>
        <link>https://blog.cloudflare.com</link>
        <atom:link href="https://blog.cloudflare.com/" rel="self" type="application/rss+xml"/>
        <language>en-us</language>
        
        <lastBuildDate>Sat, 04 Apr 2026 08:12:26 GMT</lastBuildDate>
        <item>
            <title><![CDATA[Crawler Hints supports Microsoft’s IndexNow in helping users find new content]]></title>
            <link>https://blog.cloudflare.com/crawler-hints-supports-microsofts-indexnow-in-helping-users-find-new-content/</link>
            <pubDate>Fri, 12 Aug 2022 16:30:20 GMT</pubDate>
            <description><![CDATA[ Cloudflare is uniquely positioned to help give crawlers hints about when they should recrawl, if new content has been added, or if content on a site has recently changed ]]></description>
            <content:encoded><![CDATA[ 
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/h93ft6NJSg9cczGxRWGu9/eba65d160f1d0701c350a6bc92a21acb/image2-9.png" />
            
            </figure><p>The web is constantly changing. Whether it’s news or updates to your social feed, it’s a constant flow of information. As a user, that’s great. But have you ever stopped to think how search engines deal with all the change?</p><p>It turns out, they “index” the web on a regular basis — sending bots out, to constantly crawl webpages, looking for changes. Today, bot traffic accounts for about <a href="https://radar.cloudflare.com/?date_filter=last_30_days">30% of total traffic</a> on the Internet, and given how foundational search is to using the Internet, it should come as no surprise that search engine bots make up a large proportion of that what might come as a surprise is how inefficient the model is, though: we estimate that over <a href="/crawler-hints-how-cloudflare-is-reducing-the-environmental-impact-of-web-searches/">50% of crawler traffic is wasted effort</a>.</p><p>This has a huge impact. There’s all the additional capacity that owners of websites need to bake into their site to absorb the bots crawling all over it. There’s the transmission of the data. There’s the CPU cost of running the bots. And when you’re running at the scale of the Internet, all of this has a pretty big environmental footprint.</p><p>Part of the problem, though, is nobody had really stopped to ask: maybe there’s a better way?</p><p>Right now, the model for indexing websites is the same as it has been since the 1990s: a “pull” model, where the search engine sends a crawler out to a website after a predetermined amount of time. During Impact Week last year, we asked: what about flipping the model on its head? What about moving to a push model, where a website could simply ping a search engine to let it know an update had been made?</p><p>There are a heap of advantages to such a model. The website wins: it’s not dealing with unnecessary crawls. It also makes sure that as soon as there’s an update to its content, it’s reflected in the search engine — it doesn’t need to wait for the next crawl. The website owner wins because they don't need to manage distinct search engine crawl submissions. The search engine wins, too: it saves money on crawl costs, and it can make sure it gets the latest content.</p><p>Of course, this needs work to be done on both sides of the equation. The websites need a mechanism to alert the search engines; and the search engines need a mechanism to receive the alert, so they know when to do the crawl.</p>
    <div>
      <h3>Crawler Hints — Cloudflare’s Solution for Websites</h3>
      <a href="#crawler-hints-cloudflares-solution-for-websites">
        
      </a>
    </div>
    <p>Solving this problem is why we <a href="/crawler-hints-how-cloudflare-is-reducing-the-environmental-impact-of-web-searches/">launched Crawler Hints</a>. Cloudflare sits in a unique position on the Internet — we’re serving on average 36 million HTTP requests per second. That represents <i>a lot of websites</i>. It also means we’re uniquely positioned to help solve this problem:  to help give crawlers hints about when they should recrawl if new content has been added or if content on a site has recently changed.</p><p>With Crawler Hints, we send signals to web indexers based on cache data and origin status codes to help them understand when content has likely changed or been added to a site. The aim is to increase the number of relevant crawls as well as drastically reduce the number of crawls that don’t find fresh content, saving bandwidth and compute for both indexers and sites alike, and improving the experience of using the search engines.</p><p>But, of course, that’s just half the equation.</p>
    <div>
      <h3>IndexNow Protocol — the Search Engine Moves from Pull to Push</h3>
      <a href="#indexnow-protocol-the-search-engine-moves-from-pull-to-push">
        
      </a>
    </div>
    <p>Websites alerting the search engine about changes is useless if the search engines aren’t listening — and they simply continue to crawl the way they always have. Of course, search engines are incredibly complicated, and changing the way they operate is no easy task.</p><p>The IndexNow Protocol is a standard developed by Microsoft, Seznam.cz and Yandex, and it represents a major shift in the way search engines operate. Using IndexNow, search engines have a mechanism by which they can receive signals from Crawler Hints. Once they have that signal, they can shift their crawlers from a pull model to a push model.</p><p>In a recent update, <a href="https://blogs.bing.com/webmaster/august-2022/IndexNow-adoption-gains-momentum">Microsoft has announced</a> that millions of websites are now using IndexNow to signal to search engine crawlers when their content needs to be crawled and IndexNow was used to <b>index/crawl about 7% of all new URLs</b> <b>clicked</b> when someone is selecting from web search results.</p><p>On the Cloudflare side, since the release of Crawler Hints in October 2021, Crawler Hints has processed about <b>six-hundred-billion</b> signals to IndexNow.</p><p>That’s a lot of saved crawls.</p>
    <div>
      <h3>How to enable Crawler Hints</h3>
      <a href="#how-to-enable-crawler-hints">
        
      </a>
    </div>
    <p>By enabling Crawler Hints on your website, with the simple click of a button, Cloudflare will take care of signaling to these search engines when your content has changed via the <a href="https://www.indexnow.org/">IndexNow</a> API. You don’t need to do anything else!</p><p>Crawler Hints is free to use and available to all Cloudflare customers. If you’d like to see how Crawler Hints can benefit how your website is indexed by the world's biggest search engines, please feel free to opt-into the service by:</p><ol><li><p>Sign in to your Cloudflare Account.</p></li><li><p>In the dashboard, navigate to the Cache tab.</p></li><li><p>Click on the Configuration section.</p></li><li><p>Locate the Crawler Hints and enable.</p></li></ol>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/6gbqdPUtYZlLMw8aVmoygy/3b782370e5d898dd5606088c345f7d33/image1-15.png" />
            
            </figure><p>Upon enabling Crawler Hints, Cloudflare will share when content on your site has changed and needs to be re-crawled with search engines using the IndexNow protocol (<a href="/from-0-to-20-billion-how-we-built-crawler-hints/">this blog</a> can help if you’re interested in finding out more about how the mechanism works).</p>
    <div>
      <h3>What’s Next?</h3>
      <a href="#whats-next">
        
      </a>
    </div>
    <p>Going forward, because the benefits are so substantial for site owners, search operators, and the environment, we plan to start defaulting Crawler Hints on for all our customers. We’re also hopeful that Google, the world’s largest search engine and most wasteful user of Internet resources, will adopt IndexNow or a similar standard and lower the burden of search crawling on the planet.</p><p>When we think of helping to build a better Internet, this is exactly what comes to mind: creating and supporting standards that make it operate better, greener, faster. We’re really excited about the work to date, and will continue to work to improve the signaling to ensure the most valuable information is being sent to the search engines in a timely manner. This includes incorporating additional signals such as etags, last-modified headers, and content hash differences. Adding these signals will help further inform crawlers when they should reindex sites, and how often they need to return to a particular site to check if it’s been changed. This is only the beginning. We will continue testing more signals and working with industry partners so that we can help crawlers run efficiently with these hints.</p><p>And finally: if you’re on Cloudflare, and you’d like to be part of this revolution in how search engines operate on the web (it’s free!), simply follow the instructions in the section above.</p> ]]></content:encoded>
            <category><![CDATA[Crawler Hints]]></category>
            <category><![CDATA[Bots]]></category>
            <category><![CDATA[Cache]]></category>
            <category><![CDATA[Product News]]></category>
            <category><![CDATA[SEO]]></category>
            <guid isPermaLink="false">5LL6jyHzqmppNptbCOrWuQ</guid>
            <dc:creator>Alex Krivit</dc:creator>
        </item>
        <item>
            <title><![CDATA[Automatic Signed Exchanges may dramatically boost your site visitor numbers]]></title>
            <link>https://blog.cloudflare.com/automatic-signed-exchanges-desktop-android/</link>
            <pubDate>Fri, 08 Jul 2022 12:27:53 GMT</pubDate>
            <description><![CDATA[ Automatic Signed Exchanges may dramatically boost your site visitor numbers ]]></description>
            <content:encoded><![CDATA[ 
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/4mUZ1vtinqv4i8DyEobO4K/012a7dcbaca4f70e6664d8432d5374e4/pasted-image-0--4-.png" />
            
            </figure><p>It’s been about nine months since <a href="/from-amp-to-signed-exchanges-or-how-innovation-happens-at-cloudflare/">Cloudflare announced</a> support for Signed Exchanges (SXG), a <a href="https://wicg.github.io/webpackage/draft-yasskin-http-origin-signed-responses.html">web platform specification</a> to deterministically verify the cached version of a website and enable third parties such as search engines and news aggregators to serve it much faster than the origin ever could.</p><p>Giving Internet users fast load times, even on slow connections in remote parts of the globe, is to <i>help build a better Internet</i> (our mission!) and <a href="/from-amp-to-signed-exchanges-or-how-innovation-happens-at-cloudflare/">we couldn’t be more excited about the potential of SXG</a>.Signed Exchanges drive quite impressive benefits in terms of performance improvements. <a href="https://web.dev/signed-exchanges/#impact-of-signed-exchanges">Google’s experiments</a> have shown an average 300ms to 400ms reduction in <a href="https://web.dev/lcp/">Largest Contentful Paint (LCP)</a> from SXG-enabled prefetches.  <b>And speeding up your website usually results in a</b> <a href="https://www.thinkwithgoogle.com/marketing-strategies/app-and-mobile/mobile-page-speed-new-industry-benchmarks/"><b>significant bounce rate reduction</b></a> <b>and improved SEO</b>.</p><p><i>faster websites= better SEO and lower bounce rates</i></p><p>And if setting up and maintaining SXGs through the <a href="https://web.dev/signed-exchanges/#tooling">open source toolkit</a> is a complex yet very valuable endeavor, with Cloudflare’s <a href="/automatic-signed-exchanges/">Automatic Signed Exchanges</a> it becomes a no-brainer. Just enable it with one click and see for yourself.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/4O7ZL4hRXnDPWWIqrP68Jr/87665ba214b9a23c13c9ac6518433d4d/pasted-image-0--5-.png" />
            
            </figure>
    <div>
      <h3>Our own measurements</h3>
      <a href="#our-own-measurements">
        
      </a>
    </div>
    <p>Now that Signed Exchanges have been available on Chromium for Android for several months we dove into the change in performance our customers have experienced in the real world.</p><p>We picked the 500 most visited sites that have Automatic Signed Exchanges enabled and saw that 425 of them (85%) saw an improvement in <a href="https://web.dev/lcp/">LCP</a>, which is widely considered as the Core Web Vital with the most impact on SEO and where SXG should make the biggest difference.</p><p>Out of those same 500 Cloudflare sites 389 (78%) saw an improvement in <a href="https://web.dev/fcp/">First Contentful Paint (FCP)</a> and a whopping 489 (98%) saw an improvement in <a href="https://web.dev/ttfb/">Time to First Byte (TTFB)</a>. The TTFB improvement measured here is an interesting case since if the exchange has already been prefetched, when the user clicks on the link the resource is already in the client browser cache and the TTFB measurement becomes close to zero.</p><p><b>Overall, the median customer saw an improvement of over 20% across these metrics. Some customers saw improvements of up to 80%.</b></p><p>There were also a few customers that did not see an improvement, or saw a slight degradation of their metrics.</p><p>One of the main reasons for this is that SXG wasn’t compatible with server-side personalization (e.g., serving different HTML for logged-in users) until today. To solve that, today Google added ‘Dynamic SXG’, that selectively enables SXG for visits from cookieless users only (more details on the Google blog post <a href="https://developer.chrome.com/blog/sxg-desktop/">here</a>). Dynamic SXG are supported today - all you need to do is add a `Vary: Cookie’ annotation to the HTTP header of pages that contain server-side personalization.</p><p><i>Note: Signed Exchanges are compatible with client-side personalization (lazy-loading).</i></p><p>To see what the <a href="https://www.cloudflare.com/learning/performance/what-are-core-web-vitals/">Core Web Vitals</a> look like for your own users across the world we recommend a RUM solution such as our free and privacy-first <a href="https://www.cloudflare.com/web-analytics/">Web Analytics</a>.</p>
    <div>
      <h3>Now available for Desktop and Android</h3>
      <a href="#now-available-for-desktop-and-android">
        
      </a>
    </div>
    <p><b>Starting today, Signed Exchanges is also supported by Chromium-based desktop browsers, including Chrome, Edge and Opera.</b></p><p>If you enabled Automatic Signed Exchanges on your Cloudflare dashboard, no further action is needed - the supported desktop browsers will automatically start being served the SXG version of your site’s content. Google estimates that this release will, on average, double SXG’s coverage of your site’s visits, enabling improved loading and performance for more users.</p><p>And if you haven’t yet enabled it but are curious about the impact SXG will have on your site, Automatic Signed Exchanges is available through the <a href="https://dash.cloudflare.com/?to=/:account/:zone/speed/optimization">Speed &gt; Optimization link</a> on your Cloudflare dashboard (more details <a href="https://support.cloudflare.com/hc/en-us/articles/4411075595661-Automatic-Signed-Exchanges-SXGs-">here</a>).</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/7B9QWzclctIMHPsOzHUasO/851d54a61126ea21486d96951a200716/image3.png" />
            
            </figure><p></p> ]]></content:encoded>
            <category><![CDATA[Speed & Reliability]]></category>
            <category><![CDATA[Signed Exchanges (SXG)]]></category>
            <category><![CDATA[SEO]]></category>
            <guid isPermaLink="false">6Me41Kp2wjTeaIi9xVwkrk</guid>
            <dc:creator>João Sousa Botto</dc:creator>
        </item>
        <item>
            <title><![CDATA[From 0 to 20 billion - How We Built Crawler Hints]]></title>
            <link>https://blog.cloudflare.com/from-0-to-20-billion-how-we-built-crawler-hints/</link>
            <pubDate>Thu, 16 Dec 2021 13:58:29 GMT</pubDate>
            <description><![CDATA[ Cloudflare Is reducing the environmental impact of web searches with 20+ billions crawler hints delivered so far. This blog describes the technical solution of how we built the Crawler Hints system that makes all this possible.
 ]]></description>
            <content:encoded><![CDATA[ 
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/xs3odwiEbW2X4uv6wNNdA/a08cd0796a54029678fd143edc84e4d6/image2-57.png" />
            
            </figure><p>In July 2021, as part of <a href="/welcome-to-cloudflare-impact-week/">Impact Innovation Week</a>, we announced our intention to launch <a href="/crawler-hints-how-cloudflare-is-reducing-the-environmental-impact-of-web-searches/">Crawler Hints</a> as a means to reduce the environmental impact of web searches. We spent the weeks following the announcement hard at work, and in October 2021, we announced <a href="/cloudflare-now-supports-indexnow/">General Availability for the first iteration of the product</a>. This post explains how we built it, some of the interesting engineering problems we had to solve, and shares some metrics on how it's going so far.</p>
    <div>
      <h2>Before We Begin...</h2>
      <a href="#before-we-begin">
        
      </a>
    </div>
    <p>Search indexers crawl sites periodically to check for new content. Algorithms vary by search provider, but are often based on either a regular interval or cadence of past updates, and these crawls are often not aligned with real world content changes. This naive crawling approach may harm customer page rank and also works to the detriment of search engines with respect to their operational costs and environmental impact. To make the Internet greener and more energy efficient, the goal of Crawler Hints is to help search indexers make more informed decisions on when content has changed, saving valuable compute cycles/bandwidth and having a net positive environmental impact.</p><p>Cloudflare is in an advantageous position to help inform crawlers of content changes, as we are often the “front line” of the interface between site visitors and the origin server where the content updates take place. This grants us knowledge of some key data points like headers, content hashes, and site purges among others. For customers who have opted in to Crawler Hints, we leverage this data to generate a “content freshness score” using an ensemble of active and passive signals from our customer base and request flow. To help with efficiency, Crawler Hints helps to improve SEO for websites behind Cloudflare, improves relevance for search engine users, and improves origin responsiveness by reducing bot traffic to our customers’ origin servers.</p><p>A high level design of the system we built looks as follows:</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/4kSfCs22YboeTPDLdbffsG/b21c528beaa8579a0d6e2e3b874fccca/image3-42.png" />
            
            </figure><p>In this blog we will dig into each aspect of it in more detail.</p>
    <div>
      <h2>Keeping Things Fresh</h2>
      <a href="#keeping-things-fresh">
        
      </a>
    </div>
    <p><a href="https://www.cloudflare.com/network/">Cloudflare has a large global network spanning 250 cities</a>.  A popular use case for Cloudflare is to use our CDN product to cache your website's assets so that users accessing your site can benefit from lightning fast response times. You can read more about how Cloudflare manages our cache <a href="/why-we-started-putting-unpopular-assets-in-memory/">here</a>. The important thing to call out for the purpose of this post is that the cache is Data Center local. A cache hit in London might be a cache miss in San Francisco unless you have opted-in to <a href="/orpheus/">tiered-caching</a>, but that is beyond the scope of this post.</p><p>For Crawler Hints to work, we make use of a number of signals available at request time to make an informed decision on the “freshness” of content. For our first iteration of Crawler Hints, we used a cache miss from Cloudflare’s cache as a starting basis. Although a naive signal on its own, getting the data pipelines in place to forward cache miss data from our global network to our control plane meant we would have everything in place to iterate on and improve the signal processing quickly going forward. To do this, we leveraged some existing services from our data team that takes request data , marshalls it into <a href="https://capnproto.org/">Cap'n Proto format</a>, and forwards it to a message bus (we use apache Kafka). These messages include the URLs of the resources that have met the signal criteria, along with some additional metadata for analytics/future improvement.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/1XeWU8Bx1SuIOajaVbu3Kd/76edad32d2eff51af6b2d9887ea2825e/image4-34.png" />
            
            </figure><p>The amount of traffic our global network receives is substantial. We serve over 28 million HTTP requests per second on average, with more than 35 million HTTP requests per second at peak. Typically, Cloudflare teams sample this data to enable products <a href="/get-notified-when-your-site-is-under-attack/">such as being alerted when you are under attack</a>. For Crawler Hints, every cache miss is important. Therefore, 100% of all cache misses for opted-in sites were sent for further processing, and we’ll discuss more on opt-in later.</p>
    <div>
      <h2>Redis as a Distributed Buffer</h2>
      <a href="#redis-as-a-distributed-buffer">
        
      </a>
    </div>
    <p>With messages buffered in Kafka, we can now begin the work of aggregation and deduplication. We wrote a consumer service that we call an ingestor. The ingestor reads the data from Kafka. The ingestor performs validation to ensure proper sanitization and data integrity and passes this data onto the next stage of the system. We run the ingestor as part of a Kafka consumer group, allowing us to scale our consumer count up to the partition size as throughput increases.</p><p>We ultimately want to deliver a set of “fresh” content to our search partners on a dynamic interval. For example, we might want to send a batch of 10,000 URLs every two minutes. There are, however, a couple of important things to call out though:</p><ul><li><p>There should be no duplicate resources in each batch.</p></li><li><p>We should strike a balance in our size and frequency such that overall request size isn’t too large, but big enough to remove some pressure on the receiving API by not sending too <i>many</i> requests at once.</p></li></ul><p>For the deduplication, the simplest thing to do would be to have an in-memory map in our service to track resources between a pre-specified interval. A naive implementation in Go might look something like this.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/2vlmka7b85QzcVxziLLGdR/b3669b1bbd3fd4108d94ffa8052b2f8d/image5-22.png" />
            
            </figure><p>The problem with this approach is we have little resilience. If the service was to crash, we would lose all the data for our current batch. Furthermore, if we were to run multiple instances of our services, they would all have a different “view” of which resources they had seen before and therefore we would not be deduplicating.To mitigate this issue, we decided to use a specialist caching service. There are a number of distributed caches that would fit the bill, but we chose Redis given our team’s familiarity with operating it at scale.</p><p>Redis is well known as a Key Value(KV) store often used for caching things,optionally with a specified Time To Live(TTL). Perhaps slightly less obvious is its value as a distributed buffer, housing ephemeral data with periodic flush/tear-downs. For Crawler Hints, we leveraged both these traits via a multi-generational, multi-cluster setup to achieve a highly available rolling aggregation service.</p><p>Two standalone Redis clusters were spun up. For each generation of request data, one cluster would be designated as the active primary. The validated records would be inserted as keys on the primary, serving the dual purpose of buffering while also deduplicating since Redis keys are unique. Separately, a downstream service (more on this later!) would periodically issue the command for these inserters to switch from the active primary (cluster A) to the inactive cluster (cluster B). Cluster A could then be flushed with records being batch read in a size of our choosing.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/4IYn9s1nmM2Qcci2c5A8cn/21069dec8e9b95ec87e17d8b2d31a767/image9-5.png" />
            
            </figure>
    <div>
      <h2>Buffering for Dispatch</h2>
      <a href="#buffering-for-dispatch">
        
      </a>
    </div>
    <p>At this point, we have clean, batched data. Things are looking good! However, there’s one small hiccup in the plan: we’re reading these batches from Redis at some set interval. What if it takes longer to dispatch than the interval itself? What if the search partner API is having issues?</p><p>We need a way to ensure the durability of the batch URLs and reduce the impact of any dispatch issues. To do this, we revisit an old friend from earlier: Kafka. The batches that get read from Redis are then fed into a Kafka topic. We wrote a Kafka consumer that we call the “dispatcher service” which runs within a consumer group to enable us to scale it if necessary just like the ingestor. The dispatcher reads from the Kafka topic and sends a batch of resources to each of our API partners.</p><p>Launching in tandem with Cloudflare, Crawler Hints was a joint venture between a few early adopters in the search engine space to provide a means for sites to inform indexers of content changes called IndexNow. <a href="/cloudflare-now-supports-indexnow/">You can read more about this launch here.</a> IndexNow is a large part of what makes Crawler Hints possible. As part of its manifest, it provides a common API spec to publish resources that should be re-indexed. The standardized API makes abstracting the communication layer quite simple for the partners that support it. “Pushing” these signals to our search engine partners is a big step away from the inefficient “Pull” based model that is used today (you can read more about that <a href="/crawler-hints-how-cloudflare-is-reducing-the-environmental-impact-of-web-searches/">here</a>). We launched with Yandex and Bing as Search Engine Partners.</p><p>To ensure we can add more partners in the future, we defined an interface which we call a “Hinter”.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/7K0wQWVx2hciGxZVwFMxM6/a3b6f0272225bbddd393465592c98721/image8-12.png" />
            
            </figure><p>We then satisfy this interface for each partner that we work with. We return a custom error from the Hinter service that is of type *indexers.Error. The definition of which is:</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/7Aq6eqraSEg1TpQ2b2QJK9/e763eb8303e0d1fdef3807ae58503cda/image1-84.png" />
            
            </figure><p>This allows us to “bubble up” information about which indexer has failed and increment metrics and retry only those calls to indexers which have failed.</p><p>This all culminates together with the following in our service layer:</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/5fwN6ZGTRJ5NARBrJWLfOq/a69a7ef3147e17c02de3c99e0f92ee95/image6-21.png" />
            
            </figure><p>Simple, performant, maintainable, AND easy to add more partners in the future.</p>
    <div>
      <h2>Rolling out Crawler Hints</h2>
      <a href="#rolling-out-crawler-hints">
        
      </a>
    </div>
    <p>At Cloudflare, we often release things that haven’t been done before at scale. This project is a great example of that. Trying to gauge how many users would be interested in this product and what the uptake might be like on day one, day ten, and day one thousand is close to impossible. As engineers responsible for running this system, it is essential we build in checks and balances so that the system does not become overwhelmed and responds appropriately. For this particular project, there are three different types of “protection” we put in place. These are:</p><ul><li><p>Customer opt-in</p></li><li><p>Monitoring &amp; Alerts</p></li><li><p>System resilience via “self-healing”</p></li></ul>
    <div>
      <h3>Customer opt-in</h3>
      <a href="#customer-opt-in">
        
      </a>
    </div>
    <p>Cloudflare takes any changes that can impact customer traffic flow seriously. Considering Crawler Hints has the potential to change how sites are seen externally (even if in this instance the site’s viewers are robots!) and can impact things like SEO and bandwidth usage, asking customers to opt-in is a sensible default. By asking customers to opt-in to the service, we can start to get an understanding of our system’s capacity and look for bottle necks and how to remove them. To do this, we make extensive use of Prometheus, Grafana, and Kibana.</p>
    <div>
      <h3>Monitoring &amp; Alerting</h3>
      <a href="#monitoring-alerting">
        
      </a>
    </div>
    <p>We do our best to make our systems as “self-healing” and easy to run as possible, but as they say, “By failing to prepare, you are preparing to fail.” We therefore invest a lot of time creating ways to track the health and performance of our system and creating automated alerts when things fall outside of expected bounds.</p><p>Below is a small sample of the Grafana dashboard we created for this project. As you can see, we can track customer enablement and the rate of hint dispatch in real time. The bottom two panels show the throughput of our Kafka clusters by partition. Even just these four metrics give us a lot of insight into how things are going, but we also track (as well as other things):</p><ul><li><p>Lag on Kafka by partition (how far behind real time we are)</p></li><li><p>Bad messages received from Kafka</p></li><li><p>Amount of URLs processed per “run”</p></li><li><p>Response code per index partner over time</p></li><li><p>Response time of partner API over time</p></li><li><p>Health of the Redis clusters (how much memory is used, frequency of commands we are using received by the cluster)</p></li><li><p>Memory, CPU usage, and pods available against configured limits/requests</p></li></ul>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/7bunGQ2Uvk8bZ5jdRz3t6G/08a155b3ec60b9bf44218a4d2e12b774/image7-12.png" />
            
            </figure><p>It seems a lot to track, but this information is invaluable to us, and we use it to generate alerts that notify the on-call engineer if a threshold is breached. For example, we have an alert that would escalate to an engineer if our Redis cluster approached 80% capacity. For some thresholds we specify, we may want the system to “self-heal.” In this instance, we would want an engineer to investigate as this is outside the bounds of “normal,” and it might be that something is not working as expected. An alternative reason that we might receive alerts is that our product has increased in popularity beyond our expectations, and we simply need to increase the memory limit. This requires context and is therefore best left to a human to make this decision.</p>
    <div>
      <h3>System Resilience via “self-healing”</h3>
      <a href="#system-resilience-via-self-healing">
        
      </a>
    </div>
    <p>We do everything we can to not disturb on-call engineers, and therefore, we try to make the system as “self-healing” as possible. We also don’t want to have too much extra resource running as it can be expensive and use limited capacity that another  Cloudflare service might need more - it's a trade off.  To do this, we make use of a few patterns and tools common in every distributed engineer’s toolbelt. Firstly, we deploy on Kubernetes. This enables us to make use of great features like <a href="https://kubernetes.io/docs/tasks/run-application/horizontal-pod-autoscale/">Horizontal Pod Autoscaling</a>. When any of our pods reach ~80% memory usage, a new pod is created which will pick up some of the slack up to a predefined limit.</p><p>Secondly, by using a message bus, we get a lot of control over the amount of “work” our services have to do in a given time frame. In general, a message bus is “pull” based. If we want more work, we ask for it. If we want less work, we pull less. This holds for the most part, but with a system where being close to real time is important, it is essential that we monitor the “lag” of the topic, or how far we are behind real time. If we are too far behind, we may want to introduce more partitions or consumers.</p><p>Finally, networks fail. We therefore add retry policies to all HTTP calls we make before reporting them a failure. For example, if we were to receive a 500 (Internal Server Error) from one of our partner APIs, we would retry up to five times using an exponential backoff strategy before reporting a failure.</p>
    <div>
      <h2>Data from the first couple of months</h2>
      <a href="#data-from-the-first-couple-of-months">
        
      </a>
    </div>
    <p>Since the release of Crawler Hints on October 18, 2021 until December 15, 2021, Crawler Hints has processed over twenty five billion crawl signals, has been opted-in to by more than 81,000 customers, and has handled roughly 18,000 requests per second. It’s been an exciting project to be a part of, and we are just getting started.</p>
    <div>
      <h2>What’s Next?</h2>
      <a href="#whats-next">
        
      </a>
    </div>
    <p>We will continue to work with our partners to improve the standard even further and continue to improve the signaling on our side to ensure the most valuable information is being pushed on behalf of our customers in a timely manner.</p><p><b><i>If you're interested in building scalable services and solving interesting technical problems, we are hiring engineers on our team in</i></b> <a href="https://boards.greenhouse.io/cloudflare/jobs/3129759?gh_jid=3129759"><b><i>Austin</i></b></a><b><i>,</i></b> <a href="https://boards.greenhouse.io/cloudflare/jobs/3231716?gh_jid=3231716"><b><i>Lisbon</i></b></a><b><i>, and</i></b> <a href="https://boards.greenhouse.io/cloudflare/jobs/3231718?gh_jid=3231718"><b><i>London</i></b></a><b><i>.</i></b></p> ]]></content:encoded>
            <category><![CDATA[Crawler Hints]]></category>
            <category><![CDATA[Product News]]></category>
            <category><![CDATA[Speed & Reliability]]></category>
            <category><![CDATA[SEO]]></category>
            <guid isPermaLink="false">3WLRRaI7sl25i1KhmRq7YB</guid>
            <dc:creator>Matt Boyle</dc:creator>
            <dc:creator>Nathan Disidore</dc:creator>
            <dc:creator>Rajesh Bhatia</dc:creator>
        </item>
        <item>
            <title><![CDATA[Custom Headers for Cloudflare Pages]]></title>
            <link>https://blog.cloudflare.com/custom-headers-for-pages/</link>
            <pubDate>Wed, 27 Oct 2021 13:00:15 GMT</pubDate>
            <description><![CDATA[ We're excited to announce that Pages now natively supports custom headers on your projects! Simply create a _headers file in the build directory of your project and within it, define the rules you want to apply. ]]></description>
            <content:encoded><![CDATA[ <p></p><p>Until today, Cloudflare Workers has been a great solution to setting headers, but we wanted to create an even smoother developer experience. Today, we're excited to announce that Pages now natively supports custom headers on your projects! Simply create a <code>_headers</code> file in the build directory of your project and within it, define the rules you want to apply.</p>
            <pre><code>/developer-docs/*
  X-Hiring: Looking for a job? We're hiring engineers
(https://www.cloudflare.com/careers/jobs)</code></pre>
            
    <div>
      <h2>What can you set with custom headers?</h2>
      <a href="#what-can-you-set-with-custom-headers">
        
      </a>
    </div>
    <p>Being able to set custom headers is useful for a variety of reasons — let’s explore some of your most popular use cases.</p>
    <div>
      <h3>Search Engine Optimization (SEO)</h3>
      <a href="#search-engine-optimization-seo">
        
      </a>
    </div>
    <p>When you create a Pages project, a <code>pages.dev</code> deployment is created for your project which enables you to <a href="https://developers.cloudflare.com/pages/get-started">get started immediately</a> and easily <a href="https://developers.cloudflare.com/pages/platform/preview-deployments">preview changes</a> as you iterate. However, we realize this poses an issue — publishing multiple copies of your website can harm your rankings in search engine results. One way to solve this is by disabling indexing on all <code>pages.dev</code> subdomains, but we see many using their <code>pages.dev</code> subdomain as their primary domain. With today’s announcement you can attach headers such as <a href="https://developers.google.com/search/docs/advanced/robots/robots_meta_tag#xrobotstag"><code>X-Robots-Tag</code></a> to hint to Google and other search engines how you'd like your deployment to be indexed.</p><p>For example, to prevent your <code>pages.dev</code> deployment from being indexed, you can add the following to your <code>_headers</code> file:</p>
            <pre><code>https://:project.pages.dev/*
  X-Robots-Tag: noindex</code></pre>
            
    <div>
      <h3>Security</h3>
      <a href="#security">
        
      </a>
    </div>
    <p>Customizing headers doesn’t just help with your site’s search result ranking — a number of browser security features can be configured with headers. A few headers that can enhance your site’s security are:</p><ul><li><p><a href="https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/X-Frame-Options"><code><b>X-Frame-Options</b></code></a>: You can prevent <a href="https://owasp.org/www-community/attacks/Clickjacking">click-jacking</a> by informing browsers not to embed your application inside another (e.g. with an <code>&lt;iframe&gt;</code>).</p></li><li><p><a href="https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/X-Content-Type-Options"><code><b>X-Content-Type-Option: nosniff</b></code></a><b>:</b> To prevent browsers from interpreting a response as any other content-type than what is defined with the <code>Content-Type</code> header.</p></li><li><p><a href="https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Referrer-Policy"><code><b>Referrer-Policy</b></code></a>: This allows you to customize how much information visitors give about where they're coming from when they navigate away from your page.</p></li><li><p><a href="https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Feature-Policy"><code><b>Permissions-Policy</b></code></a>: Browser features can be disabled to varying degrees with this header (recently renamed from <code>Feature-Policy</code>).</p></li><li><p><a href="https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Content-Security-Policy"><code><b>Content-Security-Policy</b></code></a>: And if you need fine-grained control over the content in your application, this header allows you to configure a number of security settings, including similar controls to the <code>X-Frame-Options</code> header.</p></li></ul><p>You can configure these headers to protect an <code>/app/*</code> path, with the following in your <code>_headers</code> file:</p>
            <pre><code>/app/*
  X-Frame-Options: DENY
  X-Content-Type-Options: nosniff
  Referrer-Policy: no-referrer
  Permissions-Policy: document-domain=()
  Content-Security-Policy: script-src 'self'; frame-ancestors 'none';</code></pre>
            
    <div>
      <h3>CORS</h3>
      <a href="#cors">
        
      </a>
    </div>
    <p>Modern browsers implement a security protection called <i>CORS</i> or Cross-Origin Resource Sharing. This prevents one domain from being able to force a user's action on another. Without CORS, a malicious site owner might be able to do things like make requests to unsuspecting visitors' banks and initiate a transfer on their behalf. However, with CORS, requests are prevented from one origin to another to stop the malicious activity.</p><p>There are, however, some cases where it is safe to allow these cross-origin requests. So-called, "<a href="https://developer.mozilla.org/en-US/docs/Web/HTTP/CORS#simple_requests">simple requests</a>" (such as linking to an image hosted on a different domain) are permitted by the browser. Fetching these resources dynamically is often where the difficulty arises, and the browser is sometimes overzealous in its protection. Simple static assets on Pages are safe to serve to any domain, since the request takes no action and there is no visitor session. Because of this, a domain owner can attach CORS headers to specify exactly which requests can be allowed in the <code>_headers</code> file for fine-grained and explicit control.</p><p>For example, the use of the asterisk will enable any origin to request any asset from your Pages deployment:</p>
            <pre><code>/*
  Access-Control-Allow-Origin: *</code></pre>
            <p>To be more restrictive and limit requests to only be allowed from a 'staging' subdomain, we can do the following:</p>
            <pre><code>https://:project.pages.dev/*
  Access-Control-Allow-Origin: https://staging.:project.pages.dev</code></pre>
            
    <div>
      <h2>How we built support for custom headers</h2>
      <a href="#how-we-built-support-for-custom-headers">
        
      </a>
    </div>
    <p>To support all these use cases for custom headers, we had to build a new engine to determine which rules to apply for each incoming request. Backed, of course, by Workers, this engine supports splats and placeholders, and allows you to include those matched values in your headers.</p><p>Although we don't support all of its features, we've modeled this matching engine after the <a href="https://wicg.github.io/urlpattern/">URLPattern specification</a> which was recently shipped with Chrome 95. We plan to be able to fully implement this specification for custom headers once URLPattern lands in the Workers runtime, and there should hopefully be no breaking changes to migrate.</p>
    <div>
      <h2>Enhanced support for redirects</h2>
      <a href="#enhanced-support-for-redirects">
        
      </a>
    </div>
    <p>With this same engine, we’re bringing these features to your <code>_redirects</code> file as well. You can now configure your redirects with splats, placeholders and status codes as shown in the example below:</p>
            <pre><code>/blog/* https://blog.example.com/:splat 301
/products/:code/:name /products?name=:name&amp;code=:code
/submit-form https://static-form.example.com/submit 307</code></pre>
            
    <div>
      <h2>Get started</h2>
      <a href="#get-started">
        
      </a>
    </div>
    <p>Custom <a href="https://developers.cloudflare.com/pages/platform/headers">headers</a> and <a href="https://developers.cloudflare.com/pages/platform/redirects">redirects</a> for Cloudflare Pages can be configured today. Check out <a href="https://developers.cloudflare.com/pages/platform">our documentation</a> to get started, and let us know how you're using it in <a href="https://discord.gg/cloudflaredev">our Discord server</a>. We'd love to hear about what this unlocks for your projects!</p>
    <div>
      <h2>Coming up...</h2>
      <a href="#coming-up">
        
      </a>
    </div>
    <p>And finally, if a <code>_headers</code> file and enhanced support for <code>_redirects</code> just isn't enough for you, we also have something <i>big</i> coming very soon which will give you the power to build even more powerful projects. Stay tuned!</p> ]]></content:encoded>
            <category><![CDATA[Product News]]></category>
            <category><![CDATA[Cloudflare Pages]]></category>
            <category><![CDATA[Developers]]></category>
            <category><![CDATA[Security]]></category>
            <category><![CDATA[SEO]]></category>
            <guid isPermaLink="false">2Svf2ExXnAO2AwvgiHaSDg</guid>
            <dc:creator>Nevi Shah</dc:creator>
            <dc:creator>Greg Brimble</dc:creator>
        </item>
        <item>
            <title><![CDATA[Crawler Hints Update: Cloudflare Supports IndexNow and Announces General Availability]]></title>
            <link>https://blog.cloudflare.com/cloudflare-now-supports-indexnow/</link>
            <pubDate>Mon, 18 Oct 2021 16:30:53 GMT</pubDate>
            <description><![CDATA[ Crawler Hints now supports IndexNow, a new protocol that allows websites to notify search engines whenever content on their website content is created, updated, or deleted.  ]]></description>
            <content:encoded><![CDATA[ 
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/2iltb3QjEq4Bh3H1Cl9wLe/ff96685b7792d346d4810ecbf1be0787/web-crawling.png" />
            
            </figure><p>In the midst of the <a href="https://www.nbcnews.com/science/environment/us-just-hottest-summer-record-rcna1957">hottest summer on record</a>, Cloudflare held its first ever <a href="https://www.cloudflare.com/impact-week/">Impact Week</a>. We announced a variety of products and initiatives that aim to make the Internet and our planet a better place, with a focus on environmental, social, and governance projects. Today, we’re excited to share an update on Crawler Hints, an initiative announced during Impact Week. <a href="/crawler-hints-how-cloudflare-is-reducing-the-environmental-impact-of-web-searches/">Crawler Hints</a> is a service that improves the operating efficiency of approximately <a href="https://radar.cloudflare.com/">45%</a> of the Internet traffic that comes from web crawlers and bots.</p><p>Crawler Hints achieves this efficiency improvement by ensuring that crawlers get information about what they’ve crawled previously and if it makes sense to crawl a website again.</p><p>Today we are excited to announce two updates for Crawler Hints:</p><ol><li><p><b>The first</b>: Crawler Hints now supports <a href="https://www.indexnow.org/">IndexNow</a>, a new protocol that allows websites to notify search engines whenever content on their website content is created, updated, or deleted. By <a href="https://blogs.bing.com/webmaster/october-2021/IndexNow-Instantly-Index-your-web-content-in-Search-Engines">collaborating with Microsoft</a> and Yandex, Cloudflare can help improve the efficiency of their search engine infrastructure, customer origin servers, and the Internet at large.</p></li><li><p><b>The second</b>: Crawler Hints is now generally available to all Cloudflare customers for free. Customers can benefit from these more efficient crawls with a single button click. If you want to enable Crawler Hints, you can do so in the <b>Cache Tab</b> of the Dashboard.</p></li></ol>
    <div>
      <h3>What problem does Crawler Hints solve?</h3>
      <a href="#what-problem-does-crawler-hints-solve">
        
      </a>
    </div>
    <p>Crawlers help make the Internet work. Crawlers are automated services that travel the Internet looking for… well, whatever they are programmed to look for. To power experiences that rely on indexing content from across the web, search engines and similar services operate massive networks of bots that crawl the Internet to identify the content most relevant to a user query. But because content on the web is always changing, and there is no central clearinghouse for <i>when</i> these changes happen on websites, search engine crawlers have a Sisyphean task. They must continuously wander the Internet, making guesses on how frequently they should check a given site for updates to its content.</p><p>Companies that run search engines have worked hard to make the process as efficient as possible, pushing the state-of-the-art for crawl cadence and infrastructure efficiency. But there remains one clear area of waste: excessive crawl.</p><p>At Cloudflare, we see traffic from all the major search crawlers, and have spent the last year studying how often these bots revisit a page that hasn't changed since they last saw it. Every one of these visits is a waste. And, unfortunately, our observation suggests that 53% of this crawler traffic is wasted.</p><p>With Crawler Hints, we expect to make this task a bit more tractable by providing an additional heuristic to the people who run these crawlers. This will allow them to know when content has been changed or added to a site instead of relying on preferences or previous changes that might not reflect the true change cadence for a site. <b>Crawler Hints aims to increase the proportion of relevant crawls and limit crawls that don’t find fresh content, improving customer experience and reducing the need for repeated crawls.</b></p><p>Cloudflare sits in a unique position on the Internet to help give crawlers hints about when they should recrawl a site. Don’t knock on a website’s door every 30 seconds to see if anything is new when Cloudflare can proactively tell your crawler when it’s the best time to index new or changed content. That’s Crawler Hints in a nutshell!</p><p>If you want to learn more about Crawler Hints, see the <a href="/crawler-hints-how-cloudflare-is-reducing-the-environmental-impact-of-web-searches/">original blog</a>.</p>
    <div>
      <h3>What is IndexNow?</h3>
      <a href="#what-is-indexnow">
        
      </a>
    </div>
    <p><a href="https://www.indexnow.org/">IndexNow</a> is a standard that was <a href="https://blogs.bing.com/webmaster/october-2021/IndexNow-Instantly-Index-your-web-content-in-Search-Engines">written by Microsoft</a> and Yandex search engines. The standard aims to provide an efficient manner of signaling to search engines and other crawlers for when they should crawl content. Cloudflare’s Crawler Hints now supports IndexNow.</p><blockquote><p><i>​​In its simplest form, IndexNow is a simple ping so that search engines know that a URL and its content has been added, updated, or deleted, allowing search engines to quickly reflect this change in their search results.</i>- <a href="https://www.indexnow.org/">www.indexnow.org</a></p></blockquote><p>By enabling Crawler Hints on your website, with the simple click of a button, Cloudflare will take care of signaling to these search engines when your content has changed via the IndexNow protocol. You don’t need to do anything else!  </p><p>What does this mean for search engine operators? With Crawler Hints you’ll receive a near real-time, pushed feed of change events of Cloudflare websites (that have opted in). This, in turn, will dramatically improve not just the quality of your results, but also the energy efficiency of running your bots.</p>
    <div>
      <h3>Collaborating with Industry leaders</h3>
      <a href="#collaborating-with-industry-leaders">
        
      </a>
    </div>
    <p>Cloudflare is in a unique position to have a <a href="https://w3techs.com/technologies/overview/proxy">sizable portion of the Internet</a> proxied behind us. As a result, we are able to see trends in the way bots access web resources. That visibility allows us to be proactive about signaling which crawls are required vs. not. We are excited to work with partners to make these insights useful to our customers. Search engines are key constituents in this equation. We are happy to collaborate and share this vision of a more efficient Internet with Microsoft Bing, and Yandex. We have been testing our interaction via IndexNow with Bing and Yandex for months with some early successes.  </p><p>This is just the beginning. Crawler Hints is a continuous process that will require working with more and more partners to improve Internet efficiency more generally. While this may take time and participation from other key parts of the industry, we are open to collaborate with any interested participant who relies on crawling to power user experiences.</p><blockquote><p><i>“The cache data from CDNs is a really valuable signal for content freshness. Cloudflare, as one of the top CDNs, is key in the adoption of IndexNow to become an industry-wide standard with a large portion of the internet actually using it. Cloudflare has built a really easy 1-click button for their users to start using it right away. Cloudflare’s mission of helping build a better Internet resonates well with why I started IndexNow i.e. to build a more efficient and effective Search.”</i><b>- </b><b><i><b>Fabrice Canel</b></i></b><b><i>, Principal Program Manager</i></b></p></blockquote>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/6KNZYN7jyh3Tq4IzxLloof/b333dd352855c57f61a50a5e956c3fe7/Screen-Shot-2021-10-18-at-8.25.56-AM.png" />
            
            </figure><blockquote><p><i>“Yandex is excited to join IndexNow as part of our long-term focus on sustainability. We have been working with the Cloudflare team in early testing to incorporate their caching signals in our crawling mechanism via the IndexNow API. The results are great so far.”</i><b>- </b><b><i><b>Maxim Zagrebin</b></i></b><b><i>, Head of Yandex Search</i></b></p></blockquote>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/2Wfd1Ma47SAZCEzzRrftnX/d86515ecd41770e133e650892a013436/Screen-Shot-2021-10-18-at-8.25.50-AM.png" />
            
            </figure><blockquote><p><i>"DuckDuckGo is supportive of anything that makes search more environmentally friendly and better for end users without harming privacy. We're looking forward to working with Cloudflare on this proposal."</i><b>- </b><b><i><b>Gabriel Weinberg</b></i></b><b><i>, CEO and Founder</i></b></p></blockquote>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/35W6NfS2GN9ta5qN68BhUg/5020126c5a739758e4b54034f96eb15c/Horizontal_Default--1-.jpg" />
            
            </figure>
    <div>
      <h3>How do Cloudflare customers benefit?</h3>
      <a href="#how-do-cloudflare-customers-benefit">
        
      </a>
    </div>
    <p>Crawler Hints doesn’t just benefit search engines. For our customers and origin owners, Crawler Hints will ensure that search engines and other bot-powered experiences will always have the freshest version of your content, translating into happier users and ultimately influencing search rankings. Crawler Hints will also mean less traffic hitting your origin, improving resource consumption. Moreover, your site performance will be improved as well: your human customers will not be competing with bots!</p><p>And for Internet users? When you interact with bot-fed experiences — which we all do every day, whether we realize it or not, like search engines or pricing tools — these will now deliver more useful results from crawled data, because Cloudflare has signaled to the owners of the bots the moment they need to update their results.</p>
    <div>
      <h3>How can I enable Crawler Hints for my website?</h3>
      <a href="#how-can-i-enable-crawler-hints-for-my-website">
        
      </a>
    </div>
    <p>Crawler Hints is free to use for all Cloudflare customers and promises to revolutionize web efficiency. If you’d like to see how Crawler Hints can benefit how your website is indexed by the worlds biggest search engines, please feel free to opt-into the service:</p><ol><li><p>Sign in to your Cloudflare Account.</p></li><li><p>In the dashboard, navigate to the <b>Cache tab.</b></p></li><li><p>Click on the <b>Configuration</b> section.</p></li><li><p>Locate the Crawler Hints sign up card and enable. It's that easy.</p></li></ol>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/6csTJTB4b54BV7Lw1Mn9wC/852526fff3dbcc949586bda216809009/Crawler.png" />
            
            </figure><p>Once you’ve enabled it, we will begin sending hints to search engines about when they should crawl particular parts of your website. Crawler Hints holds tremendous promise to improve the efficiency of the Internet.</p>
    <div>
      <h3>What’s next?</h3>
      <a href="#whats-next">
        
      </a>
    </div>
    <p>We’re thrilled to collaborate with industry leaders Microsoft Bing, and Yandex to bring IndexNow to Crawler Hints, and to bring Crawler Hints to a wide audience in general availability. We look forward to working with additional companies who run crawlers to help make this process more efficient for the whole Internet.</p> ]]></content:encoded>
            <category><![CDATA[Crawler Hints]]></category>
            <category><![CDATA[Product News]]></category>
            <category><![CDATA[Speed & Reliability]]></category>
            <category><![CDATA[SEO]]></category>
            <guid isPermaLink="false">2HzZcqvqFaoDkhIKR2kkVF</guid>
            <dc:creator>Alex Krivit</dc:creator>
            <dc:creator>Abhi Das</dc:creator>
        </item>
        <item>
            <title><![CDATA[Improving Performance and Search Rankings with Cloudflare for Fun and Profit]]></title>
            <link>https://blog.cloudflare.com/improving-performance-and-search-rankings-with-cloudflare-for-fun-and-profit/</link>
            <pubDate>Thu, 19 Nov 2020 22:38:20 GMT</pubDate>
            <description><![CDATA[ Making things fast is one of the things we do at Cloudflare. More responsive websites, apps, APIs, and networks directly translate into improved conversion and user experience.  ]]></description>
            <content:encoded><![CDATA[ <p>Making things fast is one of the things we do at Cloudflare. More responsive websites, apps, APIs, and networks directly translate into improved conversion and user experience. On November 10th, <a href="https://developers.google.com/search/blog/2020/11/timing-for-page-experience">Google announced</a> that Google Search will directly take web performance and page experience data into account when ranking results on their search engine results pages (SERPs), beginning in May 2021.</p><p>Specifically, Google Search will prioritize results based on how pages score on <a href="https://www.cloudflare.com/learning/performance/what-are-core-web-vitals/">Core Web Vitals</a>, a measurement methodology Cloudflare has worked closely with Google to establish, and we have implemented support for in our analytics tools.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/1qLZjGPDFLkD51Nd74CQZM/0fd589c0fbd657883f03e4ba4ac3246c/1-4.png" />
            
            </figure><p>Source: "Search Page Experience Graphic" by Google is licensed under CC BY 4.0</p><p>The Core Web Vitals metrics are Largest Contentful Paint (LCP, a loading measurement), First Input Delay (FID, a measure of interactivity), and Cumulative Layout Shift (CLS, a measure of visual stability). Each one is directly associated with user perceptible page experience milestones. All three can be improved using our performance products, and all three can be <a href="/start-measuring-web-vitals-with-browser-insights/">measured with our Cloudflare Browser Insights product</a>, and soon, with our free privacy-aware <a href="https://www.cloudflare.com/web-analytics/">Cloudflare Web Analytics</a>.</p><p>SEO experts have always suspected faster pages lead to better search ranking. With the recent announcement from Google, we can say with confidence that <b>Cloudflare helps you achieve the web performance trifecta</b>: our product suite makes your site faster, gives you direct visibility into how it is performing (and use that data to iteratively improve), and directly drives improved search ranking and business results.</p><blockquote><p><i>"Google providing more transparency about how Search ranking works is great for the open Web. The fact they are ranking using real metrics that are easy to measure with tools like Cloudflare's analytics suite makes Google's recent announcement all the more exciting. Cloudflare offers a full set of tools to make sites incredibly fast and measure ‘incredibly’ directly."</i></p><p>– <b>Matt Weinberg</b>, president of <a href="https://www.happycog.com/">Happy Cog</a>, a full-service digital agency.</p></blockquote>
    <div>
      <h3>Cloudflare helps make your site faster</h3>
      <a href="#cloudflare-helps-make-your-site-faster">
        
      </a>
    </div>
    <p>Cloudflare offers a diverse, easy to deploy set of products to improve page experience for your visitors. We offer a rich, configurable set of tools to improve page speed, which this post is too small to contain. Unlike Fermat, who once famously described a math problem and then said “the margin is too small to contain the solution”, and then let folks spend three hundred plus years trying to figure out his enigma, I’m going to tell you how to solve web performance problems with Cloudflare. Here are the highlights:</p>
    <div>
      <h3>Caching and Smart Routing</h3>
      <a href="#caching-and-smart-routing">
        
      </a>
    </div>
    <p>The typical website is composed of a mix of static assets, like images and product descriptions, and dynamic content, like the contents of a shopping cart or a user’s profile page. Cloudflare caches customers’ static content at our edge, avoiding the need for a full roundtrip to origin servers each time content is requested. Because our edge network places content very close (in physical terms) to users, there is less distance to travel and page loads are consequently faster. Thanks, Einstein.</p><p>And Argo Smart Routing helps speed page loads that require dynamic content. It analyzes and optimizes routing decisions across the global Internet in real-time. Think Waze, the automobile route optimization app, but for Internet traffic.</p><p>Just as Waze can tell you which route to take when driving by monitoring which roads are congested or blocked, Smart Routing can route connections across the Internet efficiently by avoiding packet loss, congestion, and outages.</p><p>Using caching and Smart Routing directly improves page speed and experience scores like Web Vitals. With Google's recent announcement, this also means improved search ranking.</p>
    <div>
      <h3>Content optimization</h3>
      <a href="#content-optimization">
        
      </a>
    </div>
    <p>Caching and Smart Routing are designed to reduce and speed up round trips from your users to your origin servers, respectively. Cloudflare also offers features to <i>optimize</i> the content we do serve.</p><p>Cloudflare Image Resizing allows on-demand sizing, quality, and format adjustments to images, including the ability to convert images to modern file formats like WebP and AVIF.</p><p>Delivering images this way to your end-users helps you save bandwidth costs and improve performance, since Cloudflare allows you to optimize images already cached at the edge.</p><p>For WordPress operators, we recently launched Automatic Platform Optimization (APO). With APO, Cloudflare will serve your entire site from our edge network, ensuring that customers see improved performance when visiting your site. By default, Cloudflare only caches static content, but with APO we can also cache dynamic content (like HTML) so the entire site is served from cache. This removes round trips from the origin drastically improving TTFB and other site performance metrics. In addition to caching dynamic content, APO caches third party scripts to further reduce the need to make requests that leave Cloudflare's edge network.</p>
    <div>
      <h3>Workers and Workers Sites</h3>
      <a href="#workers-and-workers-sites">
        
      </a>
    </div>
    <p>Reducing load on customer origins and making sure we serve the right content to the right clients at the right time are great, but what if customers want to take things a step further and eliminate origin round trips entirely? What if there <i>was no origin</i>? Before we get into Schrödinger’s cat/server territory, we can make this concrete: Cloudflare offers tools to serve entire websites from our edge, without an origin server being involved at all. For more on Workers Sites, check out our <a href="/workers-sites/">introductory blog post</a> and peruse our <a href="https://workers.cloudflare.com/built-with">Built With Workers</a> project gallery.</p><p>As big proponents of dogfooding, many of Cloudflare’s own web properties are deployed to Workers Sites, and we use Web Vitals to measure our customers’ experiences.</p><p>Using Workers Sites, our <a href="https://developers.cloudflare.com/">developers.cloudflare.com</a> site, which gets hundreds of thousands of visits a day and is critical to developers building atop our platform, is able to attain incredible Web Vitals scores:</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/3icEZxB7WvbUhwQbgspKCA/831b6bd4998312cfb1884a7bf8ec921e/2-2.png" />
            
            </figure><p>These scores are superb, showing the performance and ease of use of our edge, our static website delivery system, and our analytics toolchain.</p>
    <div>
      <h3>Cloudflare Web Analytics and Browser Insights directly measure the signals Google is prioritizing</h3>
      <a href="#cloudflare-web-analytics-and-browser-insights-directly-measure-the-signals-google-is-prioritizing">
        
      </a>
    </div>
    <p>As illustrated above, <a href="/start-measuring-web-vitals-with-browser-insights/#:~:text=Web%20Vitals%20are%20a%20new,data%20from%20the%20whole%20web.">Cloudflare makes it easy</a> to directly measure Web Vitals with Browser Insights. Enabling Browser Insights for websites proxied by Cloudflare takes one click in the Speed tab of the Cloudflare dashboard. And if you’re <i>not</i> proxying sites through Cloudflare, Web Vitals measurements will be supported in our <a href="/free-privacy-first-analytics-for-a-better-web/">upcoming, free, Cloudflare Web Analytics product</a> that any site, using Cloudflare’s proxy or not, can use.</p><p>Web Vitals breaks down user experience into three components:</p><ul><li><p>Loading: How long did it take for content to become available?</p></li><li><p>Interactivity: How responsive is the website when you interact with it?</p></li><li><p>Visual stability: How much does the page move around while loading?</p></li></ul>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/54Igksx7thHQiDLwzhpeFz/472224ed33ef956018811bafbc07a323/3.png" />
            
            </figure><p>This <a href="https://web.dev/vitals/">image</a> is reproduced from work created and <a href="https://developers.google.com/terms/site-policies">shared by Google</a> and used according to terms described in the <a href="https://creativecommons.org/licenses/by/4.0/">Creative Commons 4.0 Attribution License</a>.</p><p>It’s challenging to create a single metric that captures these high-level components. Thankfully, the folks at Google Chrome team have thought about this, and earlier this year introduced three “Core” Web Vitals metrics:  <a href="https://web.dev/lcp/">Largest Contentful Paint</a>,  <a href="https://web.dev/fid/">First Input Delay</a>, and <a href="https://web.dev/cls/">Cumulative Layout Shift</a>.</p><p>Cloudflare Browser Insights measures all three metrics directly in your users’ browsers, all with one-click enablement from the Cloudflare dashboard.</p><p>Once enabled, Browser Insights works by inserting a JavaScript "beacon" into HTML pages. You can control where the beacon loads if you only want to measure specific pages or hostnames. If you’re using CSP version 3, we’ll even automatically detect the nonce (if present) and add it to the script.</p><p>To start using Browser Insights, just head over to the Speed tab in the dashboard.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/2IHRjBAijCoedt55p1Ezne/d1082ff70c6e99bec6cde854e622ea96/4.png" />
            
            </figure><p><i>An example Browser Insights report, showing what pages on blog.cloudflare.com need improvement.</i></p>
    <div>
      <h3>Making pages fast is better for everyone</h3>
      <a href="#making-pages-fast-is-better-for-everyone">
        
      </a>
    </div>
    <p>Google’s announcement that Web Vitals measurements will be a key part of search ranking starting in May 2021 places even more emphasis on running fast, accessible websites.</p><p>Using Cloudflare’s performance tools, like our best-of-breed caching, Argo Smart Routing, content optimization, and Cloudflare Workers® products, directly improves page experience and Core Web Vitals measurements, and now, very directly, where your pages appear in Google Search results. And you don’t have to take our word for this — our analytics tools directly measure Web Vitals scores, instrumenting your real users’ experiences.</p><p>We’re excited to help our customers build fast websites, understand exactly <i>how</i> fast they are, and rank highly on Google search as a result. Render on!</p> ]]></content:encoded>
            <category><![CDATA[Speed & Reliability]]></category>
            <category><![CDATA[SEO]]></category>
            <category><![CDATA[Performance]]></category>
            <category><![CDATA[Browser Insights]]></category>
            <category><![CDATA[Analytics]]></category>
            <guid isPermaLink="false">2Re8RrgBMhISwGoB1YQw1r</guid>
            <dc:creator>Rustam Lalkaka</dc:creator>
            <dc:creator>Rita Kozlov</dc:creator>
        </item>
        <item>
            <title><![CDATA[Diving into Technical SEO using Cloudflare Workers]]></title>
            <link>https://blog.cloudflare.com/diving-into-technical-seo-cloudflare-workers/</link>
            <pubDate>Thu, 07 Mar 2019 16:05:41 GMT</pubDate>
            <description><![CDATA[ With this post we illustrate the potential applications of Cloudflare Workers in relation to search engine optimization, which is more commonly referred to as ‘SEO’ using our research and testing over the past year making Sloth. ]]></description>
            <content:encoded><![CDATA[ <p><i>This is a guest post by Igor Krestov and Dan Taylor. Igor is a lead software developer at SALT.agency, and Dan a lead technical SEO consultant, and has also been credited with </i><a href="https://searchengineland.com/service-workers-and-seo-seo-for-developers-311292"><i>coining the term “edge SEO”</i></a><i>. </i><a href="https://salt.agency/"><i>SALT.agency</i></a><i> is a technical SEO agency with offices in London, Leeds, and Boston, offering bespoke consultancy to brands around the world. You can reach them both via </i><a href="https://twitter.com/salt_agency"><i>Twitter</i></a><i>.</i></p><p>With this post we illustrate the potential applications of <a href="https://www.cloudflare.com/products/cloudflare-workers/">Cloudflare Workers</a> in relation to search engine optimization, which is more commonly referred to as ‘SEO’ using our research and testing over the past year making Sloth.</p><p>This post is aimed at readers who are both proficient in writing performant JavaScript, as well as complete newcomers, and less technical stakeholders, who haven’t really written many lines of code before.</p>
    <div>
      <h2>Endless practical applications to overcome obstacles</h2>
      <a href="#endless-practical-applications-to-overcome-obstacles">
        
      </a>
    </div>
    <p>Working with various clients and projects over the years we’ve continuously encountered the same problems and obstacles in getting their websites to a point of “technical SEO excellence”. A lot of these problems come from platform restriction at an enterprise level, legacy tech stacks, incorrect builds, and years of patching together various services and infrastructures.</p><p>As a team of technical SEO consultants, we can often be left frustrated by these barriers, that often lead to essential fixes and implementations either being not possible or delayed for months (if not years) at a time – and in this time, the business is often losing traffic and revenue.</p><p>Workers offers us a hail Mary solution to a lot of common frustrations in getting technical SEO implemented, and we believe in the long run can become an integral part of overcoming legacy issues, reducing DevOps costs, speeding up lead times, all in addition to utilising a globally distributed serverless platform with blazing fast cold start times.</p>
    <div>
      <h2>Creating accessibility at scale</h2>
      <a href="#creating-accessibility-at-scale">
        
      </a>
    </div>
    <p>When we first started out, we needed to implement simple redirects, which should be easy to create on the majority of platforms but wasn’t supported in this instance.</p><p>When the second barrier arose, we needed to inject Hreflang tags, cross-linking an old multi-lingual website on a bespoke platform build to an outdated spec. This required experiments to find an efficient way of implementing the tags without increasing latency or adding new code to the server – in a manner befitting of search engine crawling.</p><p>At this point we had a number of other applications for Workers, with arising need for non-developers to be able to modify and deploy new Worker code. This has since become an idea of Worker code generation, via Web UI or command line.</p><p>Having established a number of different use cases for Workers, we identified 3 processing phases:</p><ul><li><p>Incoming request modification – changing origin request URL or adding authorization headers.</p></li><li><p>Outgoing response modification - adding security headers, Hreflang header injection, logging.</p></li><li><p>Response body modification – injecting/changing content e.g. canonicals, robots and JSON-LD</p></li></ul><p>We wanted to generate lean Worker code, which would enable us to keep each functionality contained and independent of another, and went with an idea of filter chains, which can be used to compose fairly complex request processing.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/5XNqSuHWNflu8gXityDn7E/71deac0da1e6aa40f55fb22e387e3ffd/request-filter-chain.png" />
            
            </figure><p>A request chain depicting the path of a request as it is transformed while moving from client to origin server and back again.</p><p>A key accessibility issue we identified from a non-technical perspective was the goal trying of making this serverless technology accessible to all in SEO, because with understanding comes buy-in from stakeholders. In order to do this, we had to make Workers:</p><ul><li><p>Accessible to users who don’t understand how to write JavaScript / Performant JavaScript</p></li><li><p>Process of implementation can complement existing deployment processes</p></li><li><p>Process of implementation is secure (internally and externally)</p></li><li><p>Process of implementation is compliant with data and privacy policies</p></li><li><p>Implementations must be able to be verified through existing processes and practices (BAU)</p></li></ul><p>Before we dive into actual filters, here are partial TypeScript interfaces to illustrate filter APIs:</p>
            <pre><code>interface FilterExecutor&lt;Type, Context, ReturnType extends Type | void&gt; {
    apply(filterChain: { next: (c: Context, obj: Type) =&gt; ReturnType | Promise&lt;ReturnType&gt; }&gt;, context: Context, obj: Type): ReturnType | Promise&lt;ReturnType&gt;;
}
interface RequestFilterContext {
    // Request URL
    url: URL;
    // Short-circuit request filters 
    respondWith(response: Response | Promise&lt;Response&gt;): void;
    // Short-circuit all filters
    respondWithAndStop(response: Response | Promise&lt;Response&gt;): void;
    // Add additonal response filter
    appendResponseFilter(filter: ResponseFilter): void;
    // Add body filter
    appendBodyFilter(filter: BodyFilter): void;
}
interface RequestFilter extends FilterExecutor&lt;Request, RequestFilterContext, Request&gt; { };
interface ResponseFilterContext {
    readonly startMs: number;
    readonly endMs: number;
    readonly url: URL;
    waitUntil(promise: Promise&lt;any&gt;): void;
    respondWith(response: Response | Promise&lt;Response&gt;): void;
    respondWithAndStop(response: Response | Promise&lt;Response&gt;): void;
    appendBodyFilter(filter: BodyFilter): void;
}
interface ResponseFilter extends FilterExecutor&lt;Response, ResponseFilterContext, Response&gt; { };
interface BodyFilterContext {
    waitUntil(promise: Promise&lt;any&gt;): void;
}
interface ChunkChain {
    public next: ChunkChain | null;
    public chunk: Uint8Array;
}
interface BodyFilter extends MutableFilterExecutor&lt;ChunkChain | null, BodyFilterContext, ChunkChain | null&gt; { };</code></pre>
            <p>Request filter — Simple Redirects</p><hr /><p>Firstly, we would like to point out that this is very niche use case, if your platform supports redirects natively, you should absolutely do it through your platform, but there are a number of limited, legacy or bespoke platforms, where redirects are not supported or are limited, or are charged for (per line) by your <a href="https://www.cloudflare.com/developer-platform/solutions/hosting/">hosting</a> or platform. For example, Github Pages only support redirects via HTML refresh meta tag.</p><p>The most basic redirect filter, would look like this:</p>
            <pre><code>class RedirectRequestFilter {
    constructor(redirects) {
        this.redirects = redirects;
    }

    apply(filterChain, context, request) {
        const redirect = this.redirects[context.url.href];
        if (redirect)
            context.respondWith(new Response('', {
                status: 301,
                headers: { 'Location': redirect }
            }));
        else
            return filterChain.next(context, request);
    }
}

const { requestFilterHandle } = self.slothRequire('./worker.js');
requestFilterHandle.append(new RedirectRequestFilter({
    "https://sloth.cloud/old-homepage": "https://sloth.cloud/"
}));</code></pre>
            <p>You can see it live in Cloudflare’s playground <a href="https://cloudflareworkers.com/#f59a71db84c026a5d2bee72170113f26:https://sloth.cloud/old-homepage">here</a>.</p><p>The one implemented in Sloth supports basic path matching, hostname matching and query string matching, as well as wildcards.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/7bOFFqK57Xr5IsGHGsxCyd/683f3bb5a86104abf6112949e4ddcb6d/sloth-redirect-manager.png" />
            
            </figure><p>The Sloth dashboard for visually creating and modifying redirects. </p><p>It is all well and good when you do not have a lot of redirects to manage, but what do you do when size of redirects starts to take up significant memory available to Worker? This is where we faced another scaling issue, in taking a small handful of possible redirects, to the tens of thousands.</p>
    <div>
      <h2>Managing Redirects with Workers KV and Cuckoo Filters</h2>
      <a href="#managing-redirects-with-workers-kv-and-cuckoo-filters">
        
      </a>
    </div>
    <p>Well, here is one way you can solve it by using <a href="https://developers.cloudflare.com/workers/kv/">Workers KV</a> - a key-value data store.</p><p>Instead of hard coding redirects inside Worker code, we will store them inside Workers KV. Naive approach would be to check redirect for each URL. But Workers KV, maximum performance is not reached until a key is being read on the order of once-per-second in any given data center.</p><p>Alternative could be using a probabilistic data structure, like <a href="https://en.wikipedia.org/wiki/Cuckoo_hashing">Cuckoo Filters</a>, stored in KV, possibly split between a couple of keys as KV is limited to 64KB. Such filter flow would be:</p><ol><li><p>Retrieve frequently read filter key.</p></li><li><p>Check whether full url (or only pathname) is in the filter.</p></li><li><p>Get redirect from Worker KV using URL as a key.</p></li></ol><p>In our tests, we managed to pack 20 thousand redirects into Cuckoo Filter taking up 128KB, split between 2 keys, verified against 100 thousand active URLs with a false-positive rate of 0.5-1%.</p>
    <div>
      <h2>Body filter - Hreflang Injection</h2>
      <a href="#body-filter-hreflang-injection">
        
      </a>
    </div>
    <p>Hreflang meta tags need to be placed inside HTML  element, so before actually injecting them, we need to find either start or end of the head HTML tag, which in itself is a streaming search problem.</p><p>The caveat here is that naive method decoding UTF-8 into JavaScript string, performing search, re-encoding back into UTF-8 is fairly slow. Instead, we attempted pure JavaScript search on bytes strings (<i>Uint8Array</i>), which straight away showed promising results.</p><p>For our use case, we picked the <a href="https://en.wikipedia.org/wiki/Boyer%E2%80%93Moore%E2%80%93Horspool_algorithm">Boyer-Moore-Horspool</a> algorithm as a base of our streaming search, as it is simple, has great average case performance and only requires a pre-processing search pattern, with manual prefix/suffix matching at chunk boundaries.</p><p>Here is comparison of methods we have tested on Node v10.15.0:</p>
            <pre><code>| Chunk Size | Method                               | Ops/s               |
|------------|--------------------------------------|---------------------|
|            |                                      |                     |
| 1024 bytes | Boyer-Moore-Horspool over byte array | 163,086 ops/sec     |
| 1024 bytes | **precomputed BMH over byte array**  | **424,948 ops/sec** |
| 1024 bytes | decode utf8 into strings &amp; indexOf() | 91,685 ops/sec      |
|            |                                      |                     |
| 2048 bytes | Boyer-Moore-Horspool over byte array | 119,634 ops/sec     |
| 2048 bytes | **precomputed BMH over byte array**  | **232,192 ops/sec** |
| 2048 bytes | decode utf8 into strings &amp; indexOf() | 52,787 ops/sec      |
|            |                                      |                     |
| 4096 bytes | Boyer-Moore-Horspool over byte array | 78,729 ops/sec      |
| 4096 bytes | **precomputed BMH over byte array**  | **117,010 ops/sec** |
| 4096 bytes | decode utf8 into strings &amp; indexOf() | 25,835 ops/sec      |</code></pre>
            
    <div>
      <h2>Can we do better?</h2>
      <a href="#can-we-do-better">
        
      </a>
    </div>
    <p>Having achieved decent performance improvement with pure JavaScript search over naive method, we wanted to see whether we can do better. As Workers support <a href="https://developers.cloudflare.com/workers/api/resource-bindings/webassembly-modules/">WASM</a>, we used rust to build a simple WASM module, which exposed standard rust string search.</p>
            <pre><code>| Chunk Size | Method                              | Ops/s               |
|------------|-------------------------------------|---------------------|
|            |                                     |                     |
| 1024 bytes | Rust WASM                           | 348,197 ops/sec     |
| 1024 bytes | **precomputed BMH over byte array** | **424,948 ops/sec** |
|            |                                     |                     |
| 2048 bytes | Rust WASM                           | 225,904 ops/sec     |
| 2048 bytes | **precomputed BMH over byte array** | **232,192 ops/sec** |
|            |                                     |                     |
| 4096 bytes | **Rust WASM**                       | **129,144 ops/sec** |
| 4096 bytes | precomputed BMH over byte array     | 117,010 ops/sec     |</code></pre>
            <p>As rust version did not use precomputed search pattern, it should be significantly faster, if we precomputed and cached search patterns.</p><p>In our case, we were searching for a single pattern and stopping once it was found, where pure JavaScript version was fast enough, but if you need multi-pattern, advanced search, WASM is the way to go.</p><p>We could not record statistically significant change in latency, between basic worker and one with a body filter deployed to production, due to unstable network latency, with a mean response latency of 150ms and 10% 90th percentile standard deviation.</p>
    <div>
      <h2>What’s next?</h2>
      <a href="#whats-next">
        
      </a>
    </div>
    <p>We believe that Workers and serverless applications can open up new opportunities to overcome a lot of issues faced by the SEO community when working with legacy tech stacks, platform limitations, and heavily congested development queues.</p><p>We are also investigating whether Workers can allow us to make a more efficient Tag Manager, which bundles and pushes only matching Tags with their code, to minimize number of external requests caused by trackers and thus reduce load on user browser.</p><p>You can experiment with Cloudflare Workers yourself through <a href="https://sloth.cloud/">Sloth</a>, even if you don’t know how to write JavaScript.</p> ]]></content:encoded>
            <category><![CDATA[SEO]]></category>
            <category><![CDATA[Serverless]]></category>
            <category><![CDATA[Cloudflare Workers]]></category>
            <category><![CDATA[Cloudflare Workers KV]]></category>
            <category><![CDATA[Salt]]></category>
            <category><![CDATA[Developer Platform]]></category>
            <category><![CDATA[Developers]]></category>
            <guid isPermaLink="false">4hfeeckurXOimDtbX2H6cO</guid>
            <dc:creator>Guest Author</dc:creator>
        </item>
        <item>
            <title><![CDATA[SEO Best Practices with Cloudflare Workers, Part 2: Implementing Subdomains]]></title>
            <link>https://blog.cloudflare.com/subdomains-vs-subdirectories-improved-seo-part-2/</link>
            <pubDate>Fri, 15 Feb 2019 17:09:26 GMT</pubDate>
            <description><![CDATA[ In Part 1, the pros and cons of subdirectories vs subdomains were discussed.  The subdirectory strategy is typically superior to subdomains since subdomains suffer from keyword and backlink dilution.  ]]></description>
            <content:encoded><![CDATA[ 
    <div>
      <h4>Recap</h4>
      <a href="#recap">
        
      </a>
    </div>
    <p>In Part 1, the merits and tradeoffs of <a href="https://blog.cloudflare.com/subdomains-vs-subdirectories-best-practices-workers-part-1/"><i>subdirectories and subdomains</i></a> were discussed.  The subdirectory strategy is typically superior to subdomains because subdomains suffer from <i>keyword</i> and <i>backlink dilution</i>.  The subdirectory strategy more effectively boosts a site's search rankings by ensuring that every keyword is attributed to the root domain instead of diluting across subdomains.</p>
    <div>
      <h4>Subdirectory Strategy without the NGINX</h4>
      <a href="#subdirectory-strategy-without-the-nginx">
        
      </a>
    </div>
    <p>In the first part, our friend Bob set up a hosted Ghost blog at <i>bobtopia.coolghosthost.com</i> that he connected to <i>blog.bobtopia.com</i> using a <code>CNAME</code> DNS record.  But what if he wanted his blog to live at <i>bobtopia.com/blog</i> to gain the SEO advantages of subdirectories?</p><p>A reverse proxy like NGINX is normally needed to route traffic from subdirectories to remotely hosted services.  We'll demonstrate how to implement the subdirectory strategy with Cloudflare Workers and eliminate our dependency on NGINX. (Cloudflare Workers are <a href="https://www.cloudflare.com/learning/serverless/what-is-serverless/">serverless</a> functions that run on the Cloudflare global network.)</p>
    <div>
      <h4>Back to Bobtopia</h4>
      <a href="#back-to-bobtopia">
        
      </a>
    </div>
    <p>Let's write a Worker that proxies traffic from a subdirectory – <i>bobtopia.com/blog –</i> to a remotely hosted platform – <i>bobtopia.coolghosthost.com</i>.  This means that if I go to <i>bobtopia.com/blog</i>, I should see the content of <i>bobtopia.coolghosthost.com,</i> but my browser should still think it's on <i>bobtopia.com</i>.</p>
    <div>
      <h4>Configuration Options</h4>
      <a href="#configuration-options">
        
      </a>
    </div>
    <p>In the <a href="https://dash.cloudflare.com/?zone=workers">Workers</a> editor, we'll start a new script with some basic configuration options.</p>
            <pre><code>// keep track of all our blog endpoints here
const myBlog = {
  hostname: "bobtopia.coolghosthost.com",
  targetSubdirectory: "/articles",
  assetsPathnames: ["/public/", "/assets/"]
}</code></pre>
            <p>The script will proxy traffic from <code>myBlog.targetSubdirectory</code> to Bob's hosted Ghost endpoint, <code>myBlog.hostname</code>.  We'll talk about <code>myBlog.assetsPathnames</code> a little later.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/4euHWaCm0YKJDnMpWIsFy9/90cb71bf33f1637863413a04af86f5e4/Screen-Shot-2019-01-27-at-2.47.39-PM.png" />
            
            </figure><p>Requests are proxied from bobtopia.com/articles to bobtopia.coolghosthost.com (Uh oh... is because the hosted Ghost blog doesn't actually exist)</p>
    <div>
      <h4>Request Handlers</h4>
      <a href="#request-handlers">
        
      </a>
    </div>
    <p>Next, we'll add a request handler:</p>
            <pre><code>async function handleRequest(request) {
  return fetch(request)
}

addEventListener("fetch", event =&gt; {
  event.respondWith(handleRequest(event.request))
})</code></pre>
            <p>So far we're just passing requests through <code>handleRequest</code> unmodified.  Let's make it do something:</p>
            <pre><code>
async function handleRequest(request) { 
  ...

  // if the request is for blog html, get it
  if (requestMatches(myBlog.targetSubdirectory)) {
    console.log("this is a request for a blog document", parsedUrl.pathname)
    const targetPath = formatPath(parsedUrl)
    
    return fetch(`https://${myBlog.hostname}/${targetPath}`)
  }

  ...
  
  console.log("this is a request to my root domain", parsedUrl.pathname)
  // if its not a request blog related stuff, do nothing
  return fetch(request)
}

addEventListener("fetch", event =&gt; {
  event.respondWith(handleRequest(event.request))
})</code></pre>
            <p>In the above code, we added a conditional statement to handle traffic to <code>myBlog.targetSubdirectory</code>.  Note that we've omitted our helper functions here.  The relevant code lives inside the <code>if</code> block near the top of the function. The <code>requestMatches</code> helper checks if the incoming request contains <code>targetSubdirectory</code>.  If it does, a request is made to <code>myBlog.hostname</code> to fetch the HTML document which is returned to the browser.</p><p>When the browser parses the HTML, it makes additional asset requests required by the document (think images, stylesheets, and scripts).  We'll need another conditional statement to handle these kinds of requests.</p>
            <pre><code>// if its blog assets, get them
if ([myBlog.assetsPathnames].some(requestMatches)) {
    console.log("this is a request for blog assets", parsedUrl.pathname)
    const assetUrl = request.url.replace(parsedUrl.hostname, myBlog.hostname);

    return fetch(assetUrl)
  }</code></pre>
            <p>This similarly shaped block checks if the request matches any pathnames enumerated in <code>myBlog.assetPathnames</code> and fetches the assets required to fully render the page.  Assets happen to live in <i>/public</i> and <i>/assets</i> on a Ghost blog.  You'll be able to identify your assets directories when you <code>fetch</code> the HTML and see logs for scripts, images, and stylesheets.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/333FXL8DTk3xbnNH3CDM2G/9818beae76d36f64663d71a99e41fcb1/Screen-Shot-2019-01-27-at-5.51.44-PM.png" />
            
            </figure><p>Logs show the various scripts and stylesheets required by Ghost live in <i>/assets</i> and <i>/public</i></p><p>The full script with helper functions included is:</p>
            <pre><code>
// keep track of all our blog endpoints here
const myBlog = {
  hostname: "bobtopia.coolghosthost.com",
  targetSubdirectory: "/articles",
  assetsPathnames: ["/public/", "/assets/"]
}

async function handleRequest(request) {
  // returns an empty string or a path if one exists
  const formatPath = (url) =&gt; {
    const pruned = url.pathname.split("/").filter(part =&gt; part)
    return pruned &amp;&amp; pruned.length &gt; 1 ? `${pruned.join("/")}` : ""
  }
  
  const parsedUrl = new URL(request.url)
  const requestMatches = match =&gt; new RegExp(match).test(parsedUrl.pathname)
  
  // if its blog html, get it
  if (requestMatches(myBlog.targetSubdirectory)) {
    console.log("this is a request for a blog document", parsedUrl.pathname)
    const targetPath = formatPath(parsedUrl)
    
    return fetch(`https://${myBlog.hostname}/${targetPath}`)
  }
  
  // if its blog assets, get them
  if ([myBlog.assetsPathnames].some(requestMatches)) {
    console.log("this is a request for blog assets", parsedUrl.pathname)
    const assetUrl = request.url.replace(parsedUrl.hostname, myBlog.hostname);

    return fetch(assetUrl)
  }

  console.log("this is a request to my root domain", parsedUrl.host, parsedUrl.pathname);
  // if its not a request blog related stuff, do nothing
  return fetch(request)
}

addEventListener("fetch", event =&gt; {
  event.respondWith(handleRequest(event.request))
})</code></pre>
            
    <div>
      <h4>Caveat</h4>
      <a href="#caveat">
        
      </a>
    </div>
    <p>There is one important caveat about the current implementation that bears mentioning. This script will not work if your hosted service assets are stored in a folder that shares a name with a route on your root domain.  For example, if you're serving assets from the root directory of your hosted service, any request made to the <i>bobtopia.com</i> home page will be masked by these asset requests, and the home page won't load.</p><p>The solution here involves modifying the blog assets block to handle asset requests without using paths.  I'll leave it to the reader to solve this, but a more general solution might involve changing <code>myBlog.assetPathnames</code> to <code>myBlog.assetFileExtensions</code>, which is a list of all asset file extensions (like .png and .css).  Then, the assets block would handle requests that contain <code>assetFileExtensions</code> instead of <code>assetPathnames</code>.</p>
    <div>
      <h4>Conclusion</h4>
      <a href="#conclusion">
        
      </a>
    </div>
    <p>Bob is now enjoying the same SEO advantages as Alice after converting his subdomains to subdirectories using Cloudflare Workers.  Bobs of the world, rejoice!</p><hr /><p>Interested in deploying a Cloudflare Worker without setting up a domain on Cloudflare? We’re making it easier to get started building serverless applications with custom subdomains on <a href="https://workers.dev">workers.dev</a>. <i>If you’re already a Cloudflare customer, you can add Workers to your existing website</i> <a href="https://dash.cloudflare.com/workers"><i>here</i></a>.</p><p><a href="https://workers.dev">Reserve a workers.dev subdomain</a></p><hr /><p></p> ]]></content:encoded>
            <category><![CDATA[Cloudflare Workers]]></category>
            <category><![CDATA[SEO]]></category>
            <category><![CDATA[NGINX]]></category>
            <category><![CDATA[Serverless]]></category>
            <category><![CDATA[Developer Platform]]></category>
            <category><![CDATA[Developers]]></category>
            <guid isPermaLink="false">btWeOWsm3GJicWtFI691s</guid>
            <dc:creator>Michael Pinter</dc:creator>
        </item>
        <item>
            <title><![CDATA[SEO Best Practices with Cloudflare Workers, Part 1: Subdomain vs. Subdirectory]]></title>
            <link>https://blog.cloudflare.com/subdomains-vs-subdirectories-best-practices-workers-part-1/</link>
            <pubDate>Fri, 15 Feb 2019 17:09:05 GMT</pubDate>
            <description><![CDATA[ Alice and Bob are budding blogger buddies who met up at a meetup and purchased some root domains to start writing.  Alice bought aliceblogs.com and Bob scooped up bobtopia.com. ]]></description>
            <content:encoded><![CDATA[ 
    <div>
      <h4>Subdomain vs. Subdirectory: 2 Different SEO Strategies</h4>
      <a href="#subdomain-vs-subdirectory-2-different-seo-strategies">
        
      </a>
    </div>
    <p>Alice and Bob are budding blogger buddies who met up at a meetup and <a href="https://www.cloudflare.com/products/registrar/"><i>purchased some root domains</i></a> to start writing.  Alice bought <i>aliceblogs.com</i> and Bob scooped up <i>bobtopia.com</i>.</p><p>Alice and Bob decided against WordPress because its what their parents use and purchased subscriptions to a popular cloud-based Ghost blogging platform instead.</p><p>Bob decides his blog should live at at blog.bobtopia.com – a <i>subdomain</i> of bobtopia.com. Alice keeps it old school and builds hers at <i>aliceblogs.com/blog</i> – a <i>subdirectory</i> of aliceblogs.com.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/4bfADYO8jfROjJXl42KfXV/d7722e13418b6568bedfa1501c17e626/Untitled-1-1.png" />
            
            </figure><p><i>Subdomains</i> and <i>subdirectories</i> are different strategies for instrumenting root domains with new features (think a blog or a storefront).  Alice and Bob chose their strategies on a whim, but <i>which strategy is technically better</i>?  The short answer is, <i>it depends</i>. But the long answer can actually improve your SEO.  In this article, we'll review the merits and tradeoffs of each. In <a href="/subdomains-vs-subdirectories-improved-seo-part-2/">Part 2</a>, we'll show you how to convert subdomains to subdirectories using <a href="www.cloudflare.com/workers">Cloudflare Workers</a>.</p>
    <div>
      <h4>Setting Up Subdomains and Subdirectories</h4>
      <a href="#setting-up-subdomains-and-subdirectories">
        
      </a>
    </div>
    <p>Setting up subdirectories is trivial on basic websites.  A web server treats its subdirectories (aka subfolders) the same as regular old folders in a file system.  In other words, basic sites are already organized using subdirectories out of the box.  No set up or configuration is required.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/Rsiz813gZMirbUtjKPo1H/8d097bb447c0874bd9dcc9325ab3930f/Screen-Shot-2019-01-22-at-4.07.02-PM.png" />
            
            </figure><p>In the old school site above, we'll assume the <i>blog</i> folder contains an <i>index.html</i> file. The web server renders <i>blog/index.html</i> when a user navigates to the <i>oldschoolsite.com/blog</i> subdirectory_._  But Alice and Bob's sites don't have a <i>blog</i> folder because their blogs are hosted remotely – so this approach won't work.</p><p>On the modern Internet, subdirectory setup is more complicated because the services that comprise a root domain are often hosted on machines scattered across the world.</p><p>Because DNS records only operate on the domain level, records like <code>CNAME</code> have no effect on a url like <i>aliceblogs.com/blog</i> – and because her blog is hosted remotely, Alice needs to install <a href="https://www.nginx.com/">NGINX</a> or another reverse proxy and write some configuration code that proxies traffic from <i>aliceblogs.com/blog</i> to her hosted blog. It takes time, patience, and experience to connect her domain to her hosted blog.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/5280nHa1SMYG1xAwGUYWDa/844e384aa3c5851c36a6218982aada0c/Screen-Shot-2019-02-05-at-8.25.01-PM.png" />
            
            </figure><p>A location block in NGINX is necessary to proxy traffic from a subdirectory to a remote host</p><p>Bob's subdomain strategy is the easier approach with his remotely hosted blog.  A DNS <code>CNAME</code> record is often all that's required to connect Bob's blog to his subdomain.  No additional configuration is needed if he can remember to pay his monthly subscription.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/6kWlWnq3CzsVQIk5POm8DJ/b945ad38484de1f5f46abb950da40b4a/Screen-Shot-2019-02-11-at-10.57.06-AM.png" />
            
            </figure><p>Configuring a DNS record to point a hosted service at your blog subdomain</p><p>To recap, subdirectories are already built into simple sites that serve structured content from the same machine, but modern sites often rely on various remote services.  Subdomain set up is comparatively easy for sites that take advantage of various hosted cloud-based platforms.</p>
    <div>
      <h4>Are Subdomains or Subdirectories Better for SEO?</h4>
      <a href="#are-subdomains-or-subdirectories-better-for-seo">
        
      </a>
    </div>
    <p>Subdomains are neat. If you ask me, <i>blog.bobtopia.com</i> is more appealing than <i>bobtopia.com/blog</i>. But if we want to make an informed decision about the best strategy, where do we look?  If we're interested in SEO, we ought to consult the Google Bot.</p><p>Subdomains and subdirectories are equal in the eyes of the Google Bot, according to Google itself.  This means that Alice and Bob have the same chance at ranking in search results.  This is because Alice's root domain and Bob's subdomain build their own sets of <i>keywords</i>.  Relevant keywords help your audience find your site in a search. There is one important caveat to point out for Bob:</p><blockquote><p>A subdomain is equal and distinct from a root domain.  This means that a subdomain's keywords are treated separately from the root domain.</p></blockquote><p>What does this mean for Bob?  Let's imagine <i>bobtopia.com</i> is already a popular online platform for folks named Bob to seek kinship with other Bobs.  In this peculiar world, searches that rank for <i>bobtopia.com</i> wouldn't automatically rank for <i>blog.bobtopia.com</i> because each domain has its own separate keywords.  The lesson here is that keywords are diluted across subdomains.  Each additional subdomain decreases the likelihood that any particular domain ranks in a given search.  A high ranking subdomain does not imply your root domain ranks well.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/7KQm8LGoY1UrmNc6Nlwmz0/82706de305fd0243ff593202731d31eb/Untitled-Diagram.png" />
            
            </figure><p>In a search for "Cool Blog", <i>bobtopia.com</i> suffers from <i>keyword dilution.</i> It doesn't rank because its blog keyword is owned by <i>blog.bobtopia.com</i>.</p><p>Subdomains also suffer from <i>backlink dilution.</i>  A <i>backlink</i> is simply a hyperlink that points back to your site. Alice's attribution to a post on the etymology of Bob from <i>blog.bobtopia.com</i> does not help <i>bobtopia.com</i> because the subdomain is treated separate but equal from the root domain.  If Bob used subdirectories instead, Bob's blog posts would feed the authority of <i>bobtopia.com</i> and Bobs everywhere would rejoice.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/69iuli3pGl8aIxAYNY7FVg/f35165a5417dc18278435bcf13e6aa91/Untitled-Diagram--2-.png" />
            
            </figure><p>The authority of <i>blog.bobtopia.com</i> is increased when Alice links to Bob's interesting blog post, but the authority of <i>bobtopia.com</i> is not affected.</p><p>Although search engines have improved at identifying subdomains and attributing keywords back to the root domain, they still have a long way to go.  A prudent marketer would avoid risk by assuming search engines will always be bad at cataloguing subdomains.</p><p>So when would you want to use subdomains?  A good use case is for companies who are interested in expanding into foreign markets.  Pretend <i>bobtopia.com</i> is an American company whose website is in English.  Their English keywords won't rank well in German searches – so they translate their site into German to begin building new keywords on <i>deutsch.bobtopia.com</i>. Erfolg!</p><p>Other use cases for subdomains include product stratification (think global brands with presence across many markets) and corporate internal tools (think productivity and organization tools that aren't user facing).  But unless you're a huge corporation or just finished your Series C round of funding, subdomaining your site into many silos is not helping your SEO.</p>
    <div>
      <h4>Conclusion</h4>
      <a href="#conclusion">
        
      </a>
    </div>
    <p>If you're a startup or small business looking to optimize your SEO, consider subdirectories over subdomains.  Boosting the authority of your root domain should be a universal goal of any organization. The subdirectory strategy concentrates your keywords onto a single domain while the subdomain strategy spreads your keywords across multiple distinct domains. In a word, the subdirectory strategy results in better root domain authority. Higher domain authority leads to better search rankings which translates to more engagement.</p><p>Consider the multitude of disruptive PaaS startups with <i>docs.disruptivepaas.com</i> and <i>blog.disruptivepaas.com</i>.  Why not switch to <i>disruptivepaas.com/docs</i> and <i>disruptivepaas.com/blog</i> to boost the authority of your root domain with all those docs searches and StackOverflow backlinks?</p>
    <div>
      <h4>Want to Switch Your Subdomains to Subdirectories?</h4>
      <a href="#want-to-switch-your-subdomains-to-subdirectories">
        
      </a>
    </div>
    <p>Interested in switching your subdomains to subdirectories without a reverse proxy? In <a href="/subdomains-vs-subdirectories-improved-seo-part-2/">Part 2</a>, we'll show you how using Cloudflare Workers.</p><hr /><p>Interested in deploying a Cloudflare Worker without setting up a domain on Cloudflare? We’re making it easier to get started building serverless applications with custom subdomains on <a href="https://workers.dev">workers.dev</a>. <i>If you’re already a Cloudflare customer, you can add Workers to your existing website</i> <a href="https://dash.cloudflare.com/workers"><i>here</i></a>.</p><p><a href="https://workers.dev">Reserve a workers.dev subdomain</a></p><hr /><p></p> ]]></content:encoded>
            <category><![CDATA[Cloudflare Workers]]></category>
            <category><![CDATA[SEO]]></category>
            <category><![CDATA[NGINX]]></category>
            <category><![CDATA[Serverless]]></category>
            <category><![CDATA[Developers]]></category>
            <category><![CDATA[Developer Platform]]></category>
            <guid isPermaLink="false">2bz363mDKlEfN0RypJoRjr</guid>
            <dc:creator>Michael Pinter</dc:creator>
        </item>
        <item>
            <title><![CDATA[Better business results from faster web applications - Cloudflare is the fastest]]></title>
            <link>https://blog.cloudflare.com/the-recipe-for-better-business-results-faster-web-applications-cloudflare-provides-the-fastest-web-application-performance-fr/</link>
            <pubDate>Wed, 06 Feb 2019 16:00:00 GMT</pubDate>
            <description><![CDATA[ Does page speed affect revenue? It does, and to a large degree. Learn why Cloudflare is the most effective when it comes to boosting site speed and performance. ]]></description>
            <content:encoded><![CDATA[ 
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/1TcgRq8yKa3gNOLytw8JsX/87b53674b25d19699dfbbd142bd78277/paper-airplane-_2x.png" />
            
            </figure><p>Web performance encompasses a lot of things: page load time, responsiveness for web and mobile applications. But overall, the key element is response times. How quickly can the origin server or the cache fulfill a user request? How quickly can a DNS response reach the client device?</p><p>The Cloudflare mission is to help build a better Internet to everyone, and we offer multiple services for boosting the speed and performance of our customers and users. Cloudflare is faster than the competition when it comes to accelerating performance.</p>
    <div>
      <h3>How site speed impacts the bottom line</h3>
      <a href="#how-site-speed-impacts-the-bottom-line">
        
      </a>
    </div>
    <p>There is a lot of research out there that confirms what many businesses and web developers already know: speed affects revenue. Better web performance means better user engagement and more conversions. Better performance also results in <a href="/seo-performance-in-2018-using-cloudflare/">better SEO</a>, driving up overall traffic numbers and increasing lead generation, sales, or ad revenue.</p><p>One study by Google and Bing concluded that on average, a two-second delay in a website's page rendering led to a 4.3% loss in revenue per visitor. Another independent <a href="https://www.yottaa.com/marketing-web-performance-101-how-site-speed-impacts-your-metrics/">study</a> has shown that 1 additional second of load time reduces conversions by 7%.</p>
    <div>
      <h3>How does using Cloudflare affect performance?</h3>
      <a href="#how-does-using-cloudflare-affect-performance">
        
      </a>
    </div>
    <p>According to testing from Cedexis (a company that evaluates CDN performance):</p><ul><li><p>Cloudflare is over 50 milliseconds or 23% faster than the nearest competitor over HTTPS for the 95th percentile</p></li><li><p>Cloudflare performs better than all competitors over HTTPS at both the 50th and 95th percentile</p></li><li><p>Cloudflare performs better than all competitors over HTTP at both the 50th and 95th percentile</p></li></ul>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/nb3xsw5wyDj8oMVKQnj42/c48eb9d3068bdb277a0c7a5f45e3374b/SFRUUFNfcDk1Q2VkZXhpcy5wbmc--1.png" />
            
            </figure><p>HTTPS performance at 95th percentile</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/6ObkWDJttRfJvD98aULhGo/99fff29cc133cf0af78529552ff8ddd3/SFRUUF9wOTVDZWRleGlzLnBuZw--.png" />
            
            </figure><p>HTTP performance at 95th percentile</p><p>Translating domain names into IP addresses quickly and with authority needs an enterprise-ready DNS.  Each webpage requires multiple requests and responses in order to load, so an improvement of even a few milliseconds in how DNS queries are answered adds up quickly.  That's why it's so important to us that our DNS resolvers are the fastest available. According to DNSPerf (a DNS performance benchmarking service):</p><ul><li><p>Cloudflare is the fastest authoritative DNS provider, 30% faster globally than the next-fastest competitor</p></li><li><p>Cloudflare is the fastest public resolver, almost 30% faster globally than the next-fastest public DNS resolver</p></li></ul>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/5nlocAX29JlJjIbKLnYVqi/f5df44a81e343b91e07ce6d8438e3cc1/Resolver-Simulation.png" />
            
            </figure><p>DNS performance</p><p>We've seen this in action; customers have reported large performance gains as a result of implementing Cloudflare. Zendesk, for example, <a href="https://www.cloudflare.com/case-studies/zendesk/">improved their response times by 10x</a>, and OKCupid was able to cut their page load times <a href="https://www.cloudflare.com/case-studies/okcupid/">by up to 50%</a> by implementing Cloudflare.</p>
    <div>
      <h3>How Cloudflare technology boosts business results</h3>
      <a href="#how-cloudflare-technology-boosts-business-results">
        
      </a>
    </div>
    <p>Cloudflare has a wide range of customers, from small businesses to large enterprises, but across all of them there's a similar story: Users engage more, convert more, and bounce less when content renders quickly. For instance, improving page load speed resulted in <a href="https://www.cloudflare.com/case-studies/us-xpress/">62% more conversions for U.S. Xpress</a>. Bidu <a href="https://www.cloudflare.com/case-studies/bidu/">grew leads by 30%</a> year over year by reducing average full page load times from 13 seconds to 2.3 seconds.</p><p>Our innovative products and services help speed up applications and websites:</p><ul><li><p><a href="https://www.cloudflare.com/products/argo-smart-routing"><b>Argo</b></a>: Cloudflare Argo uses smart routing to find the fastest, least congested path across the Cloudflare network.</p></li><li><p><a href="https://www.cloudflare.com/cdn"><b>CDN</b></a>: The Cloudflare globally distributed CDN caches web content across 165+ global data centers to supercharge web performance.</p></li><li><p><a href="https://www.cloudflare.com/dns"><b>DNS</b></a>: Cloudflare DNS services shave crucial milliseconds off requests for a DNS lookup.</p></li><li><p><a href="https://www.cloudflare.com/load-balancing"><b>Load Balancing</b></a>: Cloudflare balances traffic across multiple servers or regions and uses health checks to identify offline servers.</p></li><li><p><a href="https://www.cloudflare.com/website-optimization"><b>Web Optimizations</b></a>: Cloudflare optimizes web content delivery by bundling JavaScript files, leveraging local storage, adjusting cache headers automatically, and more.</p></li></ul><p>Improving page speed and performance results in higher conversion rates, better user engagement. Cloudflare provides the fastest performance for web applications.</p><p>Sign up <a href="https://dash.cloudflare.com/sign-up">here</a> or speak to one of our experts to get started.</p> ]]></content:encoded>
            <category><![CDATA[Speed & Reliability]]></category>
            <category><![CDATA[DNS]]></category>
            <category><![CDATA[SEO]]></category>
            <guid isPermaLink="false">284q2bsfyw3nqxNroFVb5r</guid>
            <dc:creator>Rahul Deshmukh</dc:creator>
        </item>
        <item>
            <title><![CDATA[Integrating redirection.io with Cloudflare Workers]]></title>
            <link>https://blog.cloudflare.com/integrating-redirection-io-with-cloudflare-workers/</link>
            <pubDate>Tue, 21 Aug 2018 17:56:39 GMT</pubDate>
            <description><![CDATA[ Redirection.io manages web traffic redirects, offering tools for web admins, SEO agencies, and devs to analyze HTTP errors, set up redirects, customize responses, and monitor traffic efficiently. ]]></description>
            <content:encoded><![CDATA[ <p><i>The following is a guest post by </i><a href="https://twitter.com/xavierlacot"><i>Xavier Lacot</i></a><i>, a developer at </i><a href="https://redirection.io/"><i>redirection.io</i></a><i> and founder at </i><a href="https://jolicode.com/"><i>JoliCode</i></a><i>. He works primarily on Web and mobile projects as a consultant, trainer and technical expert.</i></p>
    <div>
      <h3>What is redirection.io</h3>
      <a href="#what-is-redirection-io">
        
      </a>
    </div>
    <p>Redirection.io is a Web traffic redirection manager. It provides a collection of tools for website administrators, SEO agencies, and developers, which help analyze HTTP errors, setup HTTP redirections, customize HTTP responses, and monitor the traffic efficiently.</p><p>The main part of a traditional redirection.io setup is the proxy, a software component which parses every request to check if a redirection or another response override is required. This "proxy" can be of several types - we <a href="https://redirection.io/documentation/developer-documentation/integrations">provide libraries in several languages</a> - but this setup can be simplified for Cloudflare clients by taking advantage of Cloudflare Workers.</p>
    <div>
      <h3>Here come Cloudflare Workers</h3>
      <a href="#here-come-cloudflare-workers">
        
      </a>
    </div>
    <p>Earlier this year, Cloudflare unveiled its <a href="https://www.cloudflare.com/fr-fr/products/cloudflare-workers/">Workers</a> product, a smart way of running code <i>on the edge</i> of Cloudflare locations. This computing feature is particularly interesting, as it allows performing several traffic operations without requiring any change on your own platform, code, or infrastructure: just enable Workers, write some code, and let Cloudflare handle the magic ✨</p><p>In practical terms, Workers act as application middleware. They proxy incoming HTTP requests to your stack, and can modify both requests and responses. Cloudflare has <a href="https://developers.cloudflare.com/workers/recipes/">examples and code ideas</a> to help you build your own Worker.</p><p>Here is a very basic Cloudflare Worker:</p>
            <pre><code>addEventListener('fetch', event =&gt; {
  event.respondWith(handleRequest(event.request))
})

async function handleRequest(request) {
  console.log('Got request', request)
  const response = await fetch(request)
  console.log('Got response', response)
  return response
}</code></pre>
            <p>Luckily, redirection.io already has a public HTTP endpoint for exposing hosted agents (for shared hosting, for instance, where it is impossible to install your own binaries). Offering redirection.io to all Cloudflare Workers users is as simple as querying our hosted agent at each incoming request:</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/7xa23T55yNTFqNt71UL4He/a7c2d04334db32c398aaf6f6e3b6ae3a/cloudflare-redirectionio.png" />
            
            </figure><p>In terms of code, the pattern we use is very similar to the Workers example: we listen on the incoming request:</p>
            <pre><code>addEventListener('fetch', (event) =&gt; {
    event.respondWith(redirectAndLog(event.request));
});

async function redirectAndLog(request) {
    const response = await redirectOrPass(request);
    log(request, response);

    return response;
}</code></pre>
            <p>Of course, we must implement the <code>redirectOrPass()</code> and <code>log()</code> functions, which will do all the magic:</p><ul><li><p><code>redirectOrPass()</code> queries redirection.io's API to know if some action has to be run for the current request, and lets the request pass if necessary. This call is blocking; it will stop the request until a response is received from redirection.io's backend.</p></li><li><p><code>log()</code> sends out log data to our backend for analysis and statistics purpose. This call is non-blocking, as it can be executed even after the response has been sent to the user.</p></li></ul><p>Hence, both of these functions are defined as asynchronous non-blocking functions:</p>
            <pre><code>async function redirectOrPass(request) {
    ...
}

async function log(request, response) {
    ...
}</code></pre>
            <p>Let's go with the <code>redirectOrPass()</code> implementation! Using the <a href="https://developer.mozilla.org/en-US/docs/Web/API/Fetch_API">fetch API</a>, we can call redirection.io's agent API:</p>
            <pre><code>const urlObject = new URL(request.url);
const context = {
    'host': urlObject.host,
    'request_uri': urlObject.pathname,
    'user_agent': request.headers.get('user-agent'),
    'scheme': urlObject.protocol.includes('https') ? 'https' : 'http'
};
let response = null;

try {
    response = await fetch('https://proxy.redirection.io/' + options.token + '/get', {
        method: 'POST',
        body: JSON.stringify(context),
    });
} catch (error) {
    // if no action found, play the regular request
    return await fetch(request);
}</code></pre>
            <p>Should our API respond with a <code>404</code> status (which means that no redirection or action rule is defined for a request of that type), we simply return the standard response from your website's backend - this is achieved with <code>return await fetch(request)</code></p><p>If a redirection has to be run, redirection.io's API will send a status <code>200</code> response with all the information on how to transform the response. The payload could, for instance, look like:</p>
            <pre><code>{
  "status_code": 302,
  "location": "/blog-yo"
}</code></pre>
            <p>... and we can simply use it to change the response:</p>
            <pre><code>const data = await response.text();

try {
    response = JSON.parse(data);
} catch (error) {
    // If some errors, play the regular request
    return await fetch(request);
}

// Send gone response
if (response.status_code === 410) {
    return new Response('', { status: 410 });
}

// Send redirection response
return new Response('', {
    status: Number(response.status_code),
    headers: {
        'Location': response.location
    }
});</code></pre>
            <p>The <code>log()</code> function implementation is pretty similar:</p>
            <pre><code>async function log(request, response) {
    const urlObject = new URL(request.url);
    const context = {
        'status_code': response.status,
        'host': urlObject.host,
        'request_uri': urlObject.pathname,
        'user_agent': request.headers.get('user-agent'),
        'scheme': urlObject.protocol.includes('https') ? 'https' : 'http',
    };

    try {
        return await fetch('https://proxy.redirection.io/' + options.token + '/log', {
            method: 'POST',
            body: JSON.stringify(context),
        });
    } catch (error) {
        // do nothing, does not matter if some logs are lost
    }
}</code></pre>
            
    <div>
      <h3>Impacts and performance concerns</h3>
      <a href="#impacts-and-performance-concerns">
        
      </a>
    </div>
    <p>Cloudflare's workers run <i>very fast</i>. And by "very fast", we mean in microseconds, with the ability for Cloudflare to execute "many thousands of Worker scripts per second" as stated in their documentation.</p><p>The stack behind redirection.io is also very efficient, able to manage a similar level of traffic without any latency nor slowdown. However, there can be an occasional network slowdown, which could hurt the perceived performance of your web application. There are several ways to mitigate this possible impact.</p><p>First, Cloudflare Workers support caching subrequests made with the Fetch API. This means that, if a URL of your application is often hit (for instance the homepage), you may want to cache the result from redirection.io's API. With this approach, the impact on the vast majority of your incoming requests will be close to zero.</p><p>Another approach is possible, to define the maximum timing overhead using redirection.io will add: we can simply define a timeout to the <code>fetch()</code> calls made by our Service Worker! Just nest the <code>fetch()</code> call inside a <code>Promise.race()</code>:</p>
            <pre><code>response = await fetch('https://proxy.redirection.io/' + options.token + '/get', {
    method: 'POST',
    body: JSON.stringify(context),
});</code></pre>
            <p>becomes:</p>
            <pre><code>response = await Promise.race([
  fetch('https://proxy.redirection.io/' + options.token + '/get', {
    method: 'POST',
    body: JSON.stringify(context),
  }),
  new Promise((_, reject) =&gt;
    setTimeout(() =&gt; reject(new Error('Timeout')), options.timeout)
  ),
])</code></pre>
            <p>When talking about performance, it is a good idea to provide some numbers for a valid comparison:</p><table><tr><td><p><b>Cloudflare enabled</b></p></td><td><p><b>Workers enabled</b></p></td><td><p><b>Request timing</b></p></td></tr><tr><td><p>✅</p></td><td><p>❌</p></td><td><p>~32ms</p></td></tr><tr><td><p>✅</p></td><td><p>✅ (empty worker)</p></td><td><p>~36ms</p></td></tr><tr><td><p>✅</p></td><td><p>✅ (redirection.io Worker with no cache)</p></td><td><p>~44ms</p></td></tr></table><p>This means that you can benefit from redirection.io on your website through Cloudflare workers at the price of ~8ms. And, once responses from redirection.io's API are cached, we do not notice any difference compared to a standard Cloudflare-enabled website. We strongly believe Cloudflare Workers is an amazing way to integrate redirection.io into your website. Try <a href="https://redirection.io/">redirection.io</a> and <a href="https://www.cloudflare.com/products/cloudflare-workers/">Cloudflare Workers</a> today, and report back!</p> ]]></content:encoded>
            <category><![CDATA[Serverless]]></category>
            <category><![CDATA[Cloudflare Workers]]></category>
            <category><![CDATA[JavaScript]]></category>
            <category><![CDATA[SEO]]></category>
            <category><![CDATA[Developers]]></category>
            <category><![CDATA[Developer Platform]]></category>
            <guid isPermaLink="false">2p8SdFZjHHCPeScimj3Nss</guid>
            <dc:creator>Guest Author</dc:creator>
        </item>
        <item>
            <title><![CDATA[SEO Performance in 2018 Using Cloudflare]]></title>
            <link>https://blog.cloudflare.com/seo-performance-in-2018-using-cloudflare/</link>
            <pubDate>Sun, 28 Jan 2018 15:00:00 GMT</pubDate>
            <description><![CDATA[ For some businesses SEO is a bad word, and for good reason. Google and other search engines keep their algorithms a well-guarded secret making SEO implementation not unlike playing a game where the referee won’t tell you all the rules.  ]]></description>
            <content:encoded><![CDATA[ <p>For some businesses SEO is a bad word, and for good reason. Google and other search engines keep their algorithms a well-guarded secret making SEO implementation not unlike playing a game where the referee won’t tell you all the rules. While SEO experts exist, the ambiguity around search creates an opening for grandiose claims and misinformation by unscrupulous profiteers claiming expertise.</p><p>If you’ve done SEO research, you may have come across an admixture of legitimate SEO practices, outdated optimizations, and misguided advice. You might have read that using the keyword meta tag in your HTML will help your SEO (<a href="https://webmasters.googleblog.com/2009/09/google-does-not-use-keywords-meta-tag.html">it won’t</a>), that there’s a specific number of instances a keyword should occur on a webpage (<a href="https://www.youtube.com/watch?v=Rk4qgQdp2UA">there isn’t</a>), or that buying links will improve your rankings (<a href="https://webmasters.googleblog.com/2013/02/a-reminder-about-selling-links.html">it likely won’t and will get the site penalized</a>). Let’s sift through the noise and highlight some dos and don’ts for performance-based SEO in 2018.</p>
    <div>
      <h3>SEO is dead, long live SEO!</h3>
      <a href="#seo-is-dead-long-live-seo">
        
      </a>
    </div>
    <p>Nearly every year since its inception, SEO is declared dead. It is true that the scope of best practices for search engines has narrowed over the years as search engines have become smarter, and much of the benefit from SEO can be experienced by following these two rules:</p><ol><li><p>Create good content</p></li><li><p>Don’t be creepy</p></li></ol><p>Beyond the fairly obvious, there are a number of tactics that can help improve the importance with which a website is evaluated inside Google, Bing and others. This blog will focus on optimizing for Google, though the principles and practices likely apply to all search engines.</p>
    <div>
      <h3>Does using Cloudflare hurt my SEO?</h3>
      <a href="#does-using-cloudflare-hurt-my-seo">
        
      </a>
    </div>
    <p>The short answer is, no. When asked whether or not Cloudflare can damage search rankings, John Mueller from Google stated <a href="https://twitter.com/JohnMu/status/862265871678529536">CDNs can work great for both users and search engines</a> when properly configured. This is consistent with our findings at Cloudflare, as we have millions of web properties, including SEO agencies, who use our service to improve both performance and SEO.</p>
    <div>
      <h3>Can load time affect a site's SEO ranking?</h3>
      <a href="#can-load-time-affect-a-sites-seo-ranking">
        
      </a>
    </div>
    <p>Yes, it can. Since at least 2010, Google has publicly stated that <a href="https://webmasters.googleblog.com/2010/04/using-site-speed-in-web-search-ranking.html">site speed affects your Google ranking</a>. While most sites at that time were not affected, times have changed and heavier sites with frontend frameworks, images, CMS platforms and/or a slew of other javascript dependencies are the new normal. Google promotes websites that result in a good user experience, and slow sites are frustrating and penalized in rankings as a result.</p><p>The cost of slow websites on user experience is particularly dramatic in mobile, where limited bandwidth results in further constraints. Aside from low search rankings, slow loading sites result in bad outcomes; research by Google indicates <a href="https://storage.googleapis.com/doubleclick-prod/documents/The_Need_for_Mobile_Speed_-_FINAL.pdf">53% of mobile sites are abandoned if load time is more than 3 seconds</a>. Separate research from Google using a deep <a href="https://www.cloudflare.com/learning/ai/what-is-neural-network/">neural network</a> found that as a mobile site’s <a href="https://www.thinkwithgoogle.com/marketing-resources/data-measurement/mobile-page-speed-new-industry-benchmarks/">load time goes from 1 to 7 seconds, the probability of a visitor bouncing increases 113%</a>. The problems surrounding page speed increase the longer a site takes to load; mobile sites that load in 5 seconds earn 2x more ad revenue than those that take 19 seconds to load (the average time to completely load a site on a 3G connection).</p>
    <div>
      <h3>What tools can I use to evaluate my site's performance?</h3>
      <a href="#what-tools-can-i-use-to-evaluate-my-sites-performance">
        
      </a>
    </div>
    <p>A number of free and verified tools are available for checking a website’s performance. Based on Google’s research, you can <a href="https://testmysite.thinkwithgoogle.com/">estimate the number of visitors you will lose</a> due to excessive loading time on mobile. Not to sound click baitey, but the results may surprise you.</p><p>As more web traffic continues to shift to mobile, mobile optimization must be prioritized for most websites. Google has announced that in July 2018 mobile speed will also affect SEO placement. If you want to do more research on your site’s overall mobile readiness, you can <a href="https://search.google.com/test/mobile-friendly">check to see if your site is mobile friendly</a>.</p><p>If you’re technically-minded and use Chrome, you can pop into the Chrome devtools and click on the audits tab to access Lighthouse, Chrome’s built in analysis tool.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/3grz0h0TEpSW0jusWakOhH/a6f38a10bf755c7967da7863a2b09823/audits-tab-seo-screenshot.png" />
            
            </figure><p>Other key metrics used for judging your site's performance include FCP and DCL speeds. First Contentful Paint (FCP) measures the first moment content is loaded onto the screen of the user, answering the user’s question: “is this useful?”. The other metric, DOM Content Loaded (DCL), measures when all stylesheets have loaded and the DOM tree is able to be rendered. Google provides a tool for you to <a href="https://developers.google.com/speed/pagespeed/insights/">measure your website’s FCP and DCL speeds relative to other sites</a>.</p>
    <div>
      <h3>Can spammy websites hosted on the same platform hurt SEO?</h3>
      <a href="#can-spammy-websites-hosted-on-the-same-platform-hurt-seo">
        
      </a>
    </div>
    <p>Generally speaking, there is no cause for concern as shared hosts <a href="https://www.youtube.com/watch?v=AsSwqo16C8s">shouldn’t hurt your SEO</a>, even if some of the sites on the shared host are less reputable. In the unlikely event you find yourself as the only legitimate website on the host that is almost entirely spam, it might be time to rethink your hosting strategy.</p>
    <div>
      <h3>Does downtime hurt SEO?</h3>
      <a href="#does-downtime-hurt-seo">
        
      </a>
    </div>
    <p>If your site is down when it’s crawled, it may be <a href="http://www.thesempost.com/how-an-offline-website-impacts-google-rankings-seo/">temporarily pulled from results</a>. This is why service interruptions such as getting <a href="https://www.cloudflare.com/learning/ddos/what-is-a-ddos-attack/">DDoSed</a> during peak purchases times can be more damaging. Typically a site’s ranking will recover when it comes back online. If it’s down for an entire day it may take up to a few weeks to recover.</p>
    <div>
      <h3>Don’t be creepy in SEO: an incomplete guide</h3>
      <a href="#dont-be-creepy-in-seo-an-incomplete-guide">
        
      </a>
    </div>
    <p>Everybody likes to win, but playing outside the rules can have consequences. For websites that attempt to circumvent Google’s guidelines in an attempt to trick the search algorithms and web crawlers, a perilous future awaits. Here are a few things that you should make sure you avoid.</p><p><b>Permitting user-generated spam</b> - sometimes unmoderated comment sections run amok with user generated spam ads, complete with links to online pharmacies and other unrelated topics. Leaving these types of links in place lowers the quality of your content and may subject you to penalization. Having trouble handling a spam situation? There are <a href="https://support.google.com/webmasters/answer/81749">strategies you can implement</a>.</p><p><b>Link schemes</b> - while sharing links with reputable sources is still a legitimate tactic, excessively sharing links is not. Likewise, purchasing large bundles of links in an attempt to boost SEO by artificially passing PageRank is best avoided. There are many link schemes, and if you’re curious whether or not you’re in violation, look at <a href="https://support.google.com/webmasters/answer/66356">Google’s documentation</a>. If you feel like you might’ve made questionable link decisions in the past and you want to undo them, you can <a href="https://support.google.com/webmasters/answer/2648487">disavow links that point to your site</a>, but use this feature with extreme caution.</p><p><b>Doorway pages</b> - By creating many pages that optimize for specific search phrases, but ultimately point to the same page, some sites attempt to saturate all the search terms around a particular topic. While this might be tempting strategy to gain a lot of SEO very quickly, it may result in all pages losing rank.</p><p><b>Scraping content</b> - In an attempt to artificially build content, some websites will <a href="https://www.cloudflare.com/learning/ai/how-to-prevent-web-scraping/">scrape content</a> from other reputable sources and call it their own. Aside from the fact that this behavior can get a site flagged by the Panda algorithm for unrelated or excessive content, it is also in violation of the guidelines and can result in penalization or removal of a website from results.</p><p><b>Hidden text and links</b> - by hiding text inside a webpage so it’s not visible to users, some websites will try to artificially increment the amount of content they have on their site or the amount of instances a keyword occurs. Hiding text behind an image, setting a font size to zero, using CSS to position an element off of the screen, or the classic “white text on a white background” are all tactics to be avoided.</p><p><b>Sneaky redirects</b> - as the name implies, it’s possible to surreptitiously redirect users from the result that they were expecting onto something different. Split cases can also occur where a desktop version of the site will be directed to the intended page while the mobile will be forwarded to full-screen advertising.</p><p><b>Cloaking</b> - by attempting to show different content to search engines and users, some sites will attempt to circumvent the processes a search engine has in place to filter out low value content. While cloaking might have a cool name, it’s in violation and can result in rank reduction or listing removal.</p>
    <div>
      <h3>What SEO resources does Google provide?</h3>
      <a href="#what-seo-resources-does-google-provide">
        
      </a>
    </div>
    <p>There are number of sources that can be considered authoritative when it comes to Google SEO. John Mueller, Gary Illyes and (formerly) Matt Cutts, collectively represent a large portion of the official voice of Google search and provide much of the official SEO best practices content. Aside from the videos, blogs, office hours, and other content provided by these experts, Google also provides the <a href="https://webmasters.googleblog.com/">Google webmaster blog</a> and <a href="https://www.google.com/webmasters/tools/">Google search console</a> which house various resources and updates.</p><p>Last but not least, if you have web properties currently on Cloudflare there are <a href="https://support.cloudflare.com/hc/en-us/articles/231109348-How-do-I-Improve-SEO-Rankings-On-My-Website-Using-Cloudflare-">technical optimizations you can make to improve your SEO</a>.</p> ]]></content:encoded>
            <category><![CDATA[SEO]]></category>
            <category><![CDATA[Best Practices]]></category>
            <guid isPermaLink="false">PaB6LBBFido4JmgjNw29U</guid>
            <dc:creator>Matthew Williams</dc:creator>
        </item>
        <item>
            <title><![CDATA[5 Strategies to Promote Your App]]></title>
            <link>https://blog.cloudflare.com/5-strategies-to-best-promote-your-app/</link>
            <pubDate>Fri, 27 Oct 2017 17:30:00 GMT</pubDate>
            <description><![CDATA[ Brady Gentile from Cloudflare's product team wrote an App Developer Playbook, embedded within the developer documentation page.  ]]></description>
            <content:encoded><![CDATA[ <p></p><p>Brady Gentile from Cloudflare's product team wrote an <a href="https://www.cloudflare.com/apps/assets/Cloudflare%20Apps%20Developer%20Playbook.pdf">App Developer Playbook</a>, embedded within the developer documentation <a href="https://www.cloudflare.com/apps/developer/docs/getting-started">page</a>. He decided to write it after he and his team conducted several app developer interviews, finding that many developers wanted to learn how to better promote their apps.</p><p>They wanted to help app authors out in the areas outside of developer core expertise. Social media posting, community outreach, email deployment, SEO, blog posting and syndication, etc. can be daunting.</p><p>I wanted to take a moment to highlight some of the tips from the App Developer Playbook because I think Brady did a great job of providing clear ways to approach promotional strategies.</p>
    <div>
      <h3>5 Promotional Strategies</h3>
      <a href="#5-promotional-strategies">
        
      </a>
    </div>
    <hr />
    <div>
      <h4>1. Share with online communities</h4>
      <a href="#1-share-with-online-communities">
        
      </a>
    </div>
    <p>Your app’s potential audience likely reads community-aggregated news sites such as <a href="https://news.ycombinator.com/">HackerNews</a>, <a href="https://www.producthunt.com/">Product Hunt</a>, or <a href="https://www.reddit.com/">reddit</a>. Sharing your app across these websites is a great way for users to find your app.</p>
            <figure>
            <a href="https://news.ycombinator.com/">
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/6XnoCKWlUMSfUxDVJvIF5q/e325d5efba8522fb30fcf0db285a0feb/hacker-news.jpg" />
            </a>
            </figure><p>For apps that are interesting to developers, designers, scientists, entrepreneurs, etc., be sure to share your work with the Hacker News community. Be sure to follow the <a href="https://news.ycombinator.com/newsguidelines.html">official guidelines</a> when posting and when engaging with the community. It may be tempting to ask your friends to upvote you, but honesty is the best policy, and the vote-ring detector will bury your post if you try to game it. Instead, if you don’t frontpage on the first try, consider re-posting on another day, with any of these options: the frontpage of your site, the blog post about the launch of your app, a demo of your app in action, a github repo. It may be worth taking into consideration the rate at which new posts are being added to /newest per minute or per hour, which impacts the likelihood of your post making it to the frontpage.</p><p>Since you’re sharing a project that people can play with, be sure to: 1) use “Show HN” and follow <a href="https://news.ycombinator.com/showhn.html">Show HN guidelines</a>, and 2) be available to answer questions in the comments.</p><p>Be sure to start your title with the words ‘Show HN:’ (this indicates that you’ll be sharing something interesting that you’ve built with the HN community with a live demo people can try), then briefly explain your app within the same field. Rather than just use the name of your app, consider adding something informative, like the short description you use in your Cloudflare Apps marketplace tile. For instance, “Show HN: Trebble (embed voice and music on your site)” is more informative than “Show HN: Trebble” as a post title. Next, you’ll have the option of either submitting the URL of your app or explaining a little bit about yourself, the app, and pasting a link to the app itself.</p><p>Lastly, you should probably take the time to explain yourself and what you're all about in a first comment, as it helps build good rapport with the community. Block off some time on your calendar so you’re available to answer questions and engage with the community for however long your post is on the frontpage. In addition to gathering their valuable feedback, a signal that the app author is there (“Hi, I’m <b>Name</b> and I made this app to solve <b>this problem</b> -- I’d love to get your feedback.”) will often make your project more approachable and put a face on a product.</p>
            <figure>
            <a href="https://www.producthunt.com/">
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/5cBCwVUlhNEO4roRrLRCgH/4c8694efed570d1281707ecf6e2d177b/Product-Hunt.svg" />
            </a>
            </figure><p>Product Hunt has released <a href="https://blog.producthunt.com/how-to-launch-on-product-hunt-7c1843e06399">a blog post</a> which outlines how to properly submit your app or product to their community. I highly recommended you review this post in its entirety prior to launching your Cloudflare App.</p>
            <figure>
            <a href="https://www.reddit.com/">
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/36KOnCDO8hM5Qvg7RsOFNi/dbf986230b19b64e353103c966e7a0d7/Reddit.png" />
            </a>
            </figure><p>Submit a link to your app, along with some screenshots/videos, and a descriptive title for your post, and select a subreddit to post into. For the title of your post, you’ll want to use something descriptive about your app; for example you could say “I just built an app that does [X].”</p><p>If your app isn't relevant to the subreddit in which you post, it'll likely be removed by a moderator, so think carefully about which subreddits would find your app genuinely useful. I also recommend you take some time to engage with that community prior to posting your app, in part because their feedback is valuable, and in part so that you’re not a stranger. Here are two subreddits you should definitely include, though: <a href="https://www.reddit.com/r/apps/">Apps</a>, <a href="https://www.reddit.com/r/Cloudflare/">Cloudflare</a>.</p>
    <div>
      <h4>2. Optimize your app for discoverability</h4>
      <a href="#2-optimize-your-app-for-discoverability">
        
      </a>
    </div>
    <p>One of the most important steps of the Cloudflare app deployment process is ensuring that both visitors browsing <a href="https://www.cloudflare.com/apps/">Cloudflare Apps</a> and anyone doing a search on the web may quickly and easily find your app. By optimizing your Cloudflare app for discoverability, you’ll receive a greater number of views, installations, and revenue.</p>
    <div>
      <h4>Title and description</h4>
      <a href="#title-and-description">
        
      </a>
    </div>
    <p>Your app’s title and short description are the first thing millions of website owners are going to see when coming across your Cloudflare app, whether it’s through browsing <a href="https://www.cloudflare.com/apps/">Cloudflare Apps</a> or on a search engine. It’s important that an app’s title is unique, descriptive, and identifiable.</p><p><a href="https://www.cloudflare.com/apps/noadblock">NoAdBlock</a> is a great example.</p>
            <figure>
            <a href="https://www.cloudflare.com/apps/noadblock">
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/1tJau7uPvNc7W1WRxLLDRL/1392d227dd674c452a7737782a869c16/NoAdBlock-example.png" />
            </a>
            </figure>
    <div>
      <h4>Screenshots</h4>
      <a href="#screenshots">
        
      </a>
    </div>
    <p>Showcasing how your app might appear on a user’s website gives confidence to users thinking about previewing and installing. Include a variety of screenshots, showing multiple ways in which the software can be configured on a user’s website.</p><p>Read more about how to configure your full app description and categories in the <a href="https://www.cloudflare.com/apps/assets/Cloudflare%20Apps%20Developer%20Playbook.pdf">App Developer Playbook</a>.</p>
    <div>
      <h4>3. Promote through your properties</h4>
      <a href="#3-promote-through-your-properties">
        
      </a>
    </div>
    <p>Once your app has launched on <a href="https://www.cloudflare.com/apps/">Cloudflare Apps</a>, it’s important that users are able to envision how your app will work for them and that they're easily able to use it.</p>
    <div>
      <h4>Building an app preview link</h4>
      <a href="#building-an-app-preview-link">
        
      </a>
    </div>
    <p>Preview links allow you to generate a link to the install page for your app, which includes customization options for users to play around with.</p><p>Check out this preview for the <a href="https://www.cloudflare.com/apps/spotify/install">Spotify app</a>:</p>
            <figure>
            <a href="https://www.cloudflare.com/apps/spotify/install">
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/49j9hnvytMvtqocDsKhsRC/007beca2030f9ef655347a69c88d9bc2/Screen-Recording-2017-10-20-at-12.01-PM.gif" />
            </a>
            </figure>
    <div>
      <h4>Install badge and placement</h4>
      <a href="#install-badge-and-placement">
        
      </a>
    </div>
    <p>Make it easy and obvious for users. The Cloudflare Install Button is an interactive badge which can be embedded in any online assets, including websites and emails.</p><p>To use the full Cloudflare App install badge, you can paste the code listed in the Playbook onto your website or marketing page.You just need to replace [ appTitle ] and [ appId or appAlias ] with the appropriate details for your app. You can choose a standard button or customize it to your app.</p><p>Here's what <a href="https://www.cloudflare.com/apps/NoAdBlock">NoAdBlock</a> used:</p>
            <figure>
            <a href="https://www.cloudflare.com/apps/NoAdBlock">
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/5orFHu4BRSHz2ll6ixfEWP/4d8d7816371bf65addae801c78fb24df/Install-button.png" />
            </a>
            </figure>
    <div>
      <h4>4. Spread word to existing users</h4>
      <a href="#4-spread-word-to-existing-users">
        
      </a>
    </div>
    <p>A quick and easy way to announce your app’s availability is to notify your user base that the app is now available for them to preview and install. Read more in the Playbook on how to grow your user base before the launch.</p><p>Here's a good starting template for an email announcement:</p>
            <figure>
            <a href="https://www.cloudflare.com/apps/assets/Cloudflare%20Apps%20Developer%20Playbook.pdf">
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/1DshQbe5MzECR3AGMlSdJZ/aedcbdcc12f848d65894915427ed10c2/Email-Template.png" />
            </a>
            </figure>
    <div>
      <h4>5. Form a presence on social media</h4>
      <a href="#5-form-a-presence-on-social-media">
        
      </a>
    </div>
    <p>Targeting users across multiple channels is a pretty easy way to ensure that website owners know your app is now available. Cloudflare can help you with this. Tag @Cloudflare in your posts, so your post will be retweeted and reshared.</p>
            <figure>
            <a href="https://www.cloudflare.com/apps/assets/Cloudflare%20Apps%20Developer%20Playbook.pdf">
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/7hNt3pTVZmStDmaMwVWqS4/8ef2fa8195208cf0797417cd43788510/Twitter-Announcementsw.png" />
            </a>
            </figure>
            <figure>
            <a href="https://www.cloudflare.com/apps/assets/Cloudflare%20Apps%20Developer%20Playbook.pdf">
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/1yOVoMAV90FDsbGPbNwhV3/7ea04b0c043bad5e3b10e85994c22abd/Facebook-Announcements.png" />
            </a>
            </figure>
    <div>
      <h4>Blog Stuff</h4>
      <a href="#blog-stuff">
        
      </a>
    </div>
    <p>Another way to promote the release of your app is by writing a blog post (or several) on your app’s website, delving into the features and benefits that your app brings to users. In addition to your launch post, you can enumerate the new features and bug fixes in a new and improved release, highlight different use cases from your own user base, or deep dive into a fascinating aspect of how you implemented your app.</p><p>Here's a <a href="https://blog.getadmiral.com/admiral-launches-adblock-solution-for-cloudflare-publishers/">well-written launch blog post</a>, from the makers of <a href="https://www.cloudflare.com/apps/Admiral">Admiral</a>.</p><p>Other blogs may help you with this as well. Syndication is a great way to gain significant exposure for your posts. Brainstorm a list of blogs facing the core audience for your app, and reach out and ask if you can contribute a guest blog post. If developers are the core audience, drop a line to <a>community@cloudflare.com</a>. I’d love to have a conversation about whether a guest post featuring your app would be right for the Cloudflare blog.</p><hr /><p>Again, this is just a glimpse into the guidance that the <a href="https://www.cloudflare.com/apps/assets/Cloudflare%20Apps%20Developer%20Playbook.pdf">App Developer Playbook</a> provides. Check it out and share it with your community of app developers.</p><p>Happy, productive app launching to you!</p> ]]></content:encoded>
            <category><![CDATA[Cloudflare Apps]]></category>
            <category><![CDATA[Developers]]></category>
            <category><![CDATA[Best Practices]]></category>
            <category><![CDATA[SEO]]></category>
            <category><![CDATA[Community]]></category>
            <guid isPermaLink="false">1XiY3FmKSA7yqFwNVlB0pS</guid>
            <dc:creator>Andrew Fitch</dc:creator>
        </item>
        <item>
            <title><![CDATA[Advanced Technical "Hacks" for your site's SEO]]></title>
            <link>https://blog.cloudflare.com/advanced-technical-hacks-for-your-sites-seo-2/</link>
            <pubDate>Mon, 25 Jan 2016 21:24:51 GMT</pubDate>
            <description><![CDATA[ Improving your site’s SEO is probably top of mind for you, but doing so takes a lot of hard work and the rules of the game are constantly changing. ]]></description>
            <content:encoded><![CDATA[ <p>Improving your site’s SEO is probably top of mind for you, but doing so takes a lot of hard work and the rules of the game are constantly changing. On <b>Tuesday, January 26th at 10am PT/1pm ET</b>, CloudFlare is hosting a live discussion with some of the leading experts in technical SEO. They will share advanced technical hacks to help you reap the benefits of higher search rankings. In the live discussion, <a href="https://www.linkedin.com/in/martinwoods">Martin Woods</a>, <a href="https://uk.linkedin.com/in/moaiandin">Reza Moaiandin</a>, and <a href="https://www.linkedin.com/in/patrickstox">Patrick Stox</a> will cover:</p><ul><li><p>Tangible tips about on-page code excellency</p></li><li><p>Semantic markup</p></li><li><p>Web server optimization with GZIP and HTTP/2</p></li><li><p>Web content optimization</p></li><li><p>Site security with malware and DDoS prevention</p></li></ul><p>In addition to the webinar, Reza and Martin from SALT.agency have offered <b>a free 30 minute technical SEO consult</b> on your website. Consults are limited to the first 50 people who <a href="https://salt.agency/audit/">signup here</a> and also attend the live webinar event on <b>January 26th at 10am PT</b>. Be sure to <a href="https://cc.readytalk.com/r/181bq3yxd7hn&amp;eom">register for the webinar</a>, too.</p> ]]></content:encoded>
            <category><![CDATA[SEO]]></category>
            <category><![CDATA[Events]]></category>
            <category><![CDATA[Webinars]]></category>
            <guid isPermaLink="false">1o7FPfijIjjPFly66V24ET</guid>
            <dc:creator>Elenitsa Staykova</dc:creator>
        </item>
        <item>
            <title><![CDATA[Google Now Factoring HTTPS Support Into Ranking; CloudFlare On Track to Make it Free and Easy]]></title>
            <link>https://blog.cloudflare.com/google-now-factoring-https-support-into-ranking-cloudflare-on-track-to-make-it-free-and-easy/</link>
            <pubDate>Wed, 06 Aug 2014 14:00:00 GMT</pubDate>
            <description><![CDATA[ As of today, there are only about 2 million websites that support HTTPS. That's a shamefully low number. Two things are about to happen that we at CloudFlare are hopeful will begin to change that and make everyone love locks (at least on the web!). ]]></description>
            <content:encoded><![CDATA[ <p>As of today, there are only about 2 million websites that support HTTPS. That's a shamefully low number. Two things are about to happen that we at CloudFlare are hopeful will begin to change that and make everyone love locks (at least on the web!).</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/68sVz4KPq4HjnBuy7onwfb/af37f57829e084a38f7c87ceb4e698c7/10290363093_9ff8f91c1c_z_1.jpg" />
            
            </figure><p>CC BY 2.0 by <a href="https://www.flickr.com/photos/greggman/">Gregg Tavares</a></p>
    <div>
      <h3>Google Ranks Crypto</h3>
      <a href="#google-ranks-crypto">
        
      </a>
    </div>
    <p>First, Google <a href="http://googleonlinesecurity.blogspot.co.uk/2014/08/https-as-ranking-signal_6.html">just announced</a> that they will begin taking into account whether a site supports HTTPS connections in their ranking algorithm. This means that if you care about SEO then ensuring your site supports HTTPS should be a top priority. Kudos to Google to giving webmasters a big incentive to add SSL to their sites.</p>
    <div>
      <h3>SSL All Things</h3>
      <a href="#ssl-all-things">
        
      </a>
    </div>
    <p>Second, at CloudFlare we've cleared one of the last major technical hurdle before making SSL available for every one of our customers -- even free customers. One of the challenges we had was ensuring we still had the flexibility to move traffic to sites dynamically between the servers that make up our network. While we can do this easily when traffic is over an HTTP connection, when a connection uses HTTPS we need to ensure that the correct certificates are in place and loaded into memory before requests are processed by a server.</p><p>To accomplish this, we needed to redesign how certificates are loaded into a server's memory. Previously, we'd load certificates into memory before traffic was directed to a server. That creates challenges when dealing with millions of domains and when shifting traffic to help isolate or mitigate an attack.</p>
    <div>
      <h3>Lazy Loading Certs</h3>
      <a href="#lazy-loading-certs">
        
      </a>
    </div>
    <p>Last week we pushed new code that allows us to "lazy load" <a href="https://www.cloudflare.com/application-services/products/ssl/">SSL certificates</a> on demand. This means that a certificate only needs to be in a data center, not on a particular server, before HTTPS traffic needing the certificate is directed to that server. When a request is received, the server can now dynamically retrieve the correct certificate even if it hasn't been previously loaded into memory. This allows us to continue to shift traffic to manage our network even if we are managing SSL certificates for millions of domains.</p><p>We're on track to roll out SSL for all CloudFlare customers by mid-October. When we do, the number of sites that support HTTPS on the Internet will more than double. That they'll also rank a bit higher is pretty cool too.</p><p>In the meantime, if you want a quick way to boost your Google ranking, upgrading to any paid CloudFlare account will enable HTTPS by default. Even before we make it free, it's already the fastest, easiest way to get HTTPS support on any site.</p> ]]></content:encoded>
            <category><![CDATA[SEO]]></category>
            <category><![CDATA[Google]]></category>
            <category><![CDATA[HTTPS]]></category>
            <category><![CDATA[SSL]]></category>
            <category><![CDATA[Optimization]]></category>
            <category><![CDATA[Security]]></category>
            <guid isPermaLink="false">7uUcuydFEdXscBcjp7P4ex</guid>
            <dc:creator>Matthew Prince</dc:creator>
        </item>
        <item>
            <title><![CDATA[App: Clearspike automates search engine optimization]]></title>
            <link>https://blog.cloudflare.com/app-clearspike-automates-search-engine-optimi/</link>
            <pubDate>Wed, 09 Jan 2013 18:55:00 GMT</pubDate>
            <description><![CDATA[ You care about your website, and you want it to be found. For many visitors, finding your website starts with search engines. Together, Google, Bing, Baidu and others are huge sources of traffic for every website. ]]></description>
            <content:encoded><![CDATA[ <p>You care about your website, and you want it to be found. For many visitors, finding your website starts with search engines. Together, Google, Bing, Baidu and others are huge sources of traffic for every website.</p><p>The extra speed and security CloudFlare delivers are helpful for search engine ranking, but there are many other factors, including site content, organization and proper promotion.</p><p>The newest CloudFlare App, <a href="https://www.cloudflare.com/apps/clearspike">Cle rspike</a> automates the search engine optimization (SEO) process to help your website attract more organic search engine traffic.</p><p>We know you cared enough to make your website faster and safer. Improving your SEO is a complementary step, and we're pleased to make it easy to use the Clearspike service and tap into the expertise of the Clearspike team for additional benefits.</p>
    <div>
      <h3>How it works</h3>
      <a href="#how-it-works">
        
      </a>
    </div>
    <p>Like other CloudFlare Apps, <a href="https://www.cloudflare.com/apps/clearspike">Clearspike</a> is easy to activate, with different levels of service available immediately, and no long-term commitment.</p><ul><li><p>Self-Service Plan: Get custom recommendations and update website yourself. $24 / month.</p></li><li><p>Automated Plan: Use Clearspike tools to get website optimized automatically. $49 / month.</p></li><li><p>Do-It-For-Me Plan: Get Clearspike experts to optimize your website. $199 / month.</p></li></ul><p>There's no tricks: the experts at Clearspike capture a wealth of experience in an easy-to-use service which makes their expertise usable and easy to apply.</p><p>At every level of service, Clearspike actively reviews your site for possible improvements, making recommendations and giving you tools to take action. The service includes keyword recommendations, page title optimizations, submission to appropriate directories, finding broken links, checking sitemaps and more. Clearspike helps you measure your progress, too, so you can see the return on your investment in SEO.</p><p><i>P.S. Clearspike made their service available to CloudFlare customers using the </i><a href="http://appdev.cloudflare.com/"><i>app development platform</i></a><i>.</i></p><p></p><p><i>CloudFlare is </i><a href="http://www.jobscore.com/jobs/cloudflare/partner-engineer-platform/c9SmO6kR8r4RhneJe4efaV?ref=rss&amp;sid=68"><i>hiring</i></a><i> to extend the platform.</i></p> ]]></content:encoded>
            <category><![CDATA[Cloudflare Apps]]></category>
            <category><![CDATA[SEO]]></category>
            <guid isPermaLink="false">qMhgFx2TvlJGRwR3JZU54</guid>
            <dc:creator>John Roberts</dc:creator>
        </item>
        <item>
            <title><![CDATA[SEO and your website]]></title>
            <link>https://blog.cloudflare.com/seo-and-your-website/</link>
            <pubDate>Mon, 19 Nov 2012 21:26:00 GMT</pubDate>
            <description><![CDATA[ We get a lot of questions from our customers about CloudFlare and how we impact SEO. So when SEO.com signed up for CloudFlare, I thought it would be a great opportunity to talk to an expert to get the scoop on all things SEO.  ]]></description>
            <content:encoded><![CDATA[ <p></p><p><i>We get a lot of questions from our customers about CloudFlare and how we impact SEO. So when </i><a href="http://www.seo.com/"><i>SEO.com</i></a><i> signed up for CloudFlare, I thought it would be a great opportunity to talk to an expert to get the scoop on all things SEO. I was fortunate enough to connect with </i><a href="https://twitter.com/DerekPerkins"><i>Derek Perkins</i></a><i>, Vice President of Technology at SEO.com. With more than 12 years of industry experience, Derek provided his insight on SEO in general, debunked some of the myths out there, and gave us his take on what really works, and what doesn't, when it comes to SEO and your site.</i></p>
    <div>
      <h4>CF - What are the top three tips you can offer for website owners looking to improve their SEO?</h4>
      <a href="#cf-what-are-the-top-three-tips-you-can-offer-for-website-owners-looking-to-improve-their-seo">
        
      </a>
    </div>
    <p>DP - Step one - use <a href="http://wordpress.com/">WordPress</a> and WordPress SEO. For most website owners, a Content Management System is key. The WordPress platform would be my first choice, you don't have to deal with structure of a Wordpress site to make sure it's easily searchable and findable. WordPress ranks high, especially if you activate <a href="http://yoast.com/wordpress/seo/">WordPress SEO by Yoast</a>. A combination of those two alone put you a long way ahead of where smaller business are. Correct structure is a good thing.</p><p>Step two - Focus on great content. Sporadic posting is never going to yield a tangible output. The more Google changes their algorithms, the more likely you will get ranked lower if you're not posting often. Just posting frequently however isn't enough. Content has always risen to the top of rankings, and as search engines mature, they are continuing to increase the signal to noise ratio. Posting great content regularly is the key to SEO success.</p><p>Step three - find good website hosting that will be elastic. Great content that gets picked up on TechCrunch, Digg, Reddit - any viral site - is going to see heavy spikes in traffic. If you're on a cheap hosting plan you often won't be able to scale to meet demands.</p><p>Actually, one of the first things I do is recommend CloudFlare. I love the CDN and scalability, it takes load off of the server so you don' have to worry so much about load spikes.</p>
    <div>
      <h4>CF - What are some of the misperceptions with SEO?</h4>
      <a href="#cf-what-are-some-of-the-misperceptions-with-seo">
        
      </a>
    </div>
    <p>DP - A big misperception about SEO is the idea that you have to somehow change your writing or write things that are for search engines instead of humans. That's not the case. A lot of people also say you have to have unnaturally high keyword density and that's the only way you're going to rank. That's not good, it's harmful. Google sees that as if you're writing it specifically for SEO. Search engines try to read content as if they were human. If the content doesn't flow well or read well for a human, chances are it's not going to read well for a search engine or spider.</p><p>SEO is all about having good content. Write content you and others would like to read. It is more likely be shared socially, bringing more people to your site, and more people will link back to your site, growing your online presence.</p><p>When people think of SEO they tend to focus on the 10 percent that's the little tweaks that SEO companies can do for you, whereas the bulk of the value comes from writing good content.</p>
    <div>
      <h4>CF - Web properties care a lot about SEO, what are some good resources for site owners looking to better their SEO rank?</h4>
      <a href="#cf-web-properties-care-a-lot-about-seo-what-are-some-good-resources-for-site-owners-looking-to-better-their-seo-rank">
        
      </a>
    </div>
    <p>DP - There are a number of places on the web that have good SEO forums. We have a link on our own site,<a href="http://www.seo.com/forums"></a><a href="http://www.SEO.com/forums">www.SEO.com/forums</a>, that links to a number of the best forums out there. Another great resource is <a href="http://www.seomoz.org/beginners-guide-to-seo">SEOmoz</a>, it's a great place for website owners to start.</p>
    <div>
      <h4>CF - What is one thing site owners should be doing to improve their SEO, but probably aren't?</h4>
      <a href="#cf-what-is-one-thing-site-owners-should-be-doing-to-improve-their-seo-but-probably-arent">
        
      </a>
    </div>
    <p>DP - Number one thing that people don't do - they don't have any sort of targets. Content is king, but a lot of it is knowing what specificcontent is going to be most valuable to them. You can write about two different things that are both interesting, exciting and relevant to your audience, but one is relevant to maybe 10 searches a month, whereas one is relevant to 10,000 searches a month. Having an idea of what pages or blog posts or keywords you're targeting with each will help you tailor the content.</p>
    <div>
      <h4>CF - What are some of the things site owners do that might negatively impact SEO?</h4>
      <a href="#cf-what-are-some-of-the-things-site-owners-do-that-might-negatively-impact-seo">
        
      </a>
    </div>
    <p>DP - Picking the wrong page titles and/or having a malformed HTML structure. There's a lot of SEO weight on title and header tags, you need to have your page title similar to whatever is in your H1 tag. A lot of site owners out there don't have header tags or have them set-up correctly. Even if they have a good title, the title might not be in the HTML header tag. The title and headers help both search engines and more importantly humans identify the focus of the page.</p>
    <div>
      <h4>CF - How has your industry changed in the last five years?</h4>
      <a href="#cf-how-has-your-industry-changed-in-the-last-five-years">
        
      </a>
    </div>
    <p>DP - I think SEO has a bit of a stigma because of old tactics that people used to use. You used to be able to immediately rank for SEO by using tricks like white text on a white background and various other tactics to gain the system. Even just recently Google has released things like Penguin, making it harder and harder to game the system. It changes how SEO agencies function, shifting the focus from link building to strategic content driven approaches. That has driven a proliferation of socially shareable content like infographics.</p>
    <div>
      <h4>CF - You've seen the Google vs Bing commercial search challenge. The commercial claims people choose Bing 2 to 1 over Google. Do you think that's right? What are your thoughts?</h4>
      <a href="#cf-youve-seen-the-google-vs-bing-commercial-search-challenge-the-commercial-claims-people-choose-bing-2-to-1-over-google-do-you-think-thats-right-what-are-your-thoughts">
        
      </a>
    </div>
    <p>DP - Personally, I occasionally use Bing, but I tend to go back to Google. I took the test myself and Bing won 3 to 2, but I felt like the stripped down result pages weren't a perfect test.</p>
    <div>
      <h4>CF - Google, Yahoo! and Bing are huge competitors in the search space. What are your thoughts on each? Do any of them stand out as being front runners in the near future?</h4>
      <a href="#cf-google-yahoo-and-bing-are-huge-competitors-in-the-search-space-what-are-your-thoughts-on-each-do-any-of-them-stand-out-as-being-front-runners-in-the-near-future">
        
      </a>
    </div>
    <p>DP - Bing has been gaining ground, Yahoo!'s results are powered by Microsoft, so Bing and Yahoo! will both show same results. The big two are definitely Google and Bing. You can't ignore Bing when you're tracking rankings, but they are definitely playing second fiddle to Google at this point.</p><p>For a long time Google has provided the best search rankings. Whether or not Bing has closed the gap on search, they have an uphill battle. People are used to Google, it's becoming part of the English language. I doubt that anyone outside of Microsoft headquarters has ever said "I don't know the answer, let me go Bing it."</p>
    <div>
      <h4>CloudFlare...in his own words</h4>
      <a href="#cloudflare-in-his-own-words">
        
      </a>
    </div>
    <p>I've been using CloudFlare for over a year now. I had a personal website called<a href="http://soccerreviews.com/">soccerreviews.com</a>. I built it up to have significant traffic and I was really having server load issues, in addition to having been hacked twice. Because of that, security and scalability was very important to me.</p><p>I tried CloudFlare on that site and I now have it on 30 other sites. I have yet to have any of those sites compromised, which has been fantastic.</p><p>Once I joined SEO.com, I put us on CloudFlare. One feature I really like about CloudFlare is Rocket Loader. It combines all my javascript files and speeds them up, and they aren't all being downloaded separately, decreasing download time.</p><p>As for the impact to SEO, bounce rate plays a very important role in how Google does their rankings - they see that as a human factor in SEO. If someone immediately jumps back to Google, it's obviously not a good human source. A fast site that's always online is sure to help your rankings with lower bounce rates, and having CloudFlare helps to make this possible.</p> ]]></content:encoded>
            <category><![CDATA[SEO]]></category>
            <category><![CDATA[Customers]]></category>
            <category><![CDATA[WordPress]]></category>
            <guid isPermaLink="false">2TwADmuL15zRfv0L5BsFep</guid>
            <dc:creator>Kristin Tarr</dc:creator>
        </item>
        <item>
            <title><![CDATA[Attracta: Solving a big problem for every host]]></title>
            <link>https://blog.cloudflare.com/attracta-solving-a-big-problem-for-every-host/</link>
            <pubDate>Wed, 08 Aug 2012 21:52:00 GMT</pubDate>
            <description><![CDATA[ Troy McCasland is the co founder and Vice President of Business Development at Attracta. He has been at the company since its inception, working on the executive management team and building the business. ]]></description>
            <content:encoded><![CDATA[ <p><a href="https://www.twitter.com/troymccas">Troy McCasland</a> is the co founder and Vice President of Business Development at <a href="https://www.attracta.com/">Attracta</a>. He has been at the company since its inception, working on the executive management team and building the business.</p><p>We sat down with Troy at HostingCon and asked, what exactly does Attracta do? "Attracta makes the worlds most popular search engine optimization platform," said Troy. "We solve a really big problem for every host here at HostingCon."</p><p>According to Troy, if a customer purchases a site through a host and then searches for their site in Google, only to find their site isn't there, it can be a big problem for the host. Attracta solves that problem by crawling the site and offering a XML Sitemap of the site to major search engines such as Google, Yahoo!, Bing and Ask.</p><p>Attracta currently has more than 2.3 million customers and adds over 100,000 customers a month.</p><p><i>At HostingCon 2012, </i><a href="http://www.cloudflare.com/"><i>CloudFlare</i></a><i> co-founder </i><a href="https://www.twitter.com/zatlyn"><i>Michelle Zatlyn</i></a><i> sat down with 28 leading experts in the hosting industry. Their conversations were captured live and offer insight into the latest trends and news in hosting.</i></p> ]]></content:encoded>
            <category><![CDATA[SEO]]></category>
            <category><![CDATA[Hosting Con]]></category>
            <guid isPermaLink="false">4GiBSiGyGyZYtU2l9oeAfv</guid>
            <dc:creator>Kristin Tarr</dc:creator>
        </item>
        <item>
            <title><![CDATA[Introducing CloudFlare's Stop Censorship App]]></title>
            <link>https://blog.cloudflare.com/introducing-cloudflares-stop-censorship-app/</link>
            <pubDate>Mon, 16 Jan 2012 23:50:00 GMT</pubDate>
            <description><![CDATA[ CloudFlare, as a service, illustrates the power of using DNS to make the Internet better. Unfortunately, some current legislation up for consideration in the United States illustrates the power of using DNS to make the Internet worse.  ]]></description>
            <content:encoded><![CDATA[ <p></p><p>CloudFlare, as a service, illustrates the power of using DNS to make the Internet better. Unfortunately, some current legislation up for consideration in the United States illustrates the power of using DNS to make the Internet worse. SOPA and PIPA aim to address the challenge of policing copyright online by monkeying with the underlying infrastructure of the Internet.</p><p>Spearheaded by <a href="http://icanhascheezburger.com/">Ben Huh</a> and others, many sites are planning on "blacking out" their pages on Wednesday, January 18 to raise awareness about the dangers of laws like SOPA and PIPA. Several CloudFlare users wrote to us asking if there was a way we could help them participate in such a protest. The problem is that blacking out your site entirely can have some negative results:</p><ol><li><p>Taking a site offline does nothing to help educate people who are not already aware of the risk about the problems of SOPA and PIPA; and</p></li><li><p>Removing your site from the Internet, even if for only one day, can have a significant impact on your search ranking and crawl rates.</p></li></ol><p>We wanted to provide a way for people who wanted to raise awareness about SOPA and PIPA to do so effectively and without hurting themselves in the process.</p><p>What's great about the CloudFlare Anti-Censorship App is that it will work without you having to modify any of the code on your site. If you own your own domain, you can sign up for CloudFlare. And, if you sign up for CloudFlare, you can participate in the blackout with one click. This means that if you're on Tumblr, TypePad, WordPress, Posterous, or any other platform, so long as you have your own domain you can use the app.</p><p>The app is available <a href="https://www.cloudflare.com/apps/stop_censorship">here beginning today</a>. If you want to participate in the blackout, you should turn it on by Wednesday, January 18. We will continue to make the app available for the next 30 days or until the threat from laws like SOPA and PIPA has passed.</p> ]]></content:encoded>
            <category><![CDATA[Cloudflare Apps]]></category>
            <category><![CDATA[SEO]]></category>
            <category><![CDATA[Freedom of Speech]]></category>
            <category><![CDATA[Policy & Legal]]></category>
            <guid isPermaLink="false">lraugrgBDcIUOiC9PczeW</guid>
            <dc:creator>Matthew Prince</dc:creator>
        </item>
        <item>
            <title><![CDATA[Avoid Losing SEO Link Juice to Traditional CDNs]]></title>
            <link>https://blog.cloudflare.com/losing-seo-link-juice-to-traditional-cdns/</link>
            <pubDate>Thu, 04 Aug 2011 16:37:00 GMT</pubDate>
            <description><![CDATA[ One of the problems with many traditional CDNs it that they require you to rewrite the URLs of your static resources to point to a third party domain. ]]></description>
            <content:encoded><![CDATA[ <p></p><p>One of the <a href="https://www.cloudflare.com/learning/cdn/common-cdn-issues/">problems</a> with many traditional CDNs it that they require you to rewrite the URLs of your static resources to point to a third party domain. This is problematic for two reasons: one obvious, one less obvious. The obvious reason is that it can be difficult to do. As a web administrator, you used to have to manually edit the links to your images and other static files to assign them to a CDN. Today there are tools like W3 Total Cache (W3TC) that can make that process more automatic, and some new services are promising to automate the process, but in the end even if automated this adds an extra step to the content creation process.</p><p>The less obvious problem is that these rewritten URLs point to someone else's domain, not yours, and that can potentially hurt your SEO. Deep in the bowels of search engines, backlinks to your site give you credit and increase the likelihood you'll appear near the top of search results. While the algorithms of search engines are secret and ever-changing, pointing links to a third party's domain, whether automatically or otherwise, poses a risk that you're not getting full credit for the content you create.</p><p>You can see this for yourself via Twitter. Try a search for a <a href="http://twitter.com/#!/search/cloudfront">traditional CDN like Amazon's CloudFront</a>. What you see are not people posting about the service itself, but instead links to photos from third party sites using CloudFront to store them. What's too bad is that even if you were interested in the photo and wanted to follow the link back to the original source, there's no way to figure it out because the domain doesn't reference the original content creator.</p><p>There is a time and a place for traditional CDNs, and CloudFlare works great in conjunction with them. However, we designed CloudFlare to make sure that we were completely transparent to your visitors and search engines. This means we never rewrite the URLs of your static content to point away from your domain. As a result, if you do a search for a CloudFlare user's domain on Twitter, you'll see everything being shared about them, even if some of the shared items are photos cached on our network. Similarly, if you do a <a href="http://twitter.com/#!/search/realtime/cloudflare">search on Twitter for CloudFlare</a>, all you'll see are people talking about our service.</p> ]]></content:encoded>
            <category><![CDATA[SEO]]></category>
            <category><![CDATA[Google]]></category>
            <guid isPermaLink="false">LeZXR7McfFDEDqx1Vv8vL</guid>
            <dc:creator>Matthew Prince</dc:creator>
        </item>
    </channel>
</rss>