
<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:media="http://search.yahoo.com/mrss/">
    <channel>
        <title><![CDATA[ The Cloudflare Blog ]]></title>
        <description><![CDATA[ Get the latest news on how products at Cloudflare are built, technologies used, and join the teams helping to build a better Internet. ]]></description>
        <link>https://blog.cloudflare.com</link>
        <atom:link href="https://blog.cloudflare.com/" rel="self" type="application/rss+xml"/>
        <language>en-us</language>
        
        <lastBuildDate>Fri, 03 Apr 2026 17:14:20 GMT</lastBuildDate>
        <item>
            <title><![CDATA[Investigating multi-vector attacks in Log Explorer]]></title>
            <link>https://blog.cloudflare.com/investigating-multi-vector-attacks-in-log-explorer/</link>
            <pubDate>Tue, 10 Mar 2026 13:00:00 GMT</pubDate>
            <description><![CDATA[ Log Explorer customers can now identify and investigate multi-vector attacks. Log Explorer supports 14 additional Cloudflare datasets, enabling users to have a 360-degree view of their network. ]]></description>
            <content:encoded><![CDATA[ <p>In the world of cybersecurity, a single data point is rarely the whole story. Modern attackers don’t just knock on the front door; they probe your APIs, flood your network with "noise" to distract your team, and attempt to slide through applications and servers using stolen credentials.</p><p>To stop these multi-vector attacks, you need the full picture. By using Cloudflare Log Explorer to conduct security forensics, you get 360-degree visibility through the integration of 14 new datasets, covering the full surface of Cloudflare’s Application Services and Cloudflare One product portfolios. By correlating telemetry from application-layer HTTP requests, network-layer DDoS and Firewall logs, and Zero Trust Access events, security analysts can significantly reduce Mean Time to Detect (MTTD) and effectively unmask sophisticated, multi-layered attacks.</p><p>Read on to learn more about how Log Explorer gives security teams the ultimate landscape for rapid, deep-dive forensics.</p>
    <div>
      <h2>The flight recorder for your entire stack</h2>
      <a href="#the-flight-recorder-for-your-entire-stack">
        
      </a>
    </div>
    <p>The contemporary digital landscape requires deep, correlated telemetry to defend against adversaries using multiple attack vectors. Raw logs serve as the "flight recorder" for an application, capturing every single interaction, attack attempt, and performance bottleneck. And because Cloudflare sits at the edge, between your users and your servers, all of these events are logged before the requests even reach your infrastructure. </p><p>Cloudflare Log Explorer centralizes these logs into a unified interface for rapid investigation.</p>
    <div>
      <h3>Log Types Supported</h3>
      <a href="#log-types-supported">
        
      </a>
    </div>
    
    <div>
      <h4>Zone-Scoped Logs</h4>
      <a href="#zone-scoped-logs">
        
      </a>
    </div>
    <p><i>Focus: Website traffic, security events, and edge performance.</i></p><table><tr><td><p><b>HTTP Requests</b></p></td><td><p>As the most comprehensive dataset, it serves as the "primary record" of all application-layer traffic, enabling the reconstruction of session activity, exploit attempts, and bot patterns.</p></td></tr><tr><td><p><b>Firewall Events</b></p></td><td><p>Provides critical evidence of blocked or challenged threats, allowing analysts to identify the specific WAF rules, IP reputations, or custom filters that intercepted an attack.</p></td></tr><tr><td><p><b>DNS Logs</b></p></td><td><p>Identify cache poisoning attempts, domain hijacking, and infrastructure-level reconnaissance by tracking every query resolved at the authoritative edge.</p></td></tr><tr><td><p><b>NEL (Network Error Logging) Reports</b></p></td><td><p>Distinguish between a coordinated Layer 7 DDoS attack and legitimate network connectivity issues by tracking client-side browser errors.</p></td></tr><tr><td><p><b>Spectrum Events</b></p></td><td><p>For non-web applications, these logs provide visibility into L4 traffic (TCP/UDP), helping to identify anomalies or brute-force attacks against protocols like SSH, RDP, or custom gaming traffic.</p></td></tr><tr><td><p><b>Page Shield</b></p></td><td><p>Track and audit unauthorized changes to your site's client-side environment such as JavaScript, outbound connections.</p></td></tr><tr><td><p><b>Zaraz Events</b></p></td><td><p>Examine how third-party tools and trackers are interacting with user data, which is vital for auditing privacy compliance and detecting unauthorized script behaviors.</p></td></tr></table>
    <div>
      <h4>Account-Scoped Logs</h4>
      <a href="#account-scoped-logs">
        
      </a>
    </div>
    <p><i>Focus: Internal security, Zero Trust, administrative changes, and network activity.</i></p><table><tr><td><p><b>Access Requests</b></p></td><td><p>Tracks identity-based authentication events to determine which users accessed specific internal applications and whether those attempts were authorized.</p></td></tr><tr><td><p><b>Audit Logs</b></p></td><td><p>Provides a trail of configuration changes within the Cloudflare dashboard to identify unauthorized administrative actions or modifications.</p></td></tr><tr><td><p><b>CASB Findings</b></p></td><td><p>Identifies security misconfigurations and data risks within SaaS applications (like Google Drive or Microsoft 365) to prevent unauthorized data exposure.</p></td></tr><tr><td><p><b>Magic Transit / IPSec Logs</b></p></td><td><p>Helps network engineers perform network-level (L3) monitoring such as reviewing tunnel health and view BGP routing changes.</p></td></tr><tr><td><p><b>Browser Isolation Logs</b></p></td><td><p>Tracks user actions <i>inside</i> an isolated browser session (e.g., copy-paste, print, or file uploads) to prevent data leaks on untrusted sites </p></td></tr><tr><td><p><b>Device Posture Results </b></p></td><td><p>Details the security health and compliance status of devices connecting to your network, helping to identify compromised or non-compliant endpoints.</p></td></tr><tr><td><p><b>DEX Application Tests </b></p></td><td><p>Monitors application performance from the user's perspective, which can help distinguish between a security-related outage and a standard performance degradation.</p></td></tr><tr><td><p><b>DEX Device State Events</b></p></td><td><p>Provides telemetry on the physical state of user devices, useful for correlating hardware or OS-level anomalies with potential security incidents.</p></td></tr><tr><td><p><b>DNS Firewall Logs</b></p></td><td><p>Tracks DNS queries filtered through the DNS Firewall to identify communication with known malicious domains or command-and-control (C2) servers.</p></td></tr><tr><td><p><b>Email Security Alerts</b></p></td><td><p>Logs malicious email activity and phishing attempts detected at the gateway to trace the origin of email-based entry vectors.</p></td></tr><tr><td><p><b>Gateway DNS</b></p></td><td><p>Monitors every DNS query made by users on your network to identify shadow IT, malware callbacks, or domain-generation algorithms (DGAs).</p></td></tr><tr><td><p><b>Gateway HTTP</b></p></td><td><p>Provides full visibility into encrypted and unencrypted web traffic to detect hidden payloads, malicious file downloads, or unauthorized SaaS usage.</p></td></tr><tr><td><p><b>Gateway Network</b></p></td><td><p>Tracks L3/L4 network traffic (non-HTTP) to identify unauthorized port usage, protocol anomalies, or lateral movement within the network.</p></td></tr><tr><td><p><b>IPSec Logs</b></p></td><td><p>Monitors the status and traffic of encrypted site-to-site tunnels to ensure the integrity and availability of secure network connections.</p></td></tr><tr><td><p><b>Magic IDS Detections</b></p></td><td><p>Surfaces matches against intrusion detection signatures to alert investigators to known exploit patterns or malware behavior traversing the network.</p></td></tr><tr><td><p><b>Network Analytics Logs</b></p></td><td><p>Provides high-level visibility into packet-level data to identify volumetric DDoS attacks or unusual traffic spikes targeting specific infrastructure.</p></td></tr><tr><td><p><b>Sinkhole HTTP Logs</b></p></td><td><p>Captures traffic directed to "sinkholed" IP addresses to confirm which internal devices are attempting to communicate with known botnet infrastructure.</p></td></tr><tr><td><p><b>WARP Config Changes</b></p></td><td><p>Tracks modifications to the WARP client settings on end-user devices to ensure that security agents haven't been tampered with or disabled.</p></td></tr><tr><td><p><b>WARP Toggle Changes</b></p></td><td><p>Specifically logs when users enable or disable their secure connectivity, helping to identify periods where a device may have been unprotected.</p></td></tr><tr><td><p><b>Zero Trust Network Session Logs</b></p></td><td><p>Logs the duration and status of authenticated user sessions to map out the complete lifecycle of a user's access within the protected perimeter.</p></td></tr></table>
    <div>
      <h2>Log Explorer can identify malicious activity at every stage</h2>
      <a href="#log-explorer-can-identify-malicious-activity-at-every-stage">
        
      </a>
    </div>
    <p>Get granular application layer visibility with <b>HTTP Requests</b>, <b>Firewall Events</b>, and <b>DNS logs</b> to see exactly how traffic is hitting your public-facing properties.<b> </b>Track internal movement with <b>Access Requests</b>, <b>Gateway logs</b>, and <b>Audit logs</b>. If a credential is compromised, you’ll see where they went. Use <b>Magic IDS</b> and <b>Network Analytics logs</b> to spot volumetric attacks and "East-West" lateral movement within your private network.</p>
    <div>
      <h3>Identify the reconnaissance</h3>
      <a href="#identify-the-reconnaissance">
        
      </a>
    </div>
    <p>Attackers use scanners and other tools to look for entry points, hidden directories, or software vulnerabilities. To identify this, using Log Explorer, you can query <code>http_requests</code> for any <code>EdgeResponseStatus</code> codes of 401, 403, or 404 coming from a single IP, or requests to sensitive paths (e.g. <code>/.env</code>, <code>/.git</code>, <code>/wp-admin</code>). </p><p>Additionally, <code>magic_ids_detections</code> logs can also be used to identify scanning at the network layer. These logs provide packet-level visibility into threats targeting your network. Unlike standard HTTP logs, these logs focus on <b>signature-based detections</b> at the network and transport layers (IP, TCP, UDP). Query to discover cases where a single <code>SourceIP</code> is triggering multiple unique detections across a wide range of <code>DestinationPort</code> values in a short timeframe. Magic IDS signatures can specifically flag activities like Nmap scans or SYN stealth scans.</p>
    <div>
      <h3>Check for diversions</h3>
      <a href="#check-for-diversions">
        
      </a>
    </div>
    <p>While the attacker is conducting reconnaissance, they may attempt to disguise this with a simultaneous network flood. Pivot to <code>network_analytics_logs</code> to see if a volumetric attack is being used as a smokescreen.</p>
    <div>
      <h3>Identify the approach </h3>
      <a href="#identify-the-approach">
        
      </a>
    </div>
    <p>Once attackers identify a potential vulnerability, they begin to craft their weapon. The attacker sends malicious payloads (e.g. SQL injection or large/corrupt file uploads) to confirm the vulnerability. Review <code>http_requests</code> and/or <code>fw_events</code> to identify any Cloudflare detection tools that have triggered. Cloudflare logs security signals in these datasets to easily identify requests with malicious payloads using fields such as <code>WAFAttackScore</code>, <code>WAFSQLiAttackScore</code>, <code>FraudAttack</code>, <code>ContentScanJobResults</code>, and several more. Review <a href="https://developers.cloudflare.com/logs/logpush/logpush-job/datasets/zone/http_requests/"><u>our documentation</u></a> to get a full understanding of these fields. The <code>fw_events</code> logs can be used to determine whether these requests made it past Cloudflare’s defenses by examining the <code>action</code>, <code>source</code>, and <code>ruleID</code> fields. Cloudflare’s managed rules by default blocks many of these payloads by default. Review Application Security Overview to know if your application is protected.</p>
          <figure>
          <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/1zpFguYrnbOPwyASGQCqZK/63f398acce2226e453a5eea1cc749241/image3.png" />
          </figure><p><sup><i>Showing the Managed rules Insight that displays on Security Overview if the current zone does not have Managed Rules enabled</i></sup></p>
    <div>
      <h3>Audit the identity</h3>
      <a href="#audit-the-identity">
        
      </a>
    </div>
    <p>Did that suspicious IP manage to log in? Use the <code>ClientIP</code> to search <code>access_requests</code>. If you see a "<code>Decision: Allow</code>" for a sensitive internal app, you know you have a compromised account.</p>
    <div>
      <h3>Stop the leak (data exfiltration)</h3>
      <a href="#stop-the-leak-data-exfiltration">
        
      </a>
    </div>
    <p>Attackers sometimes use DNS tunneling to bypass firewalls by encoding sensitive data (like passwords or SSH keys) into DNS queries. Instead of a normal request like <code>google.com</code>, the logs will show long, encoded strings. Look for an unusually high volume of queries for unique, long, and high-entropy subdomains by examining the fields: <code>QueryName</code>: Look for strings like <a href="http://h3ldo293js92.example.com"><code><u>h3ldo293js92.example.com</u></code></a>, <code>QueryType</code>: Often uses <code>TXT</code>, <code>CNAME</code>, or <code>NULL</code> records to carry the payload, and <code>ClientIP</code>: Identify if a single internal host is generating thousands of these unique requests.</p><p>Additionally, attackers may attempt to leak sensitive data by hiding it within non-standard protocols or by using common protocols (like DNS or ICMP) in unusual ways to bypass standard firewalls. Discover this by querying the <code>magic_ids_detections</code> logs to look for signatures that flag protocol anomalies, such as "ICMP tunneling" or "DNS tunneling" detections in the <code>SignatureMessage</code>.</p><p>Whether you are investigating a zero-day vulnerability or tracking a sophisticated botnet, the data you need is now at your fingertips.</p>
    <div>
      <h2>Correlate across datasets</h2>
      <a href="#correlate-across-datasets">
        
      </a>
    </div>
    <p>Investigate malicious activity across multiple datasets by pivoting between multiple concurrent searches. With Log Explorer, you can now work with multiple queries simultaneously with the new Tabs feature. Switch between tabs to query different datasets or Pivot and adjust queries using filtering via your query results.</p><div>
  
</div>
<p></p><p>When you correlate data across multiple Cloudflare log sources, you can detect sophisticated multi-stage attacks that appear benign when viewed in isolation. This cross-dataset analysis allows you to see the full attack chain from reconnaissance to exfiltration.</p>
    <div>
      <h3>Session hijacking (token theft)</h3>
      <a href="#session-hijacking-token-theft">
        
      </a>
    </div>
    <p><b>Scenario:</b> A user authenticates via Cloudflare Access, but their subsequent HTTP_request traffic looks like a bot.</p><p><b>Step 1:</b> Identify high-risk sessions in <code>http_requests</code>.</p>
            <pre><code>SELECT RayID, ClientIP, ClientRequestUserAgent, BotScore
FROM http_requests
WHERE date = '2026-02-22' 
  AND BotScore &lt; 20 
LIMIT 100</code></pre>
            <p><b>Step 2:</b> Copy the <code>RayID</code> and search <code>access_requests</code> to see which user account is associated with that suspicious bot activity.</p>
            <pre><code>
SELECT Email, IPAddress, Allowed
FROM access_requests
WHERE date = '2026-02-22' 
  AND RayID = 'INSERT_RAY_ID_HERE'</code></pre>
            
    <div>
      <h3>Post-phishing C2 beaconing</h3>
      <a href="#post-phishing-c2-beaconing">
        
      </a>
    </div>
    <p><b>Scenario:</b> An employee clicked a link in a phishing email which resulted in compromising their workstation. This workstation sends a DNS query for a known malicious domain, then immediately triggers an IDS alert.</p><p><b>Step 1:</b> Find phishing attacks by examining email_security_alerts for violations. </p>
            <pre><code>SELECT Timestamp, Threatcategories, To, Alertreason
FROM email_security_alerts
WHERE date = '2026-02-22' 
  AND Threatcategories LIKE 'phishing'</code></pre>
            <p><b>Step 2:</b> Use Access logs to correlate the user’s email (To) to their IP Address.</p>
            <pre><code>SELECT Email, IPAddress
FROM access_requests
WHERE date = '2026-02-22' </code></pre>
            <p><b>Step 3:</b> Find internal IPs querying a specific malicious domain in <code>gateway_dns</code> logs.</p>
            <pre><code>
SELECT SrcIP, QueryName, DstIP, 
FROM gateway_dns
WHERE date = '2026-02-22' 
  AND SrcIP = 'INSERT_IP_FROM_PREVIOUS_QUERY'
  AND QueryName LIKE '%malicious_domain_name%'</code></pre>
            
    <div>
      <h3>Lateral movement (Access → network probing)</h3>
      <a href="#lateral-movement-access-network-probing">
        
      </a>
    </div>
    <p><b>Scenario:</b> A user logs in via Zero Trust and then tries to scan the internal network.</p><p><b>Step 1:</b> Find successful logins from unexpected locations in <code>access_requests</code>.</p>
            <pre><code>SELECT IPAddress, Email, Country
FROM access_requests
WHERE date = '2026-02-22' 
  AND Allowed = true 
  AND Country != 'US' -- Replace with your HQ country</code></pre>
            <p><b>Step 2:</b> Check if that <code>IPAddress</code> is triggering network-level signatures in <code>magic_ids_detections</code>.</p>
            <pre><code>SELECT SignatureMessage, DestinationIP, Protocol
FROM magic_ids_detections
WHERE date = '2026-02-22' 
  AND SourceIP = 'INSERT_IP_ADDRESS_HERE'</code></pre>
            
    <div>
      <h3>Opening doors for more data </h3>
      <a href="#opening-doors-for-more-data">
        
      </a>
    </div>
    <p>From the beginning, Log Explorer was designed with extensibility in mind. Every dataset schema is defined using JSON Schema, a widely-adopted standard for describing the structure and types of JSON data. This design decision has enabled us to easily expand beyond HTTP Requests and Firewall Events to the full breadth of Cloudflare's telemetry. The same schema-driven approach that powered our initial datasets scaled naturally to accommodate Zero Trust logs, network analytics, email security alerts, and everything in between.</p><p>More importantly, this standardization opens the door to ingesting data beyond Cloudflare's native telemetry. Because our ingestion pipeline is schema-driven rather than hard-coded, we're positioned to accept any structured data that can be expressed in JSON format. For security teams managing hybrid environments, this means Log Explorer could eventually serve as a single pane of glass, correlating Cloudflare's edge telemetry with logs from third-party sources, all queryable through the same SQL interface. While today's release focuses on completing coverage of Cloudflare's product portfolio, the architectural groundwork is laid for a future where customers can bring their own data sources with custom schemas.</p>
    <div>
      <h3>Faster data, faster response: architectural upgrades</h3>
      <a href="#faster-data-faster-response-architectural-upgrades">
        
      </a>
    </div>
    <p>To investigate a multi-vector attack effectively, timing is everything. A delay of even a few minutes in the log availability can be the difference between proactive defense and reactive damage control.</p><p>That is why we have optimized our ingestion for better speed and resilience. By increasing concurrency in one part of our ingestion path, we have eliminated bottlenecks that could cause “noisy neighbor” issues, ensuring that one client’s data surge doesn’t slow down another’s visibility. This architectural work has reduced our P99 ingestion latency by approximately 55%, and our P50 by 25%, cutting the time it takes for an event at the edge to become available for your SQL queries.</p>
          <figure>
          <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/41M2eWP0BwrQFSZW4GzZbV/7a6139354abb561aba17e77d83beb17a/image4.png" />
          </figure><p><sup><i>Grafana chart displaying the drop in ingest latency after architectural upgrades</i></sup></p>
    <div>
      <h2>Follow along for more updates</h2>
      <a href="#follow-along-for-more-updates">
        
      </a>
    </div>
    <p>We're just getting started. We're actively working on even more powerful features to further enhance your experience with Log Explorer, including the ability to run these detection queries on a custom defined schedule. </p>
          <figure>
          <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/2JIOu9PXDwVAVcmbgq456q/1eace4b5d38bb705e82442a4ee8045dc/Scheduled_Queries_List.png" />
          </figure><p><sup><i>Design mockup of upcoming Log Explorer Scheduled Queries feature</i></sup></p><p><a href="https://blog.cloudflare.com/"><u>Subscribe to the blog</u></a> and keep an eye out for more Log Explorer updates soon in our <a href="https://developers.cloudflare.com/changelog/product/log-explorer/"><u>Change Log</u></a>. </p>
    <div>
      <h2>Get access to Log Explorer</h2>
      <a href="#get-access-to-log-explorer">
        
      </a>
    </div>
    <p>To get access to Log Explorer, you can purchase self-serve directly from the dash or for contract customers, reach out for a <a href="https://www.cloudflare.com/application-services/products/log-explorer/"><u>consultation</u></a> or contact your account manager. Additionally, you can read more in our <a href="https://developers.cloudflare.com/logs/log-explorer/"><u>Developer Documentation</u></a>.</p> ]]></content:encoded>
            <category><![CDATA[Analytics]]></category>
            <category><![CDATA[Logs]]></category>
            <category><![CDATA[Security]]></category>
            <category><![CDATA[R2]]></category>
            <category><![CDATA[Storage]]></category>
            <category><![CDATA[SIEM]]></category>
            <category><![CDATA[Product News]]></category>
            <category><![CDATA[Connectivity Cloud]]></category>
            <guid isPermaLink="false">1hirraqs3droftHovXp1G6</guid>
            <dc:creator>Jen Sells</dc:creator>
            <dc:creator>Claudio Jolowicz</dc:creator>
            <dc:creator>Nico Gutierrez</dc:creator>
        </item>
        <item>
            <title><![CDATA[Cloudflare Log Explorer is now GA, providing native observability and forensics]]></title>
            <link>https://blog.cloudflare.com/logexplorer-ga/</link>
            <pubDate>Wed, 18 Jun 2025 13:00:00 GMT</pubDate>
            <description><![CDATA[ We are happy to announce the General Availability of Cloudflare Log Explorer, a powerful product designed to bring observability and forensics capabilities directly into your Cloudflare dashboard. ]]></description>
            <content:encoded><![CDATA[ <p>We are thrilled to announce the General Availability of <a href="http://cloudflare.com/application-services/products/log-explorer/"><u>Cloudflare Log Explorer</u></a>, a powerful new product designed to bring <a href="https://www.cloudflare.com/learning/performance/what-is-observability/">observability and forensics capabilities</a> directly into your Cloudflare dashboard. Built on the foundation of Cloudflare's vast <a href="https://www.cloudflare.com/network/"><u>global network</u></a>, Log Explorer leverages the unique position of our platform to provide a comprehensive and contextualized view of your environment.</p><p>Security teams and developers use Cloudflare to detect and mitigate threats in real-time and to optimize application performance. Over the years, users have asked for additional telemetry with full context to investigate security incidents or troubleshoot application performance issues without having to forward data to third party log analytics and Security Information and Event Management (SIEM) tools. Besides avoidable costs, forwarding data externally comes with other drawbacks such as: complex setups, delayed access to crucial data, and a frustrating lack of context that complicates quick mitigation. </p><p>Log Explorer has been previewed by several hundred customers over the last year, and they attest to its benefits: </p><blockquote><p><i>“Having WAF logs (firewall events) instantly available in Log Explorer with full context — no waiting, no external tools — has completely changed how we manage our firewall rules. I can spot an issue, adjust the rule with a single click, and immediately see the effect. It’s made tuning for false positives faster, cheaper, and far more effective.” </i></p></blockquote><blockquote><p><i>“While we use Logpush to ingest Cloudflare logs into our SIEM, when our development team needs to analyze logs, it can be more effective to utilize </i><b><i>Log Explorer</i></b><i>. SIEMs make it difficult for development teams to write their own queries and manipulate the console to see the logs they need. Cloudflare's Log Explorer, on the other hand, makes it much </i><b><i>easier</i></b><i> for dev teams to look at logs and directly search for the information they need.”</i></p></blockquote><p>With Log Explorer, customers have access to Cloudflare logs with all the context available within the Cloudflare platform. Compared to external tools, customers benefit from: </p><ul><li><p><b>Reduced cost and complexity:</b> Drastically reduce the expense and operational overhead associated with forwarding, storing, and analyzing terabytes of log data in external tools.</p></li><li><p><b>Faster detection and triage:</b> Access Cloudflare-native logs directly, eliminating cumbersome data pipelines and the ingest lags that delay critical security insights.</p></li><li><p><b>Accelerated investigations with full context:</b> Investigate incidents with Cloudflare's unparalleled contextual data, accelerating your analysis and understanding of "What exactly happened?" and "How did it happen?"</p></li><li><p><b>Minimal recovery time:</b> Seamlessly transition from investigation to action with direct mitigation capabilities via the Cloudflare platform.</p></li></ul><p>Log Explorer is available as an add-on product for customers on our self serve or Enterprise plans. Read on to learn how each of the capabilities of Log Explorer can help you detect and diagnose issues more quickly.</p>
    <div>
      <h3>Monitor security and performance issues with custom dashboards</h3>
      <a href="#monitor-security-and-performance-issues-with-custom-dashboards">
        
      </a>
    </div>
    <p>Custom dashboards allow you to define the specific metrics you need in order to monitor unusual or unexpected activity in your environment.</p><p>Getting started is easy, with the ability to create a chart using natural language. A natural language interface is integrated into the chart create/edit experience, enabling you to describe in your own words the chart you want to create. Similar to the <a href="https://blog.cloudflare.com/security-analytics-ai-assistant/"><u>AI Assistant we announced during Security Week 2024</u></a>, the prompt translates your language to the appropriate chart configuration, which can then be added to a new or existing custom dashboard.</p><p>As an example, you can create a dashboard for monitoring for the presence of Remote Code Execution (RCE) attacks happening in your environment. An RCE attack is where an attacker is able to compromise a machine in your environment and execute commands. The good news is that RCE is a detection available in Cloudflare WAF.  In the dashboard example below, you can not only watch for RCE attacks, but also correlate them with other security events such as malicious content uploads, source IP addresses, and JA3/JA4 fingerprints. Such a scenario could mean one or more machines in your environment are compromised and being used to spread malware — surely, a very high risk incident!</p>
          <figure>
          <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/1UWOHhIaIFiBTtnohdvbAx/40eeac0b52bc278d0687f7d48cd875fd/BLOG-2838_2.png" />
          </figure><p>A reliability engineer might want to create a dashboard for monitoring errors. They could use the natural language prompt to enter a query like “Compare HTTP status code ranges over time.” The AI model then decides the most appropriate visualization and constructs their chart configuration.</p><p>While you can create custom dashboards from scratch, you could also use an expert-curated dashboard template to jumpstart your security and performance monitoring. </p><p>Available templates include: </p><ul><li><p><b>Bot monitoring:</b> Identify automated traffic accessing your website</p></li><li><p><b>API Security:</b> Monitor the data transfer and exceptions of API endpoints within your application</p></li><li><p><b>API Performance:</b> See timing data for API endpoints in your application, along with error rates</p></li><li><p><b>Account Takeover: </b>View login attempts, usage of leaked credentials, and identify account takeover attacks</p></li><li><p><b>Performance Monitoring:</b> Identify slow hosts and paths on your origin server, and view <a href="https://blog.cloudflare.com/ttfb-is-not-what-it-used-to-be/">time to first byte (TTFB)</a> metrics over time</p></li><li><p><b>Security Monitoring:</b> monitor attack distribution across top hosts and paths, correlate DDoS traffic with origin Response time to understand the impact of DDoS attacks.</p></li></ul>
          <figure>
          <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/3PO726Rhjol9khGOdMMnQJ/55462052782974b0fc5b0c885e42e41b/BLOG-2838_3.png" />
          </figure>
    <div>
      <h3>Investigate and troubleshoot issues with Log Search </h3>
      <a href="#investigate-and-troubleshoot-issues-with-log-search">
        
      </a>
    </div>
    <p>Continuing with the example from the prior section, after successfully diagnosing that some machines were compromised through the RCE issue, analysts can pivot over to Log Search in order to investigate whether the attacker was able to access and compromise other internal systems. To do that, the analyst could search logs from Zero Trust services, using context, such as compromised IP addresses from the custom dashboard, shown in the screenshot below: </p>
          <figure>
          <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/4iPrTc1ZtLU4ZxQWojvmje/d09bb0bf25bd17cea1d2f955371d991e/BLOG-2838_4.png" />
          </figure><p>Log Search is a streamlined experience including data type-aware search filters, or the ability to switch to a custom SQL interface for more powerful queries. Log searches are also available via a <a href="https://developers.cloudflare.com/logs/log-explorer/"><u>public API</u></a>. </p>
          <figure>
          <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/4AytV9wASU5kUuThnhl0CQ/de8c9f4b829e1ccfebdb33bd9522ae5b/BLOG-2838_5.png" />
          </figure>
    <div>
      <h3>Save time and collaborate with saved queries</h3>
      <a href="#save-time-and-collaborate-with-saved-queries">
        
      </a>
    </div>
    <p>Queries built in Log Search can now be saved for repeated use and are accessible to other Log Explorer users in your account. This makes it easier than ever to investigate issues together. </p>
          <figure>
          <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/4ouInu3nk7iZnAcJAs39F8/cc7ca6a61d19d3d9c1371ad2ca87e913/BLOG-2838_6.png" />
          </figure>
    <div>
      <h3>Monitor proactively with Custom Alerting (coming soon)</h3>
      <a href="#monitor-proactively-with-custom-alerting-coming-soon">
        
      </a>
    </div>
    <p>With custom alerting, you can configure custom alert policies in order to proactively monitor the indicators that are important to your business. </p><p>Starting from Log Search, define and test your query. From here you can opt to save and configure a schedule interval and alerting policy. The query will run automatically on the schedule you define.</p>
    <div>
      <h4>Tracking error rate for a custom hostname</h4>
      <a href="#tracking-error-rate-for-a-custom-hostname">
        
      </a>
    </div>
    <p>If you want to monitor the error rate for a particular host, you can use this Log Search query to calculate the error rate per time interval:</p>
            <pre><code>SELECT SUBSTRING(EdgeStartTimeStamp, 1, 14) || '00:00' AS time_interval,
       COUNT() AS total_requests,
       COUNT(CASE WHEN EdgeResponseStatus &gt;= 500 THEN 1 ELSE NULL END) AS error_requests,
       COUNT(CASE WHEN EdgeResponseStatus &gt;= 500 THEN 1 ELSE NULL END) * 100.0 / COUNT() AS error_rate_percentage
 FROM http_requests
WHERE EdgeStartTimestamp &gt;= '2025-06-09T20:56:58Z'
  AND EdgeStartTimestamp &lt;= '2025-06-10T21:26:58Z'
  AND ClientRequestHost = 'customhostname.com'
GROUP BY time_interval
ORDER BY time_interval ASC;
</code></pre>
            <p>Running the above query returns the following results. You can see the overall error rate percentage in the far right column of the query results.</p>
          <figure>
          <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/5v8SNmHt4OJrLSkiM2EKtJ/182c7f5709eef1fbb9e93c5423fc1bae/BLOG-2838_7.png" />
          </figure>
    <div>
      <h4>Proactively detect malware</h4>
      <a href="#proactively-detect-malware">
        
      </a>
    </div>
    <p>We can identify malware in the environment by monitoring logs from <a href="https://developers.cloudflare.com/cloudflare-one/policies/gateway/">Cloudflare Secure Web Gateway</a>. As an example, <a href="https://www.broadcom.com/support/security-center/protection-bulletin/new-katz-stealer-malware-as-a-service-compromises-web-browsers"><u>Katz Stealer</u></a> is malware-as-a-service designed for stealing credentials. We can monitor DNS queries and HTTP requests from users within the company in order to identify any machines that may be infected with Katz Stealer malware. </p>
          <figure>
          <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/7jgBTCWYpnWoNrFh8xe6ki/306e644ec3753976315c16c9d1560eec/BLOG-2838_8.png" />
          </figure>
          <figure>
          <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/7jFwBxsk8rfD3VLAYkG2iA/ebd2ebcd95d12b40978f22bf1bc7be39/BLOG-2838_9.png" />
          </figure><p>And with custom alerts, you can configure an alert policy so that you can be notified via webhook or PagerDuty.</p>
    <div>
      <h3>Maintain audit &amp; compliance with flexible retention (coming soon)</h3>
      <a href="#maintain-audit-compliance-with-flexible-retention-coming-soon">
        
      </a>
    </div>
    <p>With flexible retention, you can set the precise length of time you want to store your logs, allowing you to meet specific compliance and audit requirements with ease. Other providers require archiving or hot and cold storage, making it difficult to query older logs. Log Explorer is built on top of our R2 storage tier, so historical logs can be queried as easily as current logs. </p>
    <div>
      <h3>How we built Log Explorer to run at Cloudflare scale</h3>
      <a href="#how-we-built-log-explorer-to-run-at-cloudflare-scale">
        
      </a>
    </div>
    <p>With Log Explorer, we have built a scalable log storage platform on top of <a href="https://www.cloudflare.com/developer-platform/products/r2/"><u>Cloudflare R2</u></a> that lets you efficiently search your Cloudflare logs using familiar SQL queries. In this section, we’ll look into how we did this and how we solved some technical challenges along the way.

Log Explorer consists of three components: ingestors, compactors, and queriers. Ingestors are responsible for writing logs from Cloudflare’s data pipeline to R2. Compactors optimize storage files, so they can be queried more efficiently. Queriers execute SQL queries from users by fetching, transforming, and aggregating matching logs from R2.</p>
          <figure>
          <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/1qEH0futV2are5GnT6vjta/e50c0ec4bbb1cacada117d31b71502e2/BLOG-2838_10.png" />
          </figure><p>During ingestion, Log Explorer writes each batch of log records to a Parquet file in R2. <a href="https://parquet.apache.org/"><u>Apache Parquet</u></a> is an open-source columnar storage file format, and it was an obvious choice for us: it’s optimized for efficient data storage and retrieval, such as by embedding metadata like the minimum and maximum values of each column across the file which enables the queriers to quickly locate the data needed to serve the query.</p><p>Log Explorer stores logs on a per-customer level, just like Cloudflare D1, so that your data isn't mixed with that of other customers. In Q3 2025, per-customer logs will allow you the flexibility to create your own retention policies and decide in which regions you want to store your data.

But how does Log Explorer find those Parquet files when you query your logs? Log Explorer leverages the <a href="https://databricks.com/wp-content/uploads/2020/08/p975-armbrust.pdf"><u>Delta Lake</u></a> open table format to provide a database table abstraction atop R2 object storage. A table in Delta Lake pairs data files in Parquet format with a transaction log. The transaction log registers every addition, removal, or modification of a data file for the table – it’s stored right next to the data files in R2.</p><p>Given a SQL query for a particular log dataset such as <a href="https://developers.cloudflare.com/logs/reference/log-fields/zone/http_requests/"><u>HTTP Requests</u></a> or <a href="https://developers.cloudflare.com/logs/reference/log-fields/account/gateway_dns/"><u>Gateway DNS</u></a>, Log Explorer first has to load the transaction log of the corresponding Delta table from R2. Transaction logs are checkpointed periodically to avoid having to read the entire table history every time a user queries their logs.</p><p>Besides listing Parquet files for a table, the transaction log also includes per-column min/max statistics for each Parquet file. This has the benefit that Log Explorer only needs to fetch files from R2 that can possibly satisfy a user query. Finally, queriers use the min/max statistics embedded in each Parquet file to decide which row groups to fetch from the file.</p><p>Log Explorer processes SQL queries using <a href="https://arrow.apache.org/datafusion/"><u>Apache DataFusion</u></a>, a fast, extensible query engine written in Rust, and <a href="https://github.com/delta-io/delta-rs"><u>delta-rs</u></a>, a community-driven Rust implementation of the Delta Lake protocol. While standing on the shoulders of giants, our team had to solve some unique problems to provide log search at Cloudflare scale.</p><p>Log Explorer ingests logs from across Cloudflare’s vast global network, <a href="https://www.cloudflare.com/network"><u>spanning more than 330 cities in over 125 countries</u></a>. If Log Explorer were to write logs from our servers straight to R2, its storage would quickly fragment into a myriad of small files, rendering log queries prohibitively expensive.</p><p>Log Explorer’s strategy to avoid this fragmentation is threefold. First, it leverages Cloudflare’s data pipeline, which collects and batches logs from the edge, ultimately buffering each stream of logs in an internal system named <a href="https://blog.cloudflare.com/cloudflare-incident-on-november-14-2024-resulting-in-lost-logs/"><u>Buftee</u></a>. Second, log batches ingested from Buftee aren’t immediately committed to the transaction log; rather, Log Explorer stages commits for multiple batches in an intermediate area and “squashes” these commits before they’re written to the transaction log. Third, once log batches have been committed, a process called compaction merges them into larger files in the background.</p><p>While the open-source implementation of Delta Lake provides compaction out of the box, we soon encountered an issue when using it for our workloads. Stock compaction merges data files to a desired target size S by sorting the files in reverse order of their size and greedily filling bins of size S with them. By merging logs irrespective of their timestamps, this process distributed ingested batches randomly across merged files, destroying data locality. Despite compaction, a user querying for a specific time frame would still end up fetching hundreds or thousands of files from R2.</p><p>For this reason, we wrote a custom compaction algorithm that merges ingested batches in order of their minimum log timestamp, leveraging the min/max statistics mentioned previously. This algorithm reduced the number of overlaps between merged files by two orders of magnitude. As a result, we saw a significant improvement in query performance, with some large queries that had previously taken over a minute completing in just a few seconds.</p>
    <div>
      <h3>Follow along for more updates</h3>
      <a href="#follow-along-for-more-updates">
        
      </a>
    </div>
    <p>We're just getting started! We're actively working on even more powerful features to further enhance your experience with Log Explorer. <a href="https://blog.cloudflare.com/"><u>Subscribe to the blog</u></a> and keep an eye out for more updates in our <a href="https://developers.cloudflare.com/changelog/"><u>Change Log</u></a> to our observability and forensics offering soon.</p>
    <div>
      <h3>Get access to Log Explorer</h3>
      <a href="#get-access-to-log-explorer">
        
      </a>
    </div>
    <p>To get started with Log Explorer, <a href="https://www.cloudflare.com/application-services/products/log-explorer/">sign up here</a> or contact your account manager. You can also read more in our  <a href="https://developers.cloudflare.com/logs/log-explorer/"><u>Developer Documentation</u></a>.</p> ]]></content:encoded>
            <category><![CDATA[undefined]]></category>
            <category><![CDATA[SIEM]]></category>
            <category><![CDATA[Product News]]></category>
            <category><![CDATA[Connectivity Cloud]]></category>
            <category><![CDATA[Analytics]]></category>
            <category><![CDATA[Developer Platform]]></category>
            <category><![CDATA[Developers]]></category>
            <category><![CDATA[Security]]></category>
            <guid isPermaLink="false">kg7dxMzYcRnJdVFrxQmCw</guid>
            <dc:creator>Jen Sells</dc:creator>
            <dc:creator>Claudio Jolowicz</dc:creator>
        </item>
        <item>
            <title><![CDATA[Cloudflare enables native monitoring and forensics with Log Explorer and custom dashboards]]></title>
            <link>https://blog.cloudflare.com/monitoring-and-forensics/</link>
            <pubDate>Tue, 18 Mar 2025 13:00:00 GMT</pubDate>
            <description><![CDATA[ Today we are excited to announce support for Zero Trust datasets, and custom dashboards where customers can monitor critical metrics for suspicious or unusual activity.  ]]></description>
            <content:encoded><![CDATA[ <p>In 2024, we <a href="https://blog.cloudflare.com/log-explorer/"><u>announced Log Explorer</u></a>, giving customers the ability to store and query their HTTP and security event logs natively within the Cloudflare network. Today, we are excited to announce that Log Explorer now supports logs from our Zero Trust product suite. In addition, customers can create custom dashboards to monitor suspicious or unusual activity.</p><p>Every day, Cloudflare detects and protects customers against billions of threats, including DDoS attacks, bots, web application exploits, and more. SOC analysts, who are charged with keeping their companies safe from the growing spectre of Internet threats, may want to investigate these threats to gain additional insights on attacker behavior and protect against future attacks. Log Explorer, by collecting logs from various Cloudflare products, provides a single starting point for investigations. As a result, analysts can avoid forwarding logs to other tools, maximizing productivity and minimizing costs. Further, analysts can monitor signals specific to their organizations using custom dashboards.</p>
    <div>
      <h2>Zero Trust dataset support in Log Explorer</h2>
      <a href="#zero-trust-dataset-support-in-log-explorer">
        
      </a>
    </div>
    <p>Log Explorer stores your Cloudflare logs for a 30-day retention period so that you can analyze them natively and in a single interface, within the Cloudflare Dashboard. Cloudflare log data is diverse, reflecting the breadth of capabilities available.  For example, HTTP requests contain information about the client such as their IP address, request method, <a href="https://www.cloudflare.com/learning/network-layer/what-is-an-autonomous-system/"><u>autonomous system (ASN)</u></a>, request paths, and TLS versions used. Additionally, Cloudflare’s Application Security <a href="https://developers.cloudflare.com/waf/detections/"><u>WAF Detections</u></a> enrich these HTTP request logs with additional context, such as the <a href="https://developers.cloudflare.com/waf/detections/attack-score/"><u>WAF attack score</u></a>, to identify threats.</p><p>Today we are announcing that seven additional Cloudflare product datasets are now available in Log Explorer. These seven datasets are the logs generated from our Zero Trust product suite, and include logs from <a href="https://developers.cloudflare.com/logs/reference/log-fields/account/access_requests/"><u>Access</u></a>, <a href="https://developers.cloudflare.com/logs/reference/log-fields/account/gateway_dns/"><u>Gateway DNS</u></a>, <a href="https://developers.cloudflare.com/logs/reference/log-fields/account/gateway_http/"><u>Gateway HTTP</u></a>, <a href="https://developers.cloudflare.com/logs/reference/log-fields/account/gateway_network/"><u>Gateway Network</u></a>, <a href="https://developers.cloudflare.com/logs/reference/log-fields/account/casb_findings/"><u>CASB</u></a>, <a href="https://developers.cloudflare.com/logs/reference/log-fields/account/zero_trust_network_sessions/"><u>Zero </u></a></p><p><a href="https://developers.cloudflare.com/logs/reference/log-fields/account/zero_trust_network_sessions/"><u>Trust Network Session</u></a>, and <a href="https://developers.cloudflare.com/logs/reference/log-fields/account/device_posture_results/"><u>Device Posture Results</u></a>. Read on for examples of how to use these logs to identify common threats.</p>
    <div>
      <h3>Investigating unauthorized access</h3>
      <a href="#investigating-unauthorized-access">
        
      </a>
    </div>
    <p>By reviewing Access logs and HTTP request logs, we can reveal attempts to access resources or systems without proper permissions, including brute force password attacks, indicating potential security breaches or malicious activity.</p><p>Below, we filter Access Logs on the <code>Allowed</code> field, to see activity related to unauthorized access.</p>
          <figure>
          <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/2piOIdnNz9OWskJqrZJfcf/f88673fc184c23de493920661020e7b3/access_requests.png" />
          </figure><p>By then reviewing the HTTP logs for the requests identified in the previous query, we can assess if bot networks are the source of unauthorized activity.</p>
          <figure>
          <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/4b38nYNdpLbmHFt0BHkapa/88e1acf82d8bbc257a7cbbe102cbd723/http_requests.png" />
          </figure><p>With this information, you can craft targeted <a href="https://developers.cloudflare.com/waf/custom-rules/"><u>Custom Rules</u></a> to block the offending traffic. </p>
    <div>
      <h3>Detecting malware</h3>
      <a href="#detecting-malware">
        
      </a>
    </div>
    <p>Cloudflare's <a href="https://developers.cloudflare.com/cloudflare-one/policies/gateway/"><u>Web Gateway</u></a> can track which websites users are accessing, allowing administrators to identify and block access to malicious or inappropriate sites. These logs can be used to detect if a user’s machine or account is compromised by malware attacks. When reviewing logs, this may become apparent when we look for records that show a rapid succession of attempts to browse known malicious sites, such as hostnames that have long strings of seemingly random characters that hide their true destination. In this example, we can query logs looking for requests to a spoofed YouTube URL.</p>
          <figure>
          <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/5Nkm4udjUw9tmzPk0Fk1eK/524dc1a6d4070a1f6cc9478e09b67ffd/gateway_requests.png" />
          </figure>
    <div>
      <h2>Monitoring what matters using custom dashboards</h2>
      <a href="#monitoring-what-matters-using-custom-dashboards">
        
      </a>
    </div>
    <p>Security monitoring is not one size fits all. For instance, companies in the retail or financial industries worry about fraud, while every company is concerned about data exfiltration, of information like trade secrets. And any form of personally identifiable information (PII) is a target for data breaches or ransomware attacks.</p><p>While log exploration helps you react to threats, our new custom dashboards allow you to define the specific metrics you need in order to monitor threats you are concerned about. </p><p>Getting started is easy, with the ability to create a chart using natural language. A natural language interface is integrated into the chart create/edit experience, enabling you to describe in your own words the chart you want to create. Similar to the <a href="https://blog.cloudflare.com/security-analytics-ai-assistant/"><u>AI Assistant</u></a> we announced during Security Week 2024, the prompt translates your language to the appropriate chart configuration, which can then be added to a new or existing custom dashboard.</p><ul><li><p><b>Use a prompt</b>: Enter a query like “Compare status code ranges over time”. The AI model decides the most appropriate visualization and constructs your chart configuration.</p></li><li><p><b>Customize your chart</b>: Select the chart elements manually, including the chart type, title, dataset to query, metrics, and filters. This option gives you full control over your chart’s structure. </p></li></ul><div>
  
</div>
<br /><p><sup><i>Video shows entering a natural language description of desired metric “compare status code ranges over time”, preview chart shown is a time series grouped by error code ranges, selects “add chart” to save to dashboard.</i></sup></p><p>For more help getting started, we have some pre-built templates that you can use for monitoring specific uses. Available templates currently include: </p><ul><li><p><b>Bot monitoring</b>: Identify automated traffic accessing your website</p></li><li><p><b>API Security:</b> Monitor the data transfer and exceptions of API endpoints within your application</p></li><li><p><b>API Performance</b>: See timing data for API endpoints in your application, along with error rates</p></li><li><p><b>Account Takeover:</b> View login attempts, usage of leaked credentials, and identify account takeover attacks</p></li><li><p><b>Performance Monitoring</b>: Identify slow hosts and paths on your origin server, and view <a href="https://blog.cloudflare.com/ttfb-is-not-what-it-used-to-be/"><u>time to first byte (TTFB)</u></a> metrics over time</p></li></ul><p>Templates provide a good starting point, and once you create your dashboard, you can add or remove individual charts using the same natural language chart creator. </p><div>
  
</div>
<br /><p><sup><i>Video shows editing chart from an existing dashboard and moving individual charts via drag and drop.</i></sup></p>
    <div>
      <h3>Example use cases</h3>
      <a href="#example-use-cases">
        
      </a>
    </div>
    <p>Custom dashboards can be used to monitor for suspicious activity, or to keep an eye on performance and errors for your domains. Let’s explore some examples of suspicious activity that we can monitor using custom dashboards.</p><p>Take, for example, our use case from above: investigating unauthorized access. With custom dashboards, you can create a dashboard using the <b>Account takeover</b> template to monitor for suspicious login activity related to your domain.</p>
          <figure>
          <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/72KBaEdr0bEn4SNwKOfPfJ/e28997b94630cf856d3924e9ba443063/image7.png" />
          </figure><p>As another example, spikes in requests or errors are common indicators that something is wrong, and they can sometimes be signals of suspicious activity. With the Performance Monitoring template, you can view origin response time and time to first byte metrics as well as monitor for common errors. For example, in this chart, the spikes in 404 errors could be an indication of an unauthorized scan of your endpoints.</p>
          <figure>
          <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/3krBxVm8dB5pr5XEoHnVtK/44f682436c3d5a63baa1105987347433/image1.jpg" />
          </figure>
    <div>
      <h3>Seamlessly integrated into the Cloudflare platform</h3>
      <a href="#seamlessly-integrated-into-the-cloudflare-platform">
        
      </a>
    </div>
    <p>When using custom dashboards, if you observe a traffic pattern or spike in errors that you would like to further investigate, you can click the button to “View in Security Analytics” in order to drill down further into the data and craft custom WAF rules to mitigate the threat.  </p>
          <figure>
          <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/5XfvQ24bvDmnNKeInyA8eU/e96798a72e55fa454439f8b85197e02b/image2.png" />
          </figure><p>These tools, seamlessly integrated into the Cloudflare platform, will enable users to discover, investigate, and mitigate threats all in one place, reducing time to resolution and overall cost of ownership by eliminating the need to forward logs to third party security analysis tools. And because it is a native part of Cloudflare, you can immediately use the data from your investigation to craft targeted rules that will block these threats. </p>
    <div>
      <h2>What’s next</h2>
      <a href="#whats-next">
        
      </a>
    </div>
    <p>Stay tuned as we continue to develop more capabilities in the areas of <a href="https://www.cloudflare.com/learning/performance/what-is-observability/">observability and forensics</a>, with additional features including: </p><ul><li><p><b>Custom alerts</b>: create alerts based on specific metrics or anomalies</p></li><li><p><b>Scheduled query detections</b>: craft log queries and run them on a schedule to detect malicious activity</p></li><li><p><b>More integration</b>: further streamlining the journey between detect, investigate, and mitigate across the full Cloudflare platform.</p></li></ul>
    <div>
      <h2>How to get it</h2>
      <a href="#how-to-get-it">
        
      </a>
    </div>
    <p>Current Log Explorer beta users get immediate access to the new custom dashboards feature. Pricing will be made available to everyone during Q2 2025. Between now and then, these features continue to be available at no cost.</p><p>Let us know if you are interested in joining our Beta program by completing <a href="https://www.cloudflare.com/lp/log-explorer/"><u>this form</u></a>, and a member of our team will contact you.</p>
    <div>
      <h2>Watch on Cloudflare TV</h2>
      <a href="#watch-on-cloudflare-tv">
        
      </a>
    </div>
    <div>
  
</div><p></p> ]]></content:encoded>
            <category><![CDATA[Security Week]]></category>
            <category><![CDATA[Analytics]]></category>
            <category><![CDATA[Logs]]></category>
            <category><![CDATA[Security]]></category>
            <category><![CDATA[undefined]]></category>
            <category><![CDATA[SIEM]]></category>
            <category><![CDATA[Product News]]></category>
            <category><![CDATA[Connectivity Cloud]]></category>
            <guid isPermaLink="false">76XBFojN0mhfyCoz6VRe1G</guid>
            <dc:creator>Jen Sells</dc:creator>
        </item>
        <item>
            <title><![CDATA[Log Explorer: monitor security events without third-party storage]]></title>
            <link>https://blog.cloudflare.com/log-explorer/</link>
            <pubDate>Fri, 08 Mar 2024 14:05:00 GMT</pubDate>
            <description><![CDATA[ With the combined power of Security Analytics + Log Explorer, security teams can analyze, investigate, and monitor for security attacks natively within Cloudflare, reducing time to resolution and overall cost of ownership for customers by eliminating the need to forward logs to third-party SIEMs ]]></description>
            <content:encoded><![CDATA[ <p></p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/1GhVBYNZAsGZtOfgo8C3VY/42fc180d060574162071cbdd13ad6a88/image6-6.png" />
            
            </figure><p>Today, we are excited to announce beta availability of <a href="https://developers.cloudflare.com/logs/log-explorer/">Log Explorer</a>, which allows you to investigate your HTTP and Security Event logs directly from the Cloudflare Dashboard. Log Explorer is an extension of <a href="/security-analytics">Security Analytics</a>, giving you the ability to review related raw logs. You can analyze, investigate, and monitor for security attacks natively within the Cloudflare Dashboard, reducing time to resolution and overall cost of ownership by eliminating the need to forward logs to third party security analysis tools.</p>
    <div>
      <h3>Background</h3>
      <a href="#background">
        
      </a>
    </div>
    <p>Security Analytics enables you to analyze all of your HTTP traffic in one place, giving you the security lens you need to identify and act upon what matters most: potentially malicious traffic that has not been mitigated. Security Analytics includes built-in views such as top statistics and in-context quick filters on an intuitive page layout that enables rapid exploration and validation.</p><p>In order to power our rich analytics dashboards with fast query performance, we implemented <a href="https://developers.cloudflare.com/analytics/graphql-api/sampling/">data sampling</a> using <a href="/explaining-cloudflares-abr-analytics">Adaptive Bit Rate</a> (ABR) analytics. This is a great fit for providing high level aggregate views of the data. However, we received feedback from many Security Analytics power users that sometimes they need access to a more granular view of the data — they need logs.</p><p>Logs provide critical visibility into the operations of today's computer systems. Engineers and SOC analysts rely on logs every day to troubleshoot issues, identify and investigate security incidents, and tune the performance, reliability, and <a href="https://www.cloudflare.com/application-services/solutions/">security</a> of their applications and infrastructure. Traditional metrics or monitoring solutions provide aggregated or statistical data that can be used to identify trends. Metrics are wonderful at identifying THAT an issue happened, but lack the detailed events to help engineers uncover WHY it happened. Engineers and SOC Analysts rely on raw log data to answer questions such as:</p><ul><li><p>What is causing this increase in 403 errors?</p></li><li><p>What data was accessed by this IP address?</p></li><li><p>What was the user experience of this particular user’s session?</p></li></ul><p>Traditionally, these engineers and analysts would stand up a collection of various monitoring tools in order to capture logs and get this visibility. With more organizations using multiple clouds, or a hybrid environment with both cloud and on-premise tools and architecture, it is crucial to have a unified platform to regain visibility into this increasingly complex environment.  As more and more companies are moving towards a cloud native architecture, we see Cloudflare’s <a href="https://www.cloudflare.com/en-gb/learning/cloud/what-is-a-connectivity-cloud/">connectivity cloud</a> as an integral part of their performance and security strategy.</p><p>Log Explorer provides a lower cost option for storing and exploring log data within Cloudflare. Until today, we have offered the ability to export logs to expensive third party tools, and now with Log Explorer, you can quickly and easily explore your log data without leaving the Cloudflare Dashboard.</p>
    <div>
      <h3>Log Explorer Features</h3>
      <a href="#log-explorer-features">
        
      </a>
    </div>
    <p>Whether you're a SOC Engineer investigating potential incidents, or a Compliance Officer with specific log retention requirements, Log Explorer has you covered. It stores your Cloudflare logs for an uncapped and customizable period of time, making them accessible natively within the Cloudflare Dashboard. The supported features include:</p><ul><li><p>Searching through your HTTP Request or Security Event logs</p></li><li><p>Filtering based on any field and a number of standard operators</p></li><li><p>Switching between basic filter mode or SQL query interface</p></li><li><p>Selecting fields to display</p></li><li><p>Viewing log events in tabular format</p></li><li><p>Finding the HTTP request records associated with a Ray ID</p></li></ul>
    <div>
      <h3>Narrow in on unmitigated traffic</h3>
      <a href="#narrow-in-on-unmitigated-traffic">
        
      </a>
    </div>
    <p>As a SOC analyst, your job is to monitor and respond to threats and incidents within your organization’s network. Using Security Analytics, and now with Log Explorer, you can identify anomalies and conduct a forensic investigation all in one place.</p><p>Let’s walk through an example to see this in action:</p><p>On the Security Analytics dashboard, you can see in the Insights panel that there is some traffic that has been tagged as a likely attack, but not mitigated.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/5Oq3oqY8JXMigK8OJKPFZ4/5d3a8751a56f06f58e96538f1d46a480/Screenshot-2024-03-07-at-20.20.41.png" />
            
            </figure><p>Clicking the filter button narrows in on these requests for further investigation.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/7sWkjUYz1J0So4nphy4FSs/769d5ebb0b706a073a616b706783030c/image11.jpg" />
            
            </figure><p>In the sampled logs view, you can see that most of these requests are coming from a common client IP address.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/5gtrP14GKbnB0YeV0ySgL1/6b476ec9d19e255912315eac9730604d/Sampled-logs.png" />
            
            </figure><p>You can also see that Cloudflare has flagged all of these requests as bot traffic. With this information, you can craft a WAF rule to either block all traffic from this IP address, or block all traffic with a bot score lower than 10.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/2YgFTRD7u3KYh0bbInLylK/f076a70bc09f41d8ace0569bca172b39/Screenshot-2024-03-07-at-20.22.04.png" />
            
            </figure><p>Let’s say that the Compliance Team would like to gather documentation on the scope and impact of this attack. We can dig further into the logs during this time period to see everything that this attacker attempted to access.</p><p>First, we can use Log Explorer to query HTTP requests from the suspect IP address during the time range of the spike seen in Security Analytics.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/qsi7UnxjtygCQHnMCIx02/cda0aacf6d783b05c15196f27907c611/Log-Explorer.png" />
            
            </figure>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/YyewPADkPzjofXHXSihyi/e494ca5d4b9a3ad7071d5d3e27f57887/Query-results.png" />
            
            </figure><p>We can also review whether the attacker was able to <a href="https://www.cloudflare.com/learning/security/what-is-data-exfiltration/">exfiltrate</a> data by adding the OriginResponseBytes field and updating the query to show requests with OriginResponseBytes &gt; 0. The results show that no data was exfiltrated.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/3TgN2aPwWTDo5niA95TGQx/71143be92550dfea1ce284507fa688ac/No-logs-found.png" />
            
            </figure>
    <div>
      <h3>Find and investigate false positives</h3>
      <a href="#find-and-investigate-false-positives">
        
      </a>
    </div>
    <p>With access to the full logs via Log Explorer, you can now perform a search to find specific requests.</p><p>A 403 error occurs when a user’s request to a particular site is blocked. Cloudflare’s security products use things like <a href="/introducing-ip-lists/">IP reputation</a> and <a href="/stop-attacks-before-they-are-known-making-the-cloudflare-waf-smarter/">WAF attack scores</a> based on ML technologies in order to assess whether a given HTTP request is malicious. This is extremely effective, but sometimes requests are mistakenly flagged as malicious and blocked.</p><p>In these situations, we can now use Log Explorer to identify these requests and why they were blocked, and then adjust the relevant WAF rules accordingly.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/5ecKfF4lRDQSyLCoOgDjTw/6f0872962caff8d719958e6fdfcc8dbc/Log-Explorer-2.png" />
            
            </figure><p>Or, if you are interested in tracking down a specific request by Ray ID, an identifier given to every request that goes through Cloudflare, you can do that via Log Explorer with one query.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/3NCDOP0axFC3wZ4qs03Jei/86286da2fd9fca3cdb68214c1d0f472a/Log-Explorer-3.png" />
            
            </figure><p>Note that the LIMIT clause is included in the query by default, but has no impact on RayID queries as RayID is unique and only one record would be returned when using the RayID filter field.</p>
    <div>
      <h3>How we built Log Explorer</h3>
      <a href="#how-we-built-log-explorer">
        
      </a>
    </div>
    
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/3pNBd5iSyVW7aqfJ0JfYDP/89cd0341cc35ad9d86fd8687ba7a9147/How-we-built-Log-Explorer.png" />
            
            </figure><p>With Log Explorer, we have built a long-term, append-only log storage platform on top of <a href="https://www.cloudflare.com/developer-platform/r2/">Cloudflare R2</a>. Log Explorer leverages the <a href="https://databricks.com/wp-content/uploads/2020/08/p975-armbrust.pdf">Delta Lake</a> protocol, an open-source storage framework for building highly performant, <a href="https://en.wikipedia.org/wiki/ACID">ACID</a>-compliant databases atop a cloud object store. In other words, Log Explorer combines a large and cost-effective storage system – <a href="www.cloudflare.com/developer-platform/r2/">Cloudflare R2</a> – with the benefits of strong consistency and high performance. Additionally, Log Explorer gives you a SQL interface to your Cloudflare logs.</p><p>Each Log Explorer dataset is stored on a per-customer level, just like Cloudflare D1, so that your data isn't placed with that of other customers. In the future, this single-tenant storage model will give you the flexibility to create your own retention policies and decide in which regions you want to store your data.</p><p>Under the hood, the datasets for each customer are stored as Delta tables in R2 buckets. A <i>Delta table</i> is a storage format that organizes Apache Parquet objects into directories using Hive's partitioning naming convention. Crucially, Delta tables pair these storage objects with an append-only, checkpointed transaction log. This design allows Log Explorer to support multiple writers with optimistic concurrency.</p><p>Many of the products Cloudflare builds are a direct result of the challenges our own team is looking to address. Log Explorer is a perfect example of this <a href="/tag/dogfooding">culture of dogfooding</a>. Optimistic concurrent writes require atomic updates in the underlying object store, and as a result of our needs, R2 added a PutIfAbsent operation with strong consistency. Thanks, R2! The atomic operation sets Log Explorer apart from Delta Lake solutions based on Amazon Web Services’ S3, which incur the operational burden of using an <a href="https://delta.io/blog/2022-05-18-multi-cluster-writes-to-delta-lake-storage-in-s3/">external store</a> for synchronizing writes.</p><p>Log Explorer is written in the Rust programming language using open-source libraries, such as <a href="https://github.com/delta-io/delta-rs">delta-rs</a>, a native Rust implementation of the Delta Lake protocol, and <a href="https://arrow.apache.org/datafusion/">Apache Arrow DataFusion</a>, a very fast, extensible query engine. At Cloudflare, Rust has emerged as a popular choice for new product development due to its safety and performance benefits.</p>
    <div>
      <h3>What’s next</h3>
      <a href="#whats-next">
        
      </a>
    </div>
    <p>We know that application security logs are only part of the puzzle in understanding what’s going on in your environment. Stay tuned for future developments including tighter, more seamless integration between Analytics and Log Explorer, the addition of more datasets including Zero Trust logs, the ability to define custom retention periods, and integrated custom alerting.</p><p>Please use the <a href="https://forms.gle/tvKQDdXmCk98zyV9A">feedback link</a> to let us know how Log Explorer is working for you and what else would help make your job easier.</p>
    <div>
      <h3>How to get it</h3>
      <a href="#how-to-get-it">
        
      </a>
    </div>
    <p>We’d love to hear from you! Let us know if you are interested in joining our Beta program by completing <a href="https://cloudflare.com/lp/log-explorer/">this form</a> and a member of our team will contact you.</p><p>Pricing will be finalized prior to a General Availability (GA) launch.</p><div>
  
</div><p>Tune in for more news, announcements and thought-provoking discussions! Don't miss the full <a href="https://cloudflare.tv/shows/security-week">Security Week hub page</a>.</p> ]]></content:encoded>
            <category><![CDATA[Security Week]]></category>
            <category><![CDATA[Analytics]]></category>
            <category><![CDATA[Logs]]></category>
            <category><![CDATA[Security]]></category>
            <category><![CDATA[undefined]]></category>
            <category><![CDATA[SIEM]]></category>
            <category><![CDATA[Product News]]></category>
            <category><![CDATA[Connectivity Cloud]]></category>
            <guid isPermaLink="false">3K5UjFarMC09kkM507HshK</guid>
            <dc:creator>Jen Sells</dc:creator>
            <dc:creator>Claudio Jolowicz</dc:creator>
            <dc:creator>Cole MacKenzie</dc:creator>
        </item>
        <item>
            <title><![CDATA[Cloudflare launches AI Assistant for Security Analytics]]></title>
            <link>https://blog.cloudflare.com/security-analytics-ai-assistant/</link>
            <pubDate>Mon, 04 Mar 2024 14:00:29 GMT</pubDate>
            <description><![CDATA[ Introducing AI Assistant for Security Analytics. Now it is easier than ever to get powerful insights about your web security. Use the new integrated natural language query interface to explore Security Analytics ]]></description>
            <content:encoded><![CDATA[ <p></p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/1XqHKAmbIZ4NBeFBXsnaFp/4d66b61833021c41d054dbfd5d8d23c6/AI-Assistant-for-Security-AnalyticsNatural-Language.png" />
            
            </figure><p>Imagine you are in the middle of an attack on your most crucial production application, and you need to understand what’s going on. How happy would you be if you could simply log into the Dashboard and type a question such as: “Compare attack traffic between US and UK” or “Compare rate limiting blocks for automated traffic with rate limiting blocks from human traffic” and see a time series chart appear on your screen without needing to select a complex set of filters?</p><p>Today, we are introducing an AI assistant to help you query your security event data, enabling you to more quickly discover anomalies and potential security attacks. You can now use plain language to interrogate Cloudflare analytics and let us do the magic.</p>
    <div>
      <h2>What did we build?</h2>
      <a href="#what-did-we-build">
        
      </a>
    </div>
    <p>One of the big challenges when analyzing a spike in traffic or any anomaly in your traffic is to create filters that isolate the root cause of an issue. This means knowing your way around often complex dashboards and tools, knowing where to click and what to filter on.</p><p>On top of this, any traditional security dashboard is limited to what you can achieve by the way data is stored, how databases are indexed, and by what fields are allowed when creating filters. With our Security Analytics view, for example, it was difficult to compare time series with different characteristics. For example, you couldn’t compare the traffic from IP address x.x.x.x with automated traffic from Germany without opening multiple tabs to Security Analytics and filtering separately. From an engineering perspective, it would be extremely hard to build a system that allows these types of unconstrained comparisons.</p><p>With the AI Assistant, we are removing this complexity by leveraging our Workers AI platform to build a tool that can help you query your HTTP request and security event data and generate time series charts based on a request formulated with natural language. Now the AI Assistant does the hard work of figuring out the necessary filters and additionally can plot multiple series of data on a single graph to aid in comparisons. This new tool opens up a new way of interrogating data and logs, unconstrained by the restrictions introduced by traditional dashboards.</p><p>Now it is easier than ever to get powerful insights about your application security by using plain language to interrogate your data and better understand how Cloudflare is protecting your business. The new AI Assistant is located in the Security Analytics dashboard and works seamlessly with the existing filters. The answers you need are just a question away.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/65fA9saL0RoHlbErGhKGhL/53e0498be059490a3d442ea383136d8e/Screenshot-2024-02-29-at-13.35.32.png" />
            
            </figure>
    <div>
      <h2>What can you ask?</h2>
      <a href="#what-can-you-ask">
        
      </a>
    </div>
    <p>To demonstrate the capabilities of AI Assistant, we started by considering the questions that we ask ourselves every day when helping customers to deploy the best security solutions for their applications.</p><p>We’ve included some clickable examples in the dashboard to get you started.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/7lHKu9aErFzilFDlPId4lL/80a67fb2e3e558ea0a4626ff166fdbd3/ai-analytics.png" />
            
            </figure><p>You can use the AI Assistant to</p><ul><li><p>Identify the source of a spike in attack traffic by asking: “Compare attack traffic between US and UK”</p></li><li><p>Identify root cause of 5xx errors by asking: “Compare origin and edge 5xx errors”</p></li><li><p>See which browsers are most commonly used by your users by asking:”Compare traffic across major web browsers”</p></li><li><p>For an ecommerce site, understand what percentage of users visit vs add items to their shopping cart by asking: “Compare traffic between /api/login and /api/basket”</p></li><li><p>Identify bot attacks against your ecommerce site by asking: “Show requests to /api/basket with a bot score less than 20”</p></li><li><p>Identify the HTTP versions used by clients by asking: “Compare traffic by each HTTP version”</p></li><li><p>Identify unwanted automated traffic to specific endpoints by asking: “Show POST requests to /admin with a Bot Score over 30”</p></li></ul><div>
  
</div><p>You can start from these when exploring the AI Assistant.</p>
    <div>
      <h2>How does it work?</h2>
      <a href="#how-does-it-work">
        
      </a>
    </div>
    <p>Using Cloudflare’s powerful <a href="https://ai.cloudflare.com/">Workers AI</a> global network inference platform, we were able to use one of the off-the-shelf large language models (LLMs) offered on the platform to convert customer queries into GraphQL filters. By teaching an AI model about the available filters we have on our Security Analytics GraphQL dataset, we can have the AI model turn a request such as “<i>Compare attack traffic on /api and /admin endpoints</i>”  into a matching set of structured filters:</p>
            <pre><code>```
[
  {“name”: “Attack Traffic on /api”, “filters”: [{“key”: “clientRequestPath”, “operator”: “eq”, “value”: “/api”}, {“key”: “wafAttackScoreClass”, “operator”: “eq”, “value”: “attack”}]},
  {“name”: “Attack Traffic on /admin”, “filters”: [{“key”: “clientRequestPath”, “operator”: “eq”, “value”: “/admin”}, {“key”: “wafAttackScoreClass”, “operator”: “eq”, “value”: “attack”}]}
]
```</code></pre>
            <p>Then, using the filters provided by the AI model, we can make requests to our <a href="https://developers.cloudflare.com/analytics/graphql-api/">GraphQL APIs</a>, gather the requisite data, and plot a data visualization to answer the customer query.</p><p>By using this method, we are able to keep customer information private and avoid exposing any security analytics data to the AI model itself, while still allowing humans to query their data with ease. This ensures that your queries will never be used to train the model. And because Workers AI hosts a local instance of the LLM on Cloudflare’s own network, your queries and resulting data never leave Cloudflare’s network.</p>
    <div>
      <h2>Future Development</h2>
      <a href="#future-development">
        
      </a>
    </div>
    <p>We are in the early stages of developing this capability and plan to rapidly extend the capabilities of the Security Analytics AI Assistant. Don’t be surprised if we cannot handle some of your requests at the beginning. At launch, we are able to support basic inquiries that can be plotted in a time series chart such as “show me” or “compare” for any currently filterable fields.</p><p>However, we realize there are a number of use cases that we haven’t even thought of, and we are excited to release the Beta version of AI Assistant to all Business and Enterprise customers to let you test the feature and see what you can do with it. We would love to hear your feedback and learn more about what you find useful and what you would like to see in it next. With future versions, you’ll be able to ask questions such as “Did I experience any attacks yesterday?” and use AI to automatically generate WAF rules for you to apply to mitigate them.</p>
    <div>
      <h2>Beta availability</h2>
      <a href="#beta-availability">
        
      </a>
    </div>
    <p>Starting today, AI Assistant is available for a select few users and rolling out to all Business and Enterprise customers throughout March. Look out for it and try for free and let us know what you think by using the <a href="https://docs.google.com/forms/d/e/1FAIpQLSfKtXvPvKUZpjoKZa93ceTk_NAdRY4CF_PpjvFwZwa69o7i6A/viewform?entry.2073820081=Account%20security%20analytics">Feedback</a> link at the top of the Security Analytics page.</p><p>Final pricing will be determined prior to general availability.</p> ]]></content:encoded>
            <category><![CDATA[Security Week]]></category>
            <category><![CDATA[Security]]></category>
            <category><![CDATA[WAF]]></category>
            <category><![CDATA[Workers AI]]></category>
            <category><![CDATA[Product News]]></category>
            <category><![CDATA[Analytics]]></category>
            <category><![CDATA[AI]]></category>
            <category><![CDATA[Developer Platform]]></category>
            <category><![CDATA[Application Services]]></category>
            <guid isPermaLink="false">7rHa5ZDtie6BcqcvMDndWH</guid>
            <dc:creator>Jen Sells</dc:creator>
            <dc:creator>Harley Turan</dc:creator>
        </item>
        <item>
            <title><![CDATA[How Cloudflare instruments services using Workers Analytics Engine]]></title>
            <link>https://blog.cloudflare.com/using-analytics-engine-to-improve-analytics-engine/</link>
            <pubDate>Fri, 18 Nov 2022 14:00:00 GMT</pubDate>
            <description><![CDATA[ Learn how Cloudflare uses our own Workers Analytics Engine product to capture analytics about our own products! ]]></description>
            <content:encoded><![CDATA[ 
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/16jjPVsiGz8fzNxVYIgsbR/8c4c93497cd82108d0d74efaf52a96ad/image1-62.png" />
            
            </figure><p>Workers Analytics Engine is a new tool, <a href="/workers-analytics-engine/">announced earlier this year</a>, that enables developers and product teams to build time series analytics about anything, with high dimensionality, high cardinality, and effortless scaling. We built Analytics Engine for teams to gain insights into their code running in Workers, provide analytics to end customers, or even build usage based billing.</p><p>In this blog post we’re going to tell you about how we use Analytics Engine to build Analytics Engine. We’ve instrumented our own Analytics Engine SQL API using Analytics Engine itself and use this data to find bugs and prioritize new product features. We hope this serves as inspiration for other teams who are looking for ways to instrument their own products and gather feedback.</p>
    <div>
      <h3>Why do we need Analytics Engine?</h3>
      <a href="#why-do-we-need-analytics-engine">
        
      </a>
    </div>
    <p>Analytics Engine enables you to generate events (or “data points”) from Workers with <a href="https://developers.cloudflare.com/analytics/analytics-engine/get-started/">just a few lines of code</a>. Using the GraphQL or <a href="https://developers.cloudflare.com/analytics/analytics-engine/sql-api/">SQL API</a>, you can query these events and create useful insights about the business or technology stack. For more about how to get started using Analytics Engine, check out our <a href="https://developers.cloudflare.com/analytics/analytics-engine/">developer docs</a>.</p><p>Since we released the <a href="/analytics-engine-open-beta/">Analytics Engine open beta</a> in September, we’ve been adding new features at a rapid clip based on feedback from developers. However, we’ve had two big gaps in our visibility into the product.</p><p>First, our engineering team needs to answer <a href="https://www.cloudflare.com/learning/performance/what-is-observability/">classic observability questions</a>, such as: how many requests are we getting, how many of those requests result in errors, what are the nature of these errors, etc. They need to be able to view both aggregated data (like average error rate, or p99 response time) and drill into individual events.</p><p>Second, because this is a newly launched product, we are looking for product insights. By instrumenting the SQL API, we can understand the queries our customers write, and the errors they see, which helps us prioritize missing features.</p><p>We realized that Analytics Engine would be an amazing tool for both answering our technical observability questions, and also gathering product insight. That’s because we can log an event for every query to our SQL API, and then query for both aggregated performance issues as well as individual errors and queries that our customers run.</p><p>In the next section, we’re going to walk you through how we use Analytics Engine to monitor that API.</p>
    <div>
      <h2>Adding instrumentation to our SQL API</h2>
      <a href="#adding-instrumentation-to-our-sql-api">
        
      </a>
    </div>
    <p>The Analytics Engine SQL API lets you query events data in the same way you would an ordinary database. For decades, SQL has been the most common language for querying data. We wanted to provide an interface that allows you to immediately start asking questions about your data without having to learn a new query language.</p><p>Our SQL API parses user SQL queries, transforms and validates them, and then executes them against backend database servers. We then write information about the query back into Analytics Engine so that we can run our own analytics.Writing data into Analytics Engine from a Cloudflare Worker is very simple and <a href="https://developers.cloudflare.com/analytics/analytics-engine/get-started/">explained in our documentation</a>. We instrument our SQL API in the same way our users do, and this code excerpt shows the data we write into Analytics Engine:</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/L30suydy27OFKzv6ua9ML/c49d03afbb62a1e3df7229e6c30e087c/carbon--3--1.png" />
            
            </figure><p>With that data now being stored in Analytics Engine, we can then pull out insights about every field we’re reporting.</p>
    <div>
      <h2>Querying for insights</h2>
      <a href="#querying-for-insights">
        
      </a>
    </div>
    <p>Having our analytics in an SQL database gives you the freedom to write any query you might want. Compared to using something like metrics which are often predefined and purpose specific, you can define any custom dataset desired, and interrogate your data to ask new questions with ease.</p><p>We need to support datasets comprising trillions of data points. In order to accomplish this, we have implemented a sampling method called <a href="/explaining-cloudflares-abr-analytics/">Adaptive Bit Rate</a> (ABR). With ABR, if you have large amounts of data, your queries may be returned sampled events in order to respond in reasonable time. If you have more typical amounts of data, Analytics Engine will query all your data. This allows you to run any query you like and still get responses in a short length of time. Right now, you have to <a href="https://developers.cloudflare.com/analytics/analytics-engine/sql-api/#sampling">account for sampling in how you make your queries</a>, but we are exploring making it automatic.</p><p>Any data visualization tool can be used to visualize your analytics. At Cloudflare, we heavily use Grafana (<a href="https://developers.cloudflare.com/analytics/analytics-engine/grafana/">and you can too!</a>). This is particularly useful for observability use cases.</p>
    <div>
      <h3>Observing query response times</h3>
      <a href="#observing-query-response-times">
        
      </a>
    </div>
    <p>One query we pay attention to gives us information about the performance of our backend database clusters:</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/q8KADDDRyASR7nWPQHoKc/c633325aca2b64e464fc820abfd5e653/image2-45.png" />
            
            </figure><p>As you can see, the 99% percentile (corresponding to the 1% most complex queries to execute) sometimes spikes up to about 300ms. But on average our backend responds to queries within 100ms.</p><p>This visualization is itself generated from an SQL query:</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/6UchtUYBtnDuXwKO7afWUc/1a584c18a7ce7fb74bcc2599756ed6f7/carbon--2-.png" />
            
            </figure>
    <div>
      <h3>Customer insights from high-cardinality data</h3>
      <a href="#customer-insights-from-high-cardinality-data">
        
      </a>
    </div>
    <p>Another use of Analytics Engine is to draw insights out of customer behavior. Our SQL API is particularly well-suited for this, as you can take full advantage of the power of SQL. Thanks to our ABR technology, even expensive queries can be carried out against huge datasets.</p><p>We use this ability to help prioritize improvements to Analytics Engine. Our SQL API supports a fairly standard dialect of SQL but isn’t feature-complete yet. If a user tries to do something unsupported in an SQL query, they get back a structured error message. Those error messages are reported into Analytics Engine. We’re able to aggregate the kinds of errors that our customers encounter, which helps inform which features to prioritize next.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/AmDjwutzQH089GhoJHvzw/b734eaa557a88f2d513968f20f10f28a/image3-36.png" />
            
            </figure><p>The SQL API returns errors in the format of <code>type of error: more details</code>, and so we can take the first portion before the colon to give us the type of error. We group by that, and get a count of how many times that error happened and how many users it affected:</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/5Z1KYNLPlcb3rYTPJ9Fi8f/78ac0462fa7b5b1ae2db27d1dfd67d2b/Screenshot-2022-11-18-at-08.33.57.png" />
            
            </figure><p>To perform the above query using an ordinary metrics system, we would need to represent each error type with a different metric. Reporting that many metrics from each microservice creates scalability challenges. That problem doesn’t happen with Analytics Engine, because it’s designed to handle high-cardinality data.</p><p>Another big advantage of a high-cardinality store like Analytics Engine is that you can dig into specifics. If there’s a large spike in SQL errors, we may want to find which customers are having a problem in order to help them or identify what function they want to use. That’s easy to do with another SQL query:</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/5ZghZO2Jyk153qPnvS13Mk/05891f4f5db7f1b3247e615c7d2373e1/carbon-3.png" />
            
            </figure><p>Inside Cloudflare, we have historically relied on querying our backend database servers for this type of information. Analytics Engine’s SQL API now enables us to open up our technology to our customers, so they can easily gather insights about their services at any scale!</p>
    <div>
      <h2>Conclusion and what’s next</h2>
      <a href="#conclusion-and-whats-next">
        
      </a>
    </div>
    <p>The insights we gathered about usage of the SQL API are a super helpful input to our product prioritization decisions. We already added <a href="https://developers.cloudflare.com/analytics/analytics-engine/sql-reference/">support for <code>substring</code> and <code>position</code> functions</a> which were used in the visualizations above.</p><p>Looking at the top SQL errors, we see numerous errors related to selecting columns. These errors are mostly coming from some usability issues related to the Grafana plugin. Adding support for the DESCRIBE function should alleviate this because without this, the Grafana plugin doesn’t understand the table structure. This, as well as other improvements to our Grafana plugin, is on our roadmap.</p><p>We also can see that users are trying to query time ranges for older data that no longer exists. This suggests that our customers would appreciate having extended data retention. We’ve recently extended our retention from 31 to 92 days, and we will keep an eye on this to see if we should offer further extension.</p><p>We saw lots of errors related to common mistakes or misunderstandings of proper SQL syntax. This indicates that we could provide better examples or error explanations in our documentation to assist users with troubleshooting their queries.</p><p>Stay tuned into our <a href="https://developers.cloudflare.com/analytics/analytics-engine/">developer docs</a> to be informed as we continue to iterate and add more features!</p><p>You can start using Workers Analytics Engine Now! Analytics Engine is now in open beta with free 90-day retention. <a href="https://dash.cloudflare.com/?to=/:account/workers/analytics-engine">Start using it  today</a> or <a href="https://discord.gg/cloudflaredev">join our Discord community</a> to talk with the team.</p> ]]></content:encoded>
            <category><![CDATA[Developer Week]]></category>
            <category><![CDATA[Analytics]]></category>
            <category><![CDATA[Logs]]></category>
            <category><![CDATA[Data]]></category>
            <category><![CDATA[Cloudflare Workers]]></category>
            <category><![CDATA[Developers]]></category>
            <category><![CDATA[Developer Platform]]></category>
            <guid isPermaLink="false">5Vbtic7QOMABAMIJbPSm7v</guid>
            <dc:creator>Jen Sells</dc:creator>
            <dc:creator>Miki Mokrysz</dc:creator>
        </item>
        <item>
            <title><![CDATA[Store and process your Cloudflare Logs... with Cloudflare]]></title>
            <link>https://blog.cloudflare.com/announcing-logs-engine/</link>
            <pubDate>Tue, 15 Nov 2022 14:00:00 GMT</pubDate>
            <description><![CDATA[ Today we’re announcing Cloudflare Logs Engine — a new system that will enable you to do anything you need with Cloudflare Logs, all within Cloudflare ]]></description>
            <content:encoded><![CDATA[ 
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/11il4y7Tq0Uyuzfc3SlYMl/b8f346cf6f306b43c69f1aa07cf71f38/image1-32.png" />
            
            </figure><p>Millions of customers trust Cloudflare to accelerate their website, protect their network, or as a platform to build their own applications. But, once you’re running in production, how do you know what’s going on with your application? You need <a href="https://developers.cloudflare.com/logs/">logs from Cloudflare</a> – a record of what happened on our network when your customers interacted with your product that uses Cloudflare.</p><p>Cloudflare Logs are an indispensable tool for debugging applications, identifying security vulnerabilities, or just understanding how users are interacting with your product. However, our customers generate petabytes of logs, and store them for months or years at a time. Log data is tantalizing: all those answers, just waiting to be revealed with the right query! But until now, it’s been too hard for customers to actually store, search, and understand their logs without expensive and cumbersome third party tools.</p><p>Today we’re announcing Cloudflare Logs Engine: a new product to enable any kind of investigation with Cloudflare Logs — all within Cloudflare.</p><p>Starting today, Cloudflare customers who push their logs to R2 can retrieve them by time range and unique identifier. Over the coming months we want to enable customers to:</p><ul><li><p>Store logs for any Cloudflare dataset, for as long as you want, with a few clicks</p></li><li><p>Access logs no matter what plan you use, without relying on third party tools</p></li><li><p>Write queries that include multiple datasets</p></li><li><p>Quickly identify the logs you need and take action based on what you find</p></li></ul>
    <div>
      <h3>Why Cloudflare Logs?</h3>
      <a href="#why-cloudflare-logs">
        
      </a>
    </div>
    <p>When it comes to visibility into your traffic, most customers start with <i>analytics</i>. Cloudflare dashboard is full of analytics about all of our products, which give a high-level overview of what’s happening: for example, number of requests served, the ratio of cache hits, or the amount of CPU time used.</p><p>But sometimes, more detail is needed. Developers especially need to be able to read individual log lines to debug applications. For example, suppose you notice a problem where your application throws an error in an unexpected way – you need to know the cause of that error and see every request with that pattern.</p><p>Cloudflare offers tools like <a href="https://developers.cloudflare.com/logs/instant-logs/">Instant Logs</a> and <a href="https://developers.cloudflare.com/workers/wrangler/commands/#tail">wrangler tail</a> which excel at real-time debugging. These are incredibly helpful if you’re making changes on the fly, or if the problem occurs frequently enough that it will appear during your debugging session.</p><p>In other cases, you need to find that needle in a haystack — the one rare event that causes everything to go wrong. Or you might have identified a security issue and want to make sure you’ve identified <i>every</i> time that issue could have been exploited in your application’s history.</p><p>When this happens, you need logs. In particular, you need <i>forensics:</i> the ability to search the entire history of your logs.</p>
    <div>
      <h3>A brief overview of log analysis</h3>
      <a href="#a-brief-overview-of-log-analysis">
        
      </a>
    </div>
    <p>Before we take a look at Logs Engine itself, I want to briefly talk about alternatives – how have our customers been dealing with their logs so far?</p><p>Cloudflare has long offered <a href="https://developers.cloudflare.com/logs/logpull/">Logpull</a> and <a href="https://developers.cloudflare.com/logs/about/">Logpush</a>. Logpull enables enterprise customers to store their HTTP logs on Cloudflare for up to seven days, and retrieve them by either time or RayID. Logpush can send your Cloudflare logs just about anywhere on the Internet, quickly and reliably. While Logpush provides more flexibility, it’s been up to customers to actually store and analyze those logs.</p><p>Cloudflare has a number of <a href="https://developers.cloudflare.com/logs/get-started/enable-destinations/">partnerships</a> with SIEMs and data warehouses/data lakes. Many of these tools even have pre-built Cloudflare dashboards for easy visibility. And third party tools have a big advantage in that you can store and search across many log sources, not just Cloudflare.</p><p>That said, we’ve heard from customers that they have some challenges with these solutions.</p><p>First, third party log tooling can be expensive! Most tools require that you pay not just for storage, but for indexing all of that data when it’s ingested. While that enables powerful search functionality later on, Cloudflare (by its nature) is often one of the largest emitters of logs that a developer will have. If you were to store and index every log line we generate, it can cost more money to analyze the logs than to deliver the actual service.</p><p>Second, these tools can be hard to use. Logs are often used to track down an issue that customers discover via analytics in the Cloudflare dashboard. After finding what you need in logs, it can be hard to get back to the right part of the Cloudflare dashboard to make the appropriate configuration changes.</p><p>Finally, Logpush was previously limited to Enterprise plans. Soon, we will start offering these services to customers at any scale, regardless of plan type or how they choose to pay.</p>
    <div>
      <h3>Why Logs Engine?</h3>
      <a href="#why-logs-engine">
        
      </a>
    </div>
    <p>With Logs Engine, we wanted to solve these problems. We wanted to build something affordable, easy to use, and accessible to any Cloudflare customer. And we wanted it to work for any Cloudflare logs dataset, for any span of time.</p><p>Our first insight was that to make logs affordable, we need to separate storage and compute. The cost of Storage is actually quite low! Thanks to R2, there’s no reason many of our customers can’t store all of their logs for long periods of time. At the same time, we want to separate out the <i>analysis</i> of logs so that customers only pay for the compute of logs they analyze – not every line ingested. While we’re still developing our query pricing, our aim is to be predictable, transparent and upfront. You should never be surprised by the cost of a query (or land a huge bill by accident).</p><p>It’s great to separate storage and compute. But, if you need to scan all of your logs anyway to answer the first question you have, you haven’t gained any benefits to this separation. In order to realize cost savings, it’s critical to narrow down your search before executing a query. That’s where our next big idea came in: a tight integration with analytics.</p><p>Most of the time, when analyzing logs, you don’t know what you’re looking for. For example, if you’re trying to find the cause of a specific origin status code, you may need to spend some time understanding which origins are impacted, which clients are sending them, and the time range in which these errors happened. Thanks to our <a href="/explaining-cloudflares-abr-analytics/">ABR analytics</a>, we can provide a good summary of the data very quickly – but not the exact details of what happened. By integrating with analytics, we can help customers narrow down their queries, then switch to Logs Engine once you know exactly what you’re looking for.</p><p>Finally, we wanted to make logs accessible to anyone. That means all plan types – not just Enterprise.</p><p>Additionally, we want to make it easy to both set up log storage and analysis, and also to take action on logs once you find problems. With Logs Engine, it will be possible to search logs right from the dashboard, and to immediately create <a href="https://developers.cloudflare.com/rules/">rules</a> based on the patterns you find there.</p>
    <div>
      <h3>What’s available today and our roadmap</h3>
      <a href="#whats-available-today-and-our-roadmap">
        
      </a>
    </div>
    <p>Today, Enterprise customers can store logs in R2 and <a href="https://developers.cloudflare.com/logs/r2-log-retrieval/">retrieve them</a> via time range. Currently in beta, we also allow customers to retrieve logs by RayID (see our <a href="/r2-rayid-retrieval/">companion blog post</a>) — to join the beta, please email <a>logs-engine-beta@cloudflare.com</a>.</p><p>Coming soon, we will enable customers on <i>all</i> plan types — not just Enterprise — to ingest logs into Logs Engine. Details on pricing will follow soon.</p><p>We also plan to build more powerful querying capability, beyond time range and RayID lookup. For example, we plan to support arbitrary filtering on any column, plus more expressive queries that can look across datasets or aggregate data.</p><p>But why stop at logs? This foundation lays the groundwork to support other types of data sources and queries one day. We are just getting started. Over the long term, we’re also exploring the ability to ingest data sources <i>outside</i> of Cloudflare and query them. Paired with <a href="https://developers.cloudflare.com/analytics/analytics-engine/">Analytics Engine</a> this is a formidable way to explore any data set in a cost-effective way!</p> ]]></content:encoded>
            <category><![CDATA[Developer Week]]></category>
            <category><![CDATA[Logs]]></category>
            <category><![CDATA[Product News]]></category>
            <guid isPermaLink="false">7l7PSKkuMDp5hbuUzIeZU5</guid>
            <dc:creator>Jon Levine</dc:creator>
            <dc:creator>Jen Sells</dc:creator>
        </item>
    </channel>
</rss>