
<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:media="http://search.yahoo.com/mrss/">
    <channel>
        <title><![CDATA[ The Cloudflare Blog ]]></title>
        <description><![CDATA[ Get the latest news on how products at Cloudflare are built, technologies used, and join the teams helping to build a better Internet. ]]></description>
        <link>https://blog.cloudflare.com</link>
        <atom:link href="https://blog.cloudflare.com/" rel="self" type="application/rss+xml"/>
        <language>en-us</language>
        
        <lastBuildDate>Fri, 03 Apr 2026 17:14:12 GMT</lastBuildDate>
        <item>
            <title><![CDATA[Investigating multi-vector attacks in Log Explorer]]></title>
            <link>https://blog.cloudflare.com/investigating-multi-vector-attacks-in-log-explorer/</link>
            <pubDate>Tue, 10 Mar 2026 13:00:00 GMT</pubDate>
            <description><![CDATA[ Log Explorer customers can now identify and investigate multi-vector attacks. Log Explorer supports 14 additional Cloudflare datasets, enabling users to have a 360-degree view of their network. ]]></description>
            <content:encoded><![CDATA[ <p>In the world of cybersecurity, a single data point is rarely the whole story. Modern attackers don’t just knock on the front door; they probe your APIs, flood your network with "noise" to distract your team, and attempt to slide through applications and servers using stolen credentials.</p><p>To stop these multi-vector attacks, you need the full picture. By using Cloudflare Log Explorer to conduct security forensics, you get 360-degree visibility through the integration of 14 new datasets, covering the full surface of Cloudflare’s Application Services and Cloudflare One product portfolios. By correlating telemetry from application-layer HTTP requests, network-layer DDoS and Firewall logs, and Zero Trust Access events, security analysts can significantly reduce Mean Time to Detect (MTTD) and effectively unmask sophisticated, multi-layered attacks.</p><p>Read on to learn more about how Log Explorer gives security teams the ultimate landscape for rapid, deep-dive forensics.</p>
    <div>
      <h2>The flight recorder for your entire stack</h2>
      <a href="#the-flight-recorder-for-your-entire-stack">
        
      </a>
    </div>
    <p>The contemporary digital landscape requires deep, correlated telemetry to defend against adversaries using multiple attack vectors. Raw logs serve as the "flight recorder" for an application, capturing every single interaction, attack attempt, and performance bottleneck. And because Cloudflare sits at the edge, between your users and your servers, all of these events are logged before the requests even reach your infrastructure. </p><p>Cloudflare Log Explorer centralizes these logs into a unified interface for rapid investigation.</p>
    <div>
      <h3>Log Types Supported</h3>
      <a href="#log-types-supported">
        
      </a>
    </div>
    
    <div>
      <h4>Zone-Scoped Logs</h4>
      <a href="#zone-scoped-logs">
        
      </a>
    </div>
    <p><i>Focus: Website traffic, security events, and edge performance.</i></p><table><tr><td><p><b>HTTP Requests</b></p></td><td><p>As the most comprehensive dataset, it serves as the "primary record" of all application-layer traffic, enabling the reconstruction of session activity, exploit attempts, and bot patterns.</p></td></tr><tr><td><p><b>Firewall Events</b></p></td><td><p>Provides critical evidence of blocked or challenged threats, allowing analysts to identify the specific WAF rules, IP reputations, or custom filters that intercepted an attack.</p></td></tr><tr><td><p><b>DNS Logs</b></p></td><td><p>Identify cache poisoning attempts, domain hijacking, and infrastructure-level reconnaissance by tracking every query resolved at the authoritative edge.</p></td></tr><tr><td><p><b>NEL (Network Error Logging) Reports</b></p></td><td><p>Distinguish between a coordinated Layer 7 DDoS attack and legitimate network connectivity issues by tracking client-side browser errors.</p></td></tr><tr><td><p><b>Spectrum Events</b></p></td><td><p>For non-web applications, these logs provide visibility into L4 traffic (TCP/UDP), helping to identify anomalies or brute-force attacks against protocols like SSH, RDP, or custom gaming traffic.</p></td></tr><tr><td><p><b>Page Shield</b></p></td><td><p>Track and audit unauthorized changes to your site's client-side environment such as JavaScript, outbound connections.</p></td></tr><tr><td><p><b>Zaraz Events</b></p></td><td><p>Examine how third-party tools and trackers are interacting with user data, which is vital for auditing privacy compliance and detecting unauthorized script behaviors.</p></td></tr></table>
    <div>
      <h4>Account-Scoped Logs</h4>
      <a href="#account-scoped-logs">
        
      </a>
    </div>
    <p><i>Focus: Internal security, Zero Trust, administrative changes, and network activity.</i></p><table><tr><td><p><b>Access Requests</b></p></td><td><p>Tracks identity-based authentication events to determine which users accessed specific internal applications and whether those attempts were authorized.</p></td></tr><tr><td><p><b>Audit Logs</b></p></td><td><p>Provides a trail of configuration changes within the Cloudflare dashboard to identify unauthorized administrative actions or modifications.</p></td></tr><tr><td><p><b>CASB Findings</b></p></td><td><p>Identifies security misconfigurations and data risks within SaaS applications (like Google Drive or Microsoft 365) to prevent unauthorized data exposure.</p></td></tr><tr><td><p><b>Magic Transit / IPSec Logs</b></p></td><td><p>Helps network engineers perform network-level (L3) monitoring such as reviewing tunnel health and view BGP routing changes.</p></td></tr><tr><td><p><b>Browser Isolation Logs</b></p></td><td><p>Tracks user actions <i>inside</i> an isolated browser session (e.g., copy-paste, print, or file uploads) to prevent data leaks on untrusted sites </p></td></tr><tr><td><p><b>Device Posture Results </b></p></td><td><p>Details the security health and compliance status of devices connecting to your network, helping to identify compromised or non-compliant endpoints.</p></td></tr><tr><td><p><b>DEX Application Tests </b></p></td><td><p>Monitors application performance from the user's perspective, which can help distinguish between a security-related outage and a standard performance degradation.</p></td></tr><tr><td><p><b>DEX Device State Events</b></p></td><td><p>Provides telemetry on the physical state of user devices, useful for correlating hardware or OS-level anomalies with potential security incidents.</p></td></tr><tr><td><p><b>DNS Firewall Logs</b></p></td><td><p>Tracks DNS queries filtered through the DNS Firewall to identify communication with known malicious domains or command-and-control (C2) servers.</p></td></tr><tr><td><p><b>Email Security Alerts</b></p></td><td><p>Logs malicious email activity and phishing attempts detected at the gateway to trace the origin of email-based entry vectors.</p></td></tr><tr><td><p><b>Gateway DNS</b></p></td><td><p>Monitors every DNS query made by users on your network to identify shadow IT, malware callbacks, or domain-generation algorithms (DGAs).</p></td></tr><tr><td><p><b>Gateway HTTP</b></p></td><td><p>Provides full visibility into encrypted and unencrypted web traffic to detect hidden payloads, malicious file downloads, or unauthorized SaaS usage.</p></td></tr><tr><td><p><b>Gateway Network</b></p></td><td><p>Tracks L3/L4 network traffic (non-HTTP) to identify unauthorized port usage, protocol anomalies, or lateral movement within the network.</p></td></tr><tr><td><p><b>IPSec Logs</b></p></td><td><p>Monitors the status and traffic of encrypted site-to-site tunnels to ensure the integrity and availability of secure network connections.</p></td></tr><tr><td><p><b>Magic IDS Detections</b></p></td><td><p>Surfaces matches against intrusion detection signatures to alert investigators to known exploit patterns or malware behavior traversing the network.</p></td></tr><tr><td><p><b>Network Analytics Logs</b></p></td><td><p>Provides high-level visibility into packet-level data to identify volumetric DDoS attacks or unusual traffic spikes targeting specific infrastructure.</p></td></tr><tr><td><p><b>Sinkhole HTTP Logs</b></p></td><td><p>Captures traffic directed to "sinkholed" IP addresses to confirm which internal devices are attempting to communicate with known botnet infrastructure.</p></td></tr><tr><td><p><b>WARP Config Changes</b></p></td><td><p>Tracks modifications to the WARP client settings on end-user devices to ensure that security agents haven't been tampered with or disabled.</p></td></tr><tr><td><p><b>WARP Toggle Changes</b></p></td><td><p>Specifically logs when users enable or disable their secure connectivity, helping to identify periods where a device may have been unprotected.</p></td></tr><tr><td><p><b>Zero Trust Network Session Logs</b></p></td><td><p>Logs the duration and status of authenticated user sessions to map out the complete lifecycle of a user's access within the protected perimeter.</p></td></tr></table>
    <div>
      <h2>Log Explorer can identify malicious activity at every stage</h2>
      <a href="#log-explorer-can-identify-malicious-activity-at-every-stage">
        
      </a>
    </div>
    <p>Get granular application layer visibility with <b>HTTP Requests</b>, <b>Firewall Events</b>, and <b>DNS logs</b> to see exactly how traffic is hitting your public-facing properties.<b> </b>Track internal movement with <b>Access Requests</b>, <b>Gateway logs</b>, and <b>Audit logs</b>. If a credential is compromised, you’ll see where they went. Use <b>Magic IDS</b> and <b>Network Analytics logs</b> to spot volumetric attacks and "East-West" lateral movement within your private network.</p>
    <div>
      <h3>Identify the reconnaissance</h3>
      <a href="#identify-the-reconnaissance">
        
      </a>
    </div>
    <p>Attackers use scanners and other tools to look for entry points, hidden directories, or software vulnerabilities. To identify this, using Log Explorer, you can query <code>http_requests</code> for any <code>EdgeResponseStatus</code> codes of 401, 403, or 404 coming from a single IP, or requests to sensitive paths (e.g. <code>/.env</code>, <code>/.git</code>, <code>/wp-admin</code>). </p><p>Additionally, <code>magic_ids_detections</code> logs can also be used to identify scanning at the network layer. These logs provide packet-level visibility into threats targeting your network. Unlike standard HTTP logs, these logs focus on <b>signature-based detections</b> at the network and transport layers (IP, TCP, UDP). Query to discover cases where a single <code>SourceIP</code> is triggering multiple unique detections across a wide range of <code>DestinationPort</code> values in a short timeframe. Magic IDS signatures can specifically flag activities like Nmap scans or SYN stealth scans.</p>
    <div>
      <h3>Check for diversions</h3>
      <a href="#check-for-diversions">
        
      </a>
    </div>
    <p>While the attacker is conducting reconnaissance, they may attempt to disguise this with a simultaneous network flood. Pivot to <code>network_analytics_logs</code> to see if a volumetric attack is being used as a smokescreen.</p>
    <div>
      <h3>Identify the approach </h3>
      <a href="#identify-the-approach">
        
      </a>
    </div>
    <p>Once attackers identify a potential vulnerability, they begin to craft their weapon. The attacker sends malicious payloads (e.g. SQL injection or large/corrupt file uploads) to confirm the vulnerability. Review <code>http_requests</code> and/or <code>fw_events</code> to identify any Cloudflare detection tools that have triggered. Cloudflare logs security signals in these datasets to easily identify requests with malicious payloads using fields such as <code>WAFAttackScore</code>, <code>WAFSQLiAttackScore</code>, <code>FraudAttack</code>, <code>ContentScanJobResults</code>, and several more. Review <a href="https://developers.cloudflare.com/logs/logpush/logpush-job/datasets/zone/http_requests/"><u>our documentation</u></a> to get a full understanding of these fields. The <code>fw_events</code> logs can be used to determine whether these requests made it past Cloudflare’s defenses by examining the <code>action</code>, <code>source</code>, and <code>ruleID</code> fields. Cloudflare’s managed rules by default blocks many of these payloads by default. Review Application Security Overview to know if your application is protected.</p>
          <figure>
          <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/1zpFguYrnbOPwyASGQCqZK/63f398acce2226e453a5eea1cc749241/image3.png" />
          </figure><p><sup><i>Showing the Managed rules Insight that displays on Security Overview if the current zone does not have Managed Rules enabled</i></sup></p>
    <div>
      <h3>Audit the identity</h3>
      <a href="#audit-the-identity">
        
      </a>
    </div>
    <p>Did that suspicious IP manage to log in? Use the <code>ClientIP</code> to search <code>access_requests</code>. If you see a "<code>Decision: Allow</code>" for a sensitive internal app, you know you have a compromised account.</p>
    <div>
      <h3>Stop the leak (data exfiltration)</h3>
      <a href="#stop-the-leak-data-exfiltration">
        
      </a>
    </div>
    <p>Attackers sometimes use DNS tunneling to bypass firewalls by encoding sensitive data (like passwords or SSH keys) into DNS queries. Instead of a normal request like <code>google.com</code>, the logs will show long, encoded strings. Look for an unusually high volume of queries for unique, long, and high-entropy subdomains by examining the fields: <code>QueryName</code>: Look for strings like <a href="http://h3ldo293js92.example.com"><code><u>h3ldo293js92.example.com</u></code></a>, <code>QueryType</code>: Often uses <code>TXT</code>, <code>CNAME</code>, or <code>NULL</code> records to carry the payload, and <code>ClientIP</code>: Identify if a single internal host is generating thousands of these unique requests.</p><p>Additionally, attackers may attempt to leak sensitive data by hiding it within non-standard protocols or by using common protocols (like DNS or ICMP) in unusual ways to bypass standard firewalls. Discover this by querying the <code>magic_ids_detections</code> logs to look for signatures that flag protocol anomalies, such as "ICMP tunneling" or "DNS tunneling" detections in the <code>SignatureMessage</code>.</p><p>Whether you are investigating a zero-day vulnerability or tracking a sophisticated botnet, the data you need is now at your fingertips.</p>
    <div>
      <h2>Correlate across datasets</h2>
      <a href="#correlate-across-datasets">
        
      </a>
    </div>
    <p>Investigate malicious activity across multiple datasets by pivoting between multiple concurrent searches. With Log Explorer, you can now work with multiple queries simultaneously with the new Tabs feature. Switch between tabs to query different datasets or Pivot and adjust queries using filtering via your query results.</p><div>
  
</div>
<p></p><p>When you correlate data across multiple Cloudflare log sources, you can detect sophisticated multi-stage attacks that appear benign when viewed in isolation. This cross-dataset analysis allows you to see the full attack chain from reconnaissance to exfiltration.</p>
    <div>
      <h3>Session hijacking (token theft)</h3>
      <a href="#session-hijacking-token-theft">
        
      </a>
    </div>
    <p><b>Scenario:</b> A user authenticates via Cloudflare Access, but their subsequent HTTP_request traffic looks like a bot.</p><p><b>Step 1:</b> Identify high-risk sessions in <code>http_requests</code>.</p>
            <pre><code>SELECT RayID, ClientIP, ClientRequestUserAgent, BotScore
FROM http_requests
WHERE date = '2026-02-22' 
  AND BotScore &lt; 20 
LIMIT 100</code></pre>
            <p><b>Step 2:</b> Copy the <code>RayID</code> and search <code>access_requests</code> to see which user account is associated with that suspicious bot activity.</p>
            <pre><code>
SELECT Email, IPAddress, Allowed
FROM access_requests
WHERE date = '2026-02-22' 
  AND RayID = 'INSERT_RAY_ID_HERE'</code></pre>
            
    <div>
      <h3>Post-phishing C2 beaconing</h3>
      <a href="#post-phishing-c2-beaconing">
        
      </a>
    </div>
    <p><b>Scenario:</b> An employee clicked a link in a phishing email which resulted in compromising their workstation. This workstation sends a DNS query for a known malicious domain, then immediately triggers an IDS alert.</p><p><b>Step 1:</b> Find phishing attacks by examining email_security_alerts for violations. </p>
            <pre><code>SELECT Timestamp, Threatcategories, To, Alertreason
FROM email_security_alerts
WHERE date = '2026-02-22' 
  AND Threatcategories LIKE 'phishing'</code></pre>
            <p><b>Step 2:</b> Use Access logs to correlate the user’s email (To) to their IP Address.</p>
            <pre><code>SELECT Email, IPAddress
FROM access_requests
WHERE date = '2026-02-22' </code></pre>
            <p><b>Step 3:</b> Find internal IPs querying a specific malicious domain in <code>gateway_dns</code> logs.</p>
            <pre><code>
SELECT SrcIP, QueryName, DstIP, 
FROM gateway_dns
WHERE date = '2026-02-22' 
  AND SrcIP = 'INSERT_IP_FROM_PREVIOUS_QUERY'
  AND QueryName LIKE '%malicious_domain_name%'</code></pre>
            
    <div>
      <h3>Lateral movement (Access → network probing)</h3>
      <a href="#lateral-movement-access-network-probing">
        
      </a>
    </div>
    <p><b>Scenario:</b> A user logs in via Zero Trust and then tries to scan the internal network.</p><p><b>Step 1:</b> Find successful logins from unexpected locations in <code>access_requests</code>.</p>
            <pre><code>SELECT IPAddress, Email, Country
FROM access_requests
WHERE date = '2026-02-22' 
  AND Allowed = true 
  AND Country != 'US' -- Replace with your HQ country</code></pre>
            <p><b>Step 2:</b> Check if that <code>IPAddress</code> is triggering network-level signatures in <code>magic_ids_detections</code>.</p>
            <pre><code>SELECT SignatureMessage, DestinationIP, Protocol
FROM magic_ids_detections
WHERE date = '2026-02-22' 
  AND SourceIP = 'INSERT_IP_ADDRESS_HERE'</code></pre>
            
    <div>
      <h3>Opening doors for more data </h3>
      <a href="#opening-doors-for-more-data">
        
      </a>
    </div>
    <p>From the beginning, Log Explorer was designed with extensibility in mind. Every dataset schema is defined using JSON Schema, a widely-adopted standard for describing the structure and types of JSON data. This design decision has enabled us to easily expand beyond HTTP Requests and Firewall Events to the full breadth of Cloudflare's telemetry. The same schema-driven approach that powered our initial datasets scaled naturally to accommodate Zero Trust logs, network analytics, email security alerts, and everything in between.</p><p>More importantly, this standardization opens the door to ingesting data beyond Cloudflare's native telemetry. Because our ingestion pipeline is schema-driven rather than hard-coded, we're positioned to accept any structured data that can be expressed in JSON format. For security teams managing hybrid environments, this means Log Explorer could eventually serve as a single pane of glass, correlating Cloudflare's edge telemetry with logs from third-party sources, all queryable through the same SQL interface. While today's release focuses on completing coverage of Cloudflare's product portfolio, the architectural groundwork is laid for a future where customers can bring their own data sources with custom schemas.</p>
    <div>
      <h3>Faster data, faster response: architectural upgrades</h3>
      <a href="#faster-data-faster-response-architectural-upgrades">
        
      </a>
    </div>
    <p>To investigate a multi-vector attack effectively, timing is everything. A delay of even a few minutes in the log availability can be the difference between proactive defense and reactive damage control.</p><p>That is why we have optimized our ingestion for better speed and resilience. By increasing concurrency in one part of our ingestion path, we have eliminated bottlenecks that could cause “noisy neighbor” issues, ensuring that one client’s data surge doesn’t slow down another’s visibility. This architectural work has reduced our P99 ingestion latency by approximately 55%, and our P50 by 25%, cutting the time it takes for an event at the edge to become available for your SQL queries.</p>
          <figure>
          <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/41M2eWP0BwrQFSZW4GzZbV/7a6139354abb561aba17e77d83beb17a/image4.png" />
          </figure><p><sup><i>Grafana chart displaying the drop in ingest latency after architectural upgrades</i></sup></p>
    <div>
      <h2>Follow along for more updates</h2>
      <a href="#follow-along-for-more-updates">
        
      </a>
    </div>
    <p>We're just getting started. We're actively working on even more powerful features to further enhance your experience with Log Explorer, including the ability to run these detection queries on a custom defined schedule. </p>
          <figure>
          <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/2JIOu9PXDwVAVcmbgq456q/1eace4b5d38bb705e82442a4ee8045dc/Scheduled_Queries_List.png" />
          </figure><p><sup><i>Design mockup of upcoming Log Explorer Scheduled Queries feature</i></sup></p><p><a href="https://blog.cloudflare.com/"><u>Subscribe to the blog</u></a> and keep an eye out for more Log Explorer updates soon in our <a href="https://developers.cloudflare.com/changelog/product/log-explorer/"><u>Change Log</u></a>. </p>
    <div>
      <h2>Get access to Log Explorer</h2>
      <a href="#get-access-to-log-explorer">
        
      </a>
    </div>
    <p>To get access to Log Explorer, you can purchase self-serve directly from the dash or for contract customers, reach out for a <a href="https://www.cloudflare.com/application-services/products/log-explorer/"><u>consultation</u></a> or contact your account manager. Additionally, you can read more in our <a href="https://developers.cloudflare.com/logs/log-explorer/"><u>Developer Documentation</u></a>.</p> ]]></content:encoded>
            <category><![CDATA[Analytics]]></category>
            <category><![CDATA[Logs]]></category>
            <category><![CDATA[Security]]></category>
            <category><![CDATA[R2]]></category>
            <category><![CDATA[Storage]]></category>
            <category><![CDATA[SIEM]]></category>
            <category><![CDATA[Product News]]></category>
            <category><![CDATA[Connectivity Cloud]]></category>
            <guid isPermaLink="false">1hirraqs3droftHovXp1G6</guid>
            <dc:creator>Jen Sells</dc:creator>
            <dc:creator>Claudio Jolowicz</dc:creator>
            <dc:creator>Nico Gutierrez</dc:creator>
        </item>
        <item>
            <title><![CDATA[Cloudflare Log Explorer is now GA, providing native observability and forensics]]></title>
            <link>https://blog.cloudflare.com/logexplorer-ga/</link>
            <pubDate>Wed, 18 Jun 2025 13:00:00 GMT</pubDate>
            <description><![CDATA[ We are happy to announce the General Availability of Cloudflare Log Explorer, a powerful product designed to bring observability and forensics capabilities directly into your Cloudflare dashboard. ]]></description>
            <content:encoded><![CDATA[ <p>We are thrilled to announce the General Availability of <a href="http://cloudflare.com/application-services/products/log-explorer/"><u>Cloudflare Log Explorer</u></a>, a powerful new product designed to bring <a href="https://www.cloudflare.com/learning/performance/what-is-observability/">observability and forensics capabilities</a> directly into your Cloudflare dashboard. Built on the foundation of Cloudflare's vast <a href="https://www.cloudflare.com/network/"><u>global network</u></a>, Log Explorer leverages the unique position of our platform to provide a comprehensive and contextualized view of your environment.</p><p>Security teams and developers use Cloudflare to detect and mitigate threats in real-time and to optimize application performance. Over the years, users have asked for additional telemetry with full context to investigate security incidents or troubleshoot application performance issues without having to forward data to third party log analytics and Security Information and Event Management (SIEM) tools. Besides avoidable costs, forwarding data externally comes with other drawbacks such as: complex setups, delayed access to crucial data, and a frustrating lack of context that complicates quick mitigation. </p><p>Log Explorer has been previewed by several hundred customers over the last year, and they attest to its benefits: </p><blockquote><p><i>“Having WAF logs (firewall events) instantly available in Log Explorer with full context — no waiting, no external tools — has completely changed how we manage our firewall rules. I can spot an issue, adjust the rule with a single click, and immediately see the effect. It’s made tuning for false positives faster, cheaper, and far more effective.” </i></p></blockquote><blockquote><p><i>“While we use Logpush to ingest Cloudflare logs into our SIEM, when our development team needs to analyze logs, it can be more effective to utilize </i><b><i>Log Explorer</i></b><i>. SIEMs make it difficult for development teams to write their own queries and manipulate the console to see the logs they need. Cloudflare's Log Explorer, on the other hand, makes it much </i><b><i>easier</i></b><i> for dev teams to look at logs and directly search for the information they need.”</i></p></blockquote><p>With Log Explorer, customers have access to Cloudflare logs with all the context available within the Cloudflare platform. Compared to external tools, customers benefit from: </p><ul><li><p><b>Reduced cost and complexity:</b> Drastically reduce the expense and operational overhead associated with forwarding, storing, and analyzing terabytes of log data in external tools.</p></li><li><p><b>Faster detection and triage:</b> Access Cloudflare-native logs directly, eliminating cumbersome data pipelines and the ingest lags that delay critical security insights.</p></li><li><p><b>Accelerated investigations with full context:</b> Investigate incidents with Cloudflare's unparalleled contextual data, accelerating your analysis and understanding of "What exactly happened?" and "How did it happen?"</p></li><li><p><b>Minimal recovery time:</b> Seamlessly transition from investigation to action with direct mitigation capabilities via the Cloudflare platform.</p></li></ul><p>Log Explorer is available as an add-on product for customers on our self serve or Enterprise plans. Read on to learn how each of the capabilities of Log Explorer can help you detect and diagnose issues more quickly.</p>
    <div>
      <h3>Monitor security and performance issues with custom dashboards</h3>
      <a href="#monitor-security-and-performance-issues-with-custom-dashboards">
        
      </a>
    </div>
    <p>Custom dashboards allow you to define the specific metrics you need in order to monitor unusual or unexpected activity in your environment.</p><p>Getting started is easy, with the ability to create a chart using natural language. A natural language interface is integrated into the chart create/edit experience, enabling you to describe in your own words the chart you want to create. Similar to the <a href="https://blog.cloudflare.com/security-analytics-ai-assistant/"><u>AI Assistant we announced during Security Week 2024</u></a>, the prompt translates your language to the appropriate chart configuration, which can then be added to a new or existing custom dashboard.</p><p>As an example, you can create a dashboard for monitoring for the presence of Remote Code Execution (RCE) attacks happening in your environment. An RCE attack is where an attacker is able to compromise a machine in your environment and execute commands. The good news is that RCE is a detection available in Cloudflare WAF.  In the dashboard example below, you can not only watch for RCE attacks, but also correlate them with other security events such as malicious content uploads, source IP addresses, and JA3/JA4 fingerprints. Such a scenario could mean one or more machines in your environment are compromised and being used to spread malware — surely, a very high risk incident!</p>
          <figure>
          <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/1UWOHhIaIFiBTtnohdvbAx/40eeac0b52bc278d0687f7d48cd875fd/BLOG-2838_2.png" />
          </figure><p>A reliability engineer might want to create a dashboard for monitoring errors. They could use the natural language prompt to enter a query like “Compare HTTP status code ranges over time.” The AI model then decides the most appropriate visualization and constructs their chart configuration.</p><p>While you can create custom dashboards from scratch, you could also use an expert-curated dashboard template to jumpstart your security and performance monitoring. </p><p>Available templates include: </p><ul><li><p><b>Bot monitoring:</b> Identify automated traffic accessing your website</p></li><li><p><b>API Security:</b> Monitor the data transfer and exceptions of API endpoints within your application</p></li><li><p><b>API Performance:</b> See timing data for API endpoints in your application, along with error rates</p></li><li><p><b>Account Takeover: </b>View login attempts, usage of leaked credentials, and identify account takeover attacks</p></li><li><p><b>Performance Monitoring:</b> Identify slow hosts and paths on your origin server, and view <a href="https://blog.cloudflare.com/ttfb-is-not-what-it-used-to-be/">time to first byte (TTFB)</a> metrics over time</p></li><li><p><b>Security Monitoring:</b> monitor attack distribution across top hosts and paths, correlate DDoS traffic with origin Response time to understand the impact of DDoS attacks.</p></li></ul>
          <figure>
          <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/3PO726Rhjol9khGOdMMnQJ/55462052782974b0fc5b0c885e42e41b/BLOG-2838_3.png" />
          </figure>
    <div>
      <h3>Investigate and troubleshoot issues with Log Search </h3>
      <a href="#investigate-and-troubleshoot-issues-with-log-search">
        
      </a>
    </div>
    <p>Continuing with the example from the prior section, after successfully diagnosing that some machines were compromised through the RCE issue, analysts can pivot over to Log Search in order to investigate whether the attacker was able to access and compromise other internal systems. To do that, the analyst could search logs from Zero Trust services, using context, such as compromised IP addresses from the custom dashboard, shown in the screenshot below: </p>
          <figure>
          <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/4iPrTc1ZtLU4ZxQWojvmje/d09bb0bf25bd17cea1d2f955371d991e/BLOG-2838_4.png" />
          </figure><p>Log Search is a streamlined experience including data type-aware search filters, or the ability to switch to a custom SQL interface for more powerful queries. Log searches are also available via a <a href="https://developers.cloudflare.com/logs/log-explorer/"><u>public API</u></a>. </p>
          <figure>
          <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/4AytV9wASU5kUuThnhl0CQ/de8c9f4b829e1ccfebdb33bd9522ae5b/BLOG-2838_5.png" />
          </figure>
    <div>
      <h3>Save time and collaborate with saved queries</h3>
      <a href="#save-time-and-collaborate-with-saved-queries">
        
      </a>
    </div>
    <p>Queries built in Log Search can now be saved for repeated use and are accessible to other Log Explorer users in your account. This makes it easier than ever to investigate issues together. </p>
          <figure>
          <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/4ouInu3nk7iZnAcJAs39F8/cc7ca6a61d19d3d9c1371ad2ca87e913/BLOG-2838_6.png" />
          </figure>
    <div>
      <h3>Monitor proactively with Custom Alerting (coming soon)</h3>
      <a href="#monitor-proactively-with-custom-alerting-coming-soon">
        
      </a>
    </div>
    <p>With custom alerting, you can configure custom alert policies in order to proactively monitor the indicators that are important to your business. </p><p>Starting from Log Search, define and test your query. From here you can opt to save and configure a schedule interval and alerting policy. The query will run automatically on the schedule you define.</p>
    <div>
      <h4>Tracking error rate for a custom hostname</h4>
      <a href="#tracking-error-rate-for-a-custom-hostname">
        
      </a>
    </div>
    <p>If you want to monitor the error rate for a particular host, you can use this Log Search query to calculate the error rate per time interval:</p>
            <pre><code>SELECT SUBSTRING(EdgeStartTimeStamp, 1, 14) || '00:00' AS time_interval,
       COUNT() AS total_requests,
       COUNT(CASE WHEN EdgeResponseStatus &gt;= 500 THEN 1 ELSE NULL END) AS error_requests,
       COUNT(CASE WHEN EdgeResponseStatus &gt;= 500 THEN 1 ELSE NULL END) * 100.0 / COUNT() AS error_rate_percentage
 FROM http_requests
WHERE EdgeStartTimestamp &gt;= '2025-06-09T20:56:58Z'
  AND EdgeStartTimestamp &lt;= '2025-06-10T21:26:58Z'
  AND ClientRequestHost = 'customhostname.com'
GROUP BY time_interval
ORDER BY time_interval ASC;
</code></pre>
            <p>Running the above query returns the following results. You can see the overall error rate percentage in the far right column of the query results.</p>
          <figure>
          <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/5v8SNmHt4OJrLSkiM2EKtJ/182c7f5709eef1fbb9e93c5423fc1bae/BLOG-2838_7.png" />
          </figure>
    <div>
      <h4>Proactively detect malware</h4>
      <a href="#proactively-detect-malware">
        
      </a>
    </div>
    <p>We can identify malware in the environment by monitoring logs from <a href="https://developers.cloudflare.com/cloudflare-one/policies/gateway/">Cloudflare Secure Web Gateway</a>. As an example, <a href="https://www.broadcom.com/support/security-center/protection-bulletin/new-katz-stealer-malware-as-a-service-compromises-web-browsers"><u>Katz Stealer</u></a> is malware-as-a-service designed for stealing credentials. We can monitor DNS queries and HTTP requests from users within the company in order to identify any machines that may be infected with Katz Stealer malware. </p>
          <figure>
          <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/7jgBTCWYpnWoNrFh8xe6ki/306e644ec3753976315c16c9d1560eec/BLOG-2838_8.png" />
          </figure>
          <figure>
          <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/7jFwBxsk8rfD3VLAYkG2iA/ebd2ebcd95d12b40978f22bf1bc7be39/BLOG-2838_9.png" />
          </figure><p>And with custom alerts, you can configure an alert policy so that you can be notified via webhook or PagerDuty.</p>
    <div>
      <h3>Maintain audit &amp; compliance with flexible retention (coming soon)</h3>
      <a href="#maintain-audit-compliance-with-flexible-retention-coming-soon">
        
      </a>
    </div>
    <p>With flexible retention, you can set the precise length of time you want to store your logs, allowing you to meet specific compliance and audit requirements with ease. Other providers require archiving or hot and cold storage, making it difficult to query older logs. Log Explorer is built on top of our R2 storage tier, so historical logs can be queried as easily as current logs. </p>
    <div>
      <h3>How we built Log Explorer to run at Cloudflare scale</h3>
      <a href="#how-we-built-log-explorer-to-run-at-cloudflare-scale">
        
      </a>
    </div>
    <p>With Log Explorer, we have built a scalable log storage platform on top of <a href="https://www.cloudflare.com/developer-platform/products/r2/"><u>Cloudflare R2</u></a> that lets you efficiently search your Cloudflare logs using familiar SQL queries. In this section, we’ll look into how we did this and how we solved some technical challenges along the way.

Log Explorer consists of three components: ingestors, compactors, and queriers. Ingestors are responsible for writing logs from Cloudflare’s data pipeline to R2. Compactors optimize storage files, so they can be queried more efficiently. Queriers execute SQL queries from users by fetching, transforming, and aggregating matching logs from R2.</p>
          <figure>
          <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/1qEH0futV2are5GnT6vjta/e50c0ec4bbb1cacada117d31b71502e2/BLOG-2838_10.png" />
          </figure><p>During ingestion, Log Explorer writes each batch of log records to a Parquet file in R2. <a href="https://parquet.apache.org/"><u>Apache Parquet</u></a> is an open-source columnar storage file format, and it was an obvious choice for us: it’s optimized for efficient data storage and retrieval, such as by embedding metadata like the minimum and maximum values of each column across the file which enables the queriers to quickly locate the data needed to serve the query.</p><p>Log Explorer stores logs on a per-customer level, just like Cloudflare D1, so that your data isn't mixed with that of other customers. In Q3 2025, per-customer logs will allow you the flexibility to create your own retention policies and decide in which regions you want to store your data.

But how does Log Explorer find those Parquet files when you query your logs? Log Explorer leverages the <a href="https://databricks.com/wp-content/uploads/2020/08/p975-armbrust.pdf"><u>Delta Lake</u></a> open table format to provide a database table abstraction atop R2 object storage. A table in Delta Lake pairs data files in Parquet format with a transaction log. The transaction log registers every addition, removal, or modification of a data file for the table – it’s stored right next to the data files in R2.</p><p>Given a SQL query for a particular log dataset such as <a href="https://developers.cloudflare.com/logs/reference/log-fields/zone/http_requests/"><u>HTTP Requests</u></a> or <a href="https://developers.cloudflare.com/logs/reference/log-fields/account/gateway_dns/"><u>Gateway DNS</u></a>, Log Explorer first has to load the transaction log of the corresponding Delta table from R2. Transaction logs are checkpointed periodically to avoid having to read the entire table history every time a user queries their logs.</p><p>Besides listing Parquet files for a table, the transaction log also includes per-column min/max statistics for each Parquet file. This has the benefit that Log Explorer only needs to fetch files from R2 that can possibly satisfy a user query. Finally, queriers use the min/max statistics embedded in each Parquet file to decide which row groups to fetch from the file.</p><p>Log Explorer processes SQL queries using <a href="https://arrow.apache.org/datafusion/"><u>Apache DataFusion</u></a>, a fast, extensible query engine written in Rust, and <a href="https://github.com/delta-io/delta-rs"><u>delta-rs</u></a>, a community-driven Rust implementation of the Delta Lake protocol. While standing on the shoulders of giants, our team had to solve some unique problems to provide log search at Cloudflare scale.</p><p>Log Explorer ingests logs from across Cloudflare’s vast global network, <a href="https://www.cloudflare.com/network"><u>spanning more than 330 cities in over 125 countries</u></a>. If Log Explorer were to write logs from our servers straight to R2, its storage would quickly fragment into a myriad of small files, rendering log queries prohibitively expensive.</p><p>Log Explorer’s strategy to avoid this fragmentation is threefold. First, it leverages Cloudflare’s data pipeline, which collects and batches logs from the edge, ultimately buffering each stream of logs in an internal system named <a href="https://blog.cloudflare.com/cloudflare-incident-on-november-14-2024-resulting-in-lost-logs/"><u>Buftee</u></a>. Second, log batches ingested from Buftee aren’t immediately committed to the transaction log; rather, Log Explorer stages commits for multiple batches in an intermediate area and “squashes” these commits before they’re written to the transaction log. Third, once log batches have been committed, a process called compaction merges them into larger files in the background.</p><p>While the open-source implementation of Delta Lake provides compaction out of the box, we soon encountered an issue when using it for our workloads. Stock compaction merges data files to a desired target size S by sorting the files in reverse order of their size and greedily filling bins of size S with them. By merging logs irrespective of their timestamps, this process distributed ingested batches randomly across merged files, destroying data locality. Despite compaction, a user querying for a specific time frame would still end up fetching hundreds or thousands of files from R2.</p><p>For this reason, we wrote a custom compaction algorithm that merges ingested batches in order of their minimum log timestamp, leveraging the min/max statistics mentioned previously. This algorithm reduced the number of overlaps between merged files by two orders of magnitude. As a result, we saw a significant improvement in query performance, with some large queries that had previously taken over a minute completing in just a few seconds.</p>
    <div>
      <h3>Follow along for more updates</h3>
      <a href="#follow-along-for-more-updates">
        
      </a>
    </div>
    <p>We're just getting started! We're actively working on even more powerful features to further enhance your experience with Log Explorer. <a href="https://blog.cloudflare.com/"><u>Subscribe to the blog</u></a> and keep an eye out for more updates in our <a href="https://developers.cloudflare.com/changelog/"><u>Change Log</u></a> to our observability and forensics offering soon.</p>
    <div>
      <h3>Get access to Log Explorer</h3>
      <a href="#get-access-to-log-explorer">
        
      </a>
    </div>
    <p>To get started with Log Explorer, <a href="https://www.cloudflare.com/application-services/products/log-explorer/">sign up here</a> or contact your account manager. You can also read more in our  <a href="https://developers.cloudflare.com/logs/log-explorer/"><u>Developer Documentation</u></a>.</p> ]]></content:encoded>
            <category><![CDATA[undefined]]></category>
            <category><![CDATA[SIEM]]></category>
            <category><![CDATA[Product News]]></category>
            <category><![CDATA[Connectivity Cloud]]></category>
            <category><![CDATA[Analytics]]></category>
            <category><![CDATA[Developer Platform]]></category>
            <category><![CDATA[Developers]]></category>
            <category><![CDATA[Security]]></category>
            <guid isPermaLink="false">kg7dxMzYcRnJdVFrxQmCw</guid>
            <dc:creator>Jen Sells</dc:creator>
            <dc:creator>Claudio Jolowicz</dc:creator>
        </item>
        <item>
            <title><![CDATA[Cloudflare enables native monitoring and forensics with Log Explorer and custom dashboards]]></title>
            <link>https://blog.cloudflare.com/monitoring-and-forensics/</link>
            <pubDate>Tue, 18 Mar 2025 13:00:00 GMT</pubDate>
            <description><![CDATA[ Today we are excited to announce support for Zero Trust datasets, and custom dashboards where customers can monitor critical metrics for suspicious or unusual activity.  ]]></description>
            <content:encoded><![CDATA[ <p>In 2024, we <a href="https://blog.cloudflare.com/log-explorer/"><u>announced Log Explorer</u></a>, giving customers the ability to store and query their HTTP and security event logs natively within the Cloudflare network. Today, we are excited to announce that Log Explorer now supports logs from our Zero Trust product suite. In addition, customers can create custom dashboards to monitor suspicious or unusual activity.</p><p>Every day, Cloudflare detects and protects customers against billions of threats, including DDoS attacks, bots, web application exploits, and more. SOC analysts, who are charged with keeping their companies safe from the growing spectre of Internet threats, may want to investigate these threats to gain additional insights on attacker behavior and protect against future attacks. Log Explorer, by collecting logs from various Cloudflare products, provides a single starting point for investigations. As a result, analysts can avoid forwarding logs to other tools, maximizing productivity and minimizing costs. Further, analysts can monitor signals specific to their organizations using custom dashboards.</p>
    <div>
      <h2>Zero Trust dataset support in Log Explorer</h2>
      <a href="#zero-trust-dataset-support-in-log-explorer">
        
      </a>
    </div>
    <p>Log Explorer stores your Cloudflare logs for a 30-day retention period so that you can analyze them natively and in a single interface, within the Cloudflare Dashboard. Cloudflare log data is diverse, reflecting the breadth of capabilities available.  For example, HTTP requests contain information about the client such as their IP address, request method, <a href="https://www.cloudflare.com/learning/network-layer/what-is-an-autonomous-system/"><u>autonomous system (ASN)</u></a>, request paths, and TLS versions used. Additionally, Cloudflare’s Application Security <a href="https://developers.cloudflare.com/waf/detections/"><u>WAF Detections</u></a> enrich these HTTP request logs with additional context, such as the <a href="https://developers.cloudflare.com/waf/detections/attack-score/"><u>WAF attack score</u></a>, to identify threats.</p><p>Today we are announcing that seven additional Cloudflare product datasets are now available in Log Explorer. These seven datasets are the logs generated from our Zero Trust product suite, and include logs from <a href="https://developers.cloudflare.com/logs/reference/log-fields/account/access_requests/"><u>Access</u></a>, <a href="https://developers.cloudflare.com/logs/reference/log-fields/account/gateway_dns/"><u>Gateway DNS</u></a>, <a href="https://developers.cloudflare.com/logs/reference/log-fields/account/gateway_http/"><u>Gateway HTTP</u></a>, <a href="https://developers.cloudflare.com/logs/reference/log-fields/account/gateway_network/"><u>Gateway Network</u></a>, <a href="https://developers.cloudflare.com/logs/reference/log-fields/account/casb_findings/"><u>CASB</u></a>, <a href="https://developers.cloudflare.com/logs/reference/log-fields/account/zero_trust_network_sessions/"><u>Zero </u></a></p><p><a href="https://developers.cloudflare.com/logs/reference/log-fields/account/zero_trust_network_sessions/"><u>Trust Network Session</u></a>, and <a href="https://developers.cloudflare.com/logs/reference/log-fields/account/device_posture_results/"><u>Device Posture Results</u></a>. Read on for examples of how to use these logs to identify common threats.</p>
    <div>
      <h3>Investigating unauthorized access</h3>
      <a href="#investigating-unauthorized-access">
        
      </a>
    </div>
    <p>By reviewing Access logs and HTTP request logs, we can reveal attempts to access resources or systems without proper permissions, including brute force password attacks, indicating potential security breaches or malicious activity.</p><p>Below, we filter Access Logs on the <code>Allowed</code> field, to see activity related to unauthorized access.</p>
          <figure>
          <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/2piOIdnNz9OWskJqrZJfcf/f88673fc184c23de493920661020e7b3/access_requests.png" />
          </figure><p>By then reviewing the HTTP logs for the requests identified in the previous query, we can assess if bot networks are the source of unauthorized activity.</p>
          <figure>
          <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/4b38nYNdpLbmHFt0BHkapa/88e1acf82d8bbc257a7cbbe102cbd723/http_requests.png" />
          </figure><p>With this information, you can craft targeted <a href="https://developers.cloudflare.com/waf/custom-rules/"><u>Custom Rules</u></a> to block the offending traffic. </p>
    <div>
      <h3>Detecting malware</h3>
      <a href="#detecting-malware">
        
      </a>
    </div>
    <p>Cloudflare's <a href="https://developers.cloudflare.com/cloudflare-one/policies/gateway/"><u>Web Gateway</u></a> can track which websites users are accessing, allowing administrators to identify and block access to malicious or inappropriate sites. These logs can be used to detect if a user’s machine or account is compromised by malware attacks. When reviewing logs, this may become apparent when we look for records that show a rapid succession of attempts to browse known malicious sites, such as hostnames that have long strings of seemingly random characters that hide their true destination. In this example, we can query logs looking for requests to a spoofed YouTube URL.</p>
          <figure>
          <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/5Nkm4udjUw9tmzPk0Fk1eK/524dc1a6d4070a1f6cc9478e09b67ffd/gateway_requests.png" />
          </figure>
    <div>
      <h2>Monitoring what matters using custom dashboards</h2>
      <a href="#monitoring-what-matters-using-custom-dashboards">
        
      </a>
    </div>
    <p>Security monitoring is not one size fits all. For instance, companies in the retail or financial industries worry about fraud, while every company is concerned about data exfiltration, of information like trade secrets. And any form of personally identifiable information (PII) is a target for data breaches or ransomware attacks.</p><p>While log exploration helps you react to threats, our new custom dashboards allow you to define the specific metrics you need in order to monitor threats you are concerned about. </p><p>Getting started is easy, with the ability to create a chart using natural language. A natural language interface is integrated into the chart create/edit experience, enabling you to describe in your own words the chart you want to create. Similar to the <a href="https://blog.cloudflare.com/security-analytics-ai-assistant/"><u>AI Assistant</u></a> we announced during Security Week 2024, the prompt translates your language to the appropriate chart configuration, which can then be added to a new or existing custom dashboard.</p><ul><li><p><b>Use a prompt</b>: Enter a query like “Compare status code ranges over time”. The AI model decides the most appropriate visualization and constructs your chart configuration.</p></li><li><p><b>Customize your chart</b>: Select the chart elements manually, including the chart type, title, dataset to query, metrics, and filters. This option gives you full control over your chart’s structure. </p></li></ul><div>
  
</div>
<br /><p><sup><i>Video shows entering a natural language description of desired metric “compare status code ranges over time”, preview chart shown is a time series grouped by error code ranges, selects “add chart” to save to dashboard.</i></sup></p><p>For more help getting started, we have some pre-built templates that you can use for monitoring specific uses. Available templates currently include: </p><ul><li><p><b>Bot monitoring</b>: Identify automated traffic accessing your website</p></li><li><p><b>API Security:</b> Monitor the data transfer and exceptions of API endpoints within your application</p></li><li><p><b>API Performance</b>: See timing data for API endpoints in your application, along with error rates</p></li><li><p><b>Account Takeover:</b> View login attempts, usage of leaked credentials, and identify account takeover attacks</p></li><li><p><b>Performance Monitoring</b>: Identify slow hosts and paths on your origin server, and view <a href="https://blog.cloudflare.com/ttfb-is-not-what-it-used-to-be/"><u>time to first byte (TTFB)</u></a> metrics over time</p></li></ul><p>Templates provide a good starting point, and once you create your dashboard, you can add or remove individual charts using the same natural language chart creator. </p><div>
  
</div>
<br /><p><sup><i>Video shows editing chart from an existing dashboard and moving individual charts via drag and drop.</i></sup></p>
    <div>
      <h3>Example use cases</h3>
      <a href="#example-use-cases">
        
      </a>
    </div>
    <p>Custom dashboards can be used to monitor for suspicious activity, or to keep an eye on performance and errors for your domains. Let’s explore some examples of suspicious activity that we can monitor using custom dashboards.</p><p>Take, for example, our use case from above: investigating unauthorized access. With custom dashboards, you can create a dashboard using the <b>Account takeover</b> template to monitor for suspicious login activity related to your domain.</p>
          <figure>
          <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/72KBaEdr0bEn4SNwKOfPfJ/e28997b94630cf856d3924e9ba443063/image7.png" />
          </figure><p>As another example, spikes in requests or errors are common indicators that something is wrong, and they can sometimes be signals of suspicious activity. With the Performance Monitoring template, you can view origin response time and time to first byte metrics as well as monitor for common errors. For example, in this chart, the spikes in 404 errors could be an indication of an unauthorized scan of your endpoints.</p>
          <figure>
          <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/3krBxVm8dB5pr5XEoHnVtK/44f682436c3d5a63baa1105987347433/image1.jpg" />
          </figure>
    <div>
      <h3>Seamlessly integrated into the Cloudflare platform</h3>
      <a href="#seamlessly-integrated-into-the-cloudflare-platform">
        
      </a>
    </div>
    <p>When using custom dashboards, if you observe a traffic pattern or spike in errors that you would like to further investigate, you can click the button to “View in Security Analytics” in order to drill down further into the data and craft custom WAF rules to mitigate the threat.  </p>
          <figure>
          <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/5XfvQ24bvDmnNKeInyA8eU/e96798a72e55fa454439f8b85197e02b/image2.png" />
          </figure><p>These tools, seamlessly integrated into the Cloudflare platform, will enable users to discover, investigate, and mitigate threats all in one place, reducing time to resolution and overall cost of ownership by eliminating the need to forward logs to third party security analysis tools. And because it is a native part of Cloudflare, you can immediately use the data from your investigation to craft targeted rules that will block these threats. </p>
    <div>
      <h2>What’s next</h2>
      <a href="#whats-next">
        
      </a>
    </div>
    <p>Stay tuned as we continue to develop more capabilities in the areas of <a href="https://www.cloudflare.com/learning/performance/what-is-observability/">observability and forensics</a>, with additional features including: </p><ul><li><p><b>Custom alerts</b>: create alerts based on specific metrics or anomalies</p></li><li><p><b>Scheduled query detections</b>: craft log queries and run them on a schedule to detect malicious activity</p></li><li><p><b>More integration</b>: further streamlining the journey between detect, investigate, and mitigate across the full Cloudflare platform.</p></li></ul>
    <div>
      <h2>How to get it</h2>
      <a href="#how-to-get-it">
        
      </a>
    </div>
    <p>Current Log Explorer beta users get immediate access to the new custom dashboards feature. Pricing will be made available to everyone during Q2 2025. Between now and then, these features continue to be available at no cost.</p><p>Let us know if you are interested in joining our Beta program by completing <a href="https://www.cloudflare.com/lp/log-explorer/"><u>this form</u></a>, and a member of our team will contact you.</p>
    <div>
      <h2>Watch on Cloudflare TV</h2>
      <a href="#watch-on-cloudflare-tv">
        
      </a>
    </div>
    <div>
  
</div><p></p> ]]></content:encoded>
            <category><![CDATA[Security Week]]></category>
            <category><![CDATA[Analytics]]></category>
            <category><![CDATA[Logs]]></category>
            <category><![CDATA[Security]]></category>
            <category><![CDATA[undefined]]></category>
            <category><![CDATA[SIEM]]></category>
            <category><![CDATA[Product News]]></category>
            <category><![CDATA[Connectivity Cloud]]></category>
            <guid isPermaLink="false">76XBFojN0mhfyCoz6VRe1G</guid>
            <dc:creator>Jen Sells</dc:creator>
        </item>
        <item>
            <title><![CDATA[Log Explorer: monitor security events without third-party storage]]></title>
            <link>https://blog.cloudflare.com/log-explorer/</link>
            <pubDate>Fri, 08 Mar 2024 14:05:00 GMT</pubDate>
            <description><![CDATA[ With the combined power of Security Analytics + Log Explorer, security teams can analyze, investigate, and monitor for security attacks natively within Cloudflare, reducing time to resolution and overall cost of ownership for customers by eliminating the need to forward logs to third-party SIEMs ]]></description>
            <content:encoded><![CDATA[ <p></p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/1GhVBYNZAsGZtOfgo8C3VY/42fc180d060574162071cbdd13ad6a88/image6-6.png" />
            
            </figure><p>Today, we are excited to announce beta availability of <a href="https://developers.cloudflare.com/logs/log-explorer/">Log Explorer</a>, which allows you to investigate your HTTP and Security Event logs directly from the Cloudflare Dashboard. Log Explorer is an extension of <a href="/security-analytics">Security Analytics</a>, giving you the ability to review related raw logs. You can analyze, investigate, and monitor for security attacks natively within the Cloudflare Dashboard, reducing time to resolution and overall cost of ownership by eliminating the need to forward logs to third party security analysis tools.</p>
    <div>
      <h3>Background</h3>
      <a href="#background">
        
      </a>
    </div>
    <p>Security Analytics enables you to analyze all of your HTTP traffic in one place, giving you the security lens you need to identify and act upon what matters most: potentially malicious traffic that has not been mitigated. Security Analytics includes built-in views such as top statistics and in-context quick filters on an intuitive page layout that enables rapid exploration and validation.</p><p>In order to power our rich analytics dashboards with fast query performance, we implemented <a href="https://developers.cloudflare.com/analytics/graphql-api/sampling/">data sampling</a> using <a href="/explaining-cloudflares-abr-analytics">Adaptive Bit Rate</a> (ABR) analytics. This is a great fit for providing high level aggregate views of the data. However, we received feedback from many Security Analytics power users that sometimes they need access to a more granular view of the data — they need logs.</p><p>Logs provide critical visibility into the operations of today's computer systems. Engineers and SOC analysts rely on logs every day to troubleshoot issues, identify and investigate security incidents, and tune the performance, reliability, and <a href="https://www.cloudflare.com/application-services/solutions/">security</a> of their applications and infrastructure. Traditional metrics or monitoring solutions provide aggregated or statistical data that can be used to identify trends. Metrics are wonderful at identifying THAT an issue happened, but lack the detailed events to help engineers uncover WHY it happened. Engineers and SOC Analysts rely on raw log data to answer questions such as:</p><ul><li><p>What is causing this increase in 403 errors?</p></li><li><p>What data was accessed by this IP address?</p></li><li><p>What was the user experience of this particular user’s session?</p></li></ul><p>Traditionally, these engineers and analysts would stand up a collection of various monitoring tools in order to capture logs and get this visibility. With more organizations using multiple clouds, or a hybrid environment with both cloud and on-premise tools and architecture, it is crucial to have a unified platform to regain visibility into this increasingly complex environment.  As more and more companies are moving towards a cloud native architecture, we see Cloudflare’s <a href="https://www.cloudflare.com/en-gb/learning/cloud/what-is-a-connectivity-cloud/">connectivity cloud</a> as an integral part of their performance and security strategy.</p><p>Log Explorer provides a lower cost option for storing and exploring log data within Cloudflare. Until today, we have offered the ability to export logs to expensive third party tools, and now with Log Explorer, you can quickly and easily explore your log data without leaving the Cloudflare Dashboard.</p>
    <div>
      <h3>Log Explorer Features</h3>
      <a href="#log-explorer-features">
        
      </a>
    </div>
    <p>Whether you're a SOC Engineer investigating potential incidents, or a Compliance Officer with specific log retention requirements, Log Explorer has you covered. It stores your Cloudflare logs for an uncapped and customizable period of time, making them accessible natively within the Cloudflare Dashboard. The supported features include:</p><ul><li><p>Searching through your HTTP Request or Security Event logs</p></li><li><p>Filtering based on any field and a number of standard operators</p></li><li><p>Switching between basic filter mode or SQL query interface</p></li><li><p>Selecting fields to display</p></li><li><p>Viewing log events in tabular format</p></li><li><p>Finding the HTTP request records associated with a Ray ID</p></li></ul>
    <div>
      <h3>Narrow in on unmitigated traffic</h3>
      <a href="#narrow-in-on-unmitigated-traffic">
        
      </a>
    </div>
    <p>As a SOC analyst, your job is to monitor and respond to threats and incidents within your organization’s network. Using Security Analytics, and now with Log Explorer, you can identify anomalies and conduct a forensic investigation all in one place.</p><p>Let’s walk through an example to see this in action:</p><p>On the Security Analytics dashboard, you can see in the Insights panel that there is some traffic that has been tagged as a likely attack, but not mitigated.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/5Oq3oqY8JXMigK8OJKPFZ4/5d3a8751a56f06f58e96538f1d46a480/Screenshot-2024-03-07-at-20.20.41.png" />
            
            </figure><p>Clicking the filter button narrows in on these requests for further investigation.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/7sWkjUYz1J0So4nphy4FSs/769d5ebb0b706a073a616b706783030c/image11.jpg" />
            
            </figure><p>In the sampled logs view, you can see that most of these requests are coming from a common client IP address.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/5gtrP14GKbnB0YeV0ySgL1/6b476ec9d19e255912315eac9730604d/Sampled-logs.png" />
            
            </figure><p>You can also see that Cloudflare has flagged all of these requests as bot traffic. With this information, you can craft a WAF rule to either block all traffic from this IP address, or block all traffic with a bot score lower than 10.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/2YgFTRD7u3KYh0bbInLylK/f076a70bc09f41d8ace0569bca172b39/Screenshot-2024-03-07-at-20.22.04.png" />
            
            </figure><p>Let’s say that the Compliance Team would like to gather documentation on the scope and impact of this attack. We can dig further into the logs during this time period to see everything that this attacker attempted to access.</p><p>First, we can use Log Explorer to query HTTP requests from the suspect IP address during the time range of the spike seen in Security Analytics.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/qsi7UnxjtygCQHnMCIx02/cda0aacf6d783b05c15196f27907c611/Log-Explorer.png" />
            
            </figure>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/YyewPADkPzjofXHXSihyi/e494ca5d4b9a3ad7071d5d3e27f57887/Query-results.png" />
            
            </figure><p>We can also review whether the attacker was able to <a href="https://www.cloudflare.com/learning/security/what-is-data-exfiltration/">exfiltrate</a> data by adding the OriginResponseBytes field and updating the query to show requests with OriginResponseBytes &gt; 0. The results show that no data was exfiltrated.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/3TgN2aPwWTDo5niA95TGQx/71143be92550dfea1ce284507fa688ac/No-logs-found.png" />
            
            </figure>
    <div>
      <h3>Find and investigate false positives</h3>
      <a href="#find-and-investigate-false-positives">
        
      </a>
    </div>
    <p>With access to the full logs via Log Explorer, you can now perform a search to find specific requests.</p><p>A 403 error occurs when a user’s request to a particular site is blocked. Cloudflare’s security products use things like <a href="/introducing-ip-lists/">IP reputation</a> and <a href="/stop-attacks-before-they-are-known-making-the-cloudflare-waf-smarter/">WAF attack scores</a> based on ML technologies in order to assess whether a given HTTP request is malicious. This is extremely effective, but sometimes requests are mistakenly flagged as malicious and blocked.</p><p>In these situations, we can now use Log Explorer to identify these requests and why they were blocked, and then adjust the relevant WAF rules accordingly.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/5ecKfF4lRDQSyLCoOgDjTw/6f0872962caff8d719958e6fdfcc8dbc/Log-Explorer-2.png" />
            
            </figure><p>Or, if you are interested in tracking down a specific request by Ray ID, an identifier given to every request that goes through Cloudflare, you can do that via Log Explorer with one query.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/3NCDOP0axFC3wZ4qs03Jei/86286da2fd9fca3cdb68214c1d0f472a/Log-Explorer-3.png" />
            
            </figure><p>Note that the LIMIT clause is included in the query by default, but has no impact on RayID queries as RayID is unique and only one record would be returned when using the RayID filter field.</p>
    <div>
      <h3>How we built Log Explorer</h3>
      <a href="#how-we-built-log-explorer">
        
      </a>
    </div>
    
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/3pNBd5iSyVW7aqfJ0JfYDP/89cd0341cc35ad9d86fd8687ba7a9147/How-we-built-Log-Explorer.png" />
            
            </figure><p>With Log Explorer, we have built a long-term, append-only log storage platform on top of <a href="https://www.cloudflare.com/developer-platform/r2/">Cloudflare R2</a>. Log Explorer leverages the <a href="https://databricks.com/wp-content/uploads/2020/08/p975-armbrust.pdf">Delta Lake</a> protocol, an open-source storage framework for building highly performant, <a href="https://en.wikipedia.org/wiki/ACID">ACID</a>-compliant databases atop a cloud object store. In other words, Log Explorer combines a large and cost-effective storage system – <a href="www.cloudflare.com/developer-platform/r2/">Cloudflare R2</a> – with the benefits of strong consistency and high performance. Additionally, Log Explorer gives you a SQL interface to your Cloudflare logs.</p><p>Each Log Explorer dataset is stored on a per-customer level, just like Cloudflare D1, so that your data isn't placed with that of other customers. In the future, this single-tenant storage model will give you the flexibility to create your own retention policies and decide in which regions you want to store your data.</p><p>Under the hood, the datasets for each customer are stored as Delta tables in R2 buckets. A <i>Delta table</i> is a storage format that organizes Apache Parquet objects into directories using Hive's partitioning naming convention. Crucially, Delta tables pair these storage objects with an append-only, checkpointed transaction log. This design allows Log Explorer to support multiple writers with optimistic concurrency.</p><p>Many of the products Cloudflare builds are a direct result of the challenges our own team is looking to address. Log Explorer is a perfect example of this <a href="/tag/dogfooding">culture of dogfooding</a>. Optimistic concurrent writes require atomic updates in the underlying object store, and as a result of our needs, R2 added a PutIfAbsent operation with strong consistency. Thanks, R2! The atomic operation sets Log Explorer apart from Delta Lake solutions based on Amazon Web Services’ S3, which incur the operational burden of using an <a href="https://delta.io/blog/2022-05-18-multi-cluster-writes-to-delta-lake-storage-in-s3/">external store</a> for synchronizing writes.</p><p>Log Explorer is written in the Rust programming language using open-source libraries, such as <a href="https://github.com/delta-io/delta-rs">delta-rs</a>, a native Rust implementation of the Delta Lake protocol, and <a href="https://arrow.apache.org/datafusion/">Apache Arrow DataFusion</a>, a very fast, extensible query engine. At Cloudflare, Rust has emerged as a popular choice for new product development due to its safety and performance benefits.</p>
    <div>
      <h3>What’s next</h3>
      <a href="#whats-next">
        
      </a>
    </div>
    <p>We know that application security logs are only part of the puzzle in understanding what’s going on in your environment. Stay tuned for future developments including tighter, more seamless integration between Analytics and Log Explorer, the addition of more datasets including Zero Trust logs, the ability to define custom retention periods, and integrated custom alerting.</p><p>Please use the <a href="https://forms.gle/tvKQDdXmCk98zyV9A">feedback link</a> to let us know how Log Explorer is working for you and what else would help make your job easier.</p>
    <div>
      <h3>How to get it</h3>
      <a href="#how-to-get-it">
        
      </a>
    </div>
    <p>We’d love to hear from you! Let us know if you are interested in joining our Beta program by completing <a href="https://cloudflare.com/lp/log-explorer/">this form</a> and a member of our team will contact you.</p><p>Pricing will be finalized prior to a General Availability (GA) launch.</p><div>
  
</div><p>Tune in for more news, announcements and thought-provoking discussions! Don't miss the full <a href="https://cloudflare.tv/shows/security-week">Security Week hub page</a>.</p> ]]></content:encoded>
            <category><![CDATA[Security Week]]></category>
            <category><![CDATA[Analytics]]></category>
            <category><![CDATA[Logs]]></category>
            <category><![CDATA[Security]]></category>
            <category><![CDATA[undefined]]></category>
            <category><![CDATA[SIEM]]></category>
            <category><![CDATA[Product News]]></category>
            <category><![CDATA[Connectivity Cloud]]></category>
            <guid isPermaLink="false">3K5UjFarMC09kkM507HshK</guid>
            <dc:creator>Jen Sells</dc:creator>
            <dc:creator>Claudio Jolowicz</dc:creator>
            <dc:creator>Cole MacKenzie</dc:creator>
        </item>
        <item>
            <title><![CDATA[Enhancing security analysis with Cloudflare Zero Trust logs and Elastic SIEM]]></title>
            <link>https://blog.cloudflare.com/enhancing-security-analysis-with-cloudflare-zero-trust-logs-and-elastic-siem/</link>
            <pubDate>Thu, 22 Feb 2024 14:00:26 GMT</pubDate>
            <description><![CDATA[ Today, we are thrilled to announce new Cloudflare Zero Trust dashboards on Elastic. Shared customers using Elastic can now use these pre-built dashboards to store, search, and analyze their Zero Trust logs ]]></description>
            <content:encoded><![CDATA[ <p></p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/696ov5uPtgNwN7Qm735ESm/6f88ef27e4cacb8057d6e600fd20d378/image3-7.png" />
            
            </figure><p>Today, we are thrilled to announce new Cloudflare Zero Trust dashboards on Elastic. Shared customers using Elastic can now use these pre-built <a href="https://docs.elastic.co/integrations/cloudflare_logpush#zero-trust-events">dashboards to store, search, and analyze</a> their Zero Trust logs.</p><p>When organizations look to adopt a <a href="https://www.cloudflare.com/learning/security/glossary/what-is-zero-trust/">Zero Trust architecture</a>, there are many components to get right. If products are configured incorrectly, used maliciously, or security is somehow breached during the process, it can open your organization to underlying security risks without the ability to get insight from your data quickly and efficiently.</p><p>As a Cloudflare technology partner, Elastic helps Cloudflare customers find what they need faster, while keeping applications running smoothly and <a href="https://www.cloudflare.com/products/zero-trust/threat-defense/">protecting against cyber threats</a>. “I'm pleased to share our collaboration with Cloudflare, making it even easier to deploy log and analytics dashboards. This partnership combines Elastic's open approach with Cloudflare's practical solutions, offering straightforward tools for enterprise search, observability, and security deployment,” explained Mark Dodds, Chief Revenue Officer at Elastic.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/7kDqbu2kQvUL1P47N6aDMY/8dacf9b75432a900b32cb900f080366a/image5-3.png" />
            
            </figure>
    <div>
      <h2>Value of Zero Trust logs in Elastic</h2>
      <a href="#value-of-zero-trust-logs-in-elastic">
        
      </a>
    </div>
    <p>With this joint solution, we’ve made it easy for customers to seamlessly forward their Zero Trust logs to Elastic via Logpush jobs. This can be achieved directly via a Restful API or through an intermediary storage solution like AWS S3 or Google Cloud. Additionally, Cloudflare's integration with Elastic has undergone improvements to encompass all categories of Zero Trust logs generated by Cloudflare.</p><p><b>Here are detailed some highlights of what the integration offers:</b></p><ul><li><p><b>Comprehensive Visibility:</b> Integrating Cloudflare Logpush into Elastic provides organizations with a real-time, comprehensive view of events related to Zero Trust. This enables a detailed understanding of who is accessing resources and applications, from where, and at what times. Enhanced visibility helps detect anomalous behavior and potential security threats more effectively, allowing for early response and mitigation.</p></li><li><p><b>Field Normalization:</b> By unifying data from Zero Trust logs in Elastic, it's possible to apply consistent field normalization not only for Zero Trust logs but also for other sources. This simplifies the process of search and analysis, as data is presented in a uniform format. Normalization also facilitates the creation of alerts and the identification of patterns of malicious or unusual activity.</p></li><li><p><b>Efficient Search and Analysis:</b> Elastic provides powerful data search and analysis capabilities. Having Zero Trust logs in Elastic enables quick and precise searching for specific information. This is crucial for investigating security incidents, understanding workflows, and making informed decisions.</p></li><li><p><b>Correlation and Threat Detection:</b> By combining Zero Trust data with other security events and data, Elastic enables deeper and more effective correlation. This is essential for detecting threats that might go unnoticed when analyzing each data source separately. Correlation aids in pattern identification and the detection of sophisticated attacks.</p></li><li><p><b>Prebuilt Dashboards:</b> The integration provides out-of-the-box dashboards offering a quick start to visualizing key metrics and patterns. These dashboards help security teams visualize the security landscape in a clear and concise manner. The integration not only provides the advantage of prebuilt dashboards designed for Zero Trust datasets but also empowers users to curate their own visualizations.</p></li></ul>
    <div>
      <h2>What’s new on the dashboards</h2>
      <a href="#whats-new-on-the-dashboards">
        
      </a>
    </div>
    <p>One of the main assets of the integration is the out-of-the-box dashboards tailored specifically for each type of Zero Trust log. Let's explore some of these dashboards in more detail to find out how they can help us in terms of visibility.</p>
    <div>
      <h3>Gateway HTTP</h3>
      <a href="#gateway-http">
        
      </a>
    </div>
    <p>This dashboard focuses on HTTP traffic and allows for monitoring and analyzing HTTP requests passing through Cloudflare's <a href="https://www.cloudflare.com/zero-trust/products/gateway/">Secure Web Gateway</a>.</p><p>Here, patterns of traffic can be identified, potential threats detected, and a better understanding gained of how resources are being used within the network.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/2C5VeJ6U4MfZjn7cmHgAPn/0e2600c2f5cfdd83d9f9713d60454cc0/image2-10.png" />
            
            </figure><p>Every visualization in the stage is interactive. Therefore, the whole dashboard adapts to enabled filters, and they can be pinned across dashboards for pivoting. For instance, if clicking on one of the sections of the donut showing the different actions, a filter is automatically applied on that value and the whole dashboard is oriented around it.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/5oHgZ74rXxV1we32WHqsye/ae9d1d99546257b6a6140e0a94947ca8/image1-9.png" />
            
            </figure>
    <div>
      <h3>CASB</h3>
      <a href="#casb">
        
      </a>
    </div>
    <p>Following with a different perspective, the <a href="https://www.cloudflare.com/learning/access-management/what-is-a-casb/">CASB (Cloud Access Security Broker)</a> dashboard provides visibility over cloud applications used by users. Its visualizations are targeted to detect threats effectively, helping in the risk management and regulatory compliance.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/79LR83kaKlJg7kzZS5ewTq/5e86a9bcf83db0940d14aef082c7fdde/image4-5.png" />
            
            </figure><p>These examples illustrate how dashboards in the integration between Cloudflare and Elastic offer practical and effective data visualization for Zero Trust. They enable us to make data-driven decisions, identify behavioral patterns, and proactively respond to threats. By providing relevant information in a visual and accessible manner, these dashboards strengthen security posture and allow for more efficient risk management in the Zero Trust environment.</p>
    <div>
      <h2>How to get started</h2>
      <a href="#how-to-get-started">
        
      </a>
    </div>
    <p>Setup and deployment is simple. Use the Cloudflare dashboard or API to create Logpush jobs with all fields enabled for each dataset you’d like to ingest on Elastic. There are eight account-scoped datasets available to use today (Access Requests, Audit logs, CASB findings, Gateway logs including DNS, Network, HTTP; Zero Trust Session Logs) that can be ingested into Elastic.</p><p>Setup <a href="https://developers.cloudflare.com/logs/get-started/enable-destinations/elastic/">Logpush jobs</a> to your Elastic destination via one of the following methods:</p><ul><li><p><b>HTTP Endpoint mode</b> - Cloudflare pushes logs directly to an HTTP endpoint hosted by your Elastic Agent.</p></li><li><p><b>AWS S3 polling mode</b> - Cloudflare writes data to S3 and Elastic Agent polls the S3 bucket by listing its contents and reading new files.</p></li><li><p><b>AWS S3 SQS mode</b> - Cloudflare writes data to S3, S3 pushes a new object notification to SQS, Elastic Agent receives the notification from SQS, and then reads the S3 object. Multiple Agents can be used in this mode.</p></li></ul>
    <div>
      <h3>Enabling the integration in Elastic</h3>
      <a href="#enabling-the-integration-in-elastic">
        
      </a>
    </div>
    <ol><li><p>In Kibana, go to Management &gt; Integrations</p></li><li><p>In the integrations search bar type Cloudflare Logpush.</p></li><li><p>Click the Cloudflare Logpush integration from the search results.</p></li><li><p>Click the Add Cloudflare Logpush button to add Cloudflare Logpush integration.</p></li><li><p>Enable the Integration with the HTTP Endpoint, AWS S3 input or GCS input.</p></li><li><p>Under the AWS S3 input, there are two types of inputs: using AWS S3 Bucket or using SQS.</p></li><li><p>Configure Cloudflare to send logs to the Elastic Agent.</p></li></ol>
    <div>
      <h2>What’s next</h2>
      <a href="#whats-next">
        
      </a>
    </div>
    <p>As organizations increasingly adopt a Zero Trust architecture, understanding your organization’s security posture is paramount. The dashboards help with necessary tools to build a robust security strategy, centered around visibility, early detection, and effective threat response.  By <a href="https://www.cloudflare.com/learning/security/what-is-siem/">unifying data</a>, normalizing fields, facilitating search, and enabling the creation of custom dashboards, this integration becomes a valuable asset for any cybersecurity team aiming to strengthen their security posture.</p><p>We’re looking forward to continuing to connect Cloudflare customers with our community of technology partners, to help in the adoption of a Zero Trust architecture.</p><p>Explore this new integration today.</p> ]]></content:encoded>
            <category><![CDATA[Product News]]></category>
            <category><![CDATA[Zero Trust]]></category>
            <category><![CDATA[Logs]]></category>
            <category><![CDATA[Security]]></category>
            <category><![CDATA[SIEM]]></category>
            <category><![CDATA[Elastic]]></category>
            <category><![CDATA[Partners]]></category>
            <guid isPermaLink="false">6amHiWxrNpxWRyQhTWFUSu</guid>
            <dc:creator>Corey Mahan</dc:creator>
            <dc:creator>Gavin Chen</dc:creator>
            <dc:creator>Andrew Meyer</dc:creator>
            <dc:creator>Chema Martínez (Guest Author)</dc:creator>
        </item>
        <item>
            <title><![CDATA[Integrating Network Analytics Logs with your SIEM dashboard]]></title>
            <link>https://blog.cloudflare.com/network-analytics-logs/</link>
            <pubDate>Tue, 17 May 2022 15:46:30 GMT</pubDate>
            <description><![CDATA[ We’re excited to announce the availability of Network Analytics Logs for maximum visibility into L3/4 traffic and DDoS attacks ]]></description>
            <content:encoded><![CDATA[ <p></p><p>We’re excited to announce the availability of Network Analytics Logs. <a href="https://www.cloudflare.com/magic-transit/">Magic Transit</a>, <a href="https://www.cloudflare.com/magic-firewall/">Magic Firewall</a>, <a href="https://www.cloudflare.com/magic-wan/">Magic WAN</a>, and <a href="https://www.cloudflare.com/products/cloudflare-spectrum/">Spectrum</a> customers on the Enterprise plan can feed packet samples directly into storage services, <a href="https://www.cloudflare.com/network-services/solutions/network-monitoring-tools/">network monitoring tools</a> such as Kentik, or their <a href="https://www.cloudflare.com/learning/security/what-is-siem/">Security Information Event Management (SIEM)</a> systems such as Splunk to gain near real-time visibility into network traffic and <a href="https://www.cloudflare.com/learning/ddos/what-is-a-ddos-attack/">DDoS attacks</a>.</p>
    <div>
      <h2>What’s included in the logs</h2>
      <a href="#whats-included-in-the-logs">
        
      </a>
    </div>
    <p>By creating a Network Analytics Logs job, Cloudflare will continuously push logs of packet samples directly to the HTTP endpoint of your choice, including Websockets. The logs arrive in JSON format which makes them easy to parse, transform, and aggregate. The logs include packet samples of traffic dropped and passed by the following systems:</p><ol><li><p>Network-layer DDoS Protection Ruleset</p></li><li><p>Advanced TCP Protection</p></li><li><p>Magic Firewall</p></li></ol><p>Note that not all mitigation systems are applicable to all Cloudflare services. Below is a table describing which mitigation service is applicable to which Cloudflare service:</p>
<table>
<thead>
  <tr>
    <th><br /><span>Mitigation System</span></th>
    <th><span>Cloudflare Service</span></th>
  </tr>
  <tr>
    <th><span>Magic Transit</span></th>
    <th><span>Magic WAN</span></th>
    <th><span>Spectrum</span></th>
  </tr>
</thead>
<tbody>
  <tr>
    <td><span>Network-layer DDoS Protection Ruleset</span></td>
    <td><span>✅</span></td>
    <td><span>❌</span></td>
    <td><span>✅</span></td>
  </tr>
  <tr>
    <td><span>Advanced TCP Protection</span></td>
    <td><span>✅</span></td>
    <td><span>❌</span></td>
    <td><span>❌</span></td>
  </tr>
  <tr>
    <td><span>Magic Firewall </span></td>
    <td><span>✅</span></td>
    <td><span>✅</span></td>
    <td><span>❌</span></td>
  </tr>
</tbody>
</table><p>Packets are processed by the mitigation systems in the order outlined above. Therefore, a packet that passed all three systems may produce three packet samples, one from each system. This can be very insightful when troubleshooting and wanting to understand where in the stack a packet was dropped. To avoid overcounting the total passed traffic, Magic Transit users should only take into consideration the passed packets from the last mitigation system, Magic Firewall.</p><p>An example of a packet sample log:</p>
            <pre><code>{"AttackCampaignID":"","AttackID":"","ColoName":"bkk06","Datetime":1652295571783000000,"DestinationASN":13335,"Direction":"ingress","IPDestinationAddress":"(redacted)","IPDestinationSubnet":"/24","IPProtocol":17,"IPSourceAddress":"(redacted)","IPSourceSubnet":"/24","MitigationReason":"","MitigationScope":"","MitigationSystem":"magic-firewall","Outcome":"pass","ProtocolState":"","RuleID":"(redacted)","RulesetID":"(redacted)","RulesetOverrideID":"","SampleInterval":100,"SourceASN":38794,"Verdict":"drop"}</code></pre>
            <p>All the available log fields are documented here: <a href="https://developers.cloudflare.com/logs/reference/log-fields/account/network_analytics_logs/">https://developers.cloudflare.com/logs/reference/log-fields/account/network_analytics_logs/</a></p>
    <div>
      <h2>Setting up the logs</h2>
      <a href="#setting-up-the-logs">
        
      </a>
    </div>
    <p>In this walkthrough, we will demonstrate how to feed the Network Analytics Logs into Splunk via <a href="https://www.postman.com/">Postman</a>. At this time, it is only possible to set up Network Analytics Logs via API. Setting up the logs requires three main steps:</p><ol><li><p>Create a Cloudflare API token.</p></li><li><p>Create a Splunk Cloud HTTP Event Collector (HEC) token.</p></li><li><p>Create and enable a Cloudflare Logpush job.</p></li></ol><p>Let’s get started!</p>
    <div>
      <h3>1) Create a Cloudflare API token</h3>
      <a href="#1-create-a-cloudflare-api-token">
        
      </a>
    </div>
    <ol><li><p>Log in to your Cloudflare account and navigate to <b>My Profile.</b></p></li><li><p>On the left-hand side, in the collapsing navigation menu, click <b>API Tokens.</b></p></li><li><p>Click <b>Create Token</b> and then, under <b>Custom token</b>, click <b>Get started.</b></p></li><li><p>Give your custom token a name, and select an Account scoped permission to edit Logs. You can also scope it to a specific/subset/all of your accounts.</p></li><li><p>At the bottom, click <b>Continue to summary</b>, and then <b>Create Token</b>.</p></li><li><p><b>Copy</b> and save your token. You can also test your token with the provided snippet in Terminal.</p></li></ol><p>When you're using an API token, you don't need to provide your email address as part of the API credentials.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/4HD1OtGie9s6CbcVPA7KUx/2f103d64615a7d44e9d3adebc1e12a5b/image5-17.png" />
            
            </figure><p>Read more about creating an API token on the Cloudflare Developers website: <a href="https://developers.cloudflare.com/api/tokens/create/">https://developers.cloudflare.com/api/tokens/create/</a></p>
    <div>
      <h3>2) Create a Splunk token for an HTTP Event Collector</h3>
      <a href="#2-create-a-splunk-token-for-an-http-event-collector">
        
      </a>
    </div>
    <p>In this walkthrough, we’re using a Splunk Cloud free trial, but <a href="https://developers.cloudflare.com/logs/get-started/enable-destinations/">you can use almost any service that can accept logs over HTTPS</a>. In some cases, if you’re using an on-premise SIEM solution, you may need to allowlist <a href="https://www.cloudflare.com/ips/">Cloudflare IP address</a> in your firewall to be able to receive the logs.</p><ol><li><p>Create a Splunk Cloud account. I created a trial account for the purpose of this blog.</p></li><li><p>In the Splunk Cloud dashboard, go to <b>Settings</b> &gt; <b>Data Input.</b></p></li><li><p>Next to <b>HTTP Event Collector</b>, click <b>Add new.</b></p></li><li><p>Follow the steps to create a token.</p></li><li><p>Copy your token and your allocated Splunk hostname and save both for later.</p></li></ol>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/3n4C580G6FxELEaOqPO7Wz/58c6f8fa136f71413b008ace676c3ca9/image2-41.png" />
            
            </figure><p>Read more about using Splunk with Cloudflare Logpush on the Cloudflare Developers website: <a href="https://developers.cloudflare.com/logs/get-started/enable-destinations/splunk/">https://developers.cloudflare.com/logs/get-started/enable-destinations/splunk/</a></p><p>Read more about creating an HTTP Event Collector token on Splunk’s website: <a href="https://docs.splunk.com/Documentation/Splunk/8.2.6/Data/UsetheHTTPEventCollector">https://docs.splunk.com/Documentation/Splunk/8.2.6/Data/UsetheHTTPEventCollector</a></p>
    <div>
      <h3>3) Create a Cloudflare Logpush job</h3>
      <a href="#3-create-a-cloudflare-logpush-job">
        
      </a>
    </div>
    <p>Creating and enabling a job is very straightforward. It requires only one API call to Cloudflare to create and enable a job.</p><p>To send the API calls I used <a href="https://www.postman.com/">Postman</a>, which is a user-friendly API client that was recommended to me by a colleague. It allows you to save and customize API calls. You can also use Terminal/CMD or any other API client/script of your choice.</p><p>One thing to notice is Network Analytics Logs are <b>account</b>-scoped. The API endpoint is therefore a tad different from what you would normally use for zone-scoped datasets such as HTTP request logs and DNS logs.</p><p>This is the endpoint for creating an account-scoped Logpush job:</p><p><code>https://api.cloudflare.com/client/v4/accounts/**{account-id}**/logpush/jobs</code></p><p>Your account identifier number is a unique identifier of your account. It is a string of 32 numbers and characters. If you’re not sure what your account identifier is, log in to Cloudflare, select the appropriate account, and copy the string at the end of the URL.</p><p><code>https://dash.cloudflare.com/**{account-id}**</code></p><p>Then, set up a new request in Postman (or any other API client/CLI tool).</p><p>To successfully create a Logpush job, you’ll need the HTTP method, URL, Authorization token, and request body (data). The request body must include a destination configuration (<code>destination_conf</code>), the specified dataset (<code>network_analytics_logs</code>, in our case), and the token (your Splunk token).</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/1SlAhohD7nH7Ge6ODVPiWT/132cb3aa127ba2cc4596b2fbc2f2c6e8/image1-48.png" />
            
            </figure><p><b>Method</b>:</p><p><code>POST</code></p><p><b>URL</b>:</p><p><code>https://api.cloudflare.com/client/v4/accounts/**{account-id}**/logpush/jobs</code></p><p><b>Authorization</b>: Define a Bearer authorization in the <b>Authorization</b> tab, or add it to the header, and add your Cloudflare API token.</p><p><b>Body</b>: Select a <b>Raw</b> &gt; <b>JSON</b></p>
            <pre><code>{
"destination_conf": "{your-unique-splunk-configuration}",
"dataset": "network_analytics_logs",
"token": "{your-splunk-hec-tag}",
"enabled": "true"
}</code></pre>
            <p>If you’re using Splunk Cloud, then your unique configuration has the following format:</p><p><code>**{your-unique-splunk-configuration}=**splunk://**{your-splunk-hostname}**.splunkcloud.com:8088/services/collector/raw?channel=**{channel-id}**&amp;header_Authorization=Splunk%20**{your-splunk–hec-token}**&amp;insecure-skip-verify=false</code></p><p>Definition of the variables:</p><p><code><b>{your-splunk-hostname}</b></code>= Your allocated Splunk Cloud hostname.</p><p><code><b>{channel-id}</b></code> = A unique ID that you choose to assign for.<b>`{your-splunk–hec-token}`</b> = The token that you generated for your Splunk HEC.</p><p>An important note is that customers should have a valid <a href="https://www.cloudflare.com/application-services/products/ssl/">SSL/TLS certificate</a> on their Splunk instance to support an encrypted connection.</p><p>After you’ve done that, you can create a GET request to the same URL (no request body needed) to verify that the job was created and is enabled.</p><p>The response should be similar to the following:</p>
            <pre><code>{
    "errors": [],
    "messages": [],
    "result": {
        "id": {job-id},
        "dataset": "network_analytics_logs",
        "frequency": "high",
        "kind": "",
        "enabled": true,
        "name": null,
        "logpull_options": null,
        "destination_conf": "{your-unique-splunk-configuration}",
        "last_complete": null,
        "last_error": null,
        "error_message": null
    },
    "success": true
}</code></pre>
            <p>Shortly after, you should start receiving logs to your Splunk HEC.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/45iJNwBrNbf0ptNsRdc4N/7a2464708c75292b4f704a89a20220f2/image4-27.png" />
            
            </figure><p>Read more about enabling Logpush on the Cloudflare Developers website: <a href="https://developers.cloudflare.com/logs/reference/logpush-api-configuration/examples/example-logpush-curl/">https://developers.cloudflare.com/logs/reference/logpush-api-configuration/examples/example-logpush-curl/</a></p>
    <div>
      <h2>Reduce costs with R2 storage</h2>
      <a href="#reduce-costs-with-r2-storage">
        
      </a>
    </div>
    <p>Depending on the amount of logs that you read and write, the cost of third party cloud storage can skyrocket — forcing you to decide between managing a tight budget and being able to properly investigate networking and security issues. However, we believe that you shouldn’t have to make those trade-offs. With <a href="/logs-r2/">R2’s low costs</a>, we’re making this decision easier for our customers. Instead of feeding logs to a third party, you can reap the cost benefits of <a href="/logs-r2/">storing them in R2</a>.</p><p>To learn more about the <a href="https://www.cloudflare.com/developer-platform/r2/">R2 features and pricing</a>, check out the <a href="/r2-open-beta/">full blog post</a>. To enable R2, contact your account team.</p>
    <div>
      <h2>Cloudflare logs for maximum visibility</h2>
      <a href="#cloudflare-logs-for-maximum-visibility">
        
      </a>
    </div>
    <p><a href="https://www.cloudflare.com/plans/enterprise/">Cloudflare Enterprise</a> customers have access to detailed logs of the metadata generated by our products. These logs are helpful for troubleshooting, identifying network and configuration adjustments, and generating reports, especially when combined with logs from other sources, such as your servers, firewalls, routers, and other appliances.</p><p>Network Analytics Logs joins Cloudflare’s family of products on Logpush: <a href="https://developers.cloudflare.com/logs/reference/log-fields/zone/dns_logs/">DNS logs</a>, <a href="https://developers.cloudflare.com/logs/reference/log-fields/zone/firewall_events/">Firewall events</a>, <a href="https://developers.cloudflare.com/logs/reference/log-fields/zone/http_requests/">HTTP requests</a>, <a href="https://developers.cloudflare.com/logs/reference/log-fields/zone/nel_reports/">NEL reports</a>, <a href="https://developers.cloudflare.com/logs/reference/log-fields/zone/spectrum_events/">Spectrum events</a>, <a href="https://developers.cloudflare.com/logs/reference/log-fields/account/audit_logs/">Audit logs</a>, <a href="https://developers.cloudflare.com/logs/reference/log-fields/account/gateway_dns/">Gateway DNS</a>, <a href="https://developers.cloudflare.com/logs/reference/log-fields/account/gateway_http/">Gateway HTTP</a>, and <a href="https://developers.cloudflare.com/logs/reference/log-fields/account/gateway_network/">Gateway Network</a>.</p><p>Not using Cloudflare yet? <a href="https://dash.cloudflare.com/sign-up">Start now</a> with our Free and <a href="https://www.cloudflare.com/plans/pro/">Pro plans</a> to protect your websites against DDoS attacks, or <a href="https://www.cloudflare.com/magic-transit/">contact us</a> for comprehensive <a href="https://www.cloudflare.com/ddos/">DDoS protection</a> and <a href="https://www.cloudflare.com/learning/cloud/what-is-a-cloud-firewall/">firewall-as-a-service</a> for your entire network.</p> ]]></content:encoded>
            <category><![CDATA[DDoS]]></category>
            <category><![CDATA[Data]]></category>
            <category><![CDATA[Magic Transit]]></category>
            <category><![CDATA[Spectrum]]></category>
            <category><![CDATA[Logs]]></category>
            <category><![CDATA[Network]]></category>
            <category><![CDATA[SIEM]]></category>
            <guid isPermaLink="false">7J0cgdiD9dX3Xb9q1OaN3f</guid>
            <dc:creator>Omer Yoachimik</dc:creator>
            <dc:creator>Kyle Bowman</dc:creator>
        </item>
        <item>
            <title><![CDATA[More products, more partners, and a new look for Cloudflare Logs]]></title>
            <link>https://blog.cloudflare.com/logpush-ui-update/</link>
            <pubDate>Tue, 22 Jun 2021 13:00:04 GMT</pubDate>
            <description><![CDATA[ Customers can now use our dashboard to push HTTP, Spectrum, Firewall, and NEL Events directly to Datadog, Sumo Logic, Splunk, or an S3-compatible storage provider.
 ]]></description>
            <content:encoded><![CDATA[ <p>We are excited to announce a new look and new capabilities for <a href="https://developers.cloudflare.com/logs/">Cloudflare Logs</a>! Customers on our Enterprise plan can now configure Logpush for Firewall Events and Network Error Logs Reports directly from the dashboard. Additionally, it’s easier to send Logs directly to our analytics partners Microsoft Azure Sentinel, Splunk, Sumo Logic, and Datadog. This blog post discusses how customers use Cloudflare Logs, how we’ve made it easier to consume logs, and tours the new user interface.</p>
    <div>
      <h3>New data sets for insight into more products</h3>
      <a href="#new-data-sets-for-insight-into-more-products">
        
      </a>
    </div>
    <p>Cloudflare Logs are almost as old as Cloudflare itself, but we have a few big improvements: new datasets and new destinations.</p><p>Cloudflare has a large number of products, and nearly all of them can generate Logs in different <i>data sets</i>. We have “HTTP Request” Logs, or one log line for every L7 HTTP request that we handle (whether cached or not). We also provide connection Logs for Spectrum, our proxy for any TCP or UDP based application. Gateway, part of our Cloudflare for Teams suite, can provide Logs for <a href="/export-logs-from-cloudflare-gateway-with-logpush/">HTTP and DNS</a> traffic.</p><p>Today, we are introducing two new data sets:</p><p><b>Firewall Events</b> gives insight into malicious traffic handled by Cloudflare. It provides detailed information about everything our WAF does. For example, Firewall Events shows whether a request was blocked outright or whether we issued a CAPTCHA challenge.  <a href="/stream-firewall-events-directly-to-your-siem/">About a year ago</a> we introduced the ability to send Firewall Events directly to your <a href="https://www.cloudflare.com/learning/security/what-is-siem/">SIEM</a>; starting today, I’m thrilled to share that you can enable this directly from the dashboard!</p><p><a href="https://support.cloudflare.com/hc/en-us/articles/360050691831-Understanding-Network-Error-Logging"><b>Network Error Logging</b></a> <b>(NEL) Reports</b> provides information about clients that can’t reach our network. To enable NEL Reports for your zone and start seeing where clients are having issues reaching our network, reach out to your account manager.</p>
    <div>
      <h3>Take your Logs anywhere with an S3-compatible API</h3>
      <a href="#take-your-logs-anywhere-with-an-s3-compatible-api">
        
      </a>
    </div>
    <p>To start using logs, you need to store them first. Cloudflare has long supported AWS, Azure, and Google Cloud as storage destinations. But we know that customers use a huge variety of storage infrastructure, which could be hosted on-premise or with one of our <a href="https://www.cloudflare.com/bandwidth-alliance/">Bandwidth Alliance partners</a>.</p><p>Starting today, we support any storage destination with an <a href="https://www.cloudflare.com/developer-platform/solutions/s3-compatible-object-storage/">S3-compatible API</a>. This includes:</p><ul><li><p><a href="https://www.digitalocean.com/docs/spaces/">Digital Ocean Spaces</a></p></li><li><p><a href="https://www.backblaze.com/b2/docs/s3_compatible_api.html">Backblaze B2</a></p></li><li><p><a href="https://www.alibabacloud.com/help/doc-detail/64919.htm#title-37m-7gl-xy2">Alibaba Cloud OSS</a></p></li><li><p><a href="https://docs.jdcloud.com/en/object-storage-service/introduction-2">JD Cloud Object Storage Service</a></p></li><li><p><a href="https://docs.cloud.oracle.com/en-us/iaas/Content/Object/Tasks/s3compatibleapi.htm">Oracle Cloud Object Storage</a></p></li><li><p><a href="https://www.linode.com/products/object-storage/">Linode Object Storage</a></p></li></ul><p>And best of all, it’s super easy to get data into these locations using our new UI!</p><blockquote><p><i>"As always, we love that our partnership with Cloudflare allows us to seamlessly offer customers our easy, plug and play storage solution, Backblaze B2 Cloud Storage. Even better is that, as founding members of the Bandwidth Alliance, we can do it all with free egress."— </i><b><i>Nilay Patel</i></b><i>, Co-founder and VP of Solutions Engineering and Sales, Backblaze.</i></p></blockquote>
    <div>
      <h3>Push Cloudflare Logs directly to our analytics partners</h3>
      <a href="#push-cloudflare-logs-directly-to-our-analytics-partners">
        
      </a>
    </div>
    <p>While many customers like to store Logs themselves, we’ve also heard that many customers want to get Logs into their analytics provider directly — without going through another layer. Getting high volume log data out of <a href="https://www.cloudflare.com/learning/cloud/what-is-object-storage/">object storage</a> and into an analytics provider can require building and maintaining a costly, time-consuming, and fragile integration.</p><p>Because of this, we now provide direct integrations with four analytics platforms: Microsoft Azure Sentinel, Sumo Logic, Splunk, and Datadog. And starting today, you can push Logs directly into Sumo Logic, Splunk and Datadog from the UI! Customers can add Cloudflare to Azure Sentinel using the <a href="https://azuremarketplace.microsoft.com/en-us/marketplace/apps/azuresentinel.azure-sentinel-solution-cloudflare?tab=Overview">Azure Marketplace</a>.</p><blockquote><p><i>“Organizations are in a state of digital transformation on a journey to the cloud. Most of our customers deploy services in multiple clouds and have legacy systems on premise. Splunk provides visibility across all of this, and more importantly, with SOAR we can automate remediation. We are excited about the Cloudflare partnership, and adding their data into Splunk drives the outcomes customers need to modernize their security operations.”— </i><b><i>Jane Wong</i></b><i>, Vice President, Product Management, Security at Splunk</i></p></blockquote><blockquote><p><i>"Securing enterprise IT environments can be challenging - from devices, to users, to apps, to data centers on-premises or in the cloud. In today’s environment of increasingly sophisticated cyber-attacks, our mutual customers rely on Microsoft Azure Sentinel for a comprehensive view of their enterprise.  Azure Sentinel enables SecOps teams to collect data at cloud scale and empowers them with AI and ML to find the real threats in those signals, reducing alert fatigue by as much as 90%. By integrating directly with Cloudflare Logs we are making it easier and faster for customers to get complete visibility across their entire stack.”— </i><b><i>Sarah Fender</i></b><i>, Partner Group Program Manager, Azure Sentinel at Microsoft</i></p></blockquote><blockquote><p><i>"As a long time Cloudflare partner we've worked together to help joint customers analyze events and trends from their websites and applications to provide end-to-end visibility to improve digital experiences. We're excited to expand our partnership as part of the Cloudflare Analytics Ecosystem to provide comprehensive real-time insights for both observability and the security of mission-critical applications and services with our Cloud SIEM solution."— </i><b><i>John Coyle</i></b><i>, Vice President of Business Development for Sumo Logic</i></p></blockquote><blockquote><p><i>"Knowing that applications perform as well in the real world as they do in the datacenter is critical to ensuring great digital experiences. Combining Cloudflare Logs with Datadog telemetry about application performance in a single pane of glass ensures teams will have a holistic view of their application delivery."— </i><b><i>Michael Gerstenhaber</i></b><i>, Sr. Director of Product, Datadog</i></p></blockquote>
    <div>
      <h3>Why Cloudflare Logs?</h3>
      <a href="#why-cloudflare-logs">
        
      </a>
    </div>
    <p>Cloudflare’s mission is to help build a better Internet. We do that by providing a massive global network that protects and accelerates our customers’ infrastructure. Because traffic flows across our network before reaching our customers, it means we have a unique vantage point into that traffic. In many cases, we have visibility that our customers don’t have — whether we’re telling them about the performance of our cache, the malicious HTTP requests we drop at our edge, a spike in L3 data flows, the performance of their origin, or the CPU used by their serverless applications.</p><p>To provide this ability, we have analytics throughout our dashboard to help customers understand their network traffic, firewall, cache, load balancer, and much more. We also provide alerts that can tell customers when they see an <a href="https://support.cloudflare.com/hc/en-us/articles/360037465932-Preventing-site-downtime">increase in errors</a> or <a href="https://support.cloudflare.com/hc/en-us/articles/360053216191-Understanding-Cloudflare-DDoS-alerts">spike in DDoS activity</a>.</p><p>But some customers want more than what we currently provide with our analytics products. Many of our enterprise customers use SIEMs like Splunk and Sumo Logic or cloud monitoring tools like Datadog. These products can extend the capabilities of Cloudflare by showcasing Cloudflare data in the context of customers’ other infrastructure and providing advanced functionality on this data.</p><p>To understand how this works, consider a typical L7 DDoS attack against one of our customers.  Very commonly, an attack like this might originate from a small number of IP addresses and a customer might choose to block the source IPs completely. After blocking the IP addresses, customers may want to:</p><ul><li><p>Search through their Logs to see all the past instances of activity from those IP addresses.</p></li><li><p>Search through Logs from all their <i>other</i> applications and infrastructure to see all activity from those IP addresses</p></li><li><p>Understand exactly what that attacker was trying to do by looking at the request payload <a href="/encrypt-waf-payloads-hpke/">blocked in our WAF</a> (securely encrypted thanks to HPKE!)</p></li><li><p>Set an alert for similar activity, to be notified if something similar happens again</p></li></ul><p>All these are made possible using SIEMs and infrastructure monitoring tools. For example, our customer <a href="https://www.nov.com/">NOV</a> uses Splunk to “monitor our network and applications by alerting us to various anomalies and high-fidelity incidents".</p><p>“One of the most valuable sources of data is Cloudflare,” said John McLeod, Chief Information Security Officer at NOV. “It provides visibility into network and application attacks. With this integration, it will be easier to get Cloudflare Logs into Splunk, saving my team time and money."</p>
    <div>
      <h3>A new UI for our growing product base</h3>
      <a href="#a-new-ui-for-our-growing-product-base">
        
      </a>
    </div>
    <p>With so many new data sets and new destinations, we realized that our existing user interface was not good enough. We went back to the drawing board to design a more intuitive user experience to help you quickly and easily set up Logpush.</p><p>You can still set up Logpush in the same place in the dashboard, in the Analytics &gt; Logs tab:</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/5AES3xPpjBsHstUB9GTNZ/9366b94403f1a7ccaffec49f4dc934e9/pasted-image-0.png" />
            
            </figure><p>The new UI first prompts users to select the data set to push. Here you’ll also notice that we’ve added support for Firewall Events and NEL Reports!</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/2K4h9OgfdEV2N2XZeNBlri/bb434300e654ec0df3b412c0b7531568/pasted-image-0--1-.png" />
            
            </figure><p>After configuring details like which fields to push, customers can then select where the Logs are going. Here you can see our three new destinations, S3-compatible storage, Sumo Logic, Datadog and Splunk:</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/5zIqKJNKCX5xvDPlTYvHCT/cf085b64f93668bf17f7d0e5483729cf/pasted-image-0--2-.png" />
            
            </figure>
    <div>
      <h3>Coming soon</h3>
      <a href="#coming-soon">
        
      </a>
    </div>
    <p>Of course, we’re not done yet! We have more Cloudflare products in the pipeline and more destinations planned where customers can send their Logs. Additionally, we’re working on adding more flexibility to our logging pipeline so that customers can configure to send Logs for the entire account, plus filter Logs to just send error codes, for example.</p><p>Ultimately, we want to make working with Cloudflare Logs as useful as possible -- on Cloudflare itself! We’re working to help customers solve their performance and security challenges with data at massive scale. If that sounds interesting, please join us! We’re hiring <a href="https://boards.greenhouse.io/cloudflare/jobs/2103918?gh_jid=2103918">Systems Engineers</a> for the Data team.</p> ]]></content:encoded>
            <category><![CDATA[Logs]]></category>
            <category><![CDATA[Product News]]></category>
            <category><![CDATA[Security]]></category>
            <category><![CDATA[SIEM]]></category>
            <guid isPermaLink="false">4F41qDZp0Rgt5waBQBrT67</guid>
            <dc:creator>Bharat Nallan Chakravarthy</dc:creator>
            <dc:creator>Jon Levine</dc:creator>
        </item>
        <item>
            <title><![CDATA[Export logs from Cloudflare Gateway with Logpush]]></title>
            <link>https://blog.cloudflare.com/export-logs-from-cloudflare-gateway-with-logpush/</link>
            <pubDate>Fri, 29 May 2020 11:00:00 GMT</pubDate>
            <description><![CDATA[ Automatically export DNS query logs from Cloudflare Gateway to your SIEM. ]]></description>
            <content:encoded><![CDATA[ <p>Like many people, I have spent a lot more time at home in the last several weeks. I use the free version of Cloudflare Gateway, part of Cloudflare for Teams, to secure the Internet-connected devices on my WiFi network. In the last week, Gateway has processed about 114,000 DNS queries from those devices and blocked nearly 100 as potential security risks.</p><p>I can search those requests in the Cloudflare for Teams UI. The logs capture the hostname requested, the time of the request, and Gateway’s decision to allow or block. This works fine for one-off investigations into a block, but does not help if I want to analyze the data more thoroughly. The last thing I want to do is click through hundreds or thousands of pages.</p><p>That problem is even more difficult for organizations attempting to keep hundreds or thousands of users and their devices secure. Whether they secure roaming devices with DoH or a static IP address, or keep users safe as they return to offices, deployments at that scale need a better option for auditing tens or hundreds of millions of queries each week.</p><p>Starting today, you can configure the automatic export of logs from Cloudflare Gateway to third-party storage destinations or security information and event management (SIEM) tools. Once exported, your team can analyze and audit the data as needed. The feature builds on the same robust <a href="/cloudflare-logpush-the-easy-way-to-get-your-logs-to-your-cloud-storage/">Cloudflare Logpush Service</a> that powers data export from Cloudflare’s infrastructure products.</p>
    <div>
      <h3>Cloudflare Gateway</h3>
      <a href="#cloudflare-gateway">
        
      </a>
    </div>
    <p>Cloudflare Gateway is one-half of Cloudflare for Teams, Cloudflare’s platform for securing users, devices, and data. With Cloudflare for Teams, our global network becomes your team’s network, replacing on-premise appliances and security subscriptions with a single solution delivered closer to your users - wherever they work.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/1JWEZSrQR7ulD8aQN6c39H/30072a7998e10e7463681dad6ffee6cb/YwO4JZYUemcjuBKu4nT3Q-StBRocPnnsJb-yQk9t4NgVLdWoKpmrfmaKxJHM1i0m-7EdVRdjuJOkQggh8Y0nEhNjKEs8febz0nooNgRdyg5UCtHekla6aZRuVQiv.png" />
            
            </figure><p>As part of that platform, Cloudflare Gateway blocks threats on the public Internet from becoming incidents inside your organization. Gateway’s <a href="/protect-your-team-with-cloudflare-gateway/">first release</a> added DNS security filtering and content blocking to the world’s fastest DNS resolver, Cloudflare’s 1.1.1.1.</p><p><a href="https://developers.cloudflare.com/gateway/locations/setup-instructions/router/">Deployment</a> takes less than 5 minutes. Teams can secure entire office networks and segment traffic reports by location. For distributed organizations, Gateway can be deployed via MDM on networks that support IPv6 or using a dedicated IPv4 as part of a <a href="https://www.cloudflare.com/plans/enterprise/">Cloudflare Enterprise</a> account.</p><p>With secure DNS filtering, administrators can click a single button to block known threats, like sources of malware or phishing sites. Policies can be extended to block specific categories, like gambling sites or social media. When users request a filtered site, Gateway stops the DNS query from resolving and prevents the device from connecting to a malicious destination or hostname with blocked material.</p>
    <div>
      <h3>Cloudflare Logpush</h3>
      <a href="#cloudflare-logpush">
        
      </a>
    </div>
    <p>The average user makes about 5,000 DNS queries each day. For an organization with 1,000 employees, that produces 5M rows of data daily. That data includes regular Internet traffic, but also potential trends like targeted phishing campaigns or the use of cloud storage tools that are not approved by your IT organization.</p><p>The Cloudflare for Teams UI presents some summary views of that data, but each organization has different needs for audit, retention, or analysis. The best way to let you investigate the data in any way you need is to give you all of it. However the volume of data and how often you might need to review it means that API calls or CSV downloads are not suitable. A real logging pipeline is required.</p><p>Cloudflare Logpush solves that challenge. Cloudflare’s <a href="https://developers.cloudflare.com/logs/logpush/">Logpush Service</a> exports the data captured by Cloudflare’s network to storage destinations that you control. Rather than requiring your team to build a system to call Cloudflare APIs and pull data, Logpush routinely exports data with fields that you configure.</p><p>Cloudflare’s data team built the Logpush pipeline to make it easy to integrate with popular storage providers. Logpush supports AWS S3, Google Cloud Storage, Sumo Logic, and Microsoft Azure out of the box. Administrators can choose a storage provider, validate they own the destination, and configure exports of logs that will send deltas every five minutes from that point onward.</p>
    <div>
      <h3>How it works</h3>
      <a href="#how-it-works">
        
      </a>
    </div>
    <p>When enabled, you can navigate to a new section of the Logs component in the Cloudflare for Teams UI, titled “Logpush”. Once there, you’ll be able to choose which fields you want to export from Cloudflare Gateway and the storage destination.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/6Oprf36DuhfSZ8FgwLqisq/319316faa12fc11f3520f8fb9bde4c92/image2-9.png" />
            
            </figure><p>The Logpush wizard will walk you through validating that you own the destination and configuring how you want folders to be structured. When saved, Logpush will send updated logs every five minutes to that destination. You can configure multiple destinations and monitor for any issues by returning to this section of the Cloudflare for Teams UI.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/6t415R9imvFz1P1j5DGcBx/6af644eb6ded4511a15da901b29f11fb/image1-9.png" />
            
            </figure>
    <div>
      <h3>What’s next?</h3>
      <a href="#whats-next">
        
      </a>
    </div>
    <p>Cloudflare’s Logpush Service is only available to customers on a contract plan. If you are interested in upgrading, please let us know. All Cloudflare for Teams plans include 30-days of data that can be searched in the UI.</p><p>Cloudflare Access, the other half of Cloudflare for Teams, <a href="/log-every-request-to-corporate-apps-no-code-changes-required/">also supports</a> granular log export. You can configure Logpush for Access in the Cloudflare dashboard that houses Infrastructure features like the WAF and CDN. We plan to migrate that configuration to this UI in the near future.</p> ]]></content:encoded>
            <category><![CDATA[Cloudflare Zero Trust]]></category>
            <category><![CDATA[Cloudflare Gateway]]></category>
            <category><![CDATA[SIEM]]></category>
            <category><![CDATA[Secure Web Gateway]]></category>
            <category><![CDATA[Product News]]></category>
            <category><![CDATA[Security]]></category>
            <category><![CDATA[Zero Trust]]></category>
            <guid isPermaLink="false">2RtDiCAAPHHmt84TrHLlnm</guid>
            <dc:creator>Sam Rhea</dc:creator>
        </item>
        <item>
            <title><![CDATA[Stream Firewall Events directly to your SIEM]]></title>
            <link>https://blog.cloudflare.com/stream-firewall-events-directly-to-your-siem/</link>
            <pubDate>Fri, 24 Apr 2020 11:00:00 GMT</pubDate>
            <description><![CDATA[ As of today, customers using Cloudflare Logs can create Logpush jobs that send only Firewall Events. These events arrive much faster than our existing HTTP requests logs: they are typically delivered to your logging platform within 60 seconds of sending the response to the client. ]]></description>
            <content:encoded><![CDATA[ <p></p><p>The highest trafficked sites using Cloudflare receive billions of requests per day. But only about 5% of those requests typically trigger security rules, whether they be “managed” rules such as our <a href="https://www.cloudflare.com/learning/ddos/glossary/web-application-firewall-waf/">WAF</a> and DDoS protections, or custom rules such as those configured by customers using our powerful Firewall Rules and Rate Limiting engines.</p><p>When enforcement is taken on a request that interrupts the flow of malicious traffic, a <a href="/updates-to-firewall-analytics/#event-based-logging">Firewall Event is logged with detail</a> about the request including which rule triggered us to take action and what action we took, e.g., challenged or blocked outright.</p><p>Previously, if you wanted to ingest all of these events into your <a href="https://www.cloudflare.com/learning/security/what-is-siem/">SIEM</a> or logging platform, you had to take the whole firehose of requests—good and bad—and then filter them client side. If you’re paying by the log line or scaling your own storage solution, this cost can add up quickly. And if you have a security team monitoring logs, they’re being sent a lot of extraneous data to sift through before determining what needs their attention most.</p><p>As of today, customers using Cloudflare Logs can create <a href="https://developers.cloudflare.com/logs/about">Logpush jobs</a> that send only Firewall Events. These events arrive much faster than our existing HTTP requests logs: they are typically delivered to your logging platform within 60 seconds of sending the response to the client.</p><p>In this post we’ll show you how to use Terraform and Sumo Logic, an <a href="https://developers.cloudflare.com/logs/analytics-integrations/">analytics integration partner</a>, to get this logging set up live in just a few minutes.</p>
    <div>
      <h2>Process overview</h2>
      <a href="#process-overview">
        
      </a>
    </div>
    <p>The steps below take you through the process of configuring Cloudflare Logs to push security events directly to your logging platform. For purposes of this tutorial, we’ve chosen Sumo Logic as our log destination, but you’re free to use any of our <a href="https://developers.cloudflare.com/logs/analytics-integrations/">analytics partners</a>, or any logging platform that can read from cloud storage such as <a href="https://developers.cloudflare.com/logs/logpush/aws-s3/">AWS S3</a>, <a href="https://developers.cloudflare.com/logs/logpush/azure/">Azure Blob Storage</a>, or <a href="https://developers.cloudflare.com/logs/logpush/google-cloud-storage/">Google Cloud Storage</a>.</p><p>To configure Sumo Logic and Cloudflare we make use of Terraform, a popular Infrastructure-as-Code tool from HashiCorp. If you’re new to Terraform, see <a href="/getting-started-with-terraform-and-cloudflare-part-1/">Getting started with Terraform and Cloudflare</a> for a guided walkthrough with best practice recommendations such as how to version and store your configuration in git for easy rollback.</p><p>Once the infrastructure is in place, you’ll send a malicious request towards your site to trigger the Cloudflare Web Application Firewall, and watch as the Firewall Events generated by that request shows up in Sumo Logic about a minute later.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/7HQ0Ky3j85kh2abquFBJf7/e0ee2a5d4d613bf597c4298e4be73467/image2-18.png" />
            
            </figure>
    <div>
      <h2>Prerequisites</h2>
      <a href="#prerequisites">
        
      </a>
    </div>
    
    <div>
      <h3>Install Terraform and Go</h3>
      <a href="#install-terraform-and-go">
        
      </a>
    </div>
    <p>First you’ll need to install Terraform. See our Developer Docs for <a href="https://developers.cloudflare.com/terraform/installing/">instructions</a>.</p><p>Next you’ll need to install Go. The easiest way on macOS to do so is with <a href="https://brew.sh/">Homebrew</a>:</p>
            <pre><code>$ brew install golang
$ export GOPATH=$HOME/go
$ mkdir $GOPATH</code></pre>
            <p><a href="https://golang.org/">Go</a> is required because the Sumo Logic Terraform Provider is a "community" plugin, which means it has to be built and installed manually rather than automatically through the Terraform Registry, as will happen later for the Cloudflare Terraform Provider.</p>
    <div>
      <h3>Install the Sumo Logic Terraform Provider Module</h3>
      <a href="#install-the-sumo-logic-terraform-provider-module">
        
      </a>
    </div>
    <p>The official installation instructions for installing the Sumo Logic provider can be found on their <a href="https://github.com/SumoLogic/sumologic-terraform-provider">GitHub Project page</a>, but here are my notes:</p>
            <pre><code>$ mkdir -p $GOPATH/src/github.com/terraform-providers &amp;&amp; cd $_
$ git clone https://github.com/SumoLogic/sumologic-terraform-provider.git
$ cd sumologic-terraform-provider
$ make install</code></pre>
            
    <div>
      <h2>Prepare Sumo Logic to receive Cloudflare Logs</h2>
      <a href="#prepare-sumo-logic-to-receive-cloudflare-logs">
        
      </a>
    </div>
    
    <div>
      <h3>Install Sumo Logic livetail utility</h3>
      <a href="#install-sumo-logic-livetail-utility">
        
      </a>
    </div>
    <p>While not strictly necessary, the <a href="https://help.sumologic.com/05Search/Live-Tail/Live-Tail-CLI">livetail tool</a> from Sumo Logic makes it easy to grab the Cloudflare Logs challenge token we’ll need in a minute, and also to view the fruits of your labor: seeing a Firewall Event appear in Sumo Logic shortly after the malicious request hit the edge.</p><p>On macOS:</p>
            <pre><code>$ brew cask install livetail
...
==&gt; Verifying SHA-256 checksum for Cask 'livetail'.
==&gt; Installing Cask livetail
==&gt; Linking Binary 'livetail' to '/usr/local/bin/livetail'.
?  livetail was successfully installed!</code></pre>
            
    <div>
      <h3>Generate Sumo Logic Access Key</h3>
      <a href="#generate-sumo-logic-access-key">
        
      </a>
    </div>
    <p>This step assumes you already have a Sumo Logic account. If not, you can sign up for a free trial <a href="https://www.sumologic.com/sign-up/">here</a>.</p><ol><li><p>Browse to <code>https://service.$ENV.sumologic.com/ui/#/security/access-keys</code> where <code>$ENV</code> should be replaced by <a href="http://help.sumologic.com/Send_Data/Collector_Management_API/Sumo_Logic_Endpoints">the environment</a> you chose on signup.</p></li><li><p>Click the "+ Add Access Key" button, give it a name, and click "Create Key"</p></li><li><p>In the next step you'll save the Access ID and Access Key that are provided as environment variables, so don’t close this modal until you do.</p></li></ol>
    <div>
      <h3>Generate Cloudflare Scoped API Token</h3>
      <a href="#generate-cloudflare-scoped-api-token">
        
      </a>
    </div>
    <ol><li><p>Log in to the <a href="https://dash.cloudflare.com/">Cloudflare Dashboard</a></p></li><li><p>Click on the profile icon in the top-right corner and then select "My Profile"</p></li><li><p>Select "API Tokens" from the nav bar and click "Create Token"</p></li><li><p>Click the "Get started" button next to the "Create Custom Token" label</p></li></ol><p>On the Create Custom Token screen:</p><ol><li><p>Provide a token name, e.g., "Logpush - Firewall Events"</p></li><li><p>Under Permissions, change Account to Zone, and then select Logs and Edit, respectively, in the two drop-downs to the right</p></li><li><p>Optionally, change Zone Resources and IP Address Filtering to restrict restrict access for this token to specific zones or from specific IPs</p></li></ol><p>Click "Continue to summary" and then "Create token" on the next screen. Save the token somewhere secure, e.g., your password manager, as it'll be needed in just a minute.</p>
    <div>
      <h3>Set environment variables</h3>
      <a href="#set-environment-variables">
        
      </a>
    </div>
    <p>Rather than add sensitive credentials to source files (that may get submitted to your source code repository), we'll set environment variables and have the Terraform modules read from them.</p>
            <pre><code>$ export CLOUDFLARE_API_TOKEN="&lt;your scoped cloudflare API token&gt;"
$ export CF_ZONE_ID="&lt;tag of zone you wish to send logs for&gt;"</code></pre>
            <p>We'll also need your Sumo Logic environment, Access ID, and Access Key:</p>
            <pre><code>$ export SUMOLOGIC_ENVIRONMENT="eu"
$ export SUMOLOGIC_ACCESSID="&lt;access id from previous step&gt;"
$ export SUMOLOGIC_ACCESSKEY="&lt;access key from previous step&gt;"</code></pre>
            
    <div>
      <h3>Create the Sumo Logic Collector and HTTP Source</h3>
      <a href="#create-the-sumo-logic-collector-and-http-source">
        
      </a>
    </div>
    <p>We'll create a directory to store our Terraform project in and build it up as we go:</p>
            <pre><code>$ mkdir -p ~/src/fwevents &amp;&amp; cd $_</code></pre>
            <p>Then we'll create the Collector and HTTP source that will store and provide Firewall Events logs to Sumo Logic:</p>
            <pre><code>$ cat &lt;&lt;'EOF' | tee main.tf
##################
### SUMO LOGIC ###
##################
provider "sumologic" {
    environment = var.sumo_environment
    access_id = var.sumo_access_id
}

resource "sumologic_collector" "collector" {
    name = "CloudflareLogCollector"
    timezone = "Etc/UTC"
}

resource "sumologic_http_source" "http_source" {
    name = "firewall-events-source"
    collector_id = sumologic_collector.collector.id
    timezone = "Etc/UTC"
}
EOF</code></pre>
            <p>Then we'll create a variables file so Terraform has credentials to communicate with Sumo Logic:</p>
            <pre><code>$ cat &lt;&lt;EOF | tee variables.tf
##################
### SUMO LOGIC ###
##################
variable "sumo_environment" {
    default = "$SUMOLOGIC_ENVIRONMENT"
}

variable "sumo_access_id" {
    default = "$SUMOLOGIC_ACCESSID"
}
EOF</code></pre>
            <p>With our Sumo Logic configuration set, we’ll initialize Terraform with <code>terraform init</code> and then preview what changes Terraform is going to make by running <code>terraform plan</code>:</p>
            <pre><code>$ terraform init

Initializing the backend...

Initializing provider plugins...

Terraform has been successfully initialized!

You may now begin working with Terraform. Try running "terraform plan" to see
any changes that are required for your infrastructure. All Terraform commands
should now work.

If you ever set or change modules or backend configuration for Terraform,
rerun this command to reinitialize your working directory. If you forget, other
commands will detect it and remind you to do so if necessary.</code></pre>
            
            <pre><code>$ terraform plan
Refreshing Terraform state in-memory prior to plan...
The refreshed state will be used to calculate this plan, but will not be
persisted to local or remote state storage.


------------------------------------------------------------------------

An execution plan has been generated and is shown below.
Resource actions are indicated with the following symbols:
  + create

Terraform will perform the following actions:

  # sumologic_collector.collector will be created
  + resource "sumologic_collector" "collector" {
      + destroy        = true
      + id             = (known after apply)
      + lookup_by_name = false
      + name           = "CloudflareLogCollector"
      + timezone       = "Etc/UTC"
    }

  # sumologic_http_source.http_source will be created
  + resource "sumologic_http_source" "http_source" {
      + automatic_date_parsing       = true
      + collector_id                 = (known after apply)
      + cutoff_timestamp             = 0
      + destroy                      = true
      + force_timezone               = false
      + id                           = (known after apply)
      + lookup_by_name               = false
      + message_per_request          = false
      + multiline_processing_enabled = true
      + name                         = "firewall-events-source"
      + timezone                     = "Etc/UTC"
      + url                          = (known after apply)
      + use_autoline_matching        = true
    }

Plan: 2 to add, 0 to change, 0 to destroy.

------------------------------------------------------------------------

Note: You didn't specify an "-out" parameter to save this plan, so Terraform
can't guarantee that exactly these actions will be performed if
"terraform apply" is subsequently run.</code></pre>
            <p>Assuming everything looks good, let’s execute the plan:</p>
            <pre><code>$ terraform apply -auto-approve
sumologic_collector.collector: Creating...
sumologic_collector.collector: Creation complete after 3s [id=108448215]
sumologic_http_source.http_source: Creating...
sumologic_http_source.http_source: Creation complete after 0s [id=150364538]

Apply complete! Resources: 2 added, 0 changed, 0 destroyed.</code></pre>
            <p>Success! At this point you could log into the Sumo Logic web interface and confirm that your Collector and HTTP Source were created successfully.</p>
    <div>
      <h2>Create a Cloudflare Logpush Job</h2>
      <a href="#create-a-cloudflare-logpush-job">
        
      </a>
    </div>
    <p>Before we’ll start sending logs to your collector, you need to demonstrate the ability to read from it. This validation step prevents accidental (or intentional) misconfigurations from overrunning your logs.</p>
    <div>
      <h3>Tail the Sumo Logic Collector and await the challenge token</h3>
      <a href="#tail-the-sumo-logic-collector-and-await-the-challenge-token">
        
      </a>
    </div>
    <p>In a new shell window—you should keep the current one with your environment variables set for use with Terraform—we'll start tailing Sumo Logic for events sent from the <code>firewall-events-source</code> HTTP source.</p><p>The first time that you run livetail you'll need to specify your <a href="https://help.sumologic.com/APIs/General-API-Information/Sumo-Logic-Endpoints-and-Firewall-Security">Sumo Logic Environment</a>, Access ID and Access Key, but these values will be stored in the working directory for subsequent runs:</p>
            <pre><code>$ livetail _source=firewall-events-source
### Welcome to Sumo Logic Live Tail Command Line Interface ###
1 US1
2 US2
3 EU
4 AU
5 DE
6 FED
7 JP
8 CA
Please select Sumo Logic environment: 
See http://help.sumologic.com/Send_Data/Collector_Management_API/Sumo_Logic_Endpoints to choose the correct environment. 3
### Authenticating ###
Please enter your Access ID: &lt;access id&gt;
Please enter your Access Key &lt;access key&gt;
### Starting Live Tail session ###</code></pre>
            
    <div>
      <h3>Request and receive challenge token</h3>
      <a href="#request-and-receive-challenge-token">
        
      </a>
    </div>
    <p>Before requesting a challenge token, we need to figure out where Cloudflare should send logs.</p><p>We do this by asking Terraform for the receiver URL of the recently created HTTP source. Note that we modify the URL returned slightly as Cloudflare Logs expects <code>sumo://</code> rather than <code>https://</code>.</p>
            <pre><code>$ export SUMO_RECEIVER_URL=$(terraform state show sumologic_http_source.http_source | grep url | awk '{print $3}' | sed -e 's/https:/sumo:/; s/"//g')

$ echo $SUMO_RECEIVER_URL
sumo://endpoint1.collection.eu.sumologic.com/receiver/v1/http/&lt;redacted&gt;</code></pre>
            <p>With URL in hand, we can now request the token.</p>
            <pre><code>$ curl -sXPOST -H "Content-Type: application/json" -H "Authorization: Bearer $CLOUDFLARE_API_TOKEN" -d '{"destination_conf":"'''$SUMO_RECEIVER_URL'''"}' https://api.cloudflare.com/client/v4/zones/$CF_ZONE_ID/logpush/ownership

{"errors":[],"messages":[],"result":{"filename":"ownership-challenge-bb2912e0.txt","message":"","valid":true},"success":true}</code></pre>
            <p>Back in the other window where your livetail is running you should see something like this:</p>
            <pre><code>{"content":"eyJhbGciOiJkaXIiLCJlbmMiOiJBMTI4R0NNIiwidHlwIjoiSldUIn0..WQhkW_EfxVy8p0BQ.oO6YEvfYFMHCTEd6D8MbmyjJqcrASDLRvHFTbZ5yUTMqBf1oniPNzo9Mn3ZzgTdayKg_jk0Gg-mBpdeqNI8LJFtUzzgTGU-aN1-haQlzmHVksEQdqawX7EZu2yiePT5QVk8RUsMRgloa76WANQbKghx1yivTZ3TGj8WquZELgnsiiQSvHqdFjAsiUJ0g73L962rDMJPG91cHuDqgfXWwSUqPsjVk88pmvGEEH4AMdKIol0EOc-7JIAWFBhcqmnv0uAXVOH5uXHHe_YNZ8PNLfYZXkw1xQlVDwH52wRC93ohIxg.pHAeaOGC8ALwLOXqxpXJgQ","filename":"ownership-challenge-bb2912e0.txt"}</code></pre>
            <p>Copy the content value from above into an environment variable, as you'll need it in a minute to create the job:</p>
            <pre><code>$ export LOGPUSH_CHALLENGE_TOKEN="&lt;content value&gt;"</code></pre>
            
    <div>
      <h3>Create the Logpush job using the challenge token</h3>
      <a href="#create-the-logpush-job-using-the-challenge-token">
        
      </a>
    </div>
    <p>With challenge token in hand, we'll use Terraform to create the job.</p><p>First you’ll want to choose the log fields that should be sent to Sumo Logic. You can enumerate the list by querying the dataset:</p>
            <pre><code>$ curl -sXGET -H "Authorization: Bearer $CLOUDFLARE_API_TOKEN" https://api.cloudflare.com/client/v4/zones/$CF_ZONE_ID/logpush/datasets/firewall_events/fields | jq .
{
  "errors": [],
  "messages": [],
  "result": {
    "Action": "string; the code of the first-class action the Cloudflare Firewall took on this request",
    "ClientASN": "int; the ASN number of the visitor",
    "ClientASNDescription": "string; the ASN of the visitor as string",
    "ClientCountryName": "string; country from which request originated",
    "ClientIP": "string; the visitor's IP address (IPv4 or IPv6)",
    "ClientIPClass": "string; the classification of the visitor's IP address, possible values are: unknown | clean | badHost | searchEngine | whitelist | greylist | monitoringService | securityScanner | noRecord | scan | backupService | mobilePlatform | tor",
    "ClientRefererHost": "string; the referer host",
    "ClientRefererPath": "string; the referer path requested by visitor",
    "ClientRefererQuery": "string; the referer query-string was requested by the visitor",
    "ClientRefererScheme": "string; the referer url scheme requested by the visitor",
    "ClientRequestHTTPHost": "string; the HTTP hostname requested by the visitor",
    "ClientRequestHTTPMethodName": "string; the HTTP method used by the visitor",
    "ClientRequestHTTPProtocol": "string; the version of HTTP protocol requested by the visitor",
    "ClientRequestPath": "string; the path requested by visitor",
    "ClientRequestQuery": "string; the query-string was requested by the visitor",
    "ClientRequestScheme": "string; the url scheme requested by the visitor",
    "Datetime": "int or string; the date and time the event occurred at the edge",
    "EdgeColoName": "string; the airport code of the Cloudflare datacenter that served this request",
    "EdgeResponseStatus": "int; HTTP response status code returned to browser",
    "Kind": "string; the kind of event, currently only possible values are: firewall",
    "MatchIndex": "int; rules match index in the chain",
    "Metadata": "object; additional product-specific information. Metadata is organized in key:value pairs. Key and Value formats can vary by Cloudflare security product and can change over time",
    "OriginResponseStatus": "int; HTTP origin response status code returned to browser",
    "OriginatorRayName": "string; the RayId of the request that issued the challenge/jschallenge",
    "RayName": "string; the RayId of the request",
    "RuleId": "string; the Cloudflare security product-specific RuleId triggered by this request",
    "Source": "string; the Cloudflare security product triggered by this request",
    "UserAgent": "string; visitor's user-agent string"
  },
  "success": true
}</code></pre>
            <p>Then you’ll append your Cloudflare configuration to the <code>main.tf</code> file:</p>
            <pre><code>$ cat &lt;&lt;EOF | tee -a main.tf

##################
### CLOUDFLARE ###
##################
provider "cloudflare" {
  version = "~&gt; 2.0"
}

resource "cloudflare_logpush_job" "firewall_events_job" {
  name = "fwevents-logpush-job"
  zone_id = var.cf_zone_id
  enabled = true
  dataset = "firewall_events"
  logpull_options = "fields=RayName,Source,RuleId,Action,EdgeResponseStatusDatetime,EdgeColoName,ClientIP,ClientCountryName,ClientASNDescription,UserAgent,ClientRequestHTTPMethodName,ClientRequestHTTPHost,ClientRequestHTTPPath&amp;timestamps=rfc3339"
  destination_conf = replace(sumologic_http_source.http_source.url,"https:","sumo:")
  ownership_challenge = "$LOGPUSH_CHALLENGE_TOKEN"
}
EOF</code></pre>
            <p>And add to the <code>variables.tf</code> file:</p>
            <pre><code>$ cat &lt;&lt;EOF | tee -a variables.tf

##################
### CLOUDFLARE ###
##################
variable "cf_zone_id" {
  default = "$CF_ZONE_ID"
}</code></pre>
            <p>Next we re-run <code>terraform init</code> to install the latest Cloudflare Terraform Provider Module. You’ll need to make sure you have at least version 2.6.0 as this is the version in which we added Logpush job support:</p>
            <pre><code>$ terraform init

Initializing the backend...

Initializing provider plugins...
- Checking for available provider plugins...
- Downloading plugin for provider "cloudflare" (terraform-providers/cloudflare) 2.6.0...

Terraform has been successfully initialized!

You may now begin working with Terraform. Try running "terraform plan" to see
any changes that are required for your infrastructure. All Terraform commands
should now work.

If you ever set or change modules or backend configuration for Terraform,
rerun this command to reinitialize your working directory. If you forget, other
commands will detect it and remind you to do so if necessary.</code></pre>
            <p>With the latest Terraform installed, we check out the plan and then apply:</p>
            <pre><code>$ terraform plan
Refreshing Terraform state in-memory prior to plan...
The refreshed state will be used to calculate this plan, but will not be
persisted to local or remote state storage.

sumologic_collector.collector: Refreshing state... [id=108448215]
sumologic_http_source.http_source: Refreshing state... [id=150364538]

------------------------------------------------------------------------

An execution plan has been generated and is shown below.
Resource actions are indicated with the following symbols:
  + create

Terraform will perform the following actions:

  # cloudflare_logpush_job.firewall_events_job will be created
  + resource "cloudflare_logpush_job" "firewall_events_job" {
      + dataset             = "firewall_events"
      + destination_conf    = "sumo://endpoint1.collection.eu.sumologic.com/receiver/v1/http/(redacted)"
      + enabled             = true
      + id                  = (known after apply)
      + logpull_options     = "fields=RayName,Source,RuleId,Action,EdgeResponseStatusDatetime,EdgeColoName,ClientIP,ClientCountryName,ClientASNDescription,UserAgent,ClientRequestHTTPMethodName,ClientRequestHTTPHost,ClientRequestHTTPPath&amp;timestamps=rfc3339"
      + name                = "fwevents-logpush-job"
      + ownership_challenge = "(redacted)"
      + zone_id             = "(redacted)"
    }

Plan: 1 to add, 0 to change, 0 to destroy.

------------------------------------------------------------------------

Note: You didn't specify an "-out" parameter to save this plan, so Terraform
can't guarantee that exactly these actions will be performed if
"terraform apply" is subsequently run.</code></pre>
            
            <pre><code>$ terraform apply --auto-approve
sumologic_collector.collector: Refreshing state... [id=108448215]
sumologic_http_source.http_source: Refreshing state... [id=150364538]
cloudflare_logpush_job.firewall_events_job: Creating...
cloudflare_logpush_job.firewall_events_job: Creation complete after 3s [id=13746]

Apply complete! Resources: 1 added, 0 changed, 0 destroyed.</code></pre>
            <p>Success! Last step is to test your setup.</p>
    <div>
      <h2>Testing your setup by sending a malicious request</h2>
      <a href="#testing-your-setup-by-sending-a-malicious-request">
        
      </a>
    </div>
    <p>The following step assumes that you have the Cloudflare WAF turned on. Alternatively, you can create a Firewall Rule to match your request and generate a Firewall Event that way.</p><p>First make sure that livetail is running as described earlier:</p>
            <pre><code>$ livetail "_source=firewall-events-source"
### Authenticating ###
### Starting Live Tail session ###</code></pre>
            <p>Then in a browser make the following request <code>https://example.com/&lt;script&gt;alert()&lt;/script&gt;</code>. You should see the following returned:</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/4FOw7aQFIaaERMSbi2bRH3/3ca02fb0fa377956d28b444f4c5b2034/sqli-upinatoms.png" />
            
            </figure><p>And a few moments later in livetail:</p>
            <pre><code>{"RayName":"58830d3f9945bc36","Source":"waf","RuleId":"958052","Action":"log","EdgeColoName":"LHR","ClientIP":"203.0.113.69","ClientCountryName":"gb","ClientASNDescription":"NTL","UserAgent":"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_14_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/80.0.3987.163 Safari/537.36","ClientRequestHTTPMethodName":"GET","ClientRequestHTTPHost":"upinatoms.com"}
{"RayName":"58830d3f9945bc36","Source":"waf","RuleId":"958051","Action":"log","EdgeColoName":"LHR","ClientIP":"203.0.113.69","ClientCountryName":"gb","ClientASNDescription":"NTL","UserAgent":"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_14_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/80.0.3987.163 Safari/537.36","ClientRequestHTTPMethodName":"GET","ClientRequestHTTPHost":"upinatoms.com"}
{"RayName":"58830d3f9945bc36","Source":"waf","RuleId":"973300","Action":"log","EdgeColoName":"LHR","ClientIP":"203.0.113.69","ClientCountryName":"gb","ClientASNDescription":"NTL","UserAgent":"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_14_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/80.0.3987.163 Safari/537.36","ClientRequestHTTPMethodName":"GET","ClientRequestHTTPHost":"upinatoms.com"}
{"RayName":"58830d3f9945bc36","Source":"waf","RuleId":"973307","Action":"log","EdgeColoName":"LHR","ClientIP":"203.0.113.69","ClientCountryName":"gb","ClientASNDescription":"NTL","UserAgent":"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_14_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/80.0.3987.163 Safari/537.36","ClientRequestHTTPMethodName":"GET","ClientRequestHTTPHost":"upinatoms.com"}
{"RayName":"58830d3f9945bc36","Source":"waf","RuleId":"973331","Action":"log","EdgeColoName":"LHR","ClientIP":"203.0.113.69","ClientCountryName":"gb","ClientASNDescription":"NTL","UserAgent":"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_14_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/80.0.3987.163 Safari/537.36","ClientRequestHTTPMethodName":"GET","ClientRequestHTTPHost":"upinatoms.com"}
{"RayName":"58830d3f9945bc36","Source":"waf","RuleId":"981176","Action":"drop","EdgeColoName":"LHR","ClientIP":"203.0.113.69","ClientCountryName":"gb","ClientASNDescription":"NTL","UserAgent":"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_14_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/80.0.3987.163 Safari/537.36","ClientRequestHTTPMethodName":"GET","ClientRequestHTTPHost":"upinatoms.com"}</code></pre>
            <p>Note that for this one malicious request Cloudflare Logs actually sent 6 separate Firewall Events to Sumo Logic. The reason for this is that this specific request triggered a variety of different Managed Rules: #958051, 958052, 973300, 973307, 973331, and 981176.</p>
    <div>
      <h2>Seeing it all in action</h2>
      <a href="#seeing-it-all-in-action">
        
      </a>
    </div>
    <p>Here's a demo of  launching <code>livetail</code>, making a malicious request in a browser, and then seeing the result sent from the Cloudflare Logpush job:</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/5KtWEvKCr6QrbGKWZGzQ3a/1142fa4d2dddb6fd7cd7278d3273e0ee/fwevents-sumo-demo.gif" />
            
            </figure> ]]></content:encoded>
            <category><![CDATA[Firewall]]></category>
            <category><![CDATA[Logs]]></category>
            <category><![CDATA[Terraform]]></category>
            <category><![CDATA[SIEM]]></category>
            <guid isPermaLink="false">4JdLdzVCAsQGdNwMq7VgCa</guid>
            <dc:creator>Patrick R. Donahue</dc:creator>
        </item>
    </channel>
</rss>