Check How to Enable Logpush to Cloudflare R2?

Logpush to cloudflare r2

Cloudflare R2 allows developers to store large amounts of unstructured data and access everything they required with zero egress fees. The S3-compatible API of R2 Storage allows developers to create seamless migrations and advanced integrations.

R2 can be used in multiple scenarios such as Cloud-based application storage, Cloud storage for web content and podcast episodes, Data lakes – Analytics and big data, and Output from large batch processes, such as machine learning model artifacts or data sets can be stored in the cloud.

Create bucket and API tokes

Logs can be sent to R2 straight from the Cloudflare dashboard or via API using Cloudflare Logpush.

Create R2 Bucket

Go to R2 and select ‘Create Bucket’

  • Enter the bucket name and click ‘Create Bucket’

Create R2 API

  • Open R2 and Select ‘Manage R2 API Tokens’
  • Click the ‘Create API token’
  • Select ‘Edit permissions’ for your token under ‘Permission’
  • Make a copy of the Secret Access Key and Access Key ID.

Make sure that you have the following permission: R2 write and Logshare edit. Also, a Cloudflare API token with the following permissions can be created as an alternative: Zone scope, logs edit permission, and Account scope, R2 write permission.

Manage your account via the Cloudflare dashboard

Now you want to enable Logpush to R2 by using the dashboard.

  1. Log in to your Cloudflare account.
  2. Choose the domain or Enterprise account that you wish to use with Logpush.
  3. Next go to Analytics & logs and click logs.
  4. A modal box will open after selecting Add Logpush job.
  5. Choose the dataset to send to a storage service.
  6. Now choose the data fields you want to include in your logs. You can add or remove the fields later by adjusting your Logs > Logpush settings.
  7. Then select R2 and enter the following destination details: Bucket path, R2 Access Key ID, and Secret access key.
  8. Click on validate access.
  9. To finish enable the Logpush job, click Save and Start Pushing.

Manage through API

Let us create a job by sending a POST request with the following fields to the Logpush jobs endpoint:

  • Name: you can use your domain name as a job name (optional)
  • destination_conf: a bucket path, account ID, R2 access key ID, and R2 secret access key-containing destination for logs. To segregate your logs into daily subfolders, we recommend adding the ‘DATE’ parameter to the destination_conf.
  • dataset: The log category you want to receive.

Example of a cURL request:

curl -X POST '<ZONE_ID>/logpush/jobs' \
-H 'X-Auth-Key: <API_KEY>' \
-H 'X-Auth-Email: <EMAIL>' \
-H 'Content-Type: application/json' \
-d '{
"name": "<DOMAIN_NAME>",
"logpull_options": "fields=ClientIP,ClientRequestHost,ClientRequestMethod,ClientRequestURI,EdgeEndTimestamp,EdgeResponseBytes,EdgeResponseStatus,EdgeStartTimestamp,RayID&timestamps=rfc3339",
"destination_conf": "r2://<BUCKET_PATH>/{DATE}?account-id=<ACCOUNT_ID>&access-key-id=<R2_ACCESS_KEY_ID&gt;&secret-access-key=<R2_SECRET_ACCESS_KEY>",
"dataset": "http_requests",
"enabled": true
}'| jq .

If you need any assistance feel free to Get Assistance.

Also check: Fix Cloudflare Error 524

To get more updates you can follow us on Facebook, Twitter, LinkedIn

Subscribe to get free blog content to your Inbox

Written by actsupp-r0cks