Cloudflare Analytics gives you data, but the default dashboard is limited. You can't combine metrics from different time periods, create custom visualizations, or correlate traffic with business events. You're stuck with predefined charts and can't build the specific insights you need. This limitation prevents you from truly understanding your audience and making data-driven decisions. The solution is building custom dashboards using Cloudflare's API and Ruby's rich visualization ecosystem.
Building effective dashboards requires thoughtful architecture. Your dashboard should serve different stakeholders: content creators need traffic insights, developers need performance metrics, and business owners need conversion data. Each needs different visualizations and data granularity.
The architecture has three layers: data collection (Cloudflare API + Ruby scripts), data processing (ETL pipelines in Ruby), and visualization (web interface or static reports). Data flows from Cloudflare to your processing scripts, which transform and aggregate it, then to visualization components that present it. This separation allows you to change visualizations without affecting data collection, and to add new data sources easily.
| Component | Technology | Purpose | Update Frequency |
|---|---|---|---|
| Data Collection | Cloudflare API + ruby-cloudflare gem | Fetch raw metrics from Cloudflare | Real-time to hourly |
| Data Storage | SQLite/Redis + sequel gem | Store historical data for trends | On collection |
| Data Processing | Ruby scripts + daru gem | Calculate derived metrics, aggregates | On demand or scheduled |
| Visualization | Chartkick + sinatra/rails | Render charts and graphs | On page load |
| Presentation | HTML/CSS + bootstrap | User interface and layout | Static |
Cloudflare's GraphQL Analytics API provides comprehensive data. Use the `cloudflare` gem:
gem 'cloudflare'
# Configure client
cf = Cloudflare.connect(
email: ENV['CLOUDFLARE_EMAIL'],
key: ENV['CLOUDFLARE_API_KEY']
)
# Fetch zone analytics
def fetch_zone_analytics(start_time, end_time, metrics, dimensions = [])
query = {
query: "
query {
viewer {
zones(filter: {zoneTag: \"#{ENV['CLOUDFLARE_ZONE_ID']}\"}) {
httpRequests1mGroups(
limit: 10000,
filter: {
datetime_geq: \"#{start_time}\",
datetime_leq: \"#{end_time}\"
},
orderBy: [datetime_ASC],
#{dimensions.any? ? "dimensions: #{dimensions}," : ""}
) {
dimensions {
#{dimensions.join("\n")}
}
sum {
#{metrics.join("\n")}
}
dimensions {
datetime
}
}
}
}
}
"
}
cf.graphql.post(query)
end
# Common metrics and dimensions
METRICS = [
'visits',
'pageViews',
'requests',
'bytes',
'cachedBytes',
'cachedRequests',
'threats',
'countryMap { bytes, requests, clientCountryName }'
]
DIMENSIONS = [
'clientCountryName',
'clientRequestPath',
'clientDeviceType',
'clientBrowserName',
'originResponseStatus'
]
Create a data collector service:
# lib/data_collector.rb
class DataCollector
def self.collect_hourly_metrics
end_time = Time.now.utc
start_time = end_time - 3600
data = fetch_zone_analytics(
start_time.iso8601,
end_time.iso8601,
METRICS,
['clientCountryName', 'clientRequestPath']
)
# Store in database
store_in_database(data, 'hourly_metrics')
# Calculate aggregates
calculate_aggregates(data)
end
def self.store_in_database(data, table)
DB[table].insert(
collected_at: Time.now,
data: Sequel.pg_json(data),
period_start: start_time,
period_end: end_time
)
end
def self.calculate_aggregates(data)
# Calculate traffic by country
by_country = data.group_by { |d| d['dimensions']['clientCountryName'] }
# Calculate top pages
by_page = data.group_by { |d| d['dimensions']['clientRequestPath'] }
# Store aggregates
DB[:aggregates].insert(
calculated_at: Time.now,
top_countries: Sequel.pg_json(top_10(by_country)),
top_pages: Sequel.pg_json(top_10(by_page)),
total_visits: data.sum { |d| d['sum']['visits'] }
)
end
end
# Run every hour
DataCollector.collect_hourly_metrics
Choose gems based on your needs:
gem 'chartkick'
# Simple usage
<%= line_chart Visit.group_by_day(:created_at).count %>
<%= pie_chart Visit.group(:country).count %>
<%= column_chart PageView.group(:browser).count %>
# With Cloudflare data
def traffic_over_time_chart
data = DB[:hourly_metrics].select(
Sequel.lit("DATE_TRUNC('hour', period_start) as hour"),
Sequel.lit("SUM((data->>'visits')::int) as visits")
).group(:hour).order(:hour).last(48)
line_chart data.map { |r| [r[:hour], r[:visits]] }
end
gem 'gruff'
# Create charts as images
def create_traffic_chart_image
g = Gruff::Line.new
g.title = 'Traffic Last 7 Days'
# Add data
g.data('Visits', visits_last_7_days)
g.data('Pageviews', pageviews_last_7_days)
# Customize
g.labels = date_labels_for_last_7_days
g.theme = {
colors: ['#ff9900', '#3366cc'],
marker_color: '#aaa',
font_color: 'black',
background_colors: 'white'
}
# Write to file
g.write('public/images/traffic_chart.png')
end
gem 'daru'
gem 'daru-view' # For visualization
# Load Cloudflare data into dataframe
df = Daru::DataFrame.from_csv('cloudflare_data.csv')
# Analyze
daily_traffic = df.group_by([:date]).aggregate(visits: :sum, pageviews: :sum)
# Create visualization
Daru::View::Plot.new(
daily_traffic[:visits],
type: :line,
title: 'Daily Traffic'
).show
gem 'rails-charts'
# Even without Rails
class DashboardController
def index
@charts = {
traffic: RailsCharts::LineChart.new(
traffic_data,
title: 'Traffic Trends',
height: 300
),
sources: RailsCharts::PieChart.new(
source_data,
title: 'Traffic Sources'
)
}
end
end
Create dashboards that update in real-time:
# app.rb
require 'sinatra'
require 'json'
require 'cloudflare'
get '/dashboard' do
erb :dashboard
end
get '/stream' do
content_type 'text/event-stream'
stream do |out|
loop do
# Fetch latest data
data = fetch_realtime_metrics
# Send as SSE
out "data: #{data.to_json}\n\n"
sleep 30 # Update every 30 seconds
end
end
end
# JavaScript in dashboard
const eventSource = new EventSource('/stream');
eventSource.onmessage = (event) => {
const data = JSON.parse(event.data);
updateCharts(data);
};
# Generate static dashboard every minute
namespace :dashboard do
desc "Generate static dashboard"
task :generate do
# Fetch data
metrics = fetch_all_metrics
# Generate HTML with embedded data
template = File.read('templates/dashboard.html.erb')
html = ERB.new(template).result(binding)
# Write to file
File.write('public/dashboard/index.html', html)
# Also generate JSON for AJAX updates
File.write('public/dashboard/data.json', metrics.to_json)
end
end
# Schedule with cron
# */5 * * * * cd /path && rake dashboard:generate
gem 'faye-websocket'
require 'faye/websocket'
App = lambda do |env|
if Faye::WebSocket.websocket?(env)
ws = Faye::WebSocket.new(env)
ws.on :open do |event|
# Send initial data
ws.send(initial_dashboard_data.to_json)
# Start update timer
timer = EM.add_periodic_timer(30) do
ws.send(update_dashboard_data.to_json)
end
ws.on :close do |event|
EM.cancel_timer(timer)
ws = nil
end
end
ws.rack_response
else
# Serve static dashboard
[200, {'Content-Type' => 'text/html'}, [File.read('public/dashboard.html')]]
end
end
Generate and distribute reports automatically:
# lib/reporting/daily_report.rb
class DailyReport
def self.generate
# Fetch data for yesterday
start_time = Date.yesterday.beginning_of_day
end_time = Date.yesterday.end_of_day
data = {
summary: daily_summary(start_time, end_time),
top_pages: top_pages(start_time, end_time, limit: 10),
traffic_sources: traffic_sources(start_time, end_time),
performance: performance_metrics(start_time, end_time),
anomalies: detect_anomalies(start_time, end_time)
}
# Generate report in multiple formats
generate_html_report(data)
generate_pdf_report(data)
generate_email_report(data)
generate_slack_report(data)
# Archive
archive_report(data, Date.yesterday)
end
def self.generate_html_report(data)
template = File.read('templates/report.html.erb')
html = ERB.new(template).result_with_hash(data)
File.write("reports/daily/#{Date.yesterday}.html", html)
# Upload to S3 for sharing
upload_to_s3("reports/daily/#{Date.yesterday}.html")
end
def self.generate_email_report(data)
html = render_template('templates/email_report.html.erb', data)
text = render_template('templates/email_report.txt.erb', data)
Mail.deliver do
to ENV['REPORT_RECIPIENTS'].split(',')
subject "Daily Report for #{Date.yesterday}"
html_part do
content_type 'text/html; charset=UTF-8'
body html
end
text_part do
body text
end
end
end
def self.generate_slack_report(data)
attachments = [
{
title: "📊 Daily Report - #{Date.yesterday}",
fields: [
{
title: "Total Visits",
value: data[:summary][:visits].to_s,
short: true
},
{
title: "Top Page",
value: data[:top_pages].first[:path],
short: true
}
],
color: "good"
}
]
Slack.notify(
channel: '#reports',
attachments: attachments
)
end
end
# Schedule with whenever
every :day, at: '6am' do
runner "DailyReport.generate"
end
Make dashboards interactive:
# In your dashboard template
# Backend API endpoint
get '/api/metrics' do
start_date = params[:start_date] || 7.days.ago.to_s
end_date = params[:end_date] || Date.today.to_s
metrics = fetch_metrics_for_range(start_date, end_date)
content_type :json
metrics.to_json
end
# Click on a country to see regional data
<%= pie_chart traffic_by_country,
library: {
onClick: 'function(point) {
window.location.href = "/dashboard/country/" + point.name;
}'
}
%>
# Country detail page
get '/dashboard/country/:country' do
@country = params[:country]
@metrics = fetch_country_metrics(@country)
erb :country_dashboard
end
# Compare periods
def compare_periods(current_start, current_end, previous_start, previous_end)
current = fetch_metrics(current_start, current_end)
previous = fetch_metrics(previous_start, previous_end)
{
current: current,
previous: previous,
change: calculate_percentage_change(current, previous)
}
end
# Display comparison
Visits: <%= current[:visits] %>
(<%= change[:visits] %>%)
Deploy dashboards efficiently:
# Cache dashboard data
def cached_dashboard_data
Rails.cache.fetch('dashboard_data', expires_in: 5.minutes) do
fetch_dashboard_data
end
end
# Cache individual charts
def cached_chart(name, &block)
Rails.cache.fetch("chart_#{name}_#{Date.today}", &block)
end
# Load initial data, then update incrementally
# Export dashboard as static HTML
task :export_dashboard do
# Fetch all data
data = fetch_complete_dashboard_data
# Generate standalone HTML with embedded data
html = generate_standalone_html(data)
# Compress
compressed = Zlib::Deflate.deflate(html)
# Save
File.write('dashboard_export.html.gz', compressed)
end
# Optimize database queries
def optimized_metrics_query
DB[:metrics].select(
:timestamp,
Sequel.lit("SUM(visits) as visits"),
Sequel.lit("SUM(pageviews) as pageviews")
).where(timestamp: start_time..end_time)
.group(Sequel.lit("DATE_TRUNC('hour', timestamp)"))
.order(:timestamp)
.naked
.all
end
# Use materialized views for complex aggregations
DB.run( SQL)
CREATE MATERIALIZED VIEW daily_aggregates AS
SELECT
DATE(timestamp) as date,
SUM(visits) as visits,
SUM(pageviews) as pageviews,
COUNT(DISTINCT ip) as unique_visitors
FROM metrics
GROUP BY DATE(timestamp)
SQL
Start building your custom dashboard today. Begin with a simple HTML page that displays basic Cloudflare metrics. Then add Ruby scripts to automate data collection. Gradually introduce more sophisticated visualizations and interactive features. Within weeks, you'll have a powerful analytics platform that gives you insights no standard dashboard can provide.