Dashboard
Last sync: 1 min ago🟢 Worker Live
📨
+5 today
47
Total Reports
↑12%
12
Processed Today
6 rules
3
Active Syncs
💰
+2 today
8
Sales Reports
💾
Firebase
284MB
Storage Used
⏰ Next Scheduled PullManage →
Sales Report — next pull in
14:32
TRIGGER
⏱ Scheduled
Every 2 hours
Sales Report — Daily
subject:"Sales Report" · PDF · Every 2h
2h
Revenue Summary
subject:"Revenue" · XLSX · Every 4h
4h
Custom: Ops Report
subject:"Ops" · PDF, CSV · 8AM + 3PM
2×/day
Custom: Client Export
subject:"Client Export" · XLSX · Daily 9AM
Daily
📈 This Week — Reports Collected
📋 Live ActivityAll →
📨
Gmail sync — 2 new files from revhitesh@gmail.com
just now · auto-scheduled
sales_report_mar07.pdf stored → R2 + Firestore
3 min ago · Worker
revenue_summary.xlsx uploaded via drop folder
21 min ago · Manual
Ops Report schedule triggered (1 file pulled)
3h ago · 8:00 AM cron
📊
Client Export — no new email found
5h ago · Daily 9AM cron
📂 By Report Type
Sales Reports18
Revenue12
Custom11
Manual6
📈 30-Day Pull Volume
🕐 Pulls by Hour of Day
💰 Sales Report Pull Frequency
Sales Report — pulled every 2 hours automatically. Manual trigger available via Sync Now.
📊 File Format Breakdown
All (14)
💰 Sales (6)
📊 Revenue (4)
🔧 Custom (4)
File NameReport TypeSource / TriggerReceivedSizeStatusActions
🏨 Hotel Properties & Report Types
Each property is a category — expand to manage report types inside
🗑️
Global Email Deletion Policy
Currently: emails are kept in inbox after pull. Toggle to delete all successfully pulled emails.
OFF
⚠️ Global delete is ON. After a successful pull, emails will be moved to Gmail Trash for all schedules unless overridden per-schedule. Only emails where every attachment was stored successfully will be deleted. Emails with failed or partial pulls are never deleted.
⏱ Active Schedules 0
No schedules yet — create one on the right →
➕ Add New Schedule
📧
Delete email after pull
Emails stay in inbox after attachment is saved
⚠️ Emails moved to Gmail Trash after all attachments confirmed stored. Failed pulls are never deleted.
☁️ Firebase Storage Upload
📂
Drop files or click to browse
Direct upload → Firebase Storage → Firestore metadata
PDFXLSXCSV
📨 Manual Gmail Pull
Connected
revhitesh@gmail.com
● OAuth active · Gmail API authorized
⚡ Manual Now
Runs immediately
⏱ Schedule
Set recurring rule
📂 Load Source File
Upload File
From Test Pull
From Reports
📄
Drop PDF, CSV, or Excel here — or click to browse
Supported: .pdf .csv .xlsx .xls
📋 Loaded Sources 0
No files loaded yet — upload or select a source file
🧩 Extracted Fields 0
Load a file to see extracted data fields here — drag them to the board below
📋 Report Builder Board
📊 Data Fields 0
Drag fields here from the palette above
fx
🧮 Calculations & Results 0
Run calculations or drag fields here
📄 Report Output Preview
Add data fields and calculations above to build your custom report
📬 Scan Gmail Inbox revhitesh@gmail.com
🔥
Firebase CONNECTED
repot-hub · Firestore + Storage
🟢 Live — reading from repot-hub
Create Firebase project
Enable Firestore Database
Enable Firebase Storage
SDK initialized in dashboard
Add service account key → Worker
Set Firestore security rules
☁️
Cloudflare
Workers + R2 + Pages
Create Cloudflare account
Create R2 bucket (drop folder)
Deploy Cloudflare Worker
Configure cron schedules
Deploy dashboard via Pages
Set CORS on R2 bucket
📨
Gmail API
OAuth + Polling
Enable Gmail API (GCP)
Create OAuth 2.0 credentials
Authorize revhitesh@gmail.com
Store refresh token in Worker
Configure per-report filters
Test end-to-end pipeline
⚙️ wrangler.toml — Cron Configuration
name = "reporthub-worker" main = "worker.js" compatibility_date = "2024-01-01" [[r2_buckets]] binding = "R2" bucket_name = "reporthub-reports" # Your drop folder [vars] FIREBASE_PROJECT = "repot-hub" # ✅ Your Firebase project # Secrets — run each command below in your terminal # wrangler secret put GMAIL_CLIENT_ID # → 555493780491-5f3qtn6h399jgcqmsau8h2riupjr6a4f.apps.googleusercontent.com # wrangler secret put GMAIL_CLIENT_SECRET # → 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 # wrangler secret put GMAIL_REFRESH_TOKEN # → 1//04erpD5YgbhaHCgYIARAAGAQSNwF-L9IrfBCy... # wrangler secret put FIREBASE_SA_KEY # → paste contents of Firebase service account JSON [[triggers.crons]] crons = [ "0 */2 * * *", # Sales Report — every 2 hours "0 */4 * * *", # Revenue Summary — every 4 hours "0 8,15 * * *", # Custom Ops — 8:00 AM + 3:00 PM "0 9 * * *", # Client Export — daily 9:00 AM "0 18 * * *", # Analytics Dump — daily 6:00 PM "0 8 * * 1", # Finance Export — Monday 8:00 AM ]
🤖 Pipeline Architecture
Trigger
Cron fires or manual Sync Now click
Worker wakes on schedule (every 2h for Sales) OR immediately on manual button press from dashboard.
Auth
Gmail OAuth token auto-refresh
Stored refresh token → fresh access token. No manual re-auth ever needed.
Search
Query inbox per schedule rules
Searches unread emails: subject:"Sales Report" has:attachment is:unread
Extract
Download + deduplicate
Streams PDF/CSV/XLSX bytes. SHA-256 hash check against Firestore prevents duplicates.
Store
R2 upload + Firestore metadata
File bytes → Cloudflare R2. Name, size, type, source, timestamp → Firebase Firestore.
Dashboard
Reports appear with download links
Dashboard polls Firestore. Signed R2 URLs generated on demand (1h expiry).
Cleanup (optional)
Delete source email from Gmail
If delete after pull is enabled for this schedule, the email is moved to Gmail Trash only after all attachments are confirmed stored. Partial or failed pulls are never deleted.
💻 worker.jsFull source saved in worker.js
// ReportHub — Cloudflare Worker v2 // CORS enabled · Date range filtering · Firebase Storage // ── CORS Headers ───────────────────────────────────── const CORS_HEADERS = { "Access-Control-Allow-Origin": "*", "Access-Control-Allow-Methods": "GET, POST, OPTIONS", "Access-Control-Allow-Headers": "Content-Type", }; export default { async fetch(req, env) { // Handle CORS preflight if (req.method === "OPTIONS") return new Response(null, { status: 204, headers: CORS_HEADERS }); const url = new URL(req.url); // POST /sync — manual trigger with optional date range if (req.method==="POST" && url.pathname==="/sync") { const { subject, after, before, types } = await req.json().catch(()=>({})); const result = await pullAll(env, subject, after, before, types); return corsJson({ ok:true, ...result }); } // GET /sync — sync all schedules if (url.pathname==="/sync") { const result = await pullAll(env); return corsJson({ ok:true, ...result }); } return new Response("ReportHub Worker v2 ✅", { headers: CORS_HEADERS }); } }; async function pull(sched, tok, env, after, before) { // Build Gmail query with optional date range let q = `subject:"${sched.q}" has:attachment is:unread`; if (after) q += ` after:${after.replace(/-/g,"/")}`; if (before) q += ` before:${before.replace(/-/g,"/")}`; const { messages=[] } = await fetch( `https://gmail.googleapis.com/...?q=${encodeURIComponent(q)}`, { headers:{ Authorization:`Bearer ${tok}` } } ).then(r=>r.json()); let files = []; for (const { id } of messages) files = files.concat(await processMsg(id, sched, tok, env)); return { found: messages.length, files }; } // ... processMsg → Firebase Storage + Firestore // ... getToken → OAuth refresh // ... getFirebaseToken → Service Account JWT // // See full worker.js for complete source
Gmail Setup Progress
0 / 4 steps In Progress
1
Enable Gmail API
2
OAuth Credentials
3
Get Refresh Token
4
Store in Worker
🔑
Step 1
Enable Gmail API
Enable the Gmail API in your Google Cloud Console project.
1
Go to Google Cloud Console
Open the APIs & Services dashboard for your project.
Open Google Cloud Console
2
Enable Gmail API
Search for "Gmail API" and click Enable.
3
Configure OAuth Consent Screen
Set app name to "ReportHub", add your Gmail address as a test user.
Edit Property
📄 PDF Analyzer
Click column headers to select — click cells to pick individual values

      
Selected fields appear here