Skip to main content

Overview

Draft & Goal helps data teams automate reporting, synchronize data between systems, and build data pipelines without code.

Common data workflows

Automated reporting

Generate reports on a schedule:
  • Pull data from multiple sources
  • Transform and aggregate
  • Generate insights with AI
  • Deliver via email or Slack
Nodes used: Google Analytics, BigQuery, LLM, Google Slides, Email

Data synchronization

Keep systems in sync:
  • Export from source system
  • Transform data format
  • Load into destination
  • Verify and log results
Nodes used: Google Sheets, Loop, BigQuery Writer

Data enrichment

Add value to raw data:
  • Fetch records from database
  • Enrich with external data
  • Apply AI analysis
  • Write back enriched data
Nodes used: BigQuery, Web Scraper, LLM, BigQuery Writer

Monitoring and alerts

Track metrics and alert on anomalies:
  • Query metrics regularly
  • Compare to thresholds
  • Detect anomalies
  • Send alerts
Nodes used: Google Analytics, Conditional, LLM, Email Sender

Available data nodes

Google Suite

NodeDescription
Google SheetsRead/write spreadsheets
Google BigQuery ReaderQuery data warehouse
Google BigQuery WriterInsert data to BigQuery
Google AnalyticsWeb analytics data
Google SlidesCreate presentations

Transformation

NodeDescription
JSON Path ExtractorExtract from JSON
MergePopulate templates with data
Find & ReplaceTransform text
LoopProcess arrays

External APIs

NodeDescription
API ConnectorPre-built API integrations

Example: Weekly KPI dashboard

Automatically generate a weekly KPI dashboard:

What it does

  1. Fetch data from Analytics, BigQuery, and HubSpot
  2. Merge into a unified dataset
  3. Analyze with AI to generate insights
  4. Create slides with charts and commentary
  5. Email to stakeholders automatically

Time saved

TaskManualAutomated
Data collection2 hours0
Analysis3 hours5 minutes
Report creation2 hours0
Distribution30 min0
Total7.5 hours5 minutes

Data pipeline patterns

ETL (Extract, Transform, Load)

Source → Extract → Transform → Load → Destination
Draft & Goal handles all three stages.

Event-driven pipelines

Webhook → Process → Update → Notify
React to events in real-time.

Scheduled batch processing

Schedule → Fetch batch → Process → Output
Run heavy operations during off-hours.

Best practices

Error handling

Always handle data pipeline failures:
  • Log all operations
  • Alert on failures
  • Implement retries
  • Have fallback paths

Data validation

Validate data quality:
  • Check for nulls
  • Verify data types
  • Validate ranges
  • Flag anomalies

Incremental processing

For large datasets:
  • Track last processed timestamp
  • Only fetch new records
  • Maintain state between runs

Getting started

1

Connect your data sources

Set up integrations for Google, databases, APIs
2

Define your output

Decide where results should go (Sheets, BigQuery, etc.)
3

Build the pipeline

Connect nodes to transform data
4

Schedule execution

Set up recurring runs

Next steps