Skip to main content

Production Setup Requirements

PostgreSQL Database Required

As of PR #128, the operations framework requires PostgreSQL for tracking operations and actions.

Why PostgreSQL?

The operations framework was migrated from BigQuery to PostgreSQL for:
  • Real-time updates: PostgreSQL enables live operation status tracking
  • Electric SQL integration: Supports real-time sync to client applications
  • Better performance: Row-level operations vs BigQuery batch queries
  • Cost efficiency: No per-query costs for operation tracking

Components Requiring PostgreSQL

  1. Operations Framework (src/services/operations-service.ts)
    • Tracks user intent and workflow orchestration
    • Manages operation lifecycle and status
  2. Actions Framework (src/services/actions-service.ts)
    • Tracks external system API calls
    • Handles action status and retries
  3. Webhook Receiver (src/services/webhook-receiver-service.ts)
    • Processes inbound webhooks from sales agents
    • Updates action status based on external system responses
  4. ADCP Orchestrator (src/services/adcp-orchestrator.ts)
    • Orchestrates ADCP protocol interactions
    • Creates operations and actions for creative sync

Setting Up PostgreSQL for Cloud Run

Advantages:
  • Fully managed by Google Cloud
  • Automatic backups and high availability
  • Integrates seamlessly with Cloud Run via Unix sockets
  • Scales automatically
Setup Steps:
  1. Create Cloud SQL instance:
gcloud sql instances create activation-api-db \
  --database-version=POSTGRES_15 \
  --tier=db-f1-micro \
  --region=us-central1 \
  --root-password=<secure-password>
  1. Create database:
gcloud sql databases create activation_api \
  --instance=activation-api-db
  1. Run migrations:
# Connect to instance
gcloud sql connect activation-api-db --user=postgres

# Run migrations from scripts/create-postgres-tables.sql
  1. Update Cloud Run service:
gcloud run services update activation-api \
  --region=us-central1 \
  --add-cloudsql-instances=bok-playground:us-central1:activation-api-db \
  --set-env-vars="DATABASE_URL=postgresql://postgres:<password>@/activation_api?host=/cloudsql/bok-playground:us-central1:activation-api-db"
Cost: ~$7-15/month for db-f1-micro (shared-core)

Option 2: Neon Serverless PostgreSQL

Advantages:
  • Serverless: scales to zero when not in use
  • Free tier available (0.5 GB storage, 100 compute hours/month)
  • Simple connection string setup
  • No infrastructure management
Setup Steps:
  1. Create Neon project at https://neon.tech
  2. Get connection string from dashboard
  3. Run migrations via Neon SQL Editor
  4. Update Cloud Run:
gcloud run services update activation-api \
  --region=us-central1 \
  --set-env-vars="DATABASE_URL=postgresql://user:password@ep-xxx.us-east-2.aws.neon.tech/activation_api?sslmode=require"
Cost: Free tier available, paid plans start at $19/month

Option 3: Supabase PostgreSQL

Advantages:
  • Free tier with 500 MB database
  • Built-in API and real-time features
  • Good for development and small production workloads
Setup Steps:
  1. Create Supabase project at https://supabase.com
  2. Get connection string from project settings
  3. Run migrations via Supabase SQL Editor
  4. Update Cloud Run:
gcloud run services update activation-api \
  --region=us-central1 \
  --set-env-vars="DATABASE_URL=postgresql://postgres:password@db.xxx.supabase.co:5432/postgres"
Cost: Free tier available, paid plans start at $25/month

Database Schema Setup

Run the PostgreSQL schema migrations:
# If using local psql client
psql $DATABASE_URL < scripts/create-postgres-tables.sql

# Or via Cloud SQL proxy
cloud_sql_proxy -instances=bok-playground:us-central1:activation-api-db=tcp:5432 &
psql "host=127.0.0.1 port=5432 dbname=activation_api user=postgres" < scripts/create-postgres-tables.sql

Environment Variables

Required:
  • DATABASE_URL: Full PostgreSQL connection string
OR individual variables:
  • POSTGRES_HOST: Database host
  • POSTGRES_PORT: Database port (default: 5432)
  • POSTGRES_USER: Database user
  • POSTGRES_PASSWORD: Database password
  • POSTGRES_DATABASE: Database name (default: activation_api)
Optional:
  • POSTGRES_POOL_MAX: Maximum pool size (default: 20)
  • POSTGRES_IDLE_TIMEOUT: Idle connection timeout in ms (default: 30000)
  • POSTGRES_CONNECT_TIMEOUT: Connection timeout in ms (default: 10000)

Testing PostgreSQL Connection

Test the setup with the health endpoint:
curl https://activation-api-66ca3rk35a-uc.a.run.app/health
Should return:
{
  "service": "official-mcp-sdk-server",
  "status": "ok",
  "timestamp": "2025-10-04T20:00:00.000Z"
}
Test webhooks:
curl -X POST https://activation-api-66ca3rk35a-uc.a.run.app/webhooks/creative-status/1/test_creative_123 \
  -H "Content-Type: application/json" \
  -d '{"status":"approved","creative_id":"wonderstruck_123","reviewed_at":"2025-10-04T20:00:00Z"}'
Should return success or a specific error (not PostgreSQL connection error).

Troubleshooting

Error: connect ECONNREFUSED 127.0.0.1:5432
  • PostgreSQL is not configured
  • Set DATABASEURL or POSTGRES* environment variables
Error: password authentication failed
  • Incorrect credentials in DATABASE_URL
  • Check password and username
Error: database "activation_api" does not exist
  • Database not created
  • Run: CREATE DATABASE activation_api;
Error: relation "operations" does not exist
  • Schema migrations not run
  • Execute scripts/create-postgres-tables.sql

Migration from BigQuery

If you need to migrate existing operations data from BigQuery to PostgreSQL:
  1. Export BigQuery tables to CSV
  2. Import into PostgreSQL using COPY command
  3. Update any references to old BigQuery tables
See docs/BIGQUERY_TO_POSTGRES_MIGRATION.md for detailed steps.

Other Production Requirements

Required Environment Variables

  • NODE_ENV=production
  • PORT=8080 (set automatically by Cloud Run)
  • WORKOS_AUTHKIT_DOMAIN=https://identity.scope3.com
  • SCOPE3_API_KEY=<api-key> (for GraphQL backend)
  • GEMINI_API_KEY (stored in Secret Manager: gemini-api-key)
  • DATABASE_URL (PostgreSQL connection string)

Secret Manager

Secrets are injected via Cloud Run secrets configuration:
gcloud run services update activation-api \
  --set-secrets="GEMINI_API_KEY=gemini-api-key:latest"

Service Account

Uses default compute service account:
  • <project-number>-compute@developer.gserviceaccount.com
Required permissions:
  • BigQuery Data Viewer (for campaigns/creatives data)
  • Cloud SQL Client (if using Cloud SQL)
  • Secret Manager Secret Accessor
I