GCS to BigQuery Manual Loader with SMTP

gcp · architecture diagram.

About This Architecture

Secure GCS-to-BigQuery pipeline with manual file uploads, identity-aware access control, and SMTP notifications orchestrated by Cloud Functions. Users authenticated via Cloud IAM upload CSV files to a restricted GCS bucket, triggering an Eventarc audit log event that invokes a 2nd Gen Cloud Function running modular Python logic for validation, loading, and notifications. The orchestrator coordinates validator.py (schema validation against config bucket), bq_loader.py (BigQuery insert with error routing to work buckets), and notifier.py (SMTP/OAuth2 alerts via SendGrid), while structured_logger.py streams all operations to Cloud Logging and Monitoring for compliance auditing. This architecture enforces least-privilege IAM, immutable audit trails, and decoupled notification delivery—ideal for regulated data ingestion workflows requiring traceability and manual control.

People also ask

How do I build a secure, audited GCS-to-BigQuery pipeline with manual file uploads and email notifications on GCP?

This diagram shows a Cloud Functions-orchestrated ETL: Cloud IAM restricts uploads to authenticated @cmpc.com users, Eventarc triggers on GCS file creation, and a 2nd Gen Cloud Function runs modular Python (validator, loader, notifier, logger) to validate schemas, insert into BigQuery, route errors, and send SMTP alerts—all with Cloud Logging audit trails.

GCS to BigQuery Manual Loader with SMTP

GCPintermediateCloud FunctionsBigQueryETLCloud IAMEventarc
Domain: Data EngineeringAudience: GCP data engineers building secure, audited ETL pipelines with manual file ingestion
0 views0 favoritesPublic

Created by

March 27, 2026

Updated

March 27, 2026 at 4:44 PM

Type

architecture

Need a custom architecture diagram?

Describe your architecture in plain English and get a production-ready Draw.io diagram in seconds. Works for AWS, Azure, GCP, Kubernetes, and more.

Generate with AI