I am overhauling our analytics stack so every report and dashboard refreshes automatically, without anyone pressing “run” or “export.” The goal is a fully automated, real-time pipeline that ingests raw data, validates its quality, and surfaces clean metrics directly to our BI layer. Real-time processing is essential, and Google Cloud is our chosen environment. Expect to work primarily with Pub/Sub for event ingestion and Dataflow (or an equivalent managed service you recommend) for streaming transforms. All configuration—service accounts, IAM roles, network rules—needs to be production-ready and documented. Beyond moving bits from A to B, I need genuine process optimisation. Historical bottlenecks must be surfaced through analysis, then eliminated through automation so analysts spend time on insight instead of maintenance. Wherever data quality can break, scripts or declarative checks should be in place to stop bad records cold before they hit downstream models. Deliverables • Streaming pipeline on Google Cloud that updates analytics assets in near real-time • Automated data-quality gate: completeness, type conformity, and out-of-range detection • Deployment artefacts (Terraform, Deployment Manager, or Cloud Build YAML) and clear run-books • Short report highlighting previous inefficiencies and how the new pipeline resolves them A merge request that passes review and a live demo in our staging project will serve as final acceptance criteria.