Contract GCP Data Engineer

Заказчик: AI | Опубликовано: 20.01.2026

I’m opening a 6-month (and likely longer) contract for a seasoned GCP data engineer who can work remotely from 1:30 PM to 9:00 PM IST. The immediate need is deep, hands-on mastery of DBT Cloud; every daily workflow revolves around building, testing, and deploying models in that environment. Your core mandate will be end-to-end data transformation and quality control. You’ll orchestrate ELT pipelines that turn raw events into clean, analytics-ready datasets, then surface them in BigQuery for downstream teams. While DBT Cloud is the heart of the stack, familiarity with BigQuery itself, Cloud Functions, and BigQuery Sharing (Analytics Hub) will let you optimise performance and manage secure data exchange. Day-to-day collaboration happens in Jira, Confluence, and GitLab, so comfort with agile delivery and CI/CD is expected. Python and advanced SQL are non-negotiable; an eye for AI/ML workflows is a bonus that helps us future-proof what we build. Deliverables I’ll use to gauge success: • Robust DBT Cloud project structure with version-controlled models and tests • Automated transformation jobs that meet agreed data-quality thresholds • Documented lineage and run books in Confluence • Performance-tuned BigQuery tables accessible to analysts and BI tools You should bring at least four years of focused data-engineering experience on GCP. If the timing window works for you and you thrive on shipping reliable, analytics-grade data, let’s talk.