I am ready to move several labour-intensive data-processing and back-office routines over to an LLM-powered, agentic architecture. The end-goal is straight-through automation: a set of production-grade workflows that ingest raw data, reason over it with retrieval-augmented generation, trigger decisions or downstream calls, and surface full telemetry through an observability stack. Here is the landscape you’ll step into: • Python remains the core language, and I need somebody who has at least five years of hard-won backend experience shipping governed AI systems inside large organisations. • Preferred toolchain is LangChain (or equivalent orchestration), MCP for prompt management, FastAPI for the service layer, and a GCP deployment footprint. If you have comparable expertise on AWS that is fine, but our primary build will live on Google Cloud. • Security, compliance, and auditability are non-negotiable; think healthcare or finance levels of governance. • Bonus points if you can weave in multi-agent collaboration patterns, vector search (e.g., FAISS, Pinecone, or Vertex AI Matching Engine), and policy enforcement. Deliverables 1. Technical design: workflow diagrams, component spec, and governance plan. 2. First production slice on GCP (Cloud Functions / Run, Vertex AI, Pub/Sub, etc.) exposed through FastAPI. 3. Observability pack: metrics, structured logs, and alerting wired into Cloud Monitoring. 4. Deployment scripts or Terraform so the entire stack can be reproduced in another project. 5. Hand-off session with runbook and performance benchmarks. Acceptance criteria • Data-processing and routine admin tasks complete automatically with < 2 % manual fallback rate. • Latency under 1 s for 90 % of calls in the happy path. • Full traceability of prompts, model versions, and outputs stored for 30 days minimum. • One-click redeploy passes CI/CD checks and unit tests ≥ 85 % coverage. If you thrive on owning design-to-deployment, love squeezing real value out of LLMs, and care about rock-solid, compliant engineering, let’s talk.