I’m building a new platform and need the backend foundation in place before I tackle any front-end work. The core requirement is an n8n workflow that can watch a storage location, pick up incoming Excel or CSV files, parse them, isolate the lab results, and push the cleaned data into a Supabase Postgres database—all without manual intervention. The workflow must run unattended because uploads will be fully automated; I’m expecting somewhere between 100 and 1,000 user files in production, so solid error handling and retry logic are essential. Once each file is processed, the system should log successes or failures and prevent duplicate entries on re-uploads. Key deliverables • n8n workflow blueprint (JSON) that: – Detects a new spreadsheet in the chosen storage bucket or directory – Reads Excel/CSV, extracts only the lab result fields (test name, value, units, reference range, timestamp, patient identifier, etc.) – Transforms and validates the data, then upserts it to a Supabase table • Supabase table schema and any role-level permissions required for secure inserts • Brief setup and deployment guide so I can import the workflow, connect credentials, and go live on my own instance Please rely on native n8n nodes where possible, though custom JavaScript functions are fine for parsing quirks. If you’ve tackled n8n–Supabase pipelines before, I’d love to see an example of your work in action.