I’m running a Laravel-based property platform that already embeds about 50 000 images with CLIP / SigLIP and stores them in PostgreSQL through pgvector. Both prompt→image and image→image search are live, yet the similarity scores are loose—typing “swimming pool” still pulls in shots of the open sea. I need someone who has actually deployed vector search in production to step in and tighten the results. Where I think we can get immediate wins • Re-examining our pgvector indexing strategy (IVFFlat vs HNSW parameters, segment counts, probe settings) and rebuilding indexes where needed. • Profiling and rewriting the current SQL so distance ordering is mathematically correct and not masked by Laravel’s query builder quirks. • Verifying the way we normalise, quantise, or otherwise pre-process CLIP/SigLIP embeddings before they ever hit the database. • Stress-testing the Python side that creates the vectors to be sure batch inference, precision, and model choice aren’t sabotaging recall. Acceptance criteria – Querying “swimming pool” returns pool images in the top 10 with no coastal scenes. – End-to-end latency (Laravel → Postgres → JSON) stays under 300 ms for 50 k rows. – Query plan shows the intended ANN index being used, not a sequential scan. – All changes and parameters are reproducible via a short README plus migration / seed scripts. You’ll get SSH access to a staging clone, the current Laravel repo, and the Python embedding script. Once accuracy is demonstrably better and the query path is lean, I’ll port the changes to production.