I need a reliable way to pull data from Facebook Marketplace seller pages at scale. The target platform is Facebook; other marketplaces such as eBay, Amazon or Etsy are irrelevant for this job. Here’s what I’m after: when I paste one or many seller profile URLs into your script or small desktop app, it should crawl every public listing on those pages and export the results to CSV or Google Sheets. I mainly care about item title, price, description, photos (image URLs are fine), posting date, item location and the seller’s profile link so I can trace each record back to its source. If you can collect additional fields that Facebook exposes, even better—just keep everything neatly labelled. No hard requirement on the stack: Python with BeautifulSoup / Selenium, Node with Puppeteer, Playwright, or a headless browser solution all work for me as long as it runs on Windows or a small Linux VPS and doesn’t violate Facebook’s ToS. Please build in reasonable throttling, login handling (cookie-based or mobile API, whichever is more stable) and a simple config file where I can tune delay settings or add new accounts. Deliverables • Scraper script or executable with setup instructions • One example CSV pulled from three test seller URLs I’ll provide • Brief read-me explaining how to change credentials, schedules and output path I’ll test by running the tool on my machine; if it captures every listing and the fields above without triggering blocks, we’re done.