I have a list of 10 websites that showcase different tools and I need every relevant detail on each product pulled into a clean, structured dataset. Think of the typical items you would expect—name, description, any pricing info that is publicly visible, user ratings or reviews where available, as well as the product page URL so I can click straight through for future checks. You can work in Python with libraries such as Scrapy, BeautifulSoup, or Selenium—feel free to recommend the stack you’re most comfortable with as long as the final result is accurate and repeatable. I’d like the script delivered alongside the data so I can rerun it later if anything changes on the sites. Deliverables • An executable scraper (clearly commented) • One consolidated data file in CSV, JSON, or Excel—let me know which of these is easiest given your pipeline • A short read-me explaining how to run the scraper and any environment requirements Accuracy is key: no missing fields, no duplicated rows, and all text properly encoded. If anti-bot measures pop up on any site, just flag it; we can decide together whether to add a delay, rotate proxies, or skip. Looking forward to seeing how you’d tackle this.