Using Claude for Scraping a website
LoseIt doesn’t make it easy to get detail data out specially data from the past. However, they have an old website that is probably not for long. I thought I should be able to ask Claude, specifically, Cowork, to scape data from this website and dump it into a spreadsheet. After some coaxing and tweaking and tuning, I was finally able to get Cowork to periodically launch Chrome and navigate to LoseIt.com (where I was already logged in so didn’t have to share credentials with Claude) and use a calendar object on their site to go to a specific date and then click on each food item consumed that day then click on hyperlink on item name that shows a nutrition breakdown popup and then scrap that, close the popup, and move on to the next item and after all items are done then update a local spreadsheet and then move on to the next day. It would keep doing this until the tokens ran out - usually in about 20 days. Ive been running on schedule for the last few days and it is mostly done for the 6 months I didn’t have data for.
LoseIt has an option that on an ongoing basis sends an email with the daily and/or weekly details. I have another scheduled job that scans my gmail for emails from LoseIt and updates the same spreadsheet on an ongoing basis.
So far, so good. If you have the AI control your browser in natural language - there are a lot of interesting (and of course scary) automation possibilities.
Pretty happy with it has done for me so far.