Hey guys. I’m looking for a program that once you load in the web address, it will download everything from the different pages, to the content, to shared/linked/hosted videos, everything.
Specifically I’d like one that could do this to a forum. For example, if I was a raving lunatic and actually wanted a version of NYSpeed saved to my computer, every thread, every hosted picture, etc. could I accomplish this?
I googled and found some amateur programs that seemed more shady than helpful.
Hopefully someone can point me in the right direction.
It would be simple with a static html page. Most websites run software that generates content based on what the user wants, so scraping that would be pretty complex to say the least.
When I sold car parts, our supplier wouldn’t give us direct access to their live database. I had to hire a coder to write a program that would go to their public website, and scrape their inventory, and update my website in real time.
Thanks for the reply. I actually have a bunch of static pages ( that just require a login ) That all host a bunch of movie clips, ranging in 50meg to 100meg in size. I’d like to just rip them all down :lol:
I’ll just manually go through the forum and save the threads I want
Can’t you do this with IE, and save for offline use? You can then select the number of pages deep you want it to save. It used to be a feature, not sure if they still have it in 6 and 7.
In Firefox I use the ScrapBook add-in and it works well. Not sure how it would work on forums though. Works great on DIY, How-To’s, and other websites though.