Site sucker files

Does anyone have experience using Sitesucker?
I have some websites I want to archive from an old server and host on my reclaim account. Once the site is “sucked”, where do I put the file? Should it go somewhere on the Cpanel or should I create a WP site and somehow add the files so they appear on that site?

Any help would be greatly appreciated!

Sitesucker generates an HTML copy of the website by crawling each page. You should be left with a folder of HTML files and you would upload those to the folder for the domain you are wanting them to load at (no WP install involved). For knowing what folder to put them in see https://forums.reclaimhosting.com/t/uploading-folders-files/264 and https://forums.reclaimhosting.com/t/understanding-folder-structures-in-cpanel/295

Thank you. I keep getting errors when trying to site suck this site. Do you know why this is happening or how to get around it? William F. Sharpe

Servers can block crawlers that do this sort of thing, it’s quite possible that might be happening there. We have another thread with lots of discussion at Archiving Wordpress Sites as Static HTML that includes a few other tools.

Do you have the ability to succefully run site sucker on this website? That would at least help me eliminate the possibily of the error coming from Site Sucker on my laptop.

Thanks for the links to the other archiving posts I’ll reference those when figuring out how to host the files. best

It would help if you were more specific about the errors.

I ran it successfully limiting files to the web directory of the original and am looking at a local copy.

Maybe it’s your settings? Here’s a link to the .suck file which should open with the same settings

Thanks for this file!

One problem, the file didn’t download with the right hierarchy. Everything looks to be scattered in separate top level folders instead of stemming from a main folder. Any suggestions?

Another question for a different site, I tried following the files to public_html using file manager but I get an error message saying something about a quota.

Surely there is a way to upload the entire folder structure rather than one file a time! Do I need to use FTP, I’d like to stick with file manager if possible.

Everything you need is in the ~wsharpe folder those other files seem to be missing file referrals for some of the CSS buried in the site, it refers to images stored outside the path of the site.

I can’t help with old convoluted HTML.

You might need to review the archive content and see if it’s good enough.

I would ftp it, but you can compress the directory as a zip, upload that into the container directory; there is a cpanel tool to decompress the archive. I’ve only done it once.

There’s nearly always stuff to tweak in this process.

Right on, I’ll give it a try. thanks a million!