Does anyone have experience using Sitesucker?
I have some websites I want to archive from an old server and host on my reclaim account. Once the site is “sucked”, where do I put the file? Should it go somewhere on the Cpanel or should I create a WP site and somehow add the files so they appear on that site?
Servers can block crawlers that do this sort of thing, it’s quite possible that might be happening there. We have another thread with lots of discussion at Archiving Wordpress Sites as Static HTML that includes a few other tools.
Do you have the ability to succefully run site sucker on this website? That would at least help me eliminate the possibily of the error coming from Site Sucker on my laptop.
One problem, the file didn’t download with the right hierarchy. Everything looks to be scattered in separate top level folders instead of stemming from a main folder. Any suggestions?
Another question for a different site, I tried following the files to public_html using file manager but I get an error message saying something about a quota.
Surely there is a way to upload the entire folder structure rather than one file a time! Do I need to use FTP, I’d like to stick with file manager if possible.
Everything you need is in the ~wsharpe folder those other files seem to be missing file referrals for some of the CSS buried in the site, it refers to images stored outside the path of the site.
I can’t help with old convoluted HTML.
You might need to review the archive content and see if it’s good enough.
I would ftp it, but you can compress the directory as a zip, upload that into the container directory; there is a cpanel tool to decompress the archive. I’ve only done it once.
There’s nearly always stuff to tweak in this process.