this post was submitted on 19 Mar 2024
1 points (100.0% liked)

It's A Digital Disease!

94 readers
1 users here now

This is a sub that aims at bringing data hoarders together to share their passion with like minded people.

founded 1 year ago
MODERATORS
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/datahoarder by /u/Mhanz97 on 2024-03-18 17:00:31.


Hi everyone, like title said, why the hell its so hard to download a complete website for offline view?

I was trying to download a fandom wiki (the entire wiki about a videogame), i tried lot of tools and i always got some problems.....

i tried:

  • Wget: got problems when downloading images....lot of them was not downloaded....
  • httrack: takes forever/ super slow, and not downloading all the images too + even with depth levels restriction keep download useless outside-domain websites
  • offline explorer: maybe the worst since everything was messed up after download + no all the images
  • Cyotek web copy: same as offline explorer
  • Wikiteam software (dumpgenerator.py): ultra messy, super hard to install and didnt worked on my windows

Basically the only thing that at least download all the text + images its the chrome ctrl+s (save page), but i need to manually full load and save page by page.....and when i read in offline mode its a bit messed up, but at least i have all the thing saved......

no comments (yet)
sorted by: hot top controversial new old
there doesn't seem to be anything here