Scrape and convert website into HTML?

10,913

Solution 1

Getleft is a nice Windows client that can do this. It is very configurable and reliable.

Wget can, too, with the --mirror option.

Solution 2

Try using httrack (or webhttrack/winhttrack, if you want a GUI) to spider the web site. It's free, fast, and reliable. It's also much more powerful than primitive downloaders like wget; httrack is designed for mirroring web sites.

Be aware that converting a dynamic page to static will lose you a lot of functionality. It's also not always possible - a dynamic site can present an infinite number of different static pages.

Solution 3

It's been a long time since I used it, but webzip was quite good.

It is not free, but for $35.00, I think your client won't go broke.

A quick google for offline browsers came up with this and this that look good..

Share:
10,913

Related videos on Youtube

Kevin
Author by

Kevin

Just a music lovin' web developer who likes to learn. Deep Drupal and development expertise.

Updated on April 30, 2022

Comments

  • Kevin
    Kevin almost 2 years

    I haven't done this in 3 or 4 years, but a client wants to downgrade their dynamic website into static HTML.

    Are there any free tools out there to crawl a domain and generate working HTML files to make this quick and painless?

    Edit: it is a Coldfusion website, if that matters.

  • strager
    strager over 13 years
    I wouldn't call wget primitive.
  • Borealid
    Borealid over 13 years
    @strager: Ok then, "relatively primitive". It's got a much more restricted feature-set when it comes to mirroring sites.
  • Pekka
    Pekka over 13 years
    I'm not sure whether it can do everything httrack does, but don't underestimate wget --mirror! It can do a lot of things.