Welcome to ShenZhenJia Knowledge Sharing Community for programmer and developer-Open, Learning and Share
menu search
person
Welcome To Ask or Share your Answers For Others

Categories

I want to implement a java method which takes URL as input and stores the entire webpage including css, images, js (all related resources) on my disk. I have used Jsoup html parser to fetch html page. Now the only option I am thinking to implement is get the page using jsoup and now parse the html content and convert relative path to absolute path and then make another get requests for javascript, images etc. and save them on disk. I also read about html cleaner, htmlunit parsers but i think in all these cases I have to parse the html content to fetch images,css and javascript files.

Any advice whether i am thinking right or not. Or is there any easy way to accomplish this task ??

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
thumb_up_alt 0 like thumb_down_alt 0 dislike
253 views
Welcome To Ask or Share your Answers For Others

1 Answer

Basically, you can do it with Jsoup:

 Document doc = Jsoup.connect("http://rabotalux.com.ua/vacancy/4f4f800c8bc1597dc6fc7aff").get();
         Elements links = doc.select("link");
         Elements scripts = doc.select("script");
        for (Element element : links) {
              System.out.println(element.absUrl("href"));
        }
        for (Element element : scripts) {
              System.out.println(element.absUrl("src"));
        }

And so on with images and all related resources.

BUT if your site creates some elements with javaScript, Jsoup will skip it, as it cant execute javaScript


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
thumb_up_alt 0 like thumb_down_alt 0 dislike
Welcome to ShenZhenJia Knowledge Sharing Community for programmer and developer-Open, Learning and Share
...