Welcome to ShenZhenJia Knowledge Sharing Community for programmer and developer-Open, Learning and Share
menu search
person
Welcome To Ask or Share your Answers For Others

Categories

I have the code below:

var request = require('request');
var cheerio = require ("cheerio");
var async= require("async");

var MyLink="www.mylink.com";

    async.series([

        function(callback){
            request(Mylink, function (error, response, body) {
                if (error) return callback(error); 
                var $ = cheerio.load(body);
                //Some calculations where I get NewUrl variable...
                TheUrl=NewUrl;
                callback();
            });
        },
        function(callback){
            for (var i = 0; i <=TheUrl.length-1; i++) {
                var url = 'www.myurl.com='+TheUrl[i];
                request(url, function(error, resp, body) { 
                    if (error) return callback(error); 
                    var $ = cheerio.load(body);
                    //Some calculations again...
                    callback();
                });
            };
        }
      ], function(error){
        if (error) return next(error);
    });

Does anyone have a suggestion about how I can delay each loop iteration in the for loop? Say, the code waits 10 seconds after each iteration is complete. I tried setTimeout but didn't manage that to work.

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
thumb_up_alt 0 like thumb_down_alt 0 dislike
376 views
Welcome To Ask or Share your Answers For Others

1 Answer

Delaying multiple page fetches with async/await

I am a big fan of the async library and I've used for a long time. However, now there's async/await. Your code becomes easier to read. For instance, this would be your main function:

const urls = await fetchUrls(INITIAL_URL);

for (const url of urls) {
    await sleep(10000);
    const $ = await fetchPage(url);
    // do stuff with cheerio-processed page
}

Much better, isn't it? Before I get into the details of how fetchPage() and fetchUrls() work, let's first answer your question of how to wait before fetching the next page. The sleep function is pretty straightforward:

async function sleep(millis) {
    return new Promise(resolve => setTimeout(resolve, millis));
}

You can get a full explanation of how it works in my other answer here.

Ok, back to the other functions. The request library has a promise-enabled version of it that you can use with async/await. Let's check how's fetchPage() implemented:

async function fetchPage(url) {
    return await request({
        url: url,
        transform: (body) => cheerio.load(body)
    });
}

Since request is returning a promise, we can await on it. I also took the chance to use the transform property which allows us to tranform the response body before resolving the promise. I'm passing it through Cheerio, just like you did in your code.

Finally, fetchUrls() can just call fetchPage() and process it to fetch your array of URLs before resolving its promise. Here's the full code:

const
    request = require("request-promise-native"),
    cheerio = require("cheerio");

const
    INITIAL_URL = "http://your-initial-url.com";

/**
 * Asynchronously fetches the page referred to by `url`.
 *
 * @param {String} url - the URL of the page to be fetched
 * @return {Promise} promise to a cheerio-processed page
 */
async function fetchPage(url) {
    return await request({
        url: url,
        transform: (body) => cheerio.load(body)
    });
}

/**
 * Your initial fetch which will bring the list of URLs your looking for.
 *
 * @param {String} initialUrl - the initial URL
 * @return {Promise<string[]>} an array of URL strings
 */
async function fetchUrls(initialUrl) {
    const $ = await fetchPage(initialUrl);
    // process $ here and get urls
    return ["http://foo.com", "http://bar.com"];
}

/**
 * Clever way to do asynchronous sleep. 
 * Check this: https://stackoverflow.com/a/46720712/778272
 *
 * @param {Number} millis - how long to sleep in milliseconds
 * @return {Promise<void>}
 */
async function sleep(millis) {
    return new Promise(resolve => setTimeout(resolve, millis));
}

async function run() {
    const urls = await fetchUrls(INITIAL_URL);
    for (const url of urls) {
        await sleep(10000);
        const $ = await fetchPage(url);
        // do stuff with cheerio-processed page
    }
}

run();

To use request with promises, install it like this:

npm install request
npm install request-promise-native

And then require("request-promise-native") in your code, like in the example above.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
thumb_up_alt 0 like thumb_down_alt 0 dislike
Welcome to ShenZhenJia Knowledge Sharing Community for programmer and developer-Open, Learning and Share
...