Welcome to ShenZhenJia Knowledge Sharing Community for programmer and developer-Open, Learning and Share
menu search
person
Welcome To Ask or Share your Answers For Others

Categories

I'm writing a script which does a POST request to a server and gets blocked the server keeps sending the response whenever a specific event is triggered. I have to take a cookie for post request with earlier login request and pass it as data to POST, each cookie lasts for 10 mins after which I've to run keep-alive request.

Whenever some event is triggered I want to log that event in a file, I tried async, unirest requests they generate the post request but I don't have control over output, I tried sessions also but of no use. I want to do following things in same order

1]Login (can do only once)

2]Post the request to server

3]Keep monitoring output of step 2 eternally whenever there is some output log it into a file

4]Keep the session alive by another request to server.

Let me know if you need more explanation.

Below is code, it does not work though

while True:
    try:
        xmldata = "<eventSubscribe cookie="%s" />" % (self.cookie)
        r = requests.post(post_url,data=xmldata,stream=False,verify=False,timeout=10)
        write_to_file('Ok',r.text)
        unsubevents()
        logout()
    except Exception as e:
        print e
        self.write_to_file('Ok',"")
    self.login()

So in above code the post call I make here is blocking and continuous, It streams the output continuously so the post call never really gets completed. But it receives output in xml format, server sends these responses every time an event is triggered.

PS: I don't want to do logout and login again,this works in curl where it keeps printing output on stdout, I have to run this code for several servers like 200.

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
thumb_up_alt 0 like thumb_down_alt 0 dislike
202 views
Welcome To Ask or Share your Answers For Others

1 Answer

I've fixed this problem with two level threading and reading chunks instead of content or read_lines(). 1] First threads will be created which will spawn second thread and run keepalive when timeout hits.

2]Second thread subscribes to event with POST request and then keeps on listening to chunks of size 1024 everytime a response is received it is parsed and respective data is updated. Here I used requests with Stream=True; This wasn't working for me earlier because cookie used to expire before reading response and session used to close.

If someone has better way to do this please update here.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
thumb_up_alt 0 like thumb_down_alt 0 dislike
Welcome to ShenZhenJia Knowledge Sharing Community for programmer and developer-Open, Learning and Share

548k questions

547k answers

4 comments

86.3k users

...