I want to get a webdocument from 'https://www.fanfiction.net/s/5218118/1/', but I am sadly unable to replicate the behaviour of my browser - the server always sends me something along the lines of "please enable cookies" or "complete Captcha". Is there a way to send requests like a browser, so the server delivers me the same document as if I am a browser? I already googled and tried to integrate cookies and a fake Useragent. Here is my code:
import requests
from fake_useragent import UserAgent
url = 'https://www.fanfiction.net/s/5218118/1/'
ua = UserAgent()
S = requests.Session()
header = {'User-Agent':str(ua.chrome)}
res = S.get(url, headers=header)
cookies = dict(res.cookies)
response = S.get(url, headers=header, cookies=cookies)
Thanks already in advance! EDIT: I know that I could use selenium, but I do not want to always update my chromedriver, and also I do not want to waste performance on selenium.
question from:https://stackoverflow.com/questions/65626366/python-requests-like-a-browser