So far I’m using this code, which seems to work fine:
def expand_shortened_url(url):
headers = {"User-Agent": "Mozilla/5.0 (X11; Linux x86_64; rv:12.0) Gecko/20100101 Firefox/12.0", }
params = {
"th": "1",
"psc": "1"
}
s = requests.Session()
s.keep_alive = False
try:
if "amzn.to" in url:
response = s.head(url, headers=headers, params=params, timeout=15, allow_redirects=True)
return response.url
else:
response = s.get(url, headers=headers, params=params, timeout=15, allow_redirects=True)
return response.url
except Exception as e:
logger.error(str(e))
return None
However I would like to use the ScrapeOps API to make these requests.
I tried to implement something like this for scraping:
def scrapeops_url(url):
payload = {'api_key': os.environ["API_KEY"], 'url': url, 'country': 'it'}
proxy_url = 'https://proxy.scrapeops.io/v1/?' + urlencode(payload)
return proxy_url
But it doesn’t work with expand. Would anyone have any idea how to do this?
I hope someone can give me suggestions for building my bot; I am a student and I am self-taught about Python programming, as it has never been mentioned to me. I hope the code I wrote is clear enough as is the question I asked. I thank anyone who wants to help me.