I’m facing an issue where my Python script using the requests library work fine outside of Docker. But inside
Docker it doesn’t seem to use the proxy to make the request. It also doesn’t throw any errors.
Found just this post which seems to have the same issue but no responses unfortunately. I couldn’t find anything else at all regarding this issue. Can’t imagine I am the only since my setup is fairly simple as well.
I have also tried adding the proxies as ENV variables, but that also didn’t work.
Dockerfile:
FROM python:3.11
WORKDIR /app
COPY . .
RUN apt-get update && apt-get install -y ca-certificates openssl
RUN pip install --no-cache-dir -r requirements.txt
ENV HTTP_PROXY="http://USER:PASSWORD@HOST:PORT"
ENV HTTPS_PROXY="http://USER:PASSWORD@HOST:PORT"
ENTRYPOINT ["python", "main.py"]
Python
import requests
def proxy_request():
proxies = {
"http": "http://USER:PASSWORD@HOST:PORT",
"https": "http://USER:PASSWORD@HOST:PORT",
}
response = requests.get('https://httpbin.org/get', proxies=proxies)
print("Status:", response.status_code)
print("Content:", response.text)
Again, outside of Docker the response return the IP of the proxy. However, inside Docker it returns my own IP.
I have tried multiple Python libraries; requests & httpx. They both have the same issue.
aiohttp didn’t work at all due to a SSL issue.
eelnats is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.