How do I get the IP address from a http request using the requests library? How do I get the IP address from a http request using the requests library? python python

How do I get the IP address from a http request using the requests library?


It turns out that it's rather involved.

Here's a monkey-patch while using requests version 1.2.3:

Wrapping the _make_request method on HTTPConnectionPool to store the response from socket.getpeername() on the HTTPResponse instance.

For me on python 2.7.3, this instance was available on response.raw._original_response.

from requests.packages.urllib3.connectionpool import HTTPConnectionPooldef _make_request(self,conn,method,url,**kwargs):    response = self._old_make_request(conn,method,url,**kwargs)    sock = getattr(conn,'sock',False)    if sock:        setattr(response,'peer',sock.getpeername())    else:        setattr(response,'peer',None)    return responseHTTPConnectionPool._old_make_request = HTTPConnectionPool._make_requestHTTPConnectionPool._make_request = _make_requestimport requestsr = requests.get('http://www.google.com')print r.raw._original_response.peer

Yields:

('2a00:1450:4009:809::1017', 80, 0, 0)

Ah, if there's a proxy involved or the response is chunked, the HTTPConnectionPool._make_request isn't called.

So here's a new version patching httplib.getresponse instead:

import httplibdef getresponse(self,*args,**kwargs):    response = self._old_getresponse(*args,**kwargs)    if self.sock:        response.peer = self.sock.getpeername()    else:        response.peer = None    return responsehttplib.HTTPConnection._old_getresponse = httplib.HTTPConnection.getresponsehttplib.HTTPConnection.getresponse = getresponseimport requestsdef check_peer(resp):    orig_resp = resp.raw._original_response    if hasattr(orig_resp,'peer'):        return getattr(orig_resp,'peer')

Running:

>>> r1 = requests.get('http://www.google.com')>>> check_peer(r1)('2a00:1450:4009:808::101f', 80, 0, 0)>>> r2 = requests.get('https://www.google.com')>>> check_peer(r2)('2a00:1450:4009:808::101f', 443, 0, 0)>>> r3 = requests.get('http://wheezyweb.readthedocs.org/en/latest/tutorial.html#what-you-ll-build')>>> check_peer(r3)('162.209.99.68', 80)

Also checked running with proxies set; proxy address is returned.


Update 2016/01/19

est offers an alternative that doesn't need the monkey-patch:

rsp = requests.get('http://google.com', stream=True)# grab the IP while you can, before you consume the body!!!!!!!!print rsp.raw._fp.fp._sock.getpeername()# consume the body, which calls the read(), after that fileno is no longer available.print rsp.content  

Update 2016/05/19

From the comments, copying here for visibility, Richard Kenneth Niescior offers the following that is confirmed working with requests 2.10.0 and Python 3.

rsp=requests.get(..., stream=True)rsp.raw._connection.sock.getpeername()

Update 2019/02/22

Python3 with requests version 2.19.1.

resp=requests.get(..., stream=True)resp.raw._connection.sock.socket.getsockname()

Update 2020/01/31

Python3.8 with requests 2.22.0

resp = requests.get('https://www.google.com', stream=True)resp.raw._connection.sock.getsockname()


Try:

import requestsproxies = {  "http": "http://user:password@10.10.1.10:3128",  "https": "http://user:password@10.10.1.10:1080",}response = requests.get('http://jsonip.com', proxies=proxies)ip = response.json()['ip']print('Your public IP is:', ip)