Measure website load time with Python requests
There is such functionality in latest version of requests:
https://requests.readthedocs.io/en/latest/api/?highlight=elapsed#requests.Response.elapsed
For example:
requests.get("http://127.0.0.1").elapsed.total_seconds()
As for your question, it should be the total time for
- time to create the request object
- Send request
- Receive response
- Parse response (See comment from Thomas Orozco )
Other ways to measure a single request load time is to use urllib:
nf = urllib.urlopen(url)start = time.time()page = nf.read()end = time.time()nf.close()# end - start gives you the page load time
response.elapsed
returns a timedelta object with the time elapsed from sending the request to the arrival of the response. It is often used to stop the connection after a certain point of time is elapsed
.
# import requests module import requests # Making a get request response = requests.get('http://stackoverflow.com/') # print response print(response) # print elapsed time print(response.elapsed)
output:
<Response [200]>0:00:00.343720