Why does cURL give correct response but scrapy does not? Why does cURL give correct response but scrapy does not? curl curl

Why does cURL give correct response but scrapy does not?


It seems to work if you append a slash to your URL - so same scrapy request, but with URL changed to:

http://www.betvictor.com/sports/en/football/

Additional Example:

I had the same problem when testing another website where the page worked on curl nicely, but did not work with requests. After fighting with it for sometime, this answer with extra slash solved the problem.

import requestsimport jsonr = requests.get(r'https://bet.hkjc.com/marksix/getJSON.aspx/?sd=20190101&ed=20190331&sb=0')pretty_json = json.loads(r.text)print (json.dumps(pretty_json, indent=2))

returns this:

[  {    "id": "19/037",       "date": "30/03/2019",        "no": "15+17+18+37+39+49",        "sno": "31",        "sbcode": "",......

The slash after .aspx is important. It doesn't work without it. Without the slash, the page returns an empty javascript challenge.

import requestsimport json#no slash    r = requests.get(r'https://bet.hkjc.com/marksix/getJSON.aspx?sd=20190101&ed=20190331&sb=0')    print(r.text)

returns this:

<HTML><head><script>Challenge=341316;ChallengeId=49424326;GenericErrorMessageCookies="Cookies must be enabled in order to view this page.";</script><script>function test(var1){    var var_str=""+Challenge;    var var_arr=var_str.split("");    var LastDig=var_arr.reverse()[0];    var minDig=var_arr.sort()[0];    var subvar1 = (2 * (var_arr[2]))+(var_arr[1]*1);    var subvar2 = (2 * var_arr[2])+var_arr[1];    var my_pow=Math.pow(((var_arr[0]*1)+2),var_arr[1]);    var x=(var1*3+subvar1)*1;    var y=Math.cos(Math.PI*subvar2);    var answer=x*y;    answer-=my_pow*1;    answer+=(minDig*1)-(LastDig*1);    answer=answer+subvar2;    return answer;}</script><script>client = null;if (window.XMLHttpRequest){    var client=new XMLHttpRequest();}else{    if (window.ActiveXObject)    {        client = new ActiveXObject('MSXML2.XMLHTTP.3.0');    };}if (!((!!client)&&(!!Math.pow)&&(!!Math.cos)&&(!![].sort)&&(!![].reverse))){    document.write("Not all needed JavaScript methods are supported.<BR>");}else{    client.onreadystatechange  = function()    {        if(client.readyState  == 4)        {            var MyCookie=client.getResponseHeader("X-AA-Cookie-Value");            if ((MyCookie == null) || (MyCookie==""))            {                document.write(client.responseText);                return;            }            var cookieName = MyCookie.split('=')[0];            if (document.cookie.indexOf(cookieName)==-1)            {                document.write(GenericErrorMessageCookies);                return;            }            window.location.reload(true);        }    };    y=test(Challenge);    client.open("POST",window.location,true);    client.setRequestHeader('X-AA-Challenge-ID', ChallengeId);    client.setRequestHeader('X-AA-Challenge-Result',y);    client.setRequestHeader('X-AA-Challenge',Challenge);    client.setRequestHeader('Content-Type' , 'text/plain');    client.send();}</script></head><body><noscript>JavaScript must be enabled in order to view this page.</noscript></body></HTML>


It turned out that the order of the parameters really mattered for this server (I guess because it was simulating a hidden form with ordered inputs, and this was an extra validation check). In python requests using a POST str and url encoding by hand (i.e. ':' --> '%3A') makes things work. So although the wireshark packets are near enough identical, the only way they differ is the param string order, and indeed this is the key.

In Scrapy passing a tuple like:

 ot=     ( ('TS644333_id', '3'),              ('TS644333_75', value),              ('TS644333_md', '1'),              ('TS644333_rf', '0'),              ('TS644333_ct', '0'),              ('TS644333_pd', '0')             )

to formdata= rather than a dictionary, so that order is preserved works too.

Also the header {'Content-Type': 'application/x-www-form-urlencoded'} is required.

As anana noted in his answer, appending a trailing '/' to all request URLs also fixes things, in fact you can get away with just GET requests alone, with no js simulation and no form POSTing if you do this!