I am still (from time to time) working with that project where I - sometimes - do not get the content length (sometimes I get no headers at all) when I request them.
While I was reading the news (on paper), I get an idea, but I do not know if it is kosher.
The idea is to put a GetHeaders in a loop and go out of the loop only when I get a value in the
The question is: is it kosher to do that ?
(can I be banned because I do that ? *)
- Sometimes I get a text error (html) telling the server is too ‘hot’ to be able to return the asked file (the traffic is too heavy).
Sometimes I get another text error (as html data, not as a value like 404
Sometimes I get the file with a default file name (not the one from the
InternetHeaders) and the size is 0.
Is, asking for a file to download in a loop, considered as agressive ?
IMPORTANT: this project target a (large) specific web site; it is not an all purpose internet downloader. Also, when I ask for a missing (a file not downloaded by my project) file manually with Firefox or Safari, the file usually exists. So the faultive is (me) my application.
I wouldnt do these things in a loop. If you must call multiple times, instead use a timer and leave a few seconds between requests. It wouldnt surprise me if the errors you are getting are due to the speed at which you are sending requests.
That said, are you aware that its perfectly legal for an http/1.1 response to not have a Content-Length header?
I suggest switching to the new Xojo.Net.HttpSocket.
Thank you Greg.
I get more errors with a slower connection speed.
OK, but I checked with a browser the files and get them (from Firefox / Safari). So, they do something I dont.
Also: when I checked the headers (different window with no download code), I get that header. I am quite sure (99.99%) that
Content-Length is always sent. Of course, in the absolute, it is possible. Just like the file name. *
That means I have to buy a brand new license, but I cannot afford it.
BTW: I was wong some days ago when I was fighting with error
200 OK. I do not understand your advice (and forget I already followed that street and understand then what happened).
Before I forgot, I use:
Img_Socket.ConnectionType = SSLSocket.TLSv12 ,if that makes a difference.
- I know a web site who have newspaper scans as pdf (from 19xx thru 197xx I think) who cut the download speed to 1kb/s after a fixed download cumulated weight / day (not after a certain amount of files, because some files (for newspaper) are far lower than others). Some years ago, that same site banned people if they try to download too many files / day.
So, now, for me, nearly everything is possible.
Can this occured because I do not send an UserAgent ? If so, how can I do that using an HTTPSecureSocket ?
It could be. Just set the User-Agent header before the request.
I have to check with a faster connexion, but it does not change anything on a slow access.
Ideas comes at a strange time (usually when I do not think at the problem) ;).