First off, I will say that I appreciate that the headers are based off how others have done it.
I am curious about a few things with it though.
X-RateLimit-Reset is basically returning a time. For me to be confident in that time, I have to parse the Date HTTP return header to see what time the server thinks it is. Once I know that, I can compare it to the time that my machine thinks it is and then do a diff between the two, if there is any. There shouldn't be, but you never can tell.
Always makes me feel as though returning a time span in seconds or milliseconds would be better. (Not asking for a change here, just an observation.)
So, the question: 1 second is quite a long time. So when an API request is made and the Reset time is set, is that time rounded up? Down to the nearest second? Left as is and thus we don't have an accurate idea?
e.g. I took the server time, reset time and worked it all out, put in an appropriate delay in seconds, but was a hair's whisper out and got a few 429s returned. Now, I could add 1 second extra delay, but that is 10% of the window, which seems quite extreme.
So if I make my first request and it comes in at 16.999 seconds past. Does the reset time happen at 26.000? 27.000? Or 26.999?
Basically, I am left wondering if it is better to ignore the headers and just measure the 10secs on the machine itself. Even then though, I am left wondering whether there is a variance of up to a 1 second.