11-13-2020, 02:32 PM
cycle 14, day off #3:
Too much things are happening at the same time. It could be seen in a bad way, it could be seen in a good way.
I am in the process of changing of mailing service which by itself is a big move.
My webhosting company did force me to migrate my web servers (and my trading server) on a newer OS..
And the exchange did upgrade their API protocol yesterday and it did break my trading client. My client stop working because part of the exchange server upgrade has been to start accepting Websocket compression extension that my client has always offered but the offer was declined by the exchange server until yesterday.
I have been forced to look into the websocket library that I am using (after having pushed the enveloppe for the JSON library, events are pushing me to do the same for WS). I did found a bug in it and I did fix it.
but this did also gave me the opportunity to find a major flaw in my own code which was making read incoming messages in 16 bytes chunks (I did increase the maximum chunk size to 1536 bytes). This did also allowed to invalidate an assumption about the WS messages to my code. I was thinking that they weren't NULL terminated.
This had a big incidence on how the JSON parsing code was handling its input.
It had to make sure at every byte that we didn't reach the end. This will allow me to remove one extra condition from the code.
That doesn't seem like a lot but it is. considering that I receive a million of update every 15 minutes. If the condition is verified for every bytes of these millions of messages, at the end of the day, it makes a lot of verification that I can now remove.
I haven't removed the condition yet... but simply increase the maximum chunk size from 16 to 1536 bytes and receiving compressed network stream, it does make a huge difference in CPU usage...
These are improvements that I wouldn't have if it wasn't for the problem that the server upgrade didn't cause...
Too much things are happening at the same time. It could be seen in a bad way, it could be seen in a good way.
I am in the process of changing of mailing service which by itself is a big move.
My webhosting company did force me to migrate my web servers (and my trading server) on a newer OS..
And the exchange did upgrade their API protocol yesterday and it did break my trading client. My client stop working because part of the exchange server upgrade has been to start accepting Websocket compression extension that my client has always offered but the offer was declined by the exchange server until yesterday.
I have been forced to look into the websocket library that I am using (after having pushed the enveloppe for the JSON library, events are pushing me to do the same for WS). I did found a bug in it and I did fix it.
but this did also gave me the opportunity to find a major flaw in my own code which was making read incoming messages in 16 bytes chunks (I did increase the maximum chunk size to 1536 bytes). This did also allowed to invalidate an assumption about the WS messages to my code. I was thinking that they weren't NULL terminated.
This had a big incidence on how the JSON parsing code was handling its input.
It had to make sure at every byte that we didn't reach the end. This will allow me to remove one extra condition from the code.
That doesn't seem like a lot but it is. considering that I receive a million of update every 15 minutes. If the condition is verified for every bytes of these millions of messages, at the end of the day, it makes a lot of verification that I can now remove.
I haven't removed the condition yet... but simply increase the maximum chunk size from 16 to 1536 bytes and receiving compressed network stream, it does make a huge difference in CPU usage...
These are improvements that I wouldn't have if it wasn't for the problem that the server upgrade didn't cause...