Subliminal Talk

Full Version: lano1106 BASE v2.1 journal
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
Pages: 1 2 3 4 5 6 7 8 9 10
Stage 4, day 25:

I have a hard day to focus on work today.

Our politicians during the daily Covid press conference did use words that did stir anger inside the AM6 me. They said something like "Continue to obey us and be docile". I have been venting on Twitter about that with think a like people that starts to be skeptical about the true gravity of the virus.

IOW, I'm still in the planning phase of what I discussed in last entry (procrastinating?)

Beside, I have been hit by another race condition during an order execution. When I understood what did happen, I did notice that it was the exact mirror situation of something that I did fix previously. I got hit by the race condition when cancelling orders. I missed the fact that the same issue was present when adding new orders.

I took me some time to come up with a solution. The best way to fix it wasn't obvious but I got it now. Yeah. My execution Engine that got more robust. 1 step closer to perfection.

First 24h run isn't good. On average, I did lost about 1% of my capital with 3 trades. Markets were very quiet yesterday. Volatility appears to be back today! I did raise the trigger bar. I have no other choice even if it means less trades. That is the only reasonable thing to do until I have the planned execution improvements in place.
Stage 4, day 26:

Yesterday hasn't been a good day in terms of profitability. On average each trade did loss about 1%. The last one was the last nail in the coffin: -46%! I have pulled the plug on testing until the improvements are in. but trading volume is increasing. I'm now up to ~$8,000.

Fortunately, upon closer inspection, I did find out that the calculation wasn't considering the possibility that the planned input could go through the plan only partially. After fixing this corner case, I have turned this huge loss into a small profit.

I'm just stunned about the number of nuances that you need to keep track to keep the accounting accurate.

It is one more small step toward the perfect trading system... The journey is still ongoing...
I think that I did mention it on one occasion inside my AM6 journal. My marketing biz has a dreaded glass ceiling on the income. Since I started the biz, I did always struggle to break the $1000/day mark.

It did happen once last year right after getting rid of something that was hindering my AM6 journey. I took that as some form of confirmation that I was on the right path.

Since then, it did occur very seldomly to break that ceiling again. It did happen yesterday practically while the business was essentially on auto-pilot since, most of my time is dedicated to my new trading project...

I also connected with a new girl on a dating app. Honestly, I rarely get matches and when I do, I don't make the first move so it goes nowhere and nothing happens. Since, I have started BASE, I'm absolutely satisfied of this status quo. However, I did match with that girl. She did initiate convo and she seems extremely interested in meeting me. I did chat a lot yesterday and tonight. On one hand, I'm very flattered of the female attention that she is giving me. The way the interaction go, my confidence level that things would go sexual very fast in case we would meet is pretty high which is a possibility that is very pleasing and exciting at the same time. OTOH, this is a distraction to my main goal... I feel a strong urgency to stay focused as with the current health/political/economical situation is kinda uncertain... I'm living in a large urban region... I could want to have the financial means to leave fast to protect my family if things turn sourer than they currently are...
Stage 4, day 28:

Unbelievable, stage 4 is ending in 4 days!

One of the idea that I had to improve execution is a piece of code that I did use in my GUI application built on top of my trading lib that I haven't used for a long time when I started to run my ARB CLI app.

Lib API kinda diverged and broke the GUI app. That must have happen sometime around end of February/early March. I remember when I started to test the ARB app, I went to movie theater to go watch 1917 (oh boy that was the good old time!). I kinda miss my GUI app that was giving a visual insight into markets.

So 2 days ago, the plan was to fix the GUI app because it is the app that is using the code that I wanted to reuse to improve the ARB trades execution. After 2 days of work and my GUI app is now running (and I'm very happy to have retrieved it!), I realize that the reasoning behind the reason to fix the GUI app was flawed. I didn't have to fix it to reuse the piece of code that I want to use. I could have just study the code and start using the piece that I needed...

Oh well, this is something that I wanted/needed to do eventually. It just wasn't maybe the task to do now to reach the end goal the fastest possible way.

Am I starting to self-sabotage myself with this minor distraction?

idk for sure...

I was searching on Amazon for a book on arbitrage to perhaps give me new ideas on how to solve my current issues. I haven't seen anything obviously similar to what I'm trying to achieve. It seems like very hard to believe that no one else is doing what I do but if I am creating a novel strategy, well good for me as it is a very good thing to be the first in trading in a new way.

Anyway, I have stumbled into a book: High-Frequency Trading by Irene Aldridge. The book talks about arbitrage strategies (but apparently different types than what I do) but I have come up with interesting info from the book:

Developing computerized trading systems requires an up-front investment that is costly in terms of labor and time. One successful trading system takes on average 18 months to develop... Initial development is both risky and pricey...

So based on that, given the time spend so far, I'm ahead for the curve by many months!
Stage 4, day 29:

In terms of technology, I'm a old school dog. I kinda despise any new features of my main programming language that isn't in C++03.

It is funny, in 2014, I did met the language inventor in a public speaking event from him. I did bring my copy of his book to have it signed by him and he did notice and comment about me not having the latest edition of his book ;-)

To me, this is possibly the most absurdly complex programming language and keeping adding more complexity on top of the existing one is simply madness. It is easy to just drown in the complexity and forget what was the initial goal. That is focus on solving a problem.

This is a recurring theme to me. Getting lost in complex but interesting and stimulating intellectually details. This is why after having red dozens of books on that language, spent years on mastering it, I decided to just pull the plug by saying. This is it. No more new language feature for me. I have a decent baseline from which I can pretty much do anything I want with.

However, a little by accident, I have been forced to use a 2011 version feature. The lamda function. I must admit that it is pretty slick and useful. Ok. That feature did pass the test and from now on, I'm going integrate it in my toolset. This is funny, despite being familiar with the concept by having used closures in different other languages such as Javascript and Perl, It did never occur to me before that it could actually be useful and fun in C++. I have been seduced by its elegance today...
Stage 4, day 31:

It took me a day to understand why my GUI client works perfectly well initially and end up simply freezing after few hours.

It is the classical sinkhole being smaller than the faucet throughput. Even if the sinkhole is only just a tiny bit smaller than the faucet throughput, it is only a matter of time before the bath spills.

I'm a bit unhappy about Qt framework. The CLI version is taking 5% CPU. The GUI is taking 40% at start up and slowly ramp up above 100% where it stops being real-time.

So the bottleneck is the framework. Actually, to be fair, it is either the framework or the actual GUI work.

I did make 2 optimizations in an attempt to squeeze enough performance to get back the app to be real-time. One optimization is how memory is allocated for the updates (over 1,000,000/hour of them are received) and the server code will benefit from that improvement.

Despite all that, the performance issue isn't resolved. I'm stuck. Qt don't allow to do GUI work with multiple threads.

I have a bigger box laying around somewhere that could do it but I haven't use it for so long that I'm not even sure that I can just upgrade the software on it. I might need to reinstall the whole thing from scratch. I could do that I guess. I'll seriously consider that option.

The last resort option would be to create a slowpc option. Instead of collecting data on every pairs available and keeping it for 24h and make it instantly available by a single click, I could collect the data on a needed basis. That is if you only collect data for the displayed pairs. That mean that if you open an additional pair, you wouldn't have access to its past data instantly.

Not being able to do multithreaded GUI is a serious Qt framework limitation...

It starts to get hairy again. I better stop thinking about that and just replace my PC with a faster one...
Stage 4, day 32:

Last Stage 4 day. My project did progress quite a lot in the last month. Beside that, I don't know what else to say. Because since it is the second month of lockdown, it feels very like the first month (aka stage 3). I hope things will get back to normal soon. It is not very fun. Seeing how USA citizens fight for their civil rights and liberties make me hopeful. US citizens have all my esteem to be courageous and fight for what they think is right instead of doing what they are told to do. They are liberty champions which will hopefully be followed all around the world.

I have completed 1 of the 2 improvements in the trade execution that I wanted to make. I did launch a test run and did capture 1 execution. 2 conclusions from the results:

1. New feature did work correctly
2. In this particular execution. It didn't help improving the yield because the spread was very narrow.

I'm much more confident that with the second element in place, it will become a winner. I did just address the low hanging fruit first. The second element is a little bit more complex to implement. I will roll up my sleeves and build it one block at a time.

Realistically, it will possibly take 2 days to complete.

Concerning my client performance issue. I did put that on hold but I have one more option that did pop out in my mind yesterday. Usually, when that happens, it is a very good sign and that means that I might be on the lead of a good solution.

I remember when I was reading the framework code. They use a View/Model pattern supported by a set of classes that my client is using. And the Qt framework table view class is pretty dumb. The pattern interface is permitting for the Model to be very granular. It can report to the view that 1 cell in the whole table did change and my code use that granularity fully by reporting single row or column changes. The framework provided table view is simply invalidating the whole table and fully redraw it. THIS must be incredibly expensive and dumb.

I'll take some time to see if it would be easy to replace the dumb framework table view for a clever and efficient one. I feel like it is something very possible to achieve. That wouldn't be the first time that I remove dumb Qt class with one of mine...
Stage 5, day 1:

I did try to improve my client performance by implementing the idea that I wrote about in the last post. Performances appear to be slightly improved but not enough to escape the fatal fate of freezing due to processing saturation.

It seems like something is taking more time to perform as time goes... but oh well, I think that I'm going to throw the towel on this one. I'm not sure it would give me anything to solve that particular issue. I'll simply upgrade my desktop with a bigger machine and be done with that annoying issue.

If there was a good profiler existing, I could maybe give that a try to get some insight about what is dragging down the client... Right now, I'm just shooting in the dark.... but I'm not absolutely sure anymore that the problem comes from the tables rendering...

Update:
A new idea did pop. I have 11 time series/arrays per pair. There is currently 155 pairs. So that makes 1705 arrays. Each array can reach the maximum size of # of seconds in a day (86400).

I'm going to check if those arrays memory is pre-allocated. If not, everytime that I append an element (this happens every second), the reallocation would trigger a lot of memory copies (up to 1705*86400 copies!). Yes pretty sure that I have my performance dragging bastard...

Update #2:
My last intuition was good. CPU usage of the client is now stable! I cannot think of a better way to start a new BASE stage!
Stage 5, day 2:

After a long period of silence (over 24h), I got a burst of trading opportunities coming on my way this morning.

Things didn't went well. I'm not done yet analyzing the logs but so far, I must have discovered 4-5 new bugs that made things behave badly.

I guess at the end of the day, I'll spin these events as positive as for every new fixed bug, I become closer to success but right now, I feel like I am receiving a ton of bricks on the head as I thought that the system was in better shape... It feels like playing at a Whack-A-Mole game instead of focusing on achieving new tasks to get closer to the final vision. That wasn't expected to put this task on hold but there is no other option. I must make sure at all cost that new stages are built on solid foundation or else, my system will be as robust as a house of cards.
Stage 5, day 7:

Wow. Time has passed very fast. I haven't seen it pass. Already 5 days since the last update.

Ok, so finally, yesterday I did unroll my sleeves and started the project to replace my current desktop with a more powerful one.

I turn it on and connect to it remotely. It has been years since last time that I did use it. Simply making packages upgrade has been painful. Package manager was too old. It couldn't uncompress the new compression format packages (zst file format). I had to manually recompile few packages. I though the first thing to upgrade was the manager itself. Error. The new manager couldn't run anymore. It had dependency on a new openssl version. I compile manually openssl. Now, I need a new glibc. but I cannot install it with the manager. I did find out that untaring a package from the root was essentially how you can install a package manually without the manager and this is what I did for glibc.

That alone took me about 2 hours. After that, I have been able to fully update that PC to the latest software versions.

Final touch would be intall my current env into the new PC and I thought that I would need a temporary workplace to do that. I have done so much remotely that I might be able to get away with that step but, my temporary working place is currently occupied by a second server that I have built already many years ago (maybe 2 years) to create a NAS. It was meant to replace a small embedded Linksys NAS device that I bought at BestBuy but ended up to break. That thing was so slow... Making copies was horribly slow. The key to have a fast NAS is to actually build it with a headless Linux box. That is what I did. To complete the NAS server installation, I did realize that I already had a server (a old 32 bits Pentium 3) that I use as my proxy DNS cache/DHCP/IMAP archive server. Now that my NAS is a Linux box, I could merge those 2 servers into one.

In the last 2 years, I have never done it because there is not much incentive to complete a task when status quo is perfectly working. I always had more important stuff to do but the end result was that my temporary PC installation workplace was occupied by the NAS box. So that was my second task yesterday. I finally did it. I did turn off server #1, and server #2 took over the first server responsabilities.

TBH, I felt a big relief and a big satisfaction feeling. Some task that I have dragged along with me is finally done. As an extra bonus, DNS lookup has become much faster with this new fast server.

As if it wasn't painful enough. I had to reboot my current desktop machine. Linux system is very robust and can tolerate a bunch of failures and still be responsive but there is an achille weakness. having pending NFS requests on a gone NFS server. IDK why, but this particular situation is kinda creating a kernel deadlock and the system stops working properly. It has been like that for as long as I started using NFS mounts...

So I had to reboot my desktop. Once shutdown, it did refuse to reboot. Power goes on the motherboard, I can tell by the lit LEDS on it but pressing on the power button has zero effect. This is a known issue. I suspect that it is the CPU or MB that is in the process of failing. I have known that for months that at some point I would have to upgrade it with my spare box or minimally upgrade the desktop by replacing the CPU/MB/memory trio. Every time that I boot this machine, I need to press the power button for many minutes and eventually it finally boots. (then I keep the computer on for as long as I can. Many months in a row!) Everytime that I have to boot it, I wonder if it is going to be the last time that I'm going to be able to boot it as every time it takes more and more time before booting. This morning, I was pretty sure that it was it. The box was finally dead. So as I was calling a computer store to see if I could buy spare parts, I have learned from the employee that due to the lockdown, spare parts were extremely hard to find as everything is out of stock every where. According to him, it could take up to 3 weeks before I'm able to get hold of the required parts to repair my broken box but by some incredible luck, as I was speaking on the phone with the computer guy, my machine decided to boot at least one last time....

but that is not all yet. Because I have an old Nvidia card and it isn't supported anymore by my distro packagers, the X server refused to boot. The graphic drivers have to be recompiled every time that the kernel is upgraded... I had to do that task myself since prebuilt packages aren't available anymore. OMG, running this desktop is becoming an epic herculean effort each time!

It took 2 days to take care of everything... I knew that the task would be painful... This knowledge has probably played a role in me trying to push it for as long as possible for so long... I'm currently writing this entry with my resurrected machine ;-)... I'm almost done with the desktop transfer... I think that I'll be extremely happy about the result. Simply interacting with the spare desktop remotely, I could feel how much faster it is than my current machine... That means that in the long run, it should make me more productive....

The aim is to write my next journal entry with the new more powerful desktop!
Stage 5, day 8:

Mission accomplished! Transition is completed.

stream of non-stop challenges kept coming at me... but finally, I'm done. I have this brand new super fast computer. It has about 5x more memory 2x more faster cores.

It is a complete new ball game. My GUi client is running and at the same time, I'm watching a video and using my browser. Attempting to do that on the old desktop, everything would be stuttering. Now, not at all and it feels like I still have a lot of slack to make this beast do much more stuff before I reach a limit...

Ok, so now, it is time to continue developing the project.
Stage 5, day 9:

It is amazing how fast you get use to a new fast desktop. After only 2 days of usage, I'm already used to its speed and it has become my new normal.

But with 32 GB of ram (vs 6 GB), it is still blissful to never have to worry about running out of memory and this open up new possibilities to find ways to fill it up.

I'm discovering all the small corners that did escape my attention during the transition.
I had to install the gnome keyring, gvfs and I just realized that I forgotten to install the mini smtp client so that my program can send report emails. All those additions were done after I did declare the transition over... I think that I'm really really done now. If I still miss few elements, I guess they will be not much more of them. 1,2 or 3 max.

Update:
Oh and in the report email that my trading server is sending me daily, the one I got this morning told me that it made a small profit with 3 trades.

2 were positive and one made a small loss... Things starts to be working.
Oh and my marketing biz is doing fine as well... Yesterday morning without me having done anything yet, already had generated a $650 income. I thougth that I was heading toward a blissful $1000 income day. Unfortunately my luck stopped at around $850....

Oh well, maybe it will come again soon...
Stage 5, day 12:

Since last update, I do daily progress. Not as much as I would like but still, after having completed my desktop migration and I have returned to writing code, the first task that I wanted address was quite fuzzy in my mind how I would do it. Now the mist and smog is pretty much gone and replaced by a clear picture of how things has to be done. Having clarity of ideas is very useful to go in the desired direction.

That being said, I feel like I have been distracted by many things. I have developped a mild addiction to Twitter. I cannot explain how it did happen. I have an account for years and I have never really understood how this particular social media was working until I started to be preoccupied by the pandemic situation. I started to interact with like minded people and bing... you start to be fed with stuff that interest you. You express opinions and people react to those and share your ideas. This is going to be challenge to keep under control this growing addiction. I guess that this could be some sore procrastination creeping out it ugly face again but I cannot help in these uncertain times to seek answers, find the hidden truth and influence some people to what I think is right. That is a good thing. I just have time manage it.

Also, I got a bad surprise. Remember, I did mention that Once my desktop is up and running, I rarely reboot and in fact, this is a pretty cool thing that is doable with Linux... Well, the exception to that is when is install new stuff. You need to reboot. Last reboot goes back to 2 days ago and when I did it, I forgot to restart VLC that plays BASE audio file in loop. That means that I did skip unintentionally 1 day of programming. Oh Crap, no big deal. I guess that, I'll do an extra day to stage 5 to catch up the missed day.

I wonder if this missed day of programming can have contributed to this feeling of being easily distracted away from my work goal...
Stage 5, day 13:

I spend most of my day doing tech support. I'm kinda the sysadmin at home and my recent server merging had some undesirable side effects.

1. The smart door bell doesn't connect anymore to the network. That might be totally unrelated to the merging (but I did change the position of the router, so maybe the signal strength did drop too low for the ring to continue connect). So first thing that I did is to optimize the router location by placing it on top of the server box instead of behind it.

Nah, that didn't do anything good. Second option, is to press the setup button on the door bell to reinitialize it. To do that, I need to find the security screwdriver to open the bell box. I have only used it once when I did install the thing and I simply did not find it. I guess it is safe to declare that it is lost.

They sell spare ones on Amazon at $7.95 but with S&H, the total amount goes up to $35. It's pretty expensive but I feel that I have no other options. So I did order one and put this task on hold until I receive the screwdriver.

Scanner doesn't work anymore. I had this nice setup where I created a Samba drive on my NAS server so you can scan a document and access the scanned result file from anywhere in the house through the NAS. When I merged the servers, the Samba share IP address did change. I thought that by simply changing the IP address setting in the printer would fix the problem. It didn't. I have no idea what is making this setup that did perfectly work for years suddenly stop working when moving the server. I did struggle for pretty much the whole afternoon on that problem. There is nothing that I haven't tried including upgrading the printer firmware. Nothing did fix that and the only info that the printer was giving is that it was unable to reach the SMB share.

I did pause one moment, and I did question why I did use Samba in the first place. A Samba share can be useful in a heterogeneous environment that includes Windows machines. Since I'm anti-Microsoft and anti Bill Gates, there is absolutely zero Windows machine at my place and the printer also support FTP to transfer scans.

Update: Now I remember. It is simply a legacy leftover. When I had my old Linksys NAS (and still had few Windows machines around), it was supporting SMB shares and this is something that my printer could accept and it did work out of the box on the first attempt. I didn't thought about this choice any further despite that the Linksys box was possibly also supporting FTP (I make an association with FTP as being archaic and insecure. Telnet is another protocol in that mental category). When I did replace the Linksys box with a Linux box, I didn't question the original choice. I just did bring Samba support to the Linux box to keep the printer continue working as-is.

I did remove the Samba software and replace it with a FTP server. It didn't work out of the box. I had minor issues with PAM authentication and the info contained in the /etc/passwd file. but this operations has again demonstrated that MS software is of bad quality, bloated and slow. To be fair, Samba does probably offer much more functionality than just what FTP provides but from my point of view and my needs they are essentially equivalent. Samba software was 150MB big and it has been replaced by a 0.5 MB software that does the exact same thing!

Scanner problem fixed.

Those problems did drag me away big time. I hope this time, I am 100% done with that damn migration. That is why I avoid those type of moves. When you initiate them, it can become the equivalent of opening a can of worms.

Hopefully, and I'm happy about that, I also had enough time to fix an illusive SEGV bug still inside my trading library. The problem occurs when I reset my router or if I pull off a network cable somewhere. Part of the difficulty of finding the problem is that the generated core dump has become so big since the program now uses few GB of RAM that the core is truncated. Since the stacks are located at the end of the process virtual memory, this crucial info is missing making it pretty much impossible to know what the program was doing.

I did change /proc/sys/kernel/core_pattern to request the core to be dumped somewhere where the disk space is sufficient to avoid truncation. I got my core. At first, it wasn't clear what the problem was but I did finally figure it out and as a result, my code is now fixed in that regard.
Pages: 1 2 3 4 5 6 7 8 9 10