Pages

AI (Deep learning model) in work

Some nice AI results

Current development phase of AI models here. A lot of data has passed my desk since the last update of AI started in 2017.
Would it be nice to get more that 50% prediction from the next bar direction? It seems to be easier to get than you might think. Next one is an example of the model, which indicate next bar direction up to 67 % precision (NQ future, validation side).  Not optimized results, just a single run (teaching) so better could be got. Yes, I know if something looks too good it probably isn't but in this case there should not be any data leaks back, which is typical to get wrong result. But even if you know this, maybe the use of that info is not so useful as you might think first.

Background of the model used:

8 indicators as a feature, 7 last values from each

Some screenshots of the current development:

But here is the importance thing, usage is with  PVBars, which are NOT virtual but basically minute bars, but TF is changing, so those are as reliable as standard minute bars. (No use to use any virtual bars for other reason than visualizing something, I have not used VR tick bars but VR minute bars as easier to get data enough, VRT about same result, some could prefer those!)

Teaching, learning phase: 



Results, eq curve (simple model): 

See the validation side, as that is not used in teaching. Yes, the teaching phase should have been stopped at about 200–400 epochs, but here used more to see the overoptimized point.

15 MIN Bars:

Same with regular 15 Min bars (63 % max), not bad either but much higher drawdown in simple entry/exit model based on this prediction. 



and results 



Data, NQ from 01/01/2022 to 04/30/2024 (so that 40 - 50 k bars used give you some idea of the data size)

Keras Model:

Some info of that "secret" deep learning model, so hidden model behind. In this simplest form, usually I predict direction up to 4 bars (1st 67 %, 2nd 60%, 3rd 58%, 4th 56%, after that it usually has no value anymore). Conv1D's are the basic blocks used with some additional increments.


metrics="binary_accuracy"  
lossfunc="binary_crossentropy"
optimizer="adam" # "sgd"
activationfunc="sigmoid"  # "tanh" last, others relu

So maybe if I have some extra time and not too much customer work, that will be utilized by me. Hard to get a simple indicator to be packed to NT as this use python/keras model inside. 

Okay, here is the more info of the used data and features (not all those used as reduced above, can be done from the train program, but this is the initial data, This program, modified from initial MLWriteData, which write data to the csv files is available from me as that can be very useful):


Peep into trades:

Actually the way one model predict trades, random place from the validation side, + (add-ons) show the direction predicted after initial green or red arrow is a same way as that.    
(No, I don't have that kind of money to add-on at every bar. So it is for illustration purposes) 

Equity curve of that. The validation side is usually only interesting! Data earlier of that line used for training




See the underlying data from the window behind the chart, Ticks got from there, not bad.

A zoom into trades. Random place from the validation side. I think this is a very interesting part. Maybe useful in manual trading, too?!   FYI: no stoploss or targets, just entry and exit based on predicted direction or/and exit at last bar of session (so not a "valid" model).


No comments:

Post a Comment