Hi,
I'm currently working with LEAN locally, and high resolution (minute, sometimes second) data, and I'm wondering how I can make my backtests run faster. I have a fairly high speed computer, but i'm not sure it's living up to its potential. LEAN is currently processing 5-15k data points/second, but i'd like to bump that up a lot without changing the code significantly. How can I rededicate my computer resources or find a way to make it run faster? Jared mentioned in a different thread that he's seen 500k+ data points/second, and I'm just wondering how much further I can push my computer and how I can do it.
Thanks,
AS
Josh Morris
Also, I'm also not willing to decrease my resolution.
Josh Morris
I am also curious, but I have a slow computer :P
JayJayD
Have you tried building the engine in Release configuration?
I run a basic algorithm with two years of minute data and it processed 55k/s in Debug configuration and 108k/s (yes, almost twice faster) in Release configuration.
Is the only way I see that can improve the speed with out changing the code significantly.
Hope it helps.
Jared Broad
- Running on Windows is about 2x faster
- Running on dense data (ticks etc) increases the points/sec but doesn't increase the overall speed.
The bottle neck is synchronizing the data which is CPU limited. Even on water cooled over clocked CPU's it maxes at about 200k/sec with Windows. We'll need a different way to sort the data to break past that boundary.
The material on this website is provided for informational purposes only and does not constitute an offer to sell, a solicitation to buy, or a recommendation or endorsement for any security or strategy, nor does it constitute an offer to provide investment advisory services by QuantConnect. In addition, the material offers no opinion with respect to the suitability of any security or specific investment. QuantConnect makes no guarantees as to the accuracy or completeness of the views expressed in the website. The views are subject to change, and may have become unreliable for various reasons, including changes in market conditions or economic circumstances. All investments involve risk, including loss of principal. You should consult with an investment professional before making any investment decisions.
Nate Betz
This is probably a dumb question, but how would you even do backtesting with LEAN in local mode against high-resolution data? Where do you get the high-resolution data?
JayJayD
In order to backtesting with LEAN in local mode against high-resolution data, you must have the data first.
Maybe the fastest way is using QC data, there you can find Forex and CFD data at tick resolution, and let the engine itself download the data as it needed it usign the ApiDataProovider.
If you already have data, you can import it to Lean.
Andre Stevens
The material on this website is provided for informational purposes only and does not constitute an offer to sell, a solicitation to buy, or a recommendation or endorsement for any security or strategy, nor does it constitute an offer to provide investment advisory services by QuantConnect. In addition, the material offers no opinion with respect to the suitability of any security or specific investment. QuantConnect makes no guarantees as to the accuracy or completeness of the views expressed in the website. The views are subject to change, and may have become unreliable for various reasons, including changes in market conditions or economic circumstances. All investments involve risk, including loss of principal. You should consult with an investment professional before making any investment decisions.
To unlock posting to the community forums please complete at least 30% of Boot Camp.
You can continue your Boot Camp training progress from the terminal. We hope to see you in the community soon!