I'm really new here, so I could be doing something wrong, but when I'm going to import minute data from 2010 to 2018 the kernel dies. I've tried to import 2 parts separately (from 2010 to 2015 and then from 2015 to 2018) and it works, but later when I have to process the data after append the dataframes, it dies again. Is it a memory problem, and if I upgrade my plan will it stop occurring?
Thank's.
Michael Manus
Hmmm thats sounds like you would save the data in a list or something like that. Start with a very simple algo.
Like this one:
After that you should take a look on rolling window and consolidating data from minute resolution to 3 -5 minutes or start using daily data.
Rolling Window example from the quantconnect team:
Multiple Assets consolidating and with window example:
Rodrigo Nader
My model really needs 1 minute data. I thought it would be possible to import it for 2010 to 2018, but it really seems that Research can't handle it without dying.
Jared Broad
Sorry that would be a few gigabytes of memory usage Rodrigo -- its not possible to import such a large chunk. You would need to run a backtest to analyse a larger chunk of time like that.
The material on this website is provided for informational purposes only and does not constitute an offer to sell, a solicitation to buy, or a recommendation or endorsement for any security or strategy, nor does it constitute an offer to provide investment advisory services by QuantConnect. In addition, the material offers no opinion with respect to the suitability of any security or specific investment. QuantConnect makes no guarantees as to the accuracy or completeness of the views expressed in the website. The views are subject to change, and may have become unreliable for various reasons, including changes in market conditions or economic circumstances. All investments involve risk, including loss of principal. You should consult with an investment professional before making any investment decisions.
Huy Hoang
This is interesting, Is it possible to do a walk-forward analysis on 1-2 years of data at a time, then dump the historical data and retain the statistical results for use in the algo later? If yes, can anyone give me an example algo?
If no, what is the optimal way to manage memory since we don't have a bunch of them to burn?
Rodrigo Nader
Jared Broad even if I choose 1 year of minute historical data, with a few symbols, this happens. Is there any alternative on the Notebook or is this temporary?
Michael Manus
as i decribed above you have to go through the data while it get streamed through your code/ondata. nothing more someone can do here
Rodrigo Nader
The material on this website is provided for informational purposes only and does not constitute an offer to sell, a solicitation to buy, or a recommendation or endorsement for any security or strategy, nor does it constitute an offer to provide investment advisory services by QuantConnect. In addition, the material offers no opinion with respect to the suitability of any security or specific investment. QuantConnect makes no guarantees as to the accuracy or completeness of the views expressed in the website. The views are subject to change, and may have become unreliable for various reasons, including changes in market conditions or economic circumstances. All investments involve risk, including loss of principal. You should consult with an investment professional before making any investment decisions.
To unlock posting to the community forums please complete at least 30% of Boot Camp.
You can continue your Boot Camp training progress from the terminal. We hope to see you in the community soon!