In this day and age, it has become more and more apparent that data is scaling and changing beyond the ability of human-powered machine learning to make the most of it. This makes enhancing current analyses with deep learning all the more necessary. Our team invested heavily in developing a trading model that could continue […] continue reading »
Category: FX Analysis
Real Net Profit: 150% in just 4 Months
Developing a post-commission profitable currency trading model using Pivot Billions and R. Needle, meet haystack. Searching for the right combination of features to make a consistent trading model can be quite difficult and takes many, many iterations. By incorporating Pivot Billions and R into my research process, I was able to dramatically improve the […] continue reading »
Pivot Billions and Deep Learning enhanced trading models achieve 30% net profit
Deep Learning has revolutionized the fields of image classification, personal assistance, competitive board game play, and many more. However, the financial currency markets have been surprisingly stagnant. In our efforts to create a profitable and accurate trading model, we came upon the question: what if financial currency data could be represented as an image? The […] continue reading »
Powering Insight Through Massive Optimization
Data comes with a price. Accuracy comes with an even greater price. And the two together can demand enormous resources. That’s why it is important to achieve the greatest efficiency in your research process and make use of any tools that can help you. This is particularly true if you are trying to develop a […] continue reading »
Streamlining EDA (exploratory data analysis) with Pivot Billions enhances workflows in R.
Incorporating Pivot Billions into your R analysis workflow can dramatically improve the research cycle and your ability to get results. R is a great statistical analysis tool that a wide variety of data analysts use to analyze and model data. But R has limits on the data it can load onto your machine and tends […] continue reading »
Blazing Fast Financial Backtesting from R
As a data scientist, whenever I am developing and testing financial models in R I’ve consistently run into data size limitations, large or distributed compute clusters, and many long waits for my results to be processed and returned. That’s why I was genuinely impressed with how our recently released docker image of Pivot Billions, […] continue reading »