Kx Systems , the leader in high-performance database and timeseries analysis, announced impressive new test results from the Securities Technology Analysis Center (STAC) for benchmark testing on timeseries data using Kx's kdb+, in conjunction with HP and Intel. Using HP's Intel based industry standard servers, with 64 high performance cores and 1 terabyte of memory, STAC ran its M3 vendor-independent market data benchmark suite for large timeseries datasets (tick databases).
Says Daryan Dehghanpisheh, Global Sales Director - Financial Services Institutions, at Intel: "The latest set of benchmarks show substantial improvements in calculation speed. The advances in technology for this space continue to surprise. Five years ago a system with the same specifications as the one used in the STAC benchmarks would have cost four times more, and would not deliver the performance demonstrated. The STAC team has found that optimized configurations can bring firms major performance improvements. This provides an opportunity to apply this new level of performance to other areas like risk management, for example. With companies like Kx using all of the available system features Intel provides, financial institutions can gain a significant competitive advantage."
The benchmarks were run using a year of daily NYSE TAQ-like data, approximately 5 terabytes in total. A series of up to 20 complex queries were used in the benchmarking; the queries were defined by financial institutions in order to reflect real business requirements. The benchmarks included calculating the NBBO (National Best Bid and Offer) for a specific day, calculating volume curves over 20 days and a theoretical P&L, among others. The results showed very significant improvements compared to the last set of benchmarks published by STAC in April 2011. For example, the NBBO calculation saw a 50% improvement on the previous test results.
Simon Garland, Kx Chief Strategist, said: "The STAC tests achieved very impressive results. The ability to access and analyze vast quantities of data very quickly have always been important to financial institutions. However, the recent market volatility and record volumes make this an even more pressing requirement. In August we saw NYSE TAQ daily record volumes break through 2bn a day, and reach a new high of 2.4bn, something the markets were not expecting to see for quite a while yet." He adds: "While only a year ago at the top of the range of commodity hardware one could get something like 32 cores and 256 gigabytes of memory (off-the-shelf), now 64 cores and 1 terabyte of memory are becoming standard."
The benchmarks demonstrate that financial institutions can test trading strategies, manage their risk, back-test new trading algorithms etc., very quickly indeed. Regulatory imperatives mean that institutions must have very efficient methods for storing, retrieving and analyzing data. The STAC report proves that, given the right hardware and configuration, institutions can be assured there are affordable solutions for analyzing market data, despite unprecedented volumes.