Robert Barr, VP Data Grid Engineering Lead at Barclays, will present a keynote presentation titled “In Memory Computing for Financial Services: Past, Present and Future” at the In-Memory Computing Summit on May 23-24 in San Francisco. (A limited number of seats are still available to the event. Register today to attend at http://imcsummit.org).
Robert’s presentation addresses the major data processing challenges faced by the financial services industry in regards to performance, scalability and security. Robert was kind enough to answer a few questions to give us a high-level look at what he will cover in his talk.
What new or evolving challenges do you foresee ahead for the financial sector, and how does in-memory computing address those challenges?
Regulatory requirements are on the increase, with emerging regulations such as the Markets in Financial Instruments Directive (MiFID), Protection of Clients Assets and Money (CASS) and the Fundamental Review of the Trading Book (FRTB) to name but a few impacting not only the reporting requirements but mandating how financial services organizations are structured and operate. The constantly evolving challenge is to streamline operations, to reduce running costs and to comply with regulations while maintaining and improving market position. Traditional models of technology use are no longer sufficient to operate in today’s economy and financial services are turning to emerging tools such as in memory computing, Big Data and Cloud computing to drive down costs while increasing the scale of operations.
To what extent are Regulatory requirements driving the adoption of In-Memory Computing technologies in the financial industry?
Regulatory requirements are probably the major driver for the way Financial Services operate, and consequently the technology that is being adopted. Regulators are constantly seeking to improve consumer protections, to simplify complex products and to reduce the risk inherent in Financial Services operations. The direct consequence of this is an increase in the amount and complexity of regulatory reporting being performed, and the rate at which those are being passed to the regulators. Large scale in memory computing is increasingly being utilized to meet those challenges of volume and speed.
Assuming a world where the spinning disk is obsolete and all business applications run in-memory, what stage do you think we are in in that adoption process?
In my experience, we’re in the infancy of this process. Applications are adopting in memory products to help meet their performance requirements but still rely heavily on spinning disk to meet their scalability and persistence needs. As technologies like NVDIMM and DSSD become more prevalent, and as the convergence of Relational, Big Data and NoSQL continues, we’ll start to see a shift away from disk-first and towards memory-first applications. As technology improves and the price disparity between memory and disk narrows, disk will become less and less relevant.
Attend Robert’s talk along with keynote talks from other thought leaders in the field of In-Memory Computing and over 30 deep-dive presentations from developers and decision makers who are pushing the limits of performance and scalability of big data systems at the In-Memory Computing Summit in San Francisco May 23-24. Register today.