X
Innovation

Less risk, more reward: expanding the borders of financial analysis

Using lessons learned from the 2008 stock-market collapse, financial firms ramp up the use of big data and high-performance computing to provide faster, more accurate information in the event of another economic crisis.
Written by Matthew Malone, Contributor

Imagine it's September 2008, you’re the chief executive of a financial firm, and your CFO walks into your office and announces that a trading partner, Lehman Brothers, may go bankrupt. Insurance company AIG, which sold you piles of credit default swaps, may be next. And, by the way, that mountain of mortgage-backed securities the firm’s sitting on? It’s not clear that anyone, even your mother, will buy them.

After chewing a fistful of Xanax, you’d probably want to know the breadth and depth of your exposure. Quickly. Fortunately, there’s plenty of data, terabytes and terabytes of it, stored in the company’s data warehouse just begging to tell the full story.

Unfortunately, there isn’t really a prayer of hearing it. Certainly not before dinner. Maybe even tomorrow’s breakfast.

Yes, it’s true: despite the ever-declining cost of data storage, the exponential growth of computing speed defined by “Moore’s Law,” and the more than $3 trillion in reported global IT spending in 2007, some managers spent the frantic early hours and days of the financial crisis flying blind. Calculating traditional risk measures like value-at-risk (the probability of total portfolio loss over a certain time period) and liquidity of a complex, multi-billion-dollar portfolio could take nearly a day. The separate systems many firms use to capture and store data in different business units could also make aggregating risk from a single client a time-consuming, and ultimately inaccurate, undertaking. And, given the volatility at the height the crisis, a calculation of the firm’s cash position that took hours would be like getting the weather report after the hurricane blows out the windows.

“We saw the aggregation of risk across asset classes -- that one asset class affects another," said Neil McGovern, who oversees the capital markets group at enterprise software maker SAP. “Yet [divisions within firms] all had their own data and storage needs. There was a fracturing of the risk calculation.”

Getting it all together

Fortunately, that nerve-wracking reality may soon be history. Just four years after the start of the financial crisis, the rapid evolution of data, software and computing resources has made the prospect of real-time, firm-wide analytics more reality than dream. Crises will come again, of course, as the quality of analysis still always depends on the competence of the people creating the models that produce it. But the borders of possibility have shifted. Emerging technologies like in-memory data processing and robust integration software now allow managers to measure risk and capitalize on exponential data growth in fundamentally new ways, and often in real time.

“What’s different today is that the time dimension is almost out of the picture,” said Jim Davis, chief marketing officer at SAS, the business analytics software maker. “It used to be that people put the constraints in and then got the results in the morning. The latency associated with this stuff is disappearing.”

In the world of finance, the competitive implications of real-time computations and data analysis, whether to assess risk before making a trade or as the whole global financial system blows up, are only beginning to become clear.

What’s your legacy?

For decades, most information systems have shared basic properties. Real-time, transactional data is fed into the system, organized in a relational database and stored in massive hard drives. Performing analysis directly on the daily operational data has largely been impractical; the computing resources necessary could bog down the whole system and cause intolerable delays in conducting regular business.

Instead, to analyze the data, it’s copied and stored on additional drives. The analysis requires reading the data off the physical disks. That's fine when you're talking about opening a Word document while surfing the Web, but when it comes to doing complex calculations on sometimes billions of data points, reading from a disk slows computations and can make swift, informed decision making impossible.

So as the volumes of data and the complexity of the desired analysis have grown, speed has become a more and more precious commodity. Complicating the issue is the emergence of so-called big data, the ever-increasing pool of information that, in both size and quality, is more difficult to capture and analyze in traditional structured databases. For example, information like a street address and ZIP code is easily organized into neat tables and rows, while e-mails and videos are not.

In particular, unlocking the value of tweets, status updates, phone-based location services and blog posts that now flow from every corner of the globe demands that businesses process the information on the fly --it does little good to know that someone’s shopping in your store five minutes after they leave.

The memory game

Major hardware and software players like SAS, SAP, HP, IBM and Oracle are making rapid progress developing these next generation systems. In the spring of 2010, SAP introduced its HANA (High-performance Analytic Appliance) platform, a database program combined with third-party hardware that stores data in the computer’s memory, where the computations occur, rather than on separate physical disks. While the notion of such “in-memory” computing has been around for a while, the cost of memory relative to hard disk space had long been prohibitive.

SAS’s high-performance analytics platform also takes advantage of in-memory computing. In an early demonstration of the technology, SAS reduced the time it typically took for a value-at-risk calculation on one server from 18 hours to under three minutes. SAS’s products, and others like it, can also integrate the piles of data sprinkled throughout an organization and analyze them quickly, without the need to overhaul an entire system -- something particularly important for any organization that’s spent many millions on existing IT infrastructure.

That type of performance and capability has people like Ming Soong, chief risk officer of the United Overseas Bank in Singapore, lobbying to implement high-performance computing throughout the organization. In 2008, the bank hoped to establish an enterprise risk management platform, with the goal of conducting stress testing on data housed in different silos across the organization.

While Soong said the platform has increased the speed and sophistication with which the bank can analyze its balance sheet, it can only be accomplished at the company headquarters. Risk officers at any one of its three regional subsidiaries or many branches can't do similar calculations without grinding the whole system to a halt.

“The platform we were working on was extremely slow, to the point that when the results are finally available, no one would have been interested in them anymore,” Soong said.

With its current platform, value and liquidity risk calculations still take a few hours, and measuring its economic capital -- the amount of money the firm would need to cover all its risks in a worst-case scenario -- takes several days. The bank has defined limits on the exposure its traders can have to particular trading partners, but those calculations can’t be done in real-time.

As a result, the bank imposes artificial buffers on trading activity. “Such buffers consume scare capital,” Soong explained. “Higher capital consumption, all things being equal, leads to lower returns on equity. Impending regulations will impose even greater capital requirements. The organization that can be more capital-efficient will be the organization that attracts the attention of investors.”

Behold, big data

It isn’t hard to grasp the potential value of the the cascade of text, video, web logs, clickstream and other unstructured data that makes up much of the world’s digital information (about 80 percent, according to IBM). While things like in-memory computing and in-house analytic software provide one option for handling it, an innumerable number of startups are providing their own platforms for filtering things like social data, with the goal of providing real-time insight to inform investing decisions. One firm, DataMinr, notified its financial clients of the death of Osama bin Laden based on its analysis of just 19 tweets. It transmitted the information 20 minutes before the news broke in major outlets. Such insight would have been of great value to say, an oil trader, who could set up positions in advance of the short-term drop in oil futures that followed the later announcements in mainstream media.

Plan before you plunge

For all its promise, before diving into high-powered computing, or merely dipping a toe in big-data analysis, experts warn that fundamental issues still need to be worked out, particularly the influence of old habits and old ways of thinking.

“The legacy systems of our brains may not be well adapted to a future where data is so large and available in such array,” said Dr. Michael Rappa, the founder of the Institute for Advanced Analytics at North Carolina State University. “You can eat up a lot of effort to clean up and harmonize data to make it useful. Long before you get to data mining or sophisticated methodology, you have to deal with these essential issues.”

Rappa and several others also suggest a re-evaluation of business processes and data analytics rooted in traditional technologies. Real-time processing and flexible storage opens up new possibilities for using data to gain business insight.

“I don’t think it’s about implementing technology, but business processes that technology allows,” said George Westerman, a data scientist at MIT. “You want to question your assumptions. If you assume that [data from different] business units can’t 'talk,' test whether that's based on a reality that’s 20 to 30 years old.”

Photo: Alex E. Proimos/Flickr

More from The Borders Issue:

This post was originally published on Smartplanet.com

Editorial standards