A faltering economy requires trimming back the budget and slashing expenditures on frills like software and hardware, right? Maybe not.The same big banks which were the harbinger of the credit crisis – and are now caught in the maelstrom of the meltdown – are spending on sophisticated software and expertise to help them better identify and manage risk. And they can do this while also highlighting previously overlooked business opportunities.“Because of the problems in the financial industry, they are turning to us even more now to help implement better risk solutions,” says Dr. Jim Goodnight, CEO of SAS, a North Carolina-based analytics provider, adding that poor risk management itself was the source of the much of the problems. “Someone in the risk analytics groups at these banks should have been alerting the higher ups that what they were investing in was shaky stuff.”What SAS and other vendors like Oracle, IBM, SAP (and to some extent Microsoft) are selling is not just a combination of software, hardware and network capacity in order for enterprise managers to know in real time what’s going on in their business. They are also offering a much deeper form of predictive analytics.A digital crystal ballJeanne Harris, director of research for consulting firm Accenture’s Institute for High Performance Business, who co-authored Competing on Analytics: The New Science of Winning (Harvard Business School Press, 2007) with Thomas H. Davenport, professor of Information Technology and Management at Babson College in the United States, notes that when companies cross reference seemingly trivial sets of data interesting things emerge.Progressive Casualty Insurance Company, for example, noted there was a relationship between someone’s credit rating and their propensity to get into car accidents. And Wal Mart, which Harris and Davenport maintain collects more consumer data than anyone in the private sector, buys from more than 17,400 suppliers in 80 countries and makes some of that data available to them. Wal-Mart, in turn, runs more than 21 million queries a year searching for what they might better sell in which stores.Harris and Davenport say that clear trends start to emerge when sales data are overlaid on real time local events. For example, just before hurricanes, customers stock up on non-perishable, no cook, food. And their food of choice? Strawberry Pop Tarts. Obviously, Wal Mart now bumps its orders for Pop Tarts in areas under hurricane watches.It’s the little things that make the difference in the long term, and that goes beyond business to public policy. From mapping the human genome, to bringing drugs to market faster, to cutting costs in servicing ATMs, to better predicting hurricanes, the need to capture more data and compare it to a wider range of variables is growing exponentially.Don Campbell, chief technology officer at Cognos, now part of IBM, says it’s about adding a third dimension to aid in decision making. Whereas data mining first started as a tool to learn from past events and grew to allow a real time look at what was happening in the moment, this third dimension is looking forward, rather than through a rear view mirror.It sounds simple, but there’s usually one big hurdle: Too many companies have too many databases scattered over legacy systems – one for sales, another for inventory and so on, says Harris.Also, data are often captured on spreadsheets by department, and by the time they’re analyzed it’s not only probably too late and not very useful, it’s most likely wrong, he says. "One academic suggests between 20% and 40 % of user-created spreadsheets contain errors,” he says. “And those spreadsheets create multiple versions of the truth.”Cleaning up the database and standardizing categories and descriptions is mission critical for any company looking to leverage business analytics and move forward. Text heavy databases are even more problematic, since the English language has many words for the same thing.Business analytics for enterprises of all sizesAnd that may be where SMBs have a distinct advantage over their larger competitors: they’re less likely to be handcuffed to legacy databases and probably have less data to clean up.“But we have to trust the data,” Campbell says. “It has to be as pure as possible.”Cleaning up the data often requires a major investment of resources, though as Campbell notes, there are some programs which will mine the data for anomalies.“We had that ourselves in looking at where we should open our next call centre and then we found something that wasn’t aligned,” he says. “When we looked at it, it was a simple AM and PM thing so we were off in volumes by 12 hours.”Going forward – though it can also be applied retroactively – a new form of data tagging called Predictive Mark Up Language (PMML) is gaining traction as a standardized way of tagging data. It also allows modeling across different systems, says Campbell.“Vendor A may be better at pulling information, Vendor B may be better at aggregating it and Vendor C may be better at presenting it,” he says, noting it allows users to select a ‘best of breed’ across the board and not get locked into a ‘soup to nuts only’ option from a single vendor.And that’s what customers want, he says, the flexibility to leverage their own strategy and competitively differentiate themselves from their rivals.“It used to be data mining was a nerdy thing, with some analyst in a dark room somewhere,” he says. “We’ve progressed to the point where many end users don’t even realize the technology behind it. For example, you run up to the airline counter to get a last minute flight and the system will determine whether you or the guy next to you gets that upgrade ticket, and you can make the flight because there’s only one seat and it should go to the best customers. It makes the best guess based on the data.”What predictive mining is getting good at, he says, is identifying customer segments in real time and how those segments might respond to a different set of offers.“Casinos want to track betting among their loyalty program customers, and what the bets on the tables are by limit,” he says. “And also what happens when the football game is over and all that traffic spills out into the main floor.”Casinos also have multiple properties, he adds, and they want to cross promote them. As a result, predictive data modeling can help them do that based on past and current behaviour instead of having to rely on older data that don’t reflect current market conditions.And it’s all become much more accessible, says Jim Davis, SAS senior vice president and chief marketing officer. Davis points out that the convergence of powerful hardware, sophisticated software, large networks and a deeper understanding of how data can be more efficiently mined for predictive modeling has created a framework which companies can customize for their own purposes.Beyond business intelligence“Business intelligence as we know it is dead as a definition,” says Davis. “What BI represented was a technology that allowed end users to access data and report on that data in terms of being a differentiator in way business is run. Today, businesses are not interested in buying technology; they’re interested in solving business problems. So what we have is a framework rather than a platform. And what we don’t want people to think is it’s a giant undertaking like an Enterprise Resource Platform. It does not cost millions of dollars and take two to five years before you see results. We can see results in a matter of months. If it takes longer it has not been scoped properly.”In fact, says Davis, who obviously presents some bias, tough economic times are just when companies should be spending on analytics to ensure they’re not missing out on undiscovered markets or revenue opportunities.It’s a complex endeavour, however, and behind that combative stance there’s a lot happening beyond hardware and software. Much of what is possible in BA is the result not just of powerful computers, but also a way to harness them as a team, much as those early pioneers’ horses worked in concert to haul wagons across the plains.Grid computing, as it’s called, harnesses processors in much the same way. Companies with large server farms can set up a query and set it running overnight after the business of the day has been done. Working together, the multiple processors crunch data at a level that once required super-computers.“These databases are huge,” says Goodnight. “They are literally millions of columns across and millions of rows down.”The size of Wal-Mart’s database, for example, is 583 Terabytes – roughly 30 times the size of the Library of Congress.At the same time, the brain power required to write the queries is in high demand. Math Phds and astrophysicists are finding work in business, because these people have enough grasp on numbers and algorithms to be able to string together the kind of search queries to elicit useful results.As Harris notes, those hot new jobs are the result of business moving analytics from the back rooms to the boardrooms.Academia, like Dalhousie University in Halifax, is already responding with new programs that teach data mining as part of an MBA program. But already there’s some suggestion that as demand increases the function will be outsourced to countries with a higher per capita ratio of Phds in the sciences.So, take a good look at your data – you might be surprised what it tells you about the future.