SAS preps in-memory analytic grids
- — 23 June, 2010 20:32
SAS Institute is joining the newly hot area of in-memory processing, developing a series of high-performance analytics systems tuned for specialized tasks.
The first product will focus on risk management and become available late this year. Products for retailers and other verticals will follow.
SAS is working with Hewlett-Packard on the hardware end, which will involve grids made of HP BladeSystem technology along with Insight Control management and automation software. The HP partnership is not exclusive, but the only one SAS is conducting for now.
In-memory processing, which pushes data into RAM, adds a performance boost compared to reading and writing information from disks. SAS's new technology can dramatically speed up processing jobs that would normally take hours, reducing the time to minutes, according to David Wallace, global marketing manager for solutions and industries.
SAS decided against releasing the in-memory technology as a horizontal tool, Wallace said. "Instead, what we're doing is focusing our efforts on solving complex problems in specific industries."
The risk management product will help banks better determine how to allocate capital and find out how turmoil in the financial markets will affect their positions, Wallace said.
One area of focus for retail is "markdown optimization," the process of making sure revenues and profits on season-specific items is maximized as the products start to age.
In addition, SAS is working with early customers, including Macy's and United Overseas Bank, to determine optimal sizes for the private grid clusters, Wallace said.
SAS's products are emerging as in-memory processing gains a higher profile in part due to recent announcements from SAP. The applications vendor is developing appliances that employ its new in-memory database, and will gain more in-memory technology from the pending acquisition of Sybase.
While in-memory processing is far from new, in the past the cost of RAM and hardware limited the amount of data that could be processed at one time. The advent of low-priced multicore servers containing large amounts of RAM have mitigated this problem.
Chris Kanaracus covers enterprise software and general technology breaking news for The IDG News Service. Chris's e-mail address is Chris_Kanaracus@idg.com