Is In-Memory Computing the Holy Grail for Big Data Analytics?

Share this:

The industry is awash with articles on big data. Big data news is not confined to the technical webpages anymore. You can read about big data on Forbes and The Economist for example.

Each week the technical media reports on break through, startups, funding and customer use cases.

No matter your source for information on big data one thing they all have in common is that the amount of information an organization will deal with only ever increase, this is driving the ‘big data’ concept. Organizations are relying on increasing amounts of information from a variety of sources, text, images, audio, video, etc., to analyze, improve and execute their operations.

Big data is happening with Industries such as financial services, healthcare, retail, communications, to name a few. These industries have been collecting data for years and with the advent of the Internet of Things data is growing exponentially. The challenge is how to make sense of this data and turn it into business value – analytics and predictive analytics.

An example of big data at work is Fraud Detection and Security requirements of the Banking and Financial Services industry. These organizations need to prevent fraud and leverage analytics, machine learning, and big data technology to gain a holistic view of their customers to identify patterns buried in data and distinguish fraudulent activity from normal activity.

Many new solutions are being developed to churn through and make sense of this data. Perhaps an old solution is getting its second or third wind – big memory, really big memory and the ability to troll through with high bandwidth and latency speeds.

Why will this be important?

First, memory is getting cheaper and cheaper. Second, by storing the database in-memory one avoids the traditional IO bottleneck, despite the growth in SSD storage, and third there are some really big memory systems available, up to 64TB of shared memory in some cases and getting bigger. These tera-scale memory systems have the ability to sift through masses of data enabling OLTP and OLAP in a single system creating a Hybrid/Transaction Analytics Processing (HTAP) solutions that do not require data duplication.

In-memory computing is not new. For several decades the supercomputing industry or High Performance Computing squeezed every bit of performance they could with in-memory computing solutions. Now we are seeing in-memory HPC techniques being applied to everyday big data. HPC to the rescue of the enterprise, one more time.

Steve Campbell
Steve brings a unique blend of expertise to the technology industry. He has held senior VP positions in sales & marketing for HPC and Enterprise vendors including Sun Microsystems and Hitachi. He’s also worked with early stage technology and Internet start-ups helping to raise over $300M for clients.

Share this: