Matthew Cook -
Software Money Pit Blog
    Strategy & Management
    Trends & Technologies
    Case Studies
About Matt
Buy Matt's Book
Matthew Cook -
  • Software Money Pit Blog
    • Strategy & Management
    • Trends & Technologies
    • Case Studies
  • About Matt
  • Buy Matt’s Book
Trends & Technologies

What Is In-Memory Computing?

June 5, 2015 by Matt Cook No Comments

Image: Memory Bus by ARendle, CC license.

In-memory computing, also known as massively parallel computing, is composed of two things: 1) huge amounts of RAM; and 2) huge amounts of processing power.

In-memory computing is another technology leapfrogging the traditional data warehouse. An in-memory architecture uses data that is in the main memory (also known as Random Access Memory, or RAM) of a computer, rather than data on a hard disk.

Data retrieval from a disk is the slowest part of any analytical query, because the software has to “find and fetch” the data you want, and queries accessing very large amounts of data just can’t be done in a feasible amount of time.

You’ve probably already experienced this. I work with people who launch some SAP queries that take an hour or more to run. These people would like to query even larger amounts of data but don’t even bother trying because they know SAP might just stop in midstream or take so long that the information isn’t worth the effort.

An in-memory setup eliminates “find and fetch” because the data isn’t even stored on a disk; it’s available right there in the main memory of the application, which means it is available for selection and use in your inquiry.

It also means that the way you collect, sort, analyze, chart, use and interpret data should change dramatically – from a fixed and limited process to a more natural and iterative process. The in-memory technology makes it possible to gather information in a way that is a lot like your normal thought process.

Your brain is like an in-memory computer. To make a decision, you first start with the information you have in your head. Then you gather what is missing, using the web, asking questions, reading the newspaper. Your brain immediately processes each new piece of information and sometimes in seconds you’ve made your decision.

This new paradigm – massive data storage connected to super fast computing power – will change what we ask for. No longer will we ask for a report on sales by customer, by date, by region, by product. Instead we will want every single piece of data related to any sale of anything to anyone, say, for the past two years–every single invoice, credit, return, price, discount, the person who sold it, the commission paid on it, the color of the product, the shipment date, delivery data, invoice payment amount, date of payment – everything. This will become the expectation in all areas of an enterprise.

Amazon Web Services (AWS) is one place to secure this type of environment.  The costs for 20 to 40 terabytes of storage is about the same as the monthly rent of a Manhattan apartment.

Share:
Trends & Technologies

Tell Me Again Why I Should Care About Hyperscale Computing?

May 2, 2015 by Matt Cook No Comments

Photo: “Trails in the Sand,” Dubai, by Kamal Kestell, CC license

If “Humanscale” computing is managing bags of sand, “Hyperscale” computing is managing each individual grain of sand in every bag.

“Hyperscale” computing (HC) is the processing of data, messages or transactions on a scale orders of magnitude larger than traditional computing.  HC is becoming a need for many businesses.  Why?

Consider a company that sells bottled water.  Its main business used to be selling truckloads full of cases of water to big grocery chains.  It has 25 different products, or Stock Keeping Units (SKUs).  The big grocery chains then distributed cases of water to its stores, which numbered 20,000.  The data requirements for the water company’s computers was manageable, even as the company grew rapidly.

Now, the company wants to analyze the performance of its products on store shelves by measuring things like velocity (how fast the product turns), price compared to competing products, and out-of-stocks.  It’s customers — the big grocery chains — are offering to supply data from their systems on every scan of every product in every store, because they too want to improve the performance of products on the shelf.

In one month during the summer, about 3.5 billion bottles of water are sold.  A data file from just one big grocery chain runs to 3 million lines.  How and where will you process this data?  Traditional databases will be too slow.  You will need superfast databases that distribute computing to many servers — this is called in-memory, or massively parallel computing.  This is an example of hyperscale computing.

Other examples where you would need HC: selling direct to consumers through their smartphones, where you might have to process millions of transactions say, during the Christmas holiday season; gathering machine data every second to monitor a machine’s performance (a General Electric turbofan jet engine generates 5,000 data points per second, which amounts to 30 terabytes every 30 minutes); and managing millions of product-attribute combinations.

The computing tools for hyperscale will not be found in your ERP system.  Trying to engineer your existing systems to handle hyperscale data and transactions will be a costly failure.  But there are tools available on the market today, and many of them are found in cloud applications, and in application hosting providers.

Cloud application and hosting vendors usually have much larger data processing capabilities, including automatic failover and redundant servers.  You can take advantage of this capacity.  For example, you can obtain, from a leading application hosting provider, at a cost less than the monthly rent of an apartment in New York City, 30 terabytes of storage and a massively parallel computing environment.

My advice:

  • Identify areas of your business that are significantly under-scaled, or where you have large gaps in business needs compared to processing capability;
  • Pick one and design a pilot project (many vendors are willing to do this with you at very low cost);
  • Measure results and benefits, and if beneficial, expand the solution to other parts of your business.

It’s probably not OK to ignore this trend.  Even of you don’t need HC today, think about the future and where commerce is going.  If you don’t gain the capability for hyperscale computing, one or more of your competitors probably will.

 

Share:

Categories

  • Strategy & Management
  • Trends & Technologies

Archives

© 2017 Copyright Matthew David Cook // All rights reserved