Matthew Cook -
Software Money Pit Blog
    Strategy & Management
    Trends & Technologies
    Case Studies
About Matt
Buy Matt's Book
Matthew Cook -
  • Software Money Pit Blog
    • Strategy & Management
    • Trends & Technologies
    • Case Studies
  • About Matt
  • Buy Matt’s Book
Trends & Technologies

An Intro to Analytics Vendors

June 20, 2016 by Matt Cook No Comments

Image by David Bleasdale, CC license

Analytics is one of the top buzzwords in business software today. Analytics software is often marketed as a tool for business intelligence, data mining or insights. It’s the crystal ball software: tell me things I don’t already know, and show me ah-hahs or other exciting revelations that, if acted on, will increase sales, cut costs or produce some other benefit.

The essential elements for analytics are:

1) A design for your ‘stack’ which is just a term for layers: usually at the bottom you have a data layer, then a translation layer, then on top of that some kind of user interface layer. The translation and user interface layers are usually provided by the analytics vendor; you provide a place for data storage.

2) A way to send the data to your data storage, automatically, which is usually referred to as “ETL” or extract, transform, and load. SnapLogic and Informatica are two vendors who offer these tools.

3) Some way to “harmonize” the data, which means define each data element and how it will be used in analytics. “Sales” will mean such and such, “Gross Margin” will be defined as ……

These three components can be on-premise in your building or in a cloud hosted by a vendor.

SAS, based in North Carolina, has long pioneered this space, and now many business software firms claim to provide “robust analytics.” The problem: what constitutes “analytics”? Canned reports are not analytics. So you’ll need to shop this category knowing that probably the most serious applications will come from firms that are dedicated to analytics.

International Data Corporation (IDC) reports that the business analytics software market is projected to grow at a 9.8% annual rate through 2016. IDC describes the market as dominated by giants Oracle, SAP and IBM, with SAS, Teradata, Informatica and Microstrategy rounding out the top 10 in terms of sales revenue. Although the top 10 account for 70% of the market, IDC reports that “there is a large and competitive market that represents the remaining 30%…hundreds of ISVs (Independent Software Vendors) worldwide operate in the 12 segments of the business analytics market…some provide a single tool or application, others offer software that spans multiple market segments.”

Here are some other interesting analytics or business intelligence (BI) products: Qliktech provides easy-to-develop dashboards with graphical representations as well as tabular and exportable reports. Its Qlikview software is an “in-memory” application, which means that it stores data from multiple sources in RAM, allowing the user to see multiple views of the data, filtered and sorted according to different criteria.

Information Builders (IB) is a software company classified by advisory firm Gartner as a leader in BI applications. IB’s main application, WebFocus, is a flexible, user-friendly tool that is popular with sales teams because salespeople use it while visiting customers to enhance their selling messages with facts and visual interpretations of data.

WebFocus has a “natural language” search capability, making it useful to monitor and analyze social media.
Birst, named by Gartner as a challenger in the BI space, is a cloud-based (SaaS) application that offers “self-service BI,” deployment to mobile devices, adaptive connectors to many different types of data sources, in-memory analytics, drill-down capabilities, and data visualization. The Birst tool also has a data management layer, allowing users to link data, create relationships and indexes, and load data into a data store.  Tableau is another similar vendor.

It’s useful to start small and experiment with analytics.  People in your organization with good quantitative skills and imagination can experiment with tools, usually at very low cost.  Soon you will see some interesting results and will want to do more…but make sure to put in place some rules about what constitutes sanctioned and official “analytics” in your organization, to prevent uncontrolled proliferation of un-validated information.

Share:
Trends & Technologies

What Is Data Visualization?

December 20, 2015 by Matt Cook No Comments

A data visualization of LinkedIn connections. Image by Luc Legay, CC license

Frank Luntz is a professional pollster who uses the visualization of data to show the sentiments of viewers as they watch political ads. The technique uses a moving second-by-second graph to show when exactly during an ad viewers felt positive or negative toward the content of the ad. Viewers use a handheld device with buttons for positive and negative, and press each one according to their sentiment as they view the ad.

Mr. Luntz could have simply had each viewer fill out a questionnaire about the ad – what did they like and what didn’t they like? You would then see numeric totals and percentages related to each question, but you wouldn’t see exactly when during the ad viewers had positive or negative feelings. The second-by-second gathering of data draws a much clearer picture.

That is what data visualization is about.

Many software vendors offer products in this category and many of those vendors are start-ups. Some claim the ability to merge all kinds of data – including Twitter feeds — into a coherent picture. This may be the case but my advice is to treat this area as very formative – i.e. not yet mature and therefore somewhat experimental.

I think technology will make it easy to index every single event in your enterprise and to display in real time a visual interpretation of all of those events interacting with one another. Executives and managers will no longer look at static tables of numbers or even graphs or charts; they will be able to “watch” their business in real time and see a future visualized picture of their business, much like a weather forecast is shown in graphical terms.

Some advice, if you want to experiment in this area:

  • Find an area of your business where there is complete mystery, and where a vivid picture holds promise for a breakthrough development;
  • Make sure you have a way of capturing the data;
  • Try and buy: vendors will often conduct a pilot for you at little or no cost
Share:
Trends & Technologies

Data Virtualization vs. Data Visualization

August 26, 2015 by Matt Cook No Comments

Image: visualization of data showing places in New York City frequented by tourists (red) and locals (blue); by Eric Fisher; Creative Commons license.

Another emerging segment of the analytics software market is data virtualization (DV), referred to by some as Information-as-a-Service (IaaS), which enables access to multiple data sources, usually in real time, without the time and expense of traditional data warehousing and data extraction methods.

Forrester Research defines DV as solutions that “provide a virtualized data services layer that integrates data from heterogeneous data sources and content in real-time, near-real-time or batch as needed to support a wide range of applications and processes.”  Data Visualization, on the other hand, refers to methods of displaying data in a highly visual way, with the purpose of finding a display mechanism that reveals more insight than traditional reporting methods (see ‘What is Data Visualization’?)

Traditional BI or analytics methods rely on some form of data warehousing, in which pieces of data are extracted, usually from transaction systems, transformed or “normalized” (i.e., “formatted”), and stored in tables according to some type of schema. “Customer Account Number,” for example, may belong in the “Customer” table, and so on. As covered in the book, building a data warehouse and getting it to work right can take years, and require substantial technical skills that even many mid-sized to large companies just don’t have.

Data Virtualization aims to overcome this disadvantage by not extracting data from their original sources but by viewing and manipulating the data inside the DV tool or layer to build your analysis.  In simple terms, a DV tool is supposed to let you “see” sources of data in different applications and databases, and to “select” data from those sources for your queries or analysis.

While it’s feasible to connect directly to external applications and other data sources, whoever owns or manages that application or data source may prevent you from connecting directly, for security reasons, or to avoid overloading the database, to avoid corrupting the data, or simply because the data is proprietary and the provider allows access only through an environment external to the data source.  These are some of the barriers I have encountered.

Forrester estimates an $8 billion market for DV software.  Forrester notes that the current market is dominated by big companies such as SAP, Oracle, Informatica, Microsoft and Red Hat, and specialized firms like Composite Software, Denodo Technologies and Radiant Market.

Experimenting on a small scale is a good idea here.  Vendors are willing to show you capabilities and do small pilots to prove the concept you might be considering the software for.

Share:
Trends & Technologies

Internet of Things: Three Practical Uses

June 2, 2015 by Matt Cook No Comments

Yes, your new fridge can be an internet-enabled Thing, and you can text it to check the beer supply, possibly avoiding a stop on the way home (although, is it possible to have too much beer?)

Alas, which of life’s many difficult hardships will technology eliminate next?  The smart fridge is cool, but it’s about as necessary as a lawn ornament (no offense to law ornament fans).

What about the breakthrough, make-the-world-a-better-place uses for IoT?

In business, I see three promising areas:

Inventory: Good, cheap, durable sensors attached to inventory could cut losses and improve  accuracy.  RFID isn’t good enough in many cases, although that is changing: Airbus uses RFID tags to track thousands of airplane seats and life vests, and a major Japanese clothing retailer has applied RFID tags to everything in its stores, including inventory, hangers, and merchandising displays.

Retail: Already some stores are starting to use sensors to detect when inventory on the shelf is low.  If the trend continues and accuracy is good, this could be a revolution in retail inventory tracking, which is currently done by scanning UPC codes.  As the costs of sensors drops, more and more (lower value) products can be included in this type of solution.  Some hotel mini-bars now sense when items are consumed, eliminating the need to count, write down, and key in how many drinks and snacks a hotel guest had. 

Machinery diagnostics: For complex production lines that are difficult to keep running at top performance for long periods, IoT sensors could continually measure and transmit machine parameters, output, speed, consistency of cycles, and other variables to create a visual record of performance that could then be correlated with unplanned downtime; cause and effect could more easily be determined and machine performance improved.

PINC Solutions, Inc. markets connected sensors and software for managing truck fleets at plants and distribution centers.  It is a straightforward, practical application of IoT: trucks have RFID sensors that uniquely identify them; trucks are attached via software to delivery numbers, dock doors, destinations, and other information via a giant virtual whiteboard.

The benefits here are easy to understand: measure & reduce wait times at pickup and delivery points, reduce idling and searching in a yard full of trucks for the one you need, and provide real-time on-the-road status and ETA.

For more on this topic, check out this IoT primer published by Goldman Sachs.

For more articles like this, visit my site at softwaremoneypit.com.

Share:
Trends & Technologies

Big Data 101

May 10, 2015 by Matt Cook No Comments

Image: “Data Center.” by Stan Wlechers, CC license

So what is Big Data, particularly Big Data analytics? Why all the hype?

Big Data is what it implies: tons of data. We’re talking millions or billions of rows here – way too much for standard query tools accessing data on a disk.

What would constitute “tons” of data? Every bottle of “spring,” “purified” or “mineral” water that was scanned at a grocery store checkout during the month of July 2011; the brand, the price, the size, the name and location of the store, and the day of the week it was bought. That’s six pieces of data, multiplied by the estimated 3.3 billion bottles of water sold monthly in the United States.

Big Data analytics is the process of extracting meaning from all that data.

The analysis of big data is made possible by two developments:

1) The continuation of Moore’s law; that is, computer speed and memory have multiplied exponentially. This has enabled the processing of huge amounts of data without retrieving that data from disk storage; and

2) “Distributed” computing structures such as Hadoop have made it possible for the processing of large amounts of data to be done on multiple servers at once.

The hype you read about Big Data may be justified. Big data does have potential and should not be ignored. With the right software, a virtual picture of the data can be painted with more detail than ever before. Think of it as a photograph, illustration or sketch – with every additional line of clarification or sharpening of detail, the picture comes more into focus.

Michael Malone, writing in The Wall Street Journal, says that some really big things might be possible with big data:

“It could mean capturing every step in the path of every shopper in a store over the course of a year, or monitoring every vital sign of a patient every second for the course of his illness….Big data offers measuring precision in science, business, medicine and almost every other sector never before possible.”

But should your enterprise pursue Big Data analytics? It may already have. If your company processes millions of transactions or has millions of customers, you have a lot of data to begin with.

You need three things to enable Big Data analytics:

  1. A way to get the data, whether out of your transaction systems or from external sources, and into a database. Typically this is done with ETL or Extract, Transform, and Load software tools such as Informatica. Jobs are set up and the data is pulled every hour, day, etc., put into a file and either pushed or pulled into a storage environment.
  2. Superfast data processing. Today, an in-memory database (a database with enormous amounts of RAM and massively parallel processing) can be acquired and used on a software-as-service basis from Amazon Web Services at a very reasonable cost.
  3. User interface analytics tools that present the data in the visual form you prefer. Vendors include Oracle, Teradata, Tableau, Information Builders, Qlikview, Hyperion, and many others. The market here is moving toward data visualization via low-cost, software-as-a-service tools that allow you to aggregate disparate sources of data (internal and external systems, social media, and public sources like weather and demographic statistics.
Share:

Categories

  • Strategy & Management
  • Trends & Technologies

Archives

© 2017 Copyright Matthew David Cook // All rights reserved