Matthew Cook -
Software Money Pit Blog
    Strategy & Management
    Trends & Technologies
    Case Studies
About Matt
Buy Matt's Book
Matthew Cook -
  • Software Money Pit Blog
    • Strategy & Management
    • Trends & Technologies
    • Case Studies
  • About Matt
  • Buy Matt’s Book
Trends & Technologies

Business Software for Finance 101

June 9, 2016 by Matt Cook No Comments

Image by reynermedia, CC license

Finance and accounting functions were among the first to be automated through software. The sheer volume of numbers and calculations, reporting requirements, tax filings and payroll mechanics, plus the fact that nearly every business has to engage in these activities, made the area perfect for software.

When just these basic functions are needed, not much distinguishes one finance application from another. They all post transactions to a cost center and sub ledger account, they all capture sales and costs and calculate required P&L and balance sheet data, and they all provide reports. They might distinguish themselves in terms of ease of use or report writing, or banking account integration, or cash management, or some other aspect.

Many finance applications are simply bookkeeping systems; if you want real analysis you’ll need to extract data to Excel, Business Objects, or another analysis and reporting tool. My own experience with both Oracle and SAP bears this out: even these leading finance packages are mostly concerned with accounting and financial, not management reporting.

Oracle and SAP both have what they call “business intelligence” capabilities, but they are contained in separate modules that must be purchased and integrated with the core software. So companies can easily spend millions implementing SAP or Oracle, and still find themselves extracting data into Excel spreadsheets for basic business analysis.

My experience is that most finance applications lack budgeting and financial modeling capabilities. It is one thing to know that your prior month results were over budget because of rising fuel prices, and quite another to project the future profit impact of different oil price scenarios. At what point would it make sense to switch to alternative fuels, to pass on some of these increased costs, or to buy oil futures as a hedge? A typical finance application won’t help you to answer these questions because they mostly record and categorize costs based on what already happened, not what might happen in the future.

Yes, there are “what if” modeling applications available on the market, but as a stand-alone application they aren’t very useful, since you have to enter all of your data, as if you’re using an Excel spreadsheet. The modeling application needs integration with your ERP to be most effective. Your ERP is the source of all kinds of data needed for financial modeling: production costs, formulas, material costs, transportation costs, revenue by product, as well as cost standards and budget information. This data changes frequently based on business conditions, competition, labor costs, and many other factors.

Microstrategy, Oracle Hyperion and Cognos are leading names in the financial modeling and analytics areas, but other, smaller firms are emerging. Netsuite, the ERP-in-the-cloud vendor, offers an add-on financial modeling application. Netsuite’s web site states that the modeling application features these capabilities:
• Dynamic formulas and assumptions
• “Actuals” data incorporated into new forecasts
• Workflow management
• Planning of full financial statements
• Unlimited versions for “what-if” analysis
• Multi-dimensional models for complex sales and product planning
• Multiple currency budgeting
• Graphic drag-and-drop report builder
• Multi-version variance reporting (vs. budget, vs. plan, vs. forecast)

A3 Solutions is another, smaller firm offering financial modeling applications, either on-premise or as Software-as-a-Service. A3 uses the Excel spreadsheet as the user interface, claiming it is the friendliest environment for creating what-if scenarios, and provides tools to link multiple sources of corporate data and manage modeling versions dynamically and virtually through its Spreadsheet Automation Server. A3 claims McDonalds, Honda, Toyota, T. Rowe Price, and American Airlines as clients. Simplicity, speed of implementation, and low cost are A3’s main selling points.

Once you have the “system of record” stabilized in a strong finance application, as well as good controls over product, customer, and sales data, you can start to think about these higher-level analytical tools. Define a standard model for delivering analytics, put someone in charge of the data, and tightly control the “official” analyses that are produced.

Share:
Trends & Technologies

Are Your Apps Too Hard to Use?

May 24, 2016 by Matt Cook No Comments

You’ve heard the complaints: your systems are too clunky, slow, have too many steps, and they take too long to execute everyday transactions.

The dialogue plays out probably hundreds of times a day in offices throughout the world: users complain about to-hard-to-use systems and their IT departments tell them they just don’t know the right way to use them.

This can be a big problem, but costs and other impacts are not easy to measure. A rough estimate can be had by extrapolating the lost time per user across the enterprise.  A 15% hit to people’s productivity because the systems they use slow down their work actually means you need 1.176 people to do the work of one person.

Extrapolating this, if you have a 500-person organization, an equivalent of 88 of those people are needed only because you have sub-optimal systems.  As convincing as this seems, it’s hard to get the money to improve systems based on this argument. With perfectly-efficient systems, you wouldn’t actually need 88 fewer people because the sum of wasted time is across all 500 people.

What do you do? Two relatively low-cost options are user interface (UI – what you see when you look at the screen) tools and mobile applications.

UI Tools: There is an active market for these, which are intended to be used with widely-deployed ERP systems like SAP and Oracle. These solutions modify or enhance the system’s UI for simplified navigation and a more intuitive feel, and may combine several steps in a transaction or query into one, like an Excel macro.

One company marketing UI solutions (Winshuttle) claims to “turn everyday SAP users into heroes who transform the way their companies work.”

Solutions like this are only relevant for those companies that have full control over their systems environments – companies that own their own “instance” of the ERP system, versus those who use a SaaS ERP or one that is shared across many different business units. This is because you’ll need access “under the hood” to configure these tools.

Mobile: A shortcut (sometimes) to simplified ERP transactions is via mobile applications. A mobile application, out of necessity, must have minimal steps involving minimal data entry. No one wants a Windows version of the ERP system on their 5-inch smartphone screen.

This forces the software to consolidate steps in the transaction and pre-populate fields with user data and settings. If a given ERP transaction involves 5 or 6 steps on a desktop it will likely require only 2 or 3 steps on a mobile device.

Several of the large ERP vendors already have mobile versions of the most frequently used transactions, such as purchase orders and purchase order approvals.

You can always design your own mobile applications (there’s no shortage of people creating new smartphone apps), and doing so can lead to some very creative results that have a huge impact on user morale.

Share:
Trends & Technologies

Big Data: Correlations, Not Cause-and-Effect

February 18, 2016 by Matt Cook No Comments

Image by Marcos Gasparutti, CC license

In their recently published book, “Big Data: A Revolution That Will Transform How We Live, Work, and Think,” Viktor Mayer-Schonberger and Kenneth Cukier say that big data will provide a lot of information that can be used to establish correlations, not necessarily precise cause and effect.

But that might be good enough to extract the value you need from big data.

Three examples from their book:

  1. Walmart discovered a sales spike in Pop-Tarts if storms were in the forecast. The correlation was also true of flashlights, but selling more flashlights made sense; selling more Pop-Tarts didn’t.
  2. Doctors in Canada now prevent fevers in premature infants because of a link between a period when the baby’s vital signs are unusually stable, and, 24 hours later, a severe fever.
  3. Credit scores can be used to predict which people need to be reminded to take a prescription medicine.

Why did the people involved in the above examples compare such different sets of data? One possible reason: because they could – relatively quickly and at low cost – this was made possible by superfast data processing and cheap memory. If you could mash together all kinds of data in large volumes – and do so relatively cheaply – why wouldn’t you until you found some correlations that looked interesting?

You can begin experimenting – a process I endorse — with Big Data. You need three basic components:

  1. A way to get the data, whether out of your transaction systems or from external sources, and into a database.
  2. Superfast data processing (a database with enormous amounts of RAM and massively parallel processing). This can be had on a software-as-service basis from Amazon and other vendors.
  3. Analytics tools that present the data in the visual form you want. Vendors include Oracle, Teradata, Tableau, Information Builders, Qlikview, Hyperion, and many others.

Correlations are usually easier to spot visually. And visualization is where the market seems to be going, at least in terms of hype and vendor offerings. New insights are always welcome, so we shall see what sells and what doesn’t.

The assessment from Gartner seems about right to me at this point in time: that big data is both 1) currently in the phase they call the “trough of disillusionment;” and 2) promising enough that its use in BI will grow sharply.

Share:
Trends & Technologies

Big Data 101

May 10, 2015 by Matt Cook No Comments

Image: “Data Center.” by Stan Wlechers, CC license

So what is Big Data, particularly Big Data analytics? Why all the hype?

Big Data is what it implies: tons of data. We’re talking millions or billions of rows here – way too much for standard query tools accessing data on a disk.

What would constitute “tons” of data? Every bottle of “spring,” “purified” or “mineral” water that was scanned at a grocery store checkout during the month of July 2011; the brand, the price, the size, the name and location of the store, and the day of the week it was bought. That’s six pieces of data, multiplied by the estimated 3.3 billion bottles of water sold monthly in the United States.

Big Data analytics is the process of extracting meaning from all that data.

The analysis of big data is made possible by two developments:

1) The continuation of Moore’s law; that is, computer speed and memory have multiplied exponentially. This has enabled the processing of huge amounts of data without retrieving that data from disk storage; and

2) “Distributed” computing structures such as Hadoop have made it possible for the processing of large amounts of data to be done on multiple servers at once.

The hype you read about Big Data may be justified. Big data does have potential and should not be ignored. With the right software, a virtual picture of the data can be painted with more detail than ever before. Think of it as a photograph, illustration or sketch – with every additional line of clarification or sharpening of detail, the picture comes more into focus.

Michael Malone, writing in The Wall Street Journal, says that some really big things might be possible with big data:

“It could mean capturing every step in the path of every shopper in a store over the course of a year, or monitoring every vital sign of a patient every second for the course of his illness….Big data offers measuring precision in science, business, medicine and almost every other sector never before possible.”

But should your enterprise pursue Big Data analytics? It may already have. If your company processes millions of transactions or has millions of customers, you have a lot of data to begin with.

You need three things to enable Big Data analytics:

  1. A way to get the data, whether out of your transaction systems or from external sources, and into a database. Typically this is done with ETL or Extract, Transform, and Load software tools such as Informatica. Jobs are set up and the data is pulled every hour, day, etc., put into a file and either pushed or pulled into a storage environment.
  2. Superfast data processing. Today, an in-memory database (a database with enormous amounts of RAM and massively parallel processing) can be acquired and used on a software-as-service basis from Amazon Web Services at a very reasonable cost.
  3. User interface analytics tools that present the data in the visual form you prefer. Vendors include Oracle, Teradata, Tableau, Information Builders, Qlikview, Hyperion, and many others. The market here is moving toward data visualization via low-cost, software-as-a-service tools that allow you to aggregate disparate sources of data (internal and external systems, social media, and public sources like weather and demographic statistics.
Share:

Categories

  • Strategy & Management
  • Trends & Technologies

Archives

© 2017 Copyright Matthew David Cook // All rights reserved