Throwback Thursday: Avoiding the Two Biggest Mistakes in TCO Analysis
For this week’s Throwback Thursday, we’re headed WAAAAAY back to June 17, 2012.
Why back that far? Simple. The lessons in this post ring true even today.
Most TCO analysis is flawed. What’s perceived as common sense causes people to become blind to what’s missing in their analysis.
Read this blog entry by Evan McDonnell, and learn the two common mistakes to avoid when doing a total cost of ownership analysis.
I hope you find value in this post!
An important factor in the buying decision is determining the “total cost of ownership.” For example, in purchasing a new car, you take into consideration not just the selling price, but the gas mileage, expected breakdown frequency, average cost of repairs, and so on!
The same general approach applies when evaluating big tech investments that you hope will be the backbone of your operations for the next 10 or 15 years. But, the factors for consideration are very different in technology-based purchases. It’s imperative to get this process right if you want to have a system that fits your needs, provides the highest benefit, and has the lowest lifecycle cost of ownership.
BUT…There are two common mistake areas in TCO analysis that everyone should note and avoid.
Mistake #1 – Using Only Hard Cost Input Numbers
Choosing a technology because its costs are easier to define and appear lower without regards to fit can lead to fiasco.
A TCO model for a large software purchase almost always ends with an Excel spreadsheet comparing well-crunched numbers side by side. With that end in mind, buyers set off with a focus on hard numbers they can enter into their models. The first mistake they make is limiting scope to upfront cost, annual maintenance, and expected upgrade purchase costs from the vendor.
Why? It leaves so much out of the picture!
For starters, how well does the software package fit the needs of the intended users? If it is commercial off-the-shelf (COTS), how easily can it be adjusted to meet new requirements? Software that enables faster work, more collaboration, and better management visibility can have a dramatic impact on a company’s bottom line, yielding benefits that are often many multiples of the cost.
Evaluating how software can make an organization more effective can be straight forward in some cases (e.g. new production management software that increases daily output by 10%). But it is frequently not a hard number exercise. Compare two software packages, one that follows traditional design and keeps project information within a department vs. another that has built-in context around process events that can break down silos. How much more productive would your organization be if decision times could be cut in half (like CME Group experienced)? What’s the value of the competitive advantage that would create? While specific numbers may be hard to generate, these benefits have to be included in an evaluation. Forward thinking business and government leaders are driving their organizations to think this way.
Mistake #2 – Not Planning Ahead for Unexpected Change
One of the things I love about financial analysis in Excel is that everything lines up neatly. The sense of order that comes from seeing factors displayed and calculated builds comfort and satisfaction. But often this is a false sense of comfort. Reality is full of disruptions and unplanned events. It’s hard to fit those neatly into rows, columns, and cells so analysts tend to ignore them to the organization’s peril. The longer the time horizon for analysis, the greater the impact this mistake has.
While working as a business strategy consultant much earlier in my career, I distinctly remember a colleague telling me he just heard about this thing called the “world wide web.” Talk about a disruption! Any financial model I created for a client at that time that went out more than five years was flawed because it didn’t account for the explosive growth of the internet. The same is true for any financial analysis built within the past several years that didn’t account for the explosive growth of mobile communication and social networks.
How do you incorporate unplanned events into your TCO model? History teaches us that disruptive events in your business will happen. You will either incur the costs to adapt your software or incur the organizational costs of not having software that meets your needs. (It’s not pretty when organizations find their software isn’t flexible. It spawns manual processes outside of the application which has a negative impact on every type of metric.) Recognize that your software will have to be adapted to meet new conditions. The longer the planning horizon, the more adaptations you’ll require.
Ask these questions – Can this particular software be adapted? By whom? At what cost? Probe to get example costs for changes that seem easy as well as ones involving core logic changes.
The Right Way to Build a TCO Model
Here’s what a TCO model that avoids these mistakes and accounts for all costs and benefits looks like:
Application start-up costs:
- Software licenses expense – be sure to estimate any overage charge if the license isn’t a per-user model
- Server and infrastructure costs – required only if the application is not in the cloud
- Required customization to meet initial needs – this is easy to scope as part of an evaluation process (but you must account for a shortened application lifecycle if customizations prevent upgrades to the base product)
- Professional services for installation – just to get the software functional in your environment
- Cost of training users – this can be high if the application logic doesn’t match your organization’s work logic
- Costs to enable users if access requires anything other than a standard web browser (e.g. a thin client on the desktop)
On-going application support costs:
- Annual software maintenance costs to the vendor
- Internal costs of supporting the application
- Expected costs of full vendor upgrades to get enhanced functionality (usually every 2-3 years)
Costs of adapting the software to meet changed conditions:
- Who can adapt the application – you the client, a class of third-party developers, or only the software vendor?
- Expected frequency of changes – both minor (e.g. workflow, input variables) and major (changes to core logic)
- Expected cost per change (be sure to ask vendors for rough orders of magnitude for a range of changes)
- Improved productivity of direct users (e.g. 20% more transactions processed per day)
- Better insights for management through easily tailored reporting
- Built-in mobile apps to allow process participation from any mobile device
- Improved communication and decision making across departmental boundaries (only applies if the application has built-in collaboration around process event feeds)
- Speeding of information and decisions across departmental boundaries
- Improved management control
There’s one more factor that defies quantification for a TCO, but experienced managers find a way to account for in the process. That’s the trajectory of the vendor behind the software application. If your planning cycle requires you to view this purchase as a 10-15 year investment, then part of your purchase is a bet on the vendor’s continued innovation and expansion of the core application. Be sure to find a way to account for this in your analysis.
Failed IT investments are common news in the business press. The Federal government has had several that recently gained a lot of publicity. There are usually many factors that account for big scope, long term projects getting derailed. The seeds of failure are often planted early. A complete and proper TCO analysis can help avoid these early sources of project failure.
Vice President of Solutions