Tuesday, May 1, 2012

The Art of the Possible with Business Analytics


It has been established beyond doubt that data and its analysis can have a huge impact on an organization’s top line and bottom line. Business Analytics helps organizations deliver better business performance in two ways – by optimizing business processes and by helping to innovate. Optimization helps organizations be efficient and effective by taking inefficiencies out of the business processes and focusing on the high impact opportunities. Innovation on the other hand helps organizations by uncovering new customer segments, new product categories, new markets, new business models etc.

The styles of analyzing data are many fold from answering questions like “what is going on?” to “why are the things the way they are?” to “what will happen if I do X or Y?” to “what does the future look like?” Broadly speaking the styles of analytics can be classified into three categories:

·         Exploratory Analysis: The objective of exploratory or investigative analysis is exploration and analysis of complex and varied data – whether structured or unstructured for information discovery.  This style of analysis is particularly useful when the questions aren’t well formed or the value and shape of the data isn’t well understood.

·         Descriptive Analytics: The objective of this style of analysis is to answer historical or current questions like what is going on. why are the things the way they are?. This is the most common style of analysis and here the questions as well as the value and shape of data are well understood.

·         Predictive Analysis: Predictive analysis aims at painting a picture of the future with some reasonable certainty.

So, what’s art of possible with business analytics? It’s the application of the above three styles of analytics to a business scenario for better insights, decisions and results. Let’s try and explain this with an example. Consider this scenario:

You are a Financial Services firm e.g. a large bank and are trying to improve profitability. You read Larry Seldon’s book titled “Angel Customers and Demon Customers” and agree with the findings that 20% of your top customers bring in 80% of the profits and would like to manage you business as a portfolio of customers as opposed to portfolio of products. So, how do you do that? The answer is business analytics.

You can start by using descriptive analytics techniques like operational reports, ad-hoc query, dashboards etc. on data collected from different sources like sales, customer service etc. to determine the profitability of each customer. You can then use predictive analysis techniques like data mining, statistical analysis to further enrich your customer data into profitability segments like high, medium, low and loss making customers. Finally, you can choose different customer service channels like personal banker, phone or ATM to cost effectively serve you customers e.g. a high profitability customer can be served by a personal banker free of charge but if the loss making customer wants a personal banker there will be a charge. Once you have implemented such programs you can use exploratory analysis to gauge the sentiment across social media channels like Facebook and Twitter to see if the programs are working as desired. Better yet you may come up with new innovative business models like mobile banking or online only banking to improve profitability.

That’s the art of possible powered by business analytics. Stay tuned, I intend to publish more examples from different industries to show the art of possible with business analytics.



Thursday, March 29, 2012

Does your analytic solution tell you what questions to ask?

Analytic solutions exist to answer business questions. Conventional wisdom holds that if you can answer business questions quickly and accurately, you can take better business decisions and therefore achieve better business results and outperform the competition. Most business questions are well understood (read structured) so they are relatively easy to ask and answer. Questions like what were the revenues, cost of goods sold, margins, which regions and products outperformed/underperformed are relatively well understood and as a result most analytics solutions are well equipped to answer such questions.
Things get really interesting when you are looking for answers but you don’t know what questions to ask in the first place? That’s like an explorer looking to make new discoveries by exploration. An example of this scenario is the Center of Disease Control (CDC) in United States trying to find the vaccine for the latest strand of the swine flu virus. The researchers at CDC may try hundreds of options before finally discovering the vaccine. The exploration process is inherently messy and complex. The process is fraught with false starts, one question or a hunch leading to another and the final result may look entirely different from what was envisioned in the beginning. Speed and flexibility is the key; speed so the hundreds of possible options can be explored quickly and flexibility because almost everything about the problem, solutions and the process is unknown. 
Come to think of it, most organizations operate in an increasingly unknown or uncertain environment. Business Leaders have to take decisions based on a largely unknown view of the future. And since the value proposition of analytic solutions is to help the business leaders take better business decisions, for best results, consider adding information exploration and discovery capabilities to your analytic solution. Such exploratory analysis capabilities will help the business leaders perform even better by empowering them to refine their hunches, ask better questions and take better decisions. That’s your analytic system not only answering the questions but also suggesting what questions to ask in the first place.
Today, most leading analytic software vendors offer exploratory analysis products as part of their analytic solutions offerings. So, what characteristics should be top of mind while evaluating the various solutions? The answer is quite simply the same characteristics that are essential for exploration and analysis – speed & flexibility. Speed is required because the system inherently has to be agile to handle hundreds of different scenarios with large volumes of data across large user populations. Exploration happens at the speed of thought so make sure that you system is capable of operating at speed of thought. Flexibility is required because the exploration process from start to finish is full of unknowns; unknown questions, answers and hunches. So, make sure that the system is capable of managing and exploring all relevant data – structured or unstructured like databases, enterprise applications, tweets, social media updates, documents, texts, emails etc. and provides flexible Google like user interface to quickly explore all relevant data.
Getting Started
You can help business leaders become “Decision Masters” by augmenting your analytic solution with information discovery capabilities. For best results make sure that the solution you choose is enterprise class and allows advanced, yet intuitive, exploration and analysis of complex and varied data including structured, semi-structured and unstructured data.  You can learn more about Oracle’s exploratory analysis solutions by clicking here.

Wednesday, February 22, 2012

5 Facts that SAP won't tell you about HANA


SAP has been touting HANA as an innovative, breakthrough technology and the next “big thing”. They are aspiring to be the #2 database vendor riding on the HANA hype. Well, time to dig deeper and bring forward 5 facts that SAP won’t tell you about HANA.

#1: HANA is an In-Memory database. So, where’s the innovation?
SAP has positioned HANA as the newest, most innovative category defining offering. However, HANA is an in-memory database which may be a new category for SAP but has existed in the market for years. A quick Wikipedia search on in-memory databases reveals that in-memory databases have been around since the 1990s and today there are 40+ such independent offerings of which HANA is one. Oracle alone has 3 in-memory database offerings with successful products like TimesTen, Berkeley DB and MySQL.  Introduced in 1990’s, Oracle’s TimesTen remains an early innovator and a leader in this space. HANA, introduced in 2011 is the youngest member of the group.

#2: HANA adoption is growing rapidly. So, where’s the growth?
SAP will show numbers like FY 2011 revenues of $200M and 100+ customers to underscore HANA’s rapid customer adoption. Putting these numbers in perspective; Vertica, the largest independent in-memory database vendor before being acquired by HP in 2011, was on track to deliver revenues around $100M with 200+ customers and over 100% YoY growth rate. Oracle remains the leader in data warehouse platform market with FY 2010 revenues of close to $3B and thousands of customers. Given Oracle & Vertica’s impressive performance, HANA’s numbers while good are hardly “rapid”.

#3: HANA is enterprise ready. So, where’s the manageability and reliability?
It takes years to develop and perfect a complex product like database management system. Oracle database has been perfected over 30+ years and billions of dollars in R&D investment. TimesTen has been around for 15 years and is still being aggressively developed and perfected. SAP would like you to believe that HANA is enterprise ready from day one but dig deeper and you’ll find that HANA lacks basic features like clustering, high availability, file system persistence and ACID style transaction integrity support. HANA lacks referential integrity support so there is NO means to ensure the integrity of data stored in a HANA database. HANA’s support for locks and transaction isolation is primitive so multi user concurrency is an issue. Hopefully, you get the picture that HANA is an immature version 1 DBMS which is far from being ready to support mission critical enterprise applications.

#4: HANA is non disruptive. So, where’s plug and play?
HANA has limited support for standard ANSI SQL.  In fact, HANA requires applications to be custom written for it using non-standard SQL. In my view this is a major show stopper. In this day and age where every vendor is working diligently to support openness and application integration via support for services oriented architecture and web services in comes HANA with SAP’s age old vision of closed system with no access to underlying data structures. HANA takes vendor lock in to new levels by limiting your choice of applications, reporting & analysis tools to a few offered by SAP.

#5: HANA is an appliance. So, where’s ease and speed of deployment?
Wikipedia defines computer appliances as consisting hardware and software pre-integrated and pre-configured before delivery to customer, to provide a "turn-key" solution to a particular problem. Benefits of appliances include ease and speed of deployment with lower risk and faster time to value. With HANA you buy hardware, software, networking switches and storage from different vendors. There isn’t a single point of support and with different vendors having markedly different development and upgrade cycles it’s excruciatingly hard to test, configure, certify and update the joint solution.

In conclusion, SAP’s larger than life solution HANA definitely underscores the strategic importance of data management and analysis to organizations but due to limitations highlighted above HANA is far from being ready to support mission critical enterprise applications. Customers should consider mature technologies like TimesTen based Oracle Exalytics and Oracle Exadata for their in-memory analytics needs. 

Tuesday, February 7, 2012

Big Data Analytics – The Journey from Transactions to Interactions


Big Data Defined

Enterprise systems have long been designed around capturing, managing and analyzing business transactions e.g. marketing, sales, support activities etc. However, lately with the evolution of automation and Web 2.0 technologies like blogs, status updates, tweets etc. there has been an explosive growth in the arena of machine and consumer generated data. Defined as “Big Data”, this data is characterized by attributes like volume, variety, velocity and complexity and essentially represents machine and consumer interactions. 

Case for Big Data Analysis

Machine and consumer interaction data is forward looking in nature. This data available from sensors, web logs, chats, status updates, tweets etc. is a leading indicator of system and consumer behavior. Therefore this data is the best indicator of consumer’s decision process, intent, sentiments and system performance. Transactions on the other hand are lagging indicators of system or consumer behavior. By definition leading indicators are more speculative and less reliable compared to lagging indicators; however, to predict the future with any confidence a combination of both leading and lagging indicators is required. That’s where the value of big data analysis comes in, by combining system and consumer interactions and transactions, organizations can better predict the consumer decision process, intent sentiments and future system performance leading to revenue growth, lower costs, better profitability and better designed systems.

So, which business areas will benefit via big data analysis? Think of areas where decision-making under uncertainty is required. Areas like new product introduction, risk assessment, fraud detection, advertising and promotional campaigns, demand forecasting, inventory management and capital investments will particularly benefit by having a better read on the future.

Figure 1: Combination of big data and transactional data delivers better insights and business results


Big Data Analytics Lifecycle
The big data analytics lifecycle includes steps like acquire, organize and analyze. Big data or consumer interaction data is characterized by attributes like volume, velocity and variety and common sources of such data include web logs, status updates and tweets etc. The analytics process starts with data acquisition. The structure and content of big data can’t be known upfront and is subject to change in-flight so the data acquisition systems have to be designed for flexibility and variability; no predefined data structures, dynamic structures are a norm. The organization step entails moving the data in well defined structures so relationships can be established and the data across sources can be combined to get a complete picture. Finally the analysis step completes the lifecycle by providing rich business insights for revenue growth, lower costs and better profitability. Flexibility being the norm, the analysis systems should be discovery-oriented and explorative as opposed to prescriptive.

Getting Started
Oracle offers the broadest and most integrated portfolio of products to help you acquire and organize these diverse data sources and analyzes them alongside your existing data to find new insights and capitalize on hidden relationships. Learn how Oracle helps you acquire, organize, and analyze your big data by clicking here.


Figure 2: Oracle’s engineered system solution for big data analytics

Tuesday, January 24, 2012

Oracle Exalytics Pricing explained - a wonderful product at a wonderful price


Warren Buffet famously said and I quote (with some edits) “It's far better to buy a wonderful company (product) at a fair price than a fair company (product) at a wonderful price”. Conventional wisdom has it – quality doesn’t come cheap. Well in this day and age where conventions are broken every day - time to think differently. What-if the best analytics solution in the world was available at the bargain basement price.

Oracle recently announced the pricing for Exalytics, the industry’s first in-memory analytics machine and as conventional wisdom would have it a number of articles were published with Exalytics pricing in millions of dollars of range. But once again continuing with the glowing tradition of “WHY NOT “, Oracle is out to prove the conventional wisdom wrong. Drum rolls please….NOW YOU CAN GET EXALYTICS FOR MUCH LOWER THAN THE MILLION DOLLAR MARK. No gimmicks, no discounts, all based on the list price.
Exalytics includes 3 components – hardware, software and support.
Hardware Cost:
1)      The List price for Exalytics hardware is : $135,000
Software Cost:
Exalytics includes two software components:
2)      TimesTen In-Memory Database for Exalytics: Priced at $300 per named user/ 100 user minimum OR $34,500 per processor
3)      Oracle BI Foundation Suite: Priced at $3,675 per named user/100 user minimum OR $ 450,000 per processor       
Support Cost:
4)      Annual support cost includes support for Exalytics hardware ($ 29,700), TimesTen In-Memory Database for Exalytics ($66 per user OR $7,590 per processor), Oracle BI Foundation Suite ($808.50 per user OR $99,000 per processor).

So, what’s the total cost of deploying a analytic system with 100 users?
Exalytics Cost for a 100 user system = 1 + 100*2 + 100*3 + 4 = $135,000 + 100*$300 + 100* $3,675 + ($29,700 + 100*$66 + 100*808.50) = $135,000 + $30,000 + $367,500 + $117,150 = $649,650

Now, discounts of up to 50% are quite common in the software world. I don’t know how much Oracle discounts but assuming a conservative 50% discount rate we are looking at a 100 user system powered by 1 TB of RAM and 40 CPU cores and market leading BI and In-Memory database technology for under $300 K. That’s $3K per user. Compare this to reoccurring $3K per user per year Salesforce charges for their Sales Cloud and the $5K per month pricing offered by a cloud based BI provider.

In this day and age where we are moving away from “Whys” to “Why Nots”, I think Oracle Exalytics definitely proved the conventional wisdom wrong by delivering best value for best price. Well this would even make Warren Buffet revise his quote – “a wonderful company (product) at a wonderful price”.