The Big Data Phenomenon

Posted: February 14th, 2013 | Author: Juhana Enqvist | Filed under: Industry Insights | Tags: , , , , , | 2 Comments »

When the Big Data buzz started to surface a few years ago, my first thoughts were “what’s the big fuss about this?” Many of the use cases were very simple compared to what Comptel has being doing for its customers already for decades. Data aggregation and advanced analytics is already a combination Comptel is used to.

Maybe the reason was that they were new to businesses outside the Telco world, where massive amounts of events go through very complex real-time processing for enterprise scale systems that are not able to cope with such tremendous amounts of data. Traditional data warehouse systems are ill-equipped and very cost- inefficient to handle such systems.

Comptel’s bread and butter in Mediation business has always been in doing complex things with high-volume, real-time requirements for enterprise systems that are not able to do that cost efficiently. And lately, as you’ve probably noticed, joining forces with finest quality analytics expertise we are already tackling the problem of big data quite adequately!

Even today, in terms of web scale data processing needs, Comptel is in the same league with the big players. Twitter’s data volumes and processing complexity look like  a small-scale operation compared to what Comptel does with its real-time event processing engine. On top of that, every single event is potentially related to billable items, which brings the accuracy and auditing requirements to a completely different level.

Why, then, the sudden, booming buzz around Big Data? The answer: the growing behavioral data volumes, and analyzing them.

Promise of Big Data

There are numerous drivers that have created both the need and the opportunity for big data.

•    Improved customer experience and contextual intelligence by connecting emotionally with the customers at the right time
•    Cloud computing and improvement of cost-performance ratio of computation and storage
•    Smartphone and mobile apps phenomenon creating more data that can be turned to information
•    Ubiquitousness of social media where the users are creating content that can be turned to information
•    Advanced analytics reaching the level of Artificial Intelligence

Lots of seemingly unrelated, already over-repeated buzzwords. “Is this guy trying to bore me to death”, you might ask. Well, yes I am, but that’s not the point here.

What is really going on here is, of course, is the ability to offer previously unseen levels of personalized services at very low costs. This is achieved with the next level of automation – the automation of decision making. The promise is the same as with the previous automation cycles:
Increased throughput or productivity.
Improved quality or increased predictability of quality.
Improved robustness (consistency), of processes or product.

Looking back a couple of centuries, the first automation wave concentrated on manufacturing and hard labour, moving then to automation of simple services, like telephone switches and bank clerks. Similar examples within Comptel include, telecom network engineers who manually activated new subscribers, or invoicing accountants reading the impulse meter counters, calculating the difference and creating invoices out of that. And now, decision- making is the next in line.

Advances in the cost of computing, smartphones, apps and cloud together have created an enabler of completely new types of services, and lowered the barrier to business entrance. With smartphones and app stores, anyone can theoretically create a service that can reach a billion paying customers. Cloud IaaS providers like Amazon EC2 make sure that your entry barrier is low, but your service can rapidly scale with the growth of your business.

With the massive scale of software deployment in mobile world (‘billion downloads of Angry Birds already in 2012!‘, ‘’Temple Run 2′ was downloaded 50 million times in less than two weeks’), it is actually a very valid revenue generation model to give the software for free, and then exploiting  upsell (freemium model) or advertising as the source of income. Average revenue per Facebook user is $4.34 (2011) – without the user paying anything directly to Facebook. To further optimize the revenue, customer behavior analysis is required to figure out the optimal prices, and optimal contextual timing of offers – the right offer at the right time.

The real game changer is, that for the first time ever, usage and behavioral data is available on a scientifically quantifiable, individual, very granular level – and customers give the information willingly. This makes it possible to analyze individual behavior and give personalized services based on what the individual really wants. Having Intelligence at every touchpoint creates an emotional connection for the customer, and a feeling of being cared for has a strong influence on customer loyalty. Many are willing to switch to a service provider who offers this kind of  personalized service, even at a higher cost.

Mr. Orwell might have a say on this kind of intelligence, though. With such a dramatic shift in data exploitation and deducing capabilities, it is a very welcome fact that privacy laws and regulations exist.

But, it doesn’t end there. Combining the behavioral predictions with IT and network topology and capabilities, it is possible to automate the configuration and management of the whole service platform. Imagine automated capacity balancing across data centers within a country, continent, or even the globe, according to usage, weather etc predictions. Predictions of the capacity growth could trigger automatic network and hardware expansions using an intelligent Fulfillment solution, including Resource Inventory, and Order Management. Software Defined Networking is a very interesting ingredient in all this, allowing new dynamics into network configuration, together with software.

Soon, the entire component building – device assembly – delivery – installation – configuration supply chain could be completely automated without any human intervention.

Welcome, Skynet!

Comptel For Big Data

In all this, Comptel is in a very unique position.
1.    With Comptel’s massive experience in data stream shaping – processing, enriching, aggregating and correlating massive amounts of structured and unstructured data in an efficient way – together into meaningful business value creating scalable, real-time, actionable insights out of Big Data is business as usual for us, with the standard analytics toolset carried out by typical Data Warehouse/Business Intelligence tools.
2.    With the world class predictive analytics algorithms of Comptel Social Links, Comptel can create very accurate, actionable predictions on individual subscribers with the data already flowing through the Comptel EventLink platform. This saves an enormous amount of time and effort by reducing the integration cost. As anyone involved with mediation or data warehousing projects can tell, the cost of integrating data to a single place is very high. To support fast, easy and cost-efficient integration, Comptel Social Links includes a suite of ready-made and productized use cases, which utilize Social Network Analytics and machine learning for predicting customer attributes and behavior, and for automatically optimizing decisions in  both customer and network facing business systems.
3.    The Comptel Fulfillment product suite is already geared towards automated, data-driven process execution. The  fundamental thought behind Catalog and Inventory driven Order Management is, that instead of using static process definitions, let the software figure out what is the optimal way of fulfilling the order. Therefore, it is possible to introduce predictive analytics based on service and product usage trends, such as order periodization and execution optimization and hardware capacity management.
4.    Using the insights from the usage and network quality data, combined with the understanding of the network capacities, Comptel can control the perceived network quality through Predictive Policy Control and analyzing other quality of service data. The predictive control can be used to, for example, to temporarily provide exceptional QoE for customers with high expected future value, when the customer needs it most to optimize the revenue within a system of  limited network resources.

Juhana Enqvist is Chief Architect at Comptel