The 3 Horsemen of the Legacy System Apocalypse

Solution Review Interview with David Flower

The 3 Horsemen of the Legacy System Apocalypse

April 24, 2018

The world is speeding up; people expect customized information and services immediately. In these tumultuous times, some companies are clinging to their legacy data infrastructure as a security blanket. However, traditional RDBMSes are just not able to provide the massive scales, edge distribution, and virtual or cloud deployments that are necessary for modern applications. In particular, there are three market drivers that spell the end of legacy systems: 5G, Internet of Things, and Machine Learning.

Must Read: Latency of 5g

5G: Not an Evolution — a Revolution

There was a lot of hoopla surrounding 4G, and one could be forgiven for thinking that the transition to 5G would be similar. But make no mistake — the leap from 4G to 5G is much larger than from 3G. 5G requires network slicing, utilizing multiple edge deployments, and much lower latencies than 4G. With 5G, CSPs will massively expand application possibilities and capacity. However, these come with stringent requirements: for each call, the system has to know who the caller is, where they are, what the caller’s policy is, if they have credit, and more, all in milliseconds. Legacy systems simply cannot keep up.

Must Read: The role of BSS on the journey to 5G

Internet of Things – Analytics Powering Decisions

Analyst firm Gartner predicted that around 6.4 billion IoT devices would be in use worldwide in 2016, and a McKinsey report estimates that IoT has the potential to represent around 11% of the world’s economy by 2025. That’s around 0.84 devices per person in the world, with almost assured explosive growth. Most legacy infrastructures just do not meet the stringent requirements to power IoT applications. You will have to deal with millions, if not billions, of sensors sending data to your data centers each second.

Processing all that data is one challenge; making it useful is an entirely different mountain. The most successful IoT applications will not only need to ingest the deluge of data, but then make decisions on the data in real-time. For example, smart meters that monitor usage and conditions, such as weather, time of day, and more. The meters then adjust the price of service(s) accordingly. Since usage and conditions can change any instant, you need real-time monitoring and decisions to get the most out of each meter.

Another example is using IoT in manufacturing or high-performance computing. Sensors monitor the performance and condition of hardware and environment, and make adjustments to achieve the best result. This use case requires accurate real-time decisions to get the best results now, instead of what would be the best results 5 minutes ago.

Must Read: 5g Enterprise Solutions

Machine Learning — Acting on Knowledge Now

Most companies are looking at how they can leverage ML to get ahead of their competition. And they should; ML is going to cause massive waves in almost every industry. However, while there is a lot of talk and action on some parts of ML, most people are missing the crucial final component: implementation. You can create the most sophisticated model possible, but it doesn’t actually help you unless you have a way to use it. You can use your classic data lake or warehouse to train and update your models, but unless you implement models in real-time, what’s the point?

Legacy infrastructure just cannot keep up with ML applications. For example, a ML powered fraud detection system requires ingesting and processing potentially hundreds of data points per card swipe, with thousands of card swipes per second. In this use case, the importance of accuracy is evident — false positives mean angry customers, and undetected fraud hurts the bottom line. In addition, the value of real-time implementation is also evident: you can stop fraud, instead of fixing it after the damage is done.

Another use case where real-time is essential to ML implementation is hyper-personalization. When a user opens an app, or loads a website, they want to see content that is personalized to them. Having unpersonalized, or worse, improperly personalized, content harms user experience and retention. In addition, non-real-time personalization means that you could be missing out on offers users would be willing to take in that moment. Making correct personalization decisions in real-time using ML allows you to keep users engaged with your content and offers.

Staying Ahead of the Tide

Legacy systems are dying, and replacing them with real-time alternatives allows you to use the most current data to provide the best service possible. Markets are moving quicker than ever, and you need to keep ahead of the latest trends and disruptors to ensure success. Like most disruptors, 5G, IoT, and ML all interact with each other to create new possibilities and challenges.

To learn more about how market leaders such as Openet, Huawei, and FT.com are using Volt Active Data to achieve these use cases (and more), watch this interview by Solutions Review with our CEO, David Flower.

  • 184/A, Newman, Main Street Victor
  • info@examplehigh.com
  • 889 787 685 6