Legacy application modernization is a continuous, pragmatic approach to digital
transformation, based on a data mesh architecture and operational model.
Digital business transformation doesn't happen overnight. It’s a continuous cycle designed to improve existing business models, build new ones, and discover, and use, emerging technologies. The challenge for IT is to be agile and responsive enough to stay in sync with, or, better yet, ahead of, digital business demand.
Yet, legacy systems, which run so much of a company’s mission-critical business, basically hold their data and logic “captive” – making it costly, risky, and slow to participate in, and add value to, digital business transformation.
A pragmatic approach to address these obstacles is continuous application modernization founded on a data mesh operational model and data architecture. This paper explains why and how.
To support the agility needed to do business digitally, an enterprise needs to adopt a continuous delivery methodology for its application portfolio. Legacy applications, while integral to the business, often create obstacles and delays that negatively impact IT responsiveness. Application leaders would be wise to incrementally transform these legacy systems into the basis of a platform for digital business.
Since “lift and shift” (i.e., lifting apps from on-premise environments, and shifting them to the cloud) projects are often too costly, risky and time-consuming, consider an iterative approach, enabled by a modern data integration platform that supports legacy modernization.
Continuous legacy application modernization is a systematic and iterative methodology, greatly enabled by taking a data product approach, with a focus on quick time to value and minimized risk.
Put simply, continuous software delivery requires continuous legacy application modernization.
There’s a wide variety of legacy application modernization methods to choose from, based on different levels of cost, complexity, risk, and, ultimately, business value.
The “7Rs” framework provides various migration options. As you review them, keep in mind that the level of modernization increases from none (i.e., rehosting, replacing) to all (i.e., refactoring, reimagining):
Retire the applications you don’t need any more.
Retain on-premises applications that are too complex, or costly, to migrate, but backup data to the cloud.
Rehost applications quickly in the cloud.
Re-platform applications that need to run on a different operating system in the cloud.
Replace applications for which better, and/or cheaper, SaaS solutions are available.
Refactor applications that need significant code rework for the cloud, decoupling from other systems as needed.
Reimagine business processes in the cloud by redefining and enhancing core value propositions.
Below is a list of basic questions to ask yourself before embarking on an application modernization program:
Most large enterprises have invested heavily in their application mix, both from a financial and operational perspective. While “legacy” typically has a negative connotation in the application world, legacy systems are often among an organization’s most mission-critical and reliable applications.
Few enterprises would be willing to retire such applications and start from scratch. The financial and productivity losses would just be too substantial. So, application modernization has become the most efficient and cost-effective way for enterprises to quickly benefit from the newer, and more flexible, architectures, frameworks, software platforms, and tools.
Legacy applications are often siloed and monolithic, making them difficult to update and integrate, and expensive to scale. They’re difficult to update and integrate for architectural reasons. Because a legacy application is often a self-contained and non-modular, the impact of regression issues as a result of an enhancement, can be drastic.
They’re expensive to scale for similar reasons. If even only one software component has performance issues, it may be necessary to scale up the entire system just to serve that single component – making the cost excessive.
When a system is modernized to a microservices architecture, for example, components are smaller, and more loosely coupled. They can also be deployed, and scaled, independently of one another. While having its own set of challenges, this approach is where much of the core value in modernization lies.
When it comes to applications, business teams are looking for good fit, value, and agility. Most legacy apps don’t support digital business requirements, and can’t keep up with the pace of change.
On the other hand, IT teams are busy balancing complexity, cost, and risk. For them, legacy systems represent high total cost of ownership (TCO) in terms of enhancing, maintaining, operating, and scaling them. In many cases, the technology, integration, and documentation are simply outdated – resulting in a shortage of skilled manpower. All of this puts business continuity at risk.
An application modernization program benefits both business and IT alike, in that the new apps can deliver value a lot sooner, and allow for quicker, and more agile development and maintenance cycles. They’d also be simpler, and less costly, to operate.
Legacy application modernization entails updating older (e.g., legacy, siloed, mainframe, etc.) software to more modern computing approaches, including newer computing languages, frameworks, and infrastructure.
It’s sort of like renovating an old house in order to take advantage of improvements in structural integrity, safety, efficiency, and so forth. Rather than retiring an existing system, or replacing it outright, legacy modernization extends the lifespan of enterprise applications, while also benefitting from emerging technologies along the way.
Much of the yes/no debate surrounding application modernization is focused on monolithic, on-premises applications – generally updated, and maintained, via waterfall development processes – and how those systems can be migrated into the cloud, to benefit from modern service-oriented applications and data.
The benefits of legacy application modernization programs can be summed up as:
Increasing the speed of new feature delivery
Revealing the functionality of existing apps for consumption by APIs, or other services
Re-platforming systems from on-premises to cloud environments, for easier scaling and enhanced performance
The 3 main challenges of legacy application modernization are cost, complexity, and risk. Moving a system from on-premises to cloud, without any thought of ROI, makes no sense. On the other hand, there may be apps that can truly benefit from re-platforming, but because they are so heavily coupled to their data and other systems, the complexity of modernization outweighs the benefits.
The key to success in legacy application modernization is in choosing the right strategy – and in pinpointing only those projects that lead to improved customer experience and ROI.
For an enterprise, the ultimate goal is to deliver high-quality software more quickly, and with greater frequency, to enable the business to become more agile and responsive. To do this, enterprises must transform their culture, tools, and processes.
That’s where continuous legacy application modernization comes in. The move to the cloud requires several important steps, such as creating a pilot and foundation for the migration, and then assuring, and optimizing, it. However, the only way to do digital business, is to modernize your legacy applications first.
In the cloud migration process, application modernization drives digital business outcomes.
Digital business transformation is not one thing, that happens once. It’s a continuous process that tests, and improves, new business models and technologies. Technical teams are doing their best to provide high-quality, timely support for digital business ventures. However, each initiative comes with its own special demands, forcing IT to become more agile and responsive in the process.
To support this process, enterprises should take an iterative, continuous application modernization approach, as illustrated below.
The blue infinity symbol illustrates a continuous agile software delivery cycle.
The gray cycle indicates the legacy modernization steps taken during the Model and Build stages.
However, for all this to happen, you need contextual data about your applications and infrastructure, and their impact on your customers and business. Without this data, you’re basically entering the modernization process blind, without the ability to accurately assess, predict, and confirm results, or avoid mistakes, and minimize risk.
That’s where a platform based on data products comes into play.
By putting data (instead of applications) in the center, and treating it as a reusable asset, we can say goodbye to application and data silos.
Treating data as a product, enables us to apply agile software practices to data, including iterative product definition, engineering, deployment, and adoption. By doing so, we can ensure that data is being used to deliver the desired business outcomes on an ongoing basis.
Once data is productized, applications can be decoupled from their data, with legacy app logic and UI iteratively modernized. The modernized components are based on data products.
A data product is quickly created for specific data consumers, for a specific use case. It can take on different forms, based on the business domain or workload it’s meant to serve. It typically relates to an individual business entity – such as a customer, vendor, invoice, payment, service or device – and can be used in both operational and analytical workloads.
The data for the data product often originates in different source systems, some of which are legacy, and comes in different formats, structures, and technologies.
The data product contains everything a data consumer needs to generate value from the data. This includes its metadata:
Schema, of all database tables
Logic, for processing the raw ingested data
Access methods, such as CDC, JDBC, SQL, streaming, web services, etc.
Synchronization rules, defining sync schedules with source systems
Orchestrated data flows, that make the data ready for delivery
Lineage, to any and all source systems
Access controls, such as authentication and credential checking
And its data, which is:
Instantiated, according to the data product's metadata
Unified, cleansed, and masked
Enriched, with offline and real-time insights
Persisted, cached, or virtualized
Audited, by a log that tracks all changes to the data
The data product's data and definition are managed separately, with a data product having multiple instances of its data (one for each individual data product), but only one definition.
The data product is created by applying a product lifecycle philosophy to data. Data product delivery adheres to the agile principles of providing quick, incremental value to data consumers via quick and iterative cycles.
Definition and design
Data product requirements are formulated “outside-in”, on the basis of the data consumers’ business objectives, current data asset inventories, and data privacy and governance constraints. Its design relies on how the data will be structured, and componentized as a product for consumption via services.
Engineering
Data products are engineered by identifying, integrating, and unifying the data from its sources, and then imploying data masking tools as required. Web service APIs are built to give consuming applications authorized access the data product, and pipelines are created to deliver it securely to its consumers.
Quality assurance
All data is tested and validated make sure it’s complete, compliant, and fresh – and that it can be consumed safely by applications at scale.
Support and maintenance
Data product performance, reliability and usage are constantly monitored, so any problems can be quickly alerted and resolved.
Data product management
In the same way that software delivery is assured by a software product manager – who’s responsible for defining and prioritizing user needs, and collaborating with development and QA teams – a similar role is required for the successful delivery of data products. The data product manager is responsible for ensuring that business value and ROI are derived from the data product.
By continually and incrementally modernizing your legacy applications, you assure business continuity, reduce dependence on IT, and improve the performance and value of each application.
Here’s a practical approach to continuous legacy application modernization:
Determine your modernization goals.
Understand your legacy applications inside, out.
Prioritize continuous modernization based on the business capabilities poorly served by your legacy applications.
Productize your data around the key business entities that your legacy systems use.
Modernize the selected components of the legacy systems, leveraging the relevant data products.
Measure results against goals – before, during, and after, modernization.
Iterate, over and over again.
Having discussed the 2 pillars of legacy modernization – continuous application modernization, and a data product approach – let’s examine a technical implementation, based on a Data Product Platform.A Data Product Platform continually syncs, transforms, and serves data via real-time data products, to deliver a trusted, up-to-date, and holistic view of any business entity, such as a customer, order, supplier, or device.
Every data product integrates a specific entity’s data from all relevant source systems, into its own, secure, high-performance Micro-Database™. It also ensures that the Micro-Database is always current with fresh data, and instantly accessible to authorized data consumers across the enterprise.
The platform is unique in its versatility to be deployed as a data fabric, data mesh, or data hub. Moreover, customers can first implement the platform as a centralized data fabric architecture or data hub architecture, and gradually phase into a federated data mesh architecture, at a pace that matches their business needs and the level of their data management maturity.
Data Product Platform enables and accelerates legacy modernization by making it easy to:
Data Product Platform allows you to systematically decouple the data products from the legacy source systems.
Once data products are deployed, ingesting data from the legacy systems, modernized application functionality can be incrementally developed based on the data products and their
APIs, and their legacy counterparts can be retired. The platform’s Micro-Databases now operate as the database of record for the new functionality.
By taking this approach, enterprises can decouple their legacy systems from their data, while moving critical functionality and data to the cloud, at the pace they require, and can support.
The shift to digital decoupling allows enterprises to:
Focus on value, not cost
Legacy application modernization is a marathon, not a sprint. Take the time to assess where you are now, and where you want to be.
Don’t try to boil the lake
The massive scale of a cloud modernization program can be daunting. Break your application portfolio down into bite-size segments.
Get your people to get with the program
One of the greatest obstacles to any cloud initiative is a lack of alignment between Business and IT.
Take a product approach to data
Transform fragmented siloed data into data products that are decoupled from the legacy systems.
Go with data mesh
An Operational Data Mesh platform is foundational to continuous application modernization because it operates at the data product level, including the product’s metadata, algorithms, data, access methods, sync rules, orchestrated data flows, lineage, audit log, and access controls.
Prioritize your apps
Decide which applications to modernize first, based on the specific app in question, and the value it brings. Don’t tackle your most challenging project first, even if it promises to deliver the most value. That’s a sure-fire way to lower morale from the get-go.
Take time to test. Legacy apps are typically products of years of accumulated effort, experience, and knowledge. To match that kind of performance, choose test data management tools capable of leveraging data products.
Contemplate, before you automate
There are many solutions out there that can automate parts of your modernization program, but don’t expect 100% accuracy.
A continuous application modernization platform:
Overcomes performance and scalability constraints of source systems, in support of new digital channels
Decouples legacy from digital channels, allowing for iterative application modernization
Enables 24x7 operations, even when source systems need to be taken down for maintenance
Reduces integration problems if data and systems are highly fragmented
Organizes data in a patented mDB in support of operational and analytical workloads
Wade into the modern world
Notice we didn’t say “Leap into the modern world”, because you have to look before you leap. Full-scale modernization isn’t right for every application, every use case, or even every business. There may be valid business reasons to migrate to the cloud quickly, but even if you choose to put off modernization for now, know that it needs to be on your future agenda.
Ultimately, if you want to benefit from the full value of the cloud, you need to be working towards cloud-based applications, data, and infrastructure. A well crafted answer to "What is legacy application modernization", based on the principles of data mesh, will get you there.