Modernizing Legacy Systems: Driving Toward Virtualization
Bridging the gap between the old and the new is a constant struggle for organizations these days. From a business process perspective it applies to many different variables: the transition from analog to digital and the migration from paper to electronic workflows, for example. But one issue that continues to garner significant attention for many businesses is the need to modernize legacy applications.
Application modernization is not a new concept. It is an issue that has been top of mind for CIOs over the past couple of decades. For many businesses, vital day-to-day operations are dependent on specialized core applications that were developed and put in place years, even decades ago. These core applications are custom-built, comprised of millions of lines of programming code and designed to support high volumes of data-intensive processes — perhaps billions of transactions per day.
Most of these core applications are based on older, outdated programming languages running on mainframe computers. While they play a fundamental role in everyday operations, many of the applications have become stale over time. Modernizing these applications could provide significant benefits in terms of usability and support, improving productivity, and perhaps most importantly providing access to the enterprise tools now available through current generation computing architectures, mobile technologies, and cloud-based solutions.
At the same time, supporting and maintaining legacy platforms has become increasingly expensive, particularly for large organizations that are reliant on older computing infrastructure. Many estimates show that businesses spend between 60 and 80 percent of their IT budgets in support of legacy systems. It varies by company size and by vertical industry, but the ratio can be even higher among federal agencies and other public sector organizations. It is a big initiative — and a huge business opportunity. According to Gartner, Application Modernization will be a $210 billion industry in 2014.
Identifying strategic objectives
While application modernization is a priority for many organizations, most are moving at different speeds and approaching the problem in different ways. Because of the proprietary nature of these applications, converting to a new architecture is not simple, nor is it cheap. According to Brandon Edenfield, executive director for the Application Modernization Group at Dell, businesses need to identify specific objectives before launching into the process. “First, you have to make a decision about what your goal is,” he explains. “Is your goal to save money or is your goal to truly modernize?”
The two issues are not necessarily mutually exclusive, but it is an important point to make. Most are desirous of reducing operating costs regardless of the initial intent, but the end goal for application modernization could be entirely different depending upon strategic objectives. Edenfield believes there are three primary paths to take when considering modernization for legacy systems: re-hosting, conversion, and complete system re-architecture.
With re-hosting, the goal may not be to improve the architecture or even to change the data structure but rather to move to a more cost-effective computing platform. With a re-hosting solution, businesses are looking to move off of their legacy systems without modifying the applications. In many cases, the goal with re-hosting is to reduce costs while ensuring that applications look and behave similarly to the original so as not to disrupt process and workflow.
With a conversion solution the intent is to actually convert to a new application language; for example, to move from IBM’s natural language or from COBOL to Java. “What you get is a tool-generated framework,” Edenfield explains. “The application code looks the same as what the original programmer produced but the customer gets a fast migration to a new language.” He adds that a conversion solution should not be confused with a completely revamped system architecture, where the application code is completely rewritten to support more object-oriented functionality and perhaps to deliver and support new services.
A re-architected system takes the customer to a final end state, completely removed from legacy mainframe systems and running true object-oriented applications to provide better functionality and integration. A system re-architecture could also involve more than application modernization, including revamping organizational structure and processes, creating new business rules, or establishing a common data layer for disparate databases, for example.
Wants versus needs
Of the three approaches described, re-architecting the entire system is much more costly and potentially much more disruptive. Edenfield says it is not uncommon for customers to approach Dell with the initial mindset of pursuing a complete system redesign, only to back off quickly once they understand the costs and implementation issues associated with such a broad initiative.
Conversely, there are numerous examples in the market of high-profile companies that have pursued legacy system redesigns only to cancel due to unexpected costs and program delays—often after investing millions in the failed effort. “We see a lot of companies coming in and saying that is what they need,” Edenfield says. “In reality what they are really saying is that is what they want. If you need it, you’re going to spend whatever it takes to get it but if you just want it you may or may not.”
As a result, most businesses tend to approach the issue of application modernization in phases. In other words, begin with re-hosting or language migration. Then, the savings realized from moving off the legacy system can be applied toward further system re-architecture in phase two or three. Edenfield admits that the majority of Dell’s success stories so far have been those implementations where businesses were primarily focused on reducing operational costs. Nevertheless, he notes that customers are beginning to show greater interest in the bigger picture, especially within the last six months.
Virtualization and the cloud
According to Edenfield, the trend toward virtualization and cloud services is a significant factor behind this movement. “When you go into these larger companies, a large percentage of their real run-the-business enterprise applications are running on these old legacy systems and they are behind the wall,” he explains. “They have proprietary data structures and programming languages, and they don’t fit well at all in the cloud or virtualization as we know it.”
Certainly, factors such as BYOD and the consumerization of IT are causing businesses to rethink existing infrastructure. Employees have greater demands for access to information, and we are witnessing growing interest in self-service access to enterprise applications. There is also increased pressure on IT departments to deliver new applications faster—both for internal and external users.
At the same time, trends around big data are also driving the need to revamp existing infrastructure, especially when it comes to centralizing data sources. Increasingly, businesses are looking to organize and analyze big data content. With data residing in silos across various legacy systems that is virtually impossible to accomplish. “Even things like social analytics,” Edenfield exclaims. “Good luck trying to pull all that data from all the different systems into something you can use. You have to get all that data into a distributed environment if you are going to achieve that goal.”
Competitive pressure is another factor contributing to an increased interest in system and platform migration. Leading enterprises have embraced the concept of application modernization and have achieved great success while doing so. This is surely driving further acceptance and adoption among businesses that might have previously viewed the process as somewhat intimidating.
Whatever the case, businesses will increasingly be forced to modernize applications and, to the extent necessary, migrate from legacy mainframe systems. In today’s business world, users demand instant access to information. It is no longer acceptable, for example, to wait for the system to generate a business report. Employees are pulling relevant information and creating their own reports on an as needed basis. Likewise, when a new process or application is required, IT must be able to move quickly to deliver solutions that will run on any device—and in the cloud. As a result, businesses will require a reliable backend architecture that supports these changing computing models.
This article originally appeared in the May 2014 issue of Workflow