person with a disco ball where their head usually is, reading a magazine
Feature

What GenAI and LLMs Bring to Legacy Tech

5 minute read
David Barry avatar
By
SAVED
You don't have to discard your legacy tech stack to modernize your company. Instead, try integrating GenAI and LLM to gain an upper hand.

Prompted by new automation capabilities, machine learning and artificial intelligence, the digitalization of organizations has been in the works for years now — and well before the emergence of generative AI. 

To compete and remain relevant, companies have been updating and modernizing their digital workplace technology, rationalizing their tech stacks and upgrading applications, automating business processes and expanding their cloud reach.

While these developments were expected to displace legacy technology in many enterprises, some are finding ways to marry new and old to get the best of both worlds. Here's how.

LLMs and Legacy Tech

While a lot of the focus is on integrating new tools, it's a fair question to ask what's happening with legacy tech stacks: Will they really be consigned to the IT dustbin despite the billions of dollars invested in them over the years? 

Greg Benson, chief scientist at SnapLogic and professor at University of San Francisco, believes that while interest in generative AI is likely to disrupt existing tech investments, the outcome will only serve to improve legacy platforms and software — rather than displace them. 

“While GenAI is seeing widespread inclusion in modern platforms and applications, I believe it has great potential to transform and modernize legacy technology stacks,” he said. 

Legacy stacks tend to be based on older languages and libraries, and few of the original developers are still around, Benson said. This provides an opportunity for generative AI to accelerate the modernization of older codebases.  

“I've personally used GenAI to reason about and modify older code," he said, noting that the technology has the potential to break down skill and cost barriers to writing legacy code from scratch. 

According to Amit Sood, CTO and head of product at software company Simplr, this is indeed one of the reasons why companies that have invested heavily in technology over the years should be thinking about integrating generative AI capabilities, not ignoring them.

That's because heavy tech investments mean high volumes of data flowing through their systems, he said, and data is the lifeblood of truly impactful generative AI initiatives.

Plus, he said, generative AI isn't an all-or-nothing technology. This means companies can start small, in areas where there are relatively low resource investments and risk to the company, and build from there.

But it must be a consideration. “Gartner predicts that 80% of enterprises will be using generative AI by 2026," Sood said. "Those who think they can ignore it all together are at serious risk of falling behind one of the most impactful tech innovations in decades." 

Related Article: Legacy Systems Just Won't Quit, So Here's How to Manage Them

Uplifting Legacy Tech

LLMs are powerful in helping organizations enhance their operations and decision-making processes. More specifically, they bring intelligent automation and data-driven insights to traditional tech environments, Dwarak Sri, global head of AI at Tampa, Fla.-based BlueCloud, said. 

The advantage for those operating with legacy stacks is that with the introduction of LLMs, legacy and modern tech stacks undergo a significant uplift.

While there is no doubt that legacy tech must embrace contemporary architecture, he said, there are three important considerations to keep in mind before proceeding:

  1. Plan: In the new architecture, the cloud-based enterprise data platform takes center stage, accommodating data pipelines inclusive of semi-structured and unstructured data. Modernizing the tech stack entails a well-planned migration to this cloud data platform.  
  2. Enable: The transition introduces the use of embeddings and a vector/graph data store for indexing. Rather than rigidly structuring data, permit data to retain its natural form, which will result in faster processing and more efficient data retrieval.
  3. Fine-tune: The integration will require the selection a foundational model such as BERT, GPT-3.5, GPT-4, Llama-2 and Claude, among others. However, since these foundational models are pre-trained on extensive datasets, they may be unable to handle domain-specific or organization-specific concepts of interest. Hence, fine-tuning becomes imperative. 

“This process doesn't necessitate training the model from scratch with a large dataset but instead refines a pre-trained model by modifying its weights for improved performance,” he explained.

With these foundational steps in place, the final pieces of the puzzle can be assembled.

The data and orchestration framework is critical for crafting applications with language models, Sri said, because it seamlessly integrates with downstream vector or graph data stores and databases for LLM applications. The result is that APIs from frameworks such as LangChain, Haystack and LlamaIndex provide advanced users with the means to customize and extend data connectors, indices, retrievers, query engines and other available modules, catering to their specific needs.

“Whether you are a company that has adopted a modern tech stack or is still using legacy technology, a structured approach is necessary to incorporate LLMs into your architecture,” he said. “The recommended approach facilitates a seamless and secure transition to a transformative design. It capitalizes on existing tech investments by enhancing data pipelines and strategically building the essential components necessary to develop and empower LLM applications.”

Related Article: Intelligent Process Automation Is Here — Where Are You?

Building In GenAI

According to Mitri Dahdaly, VP of product management at automation company Legion Technologies, companies that do not invest in generative AI and LLMs will find themselves rapidly outclassed by those that do. Generative AI is going to shape the course of innovation, he said, and it’s crucial that leaders modernize their tech stacks in an age AI-driven business.

In his view, the organizations that will find the most success with incorporating generative AI are the ones approaching it with an open mind and exploring all the potential ways it can transform productivity.

For instance, by integrating it into their existing solutions, they can add value for customers. Or, by building it into their tech stacks, companies can unlock new features and updates to the software they currently use — and that's a convenient place to start, Dahdaly said. 

That said, as generative AI evolves and new functionalities emerge, it may be worth considering competing solutions to find one that will truly maximize productivity. When introducing new solutions, technology leaders should position generative AI and large language models as complements to existing work processes.

Learning Opportunities

“Allowing employees to trial different solutions to see what harmonizes best with their current workflows and can help leaders make the right decisions and incorporate generative AI seamlessly,” he said. 

Related Article: How to Identify the Right Workplace Processes to Automate

The Best of Both Worlds

Rather than viewing LLMs as a replacement for current systems, Mike Hyzy, senior principal consultant at IT consultancy Daugherty Business Solutions, recommends organizations take a balanced approach that exploit the strengths of both LLMs and traditional legacy software.

It's better to see them as a powerful new tool that can complement and enhance human knowledge workers, he said. Traditional software still excels at structured data processing, workflow management and delivering specialized functionality. With the proper integration and governance, LLMs can make employees more productive by automating rote tasks and providing helpful information on demand.

The wise path forward, Hyzy said, is to carefully assess where LLMs can add the most value to each process and determine how to combine them with existing technologies. The outcome will likely involve developing new interfaces and APIs to connect LLMs to backend databases, ERPs and other legacy systems.

Some tasks, like customer service, may shift primarily to LLMs, while for others, like financial reporting, LLMs may provide an assistive role. With a well-planned integration strategy, LLMs don't have to disrupt current technology investments; they can help those systems deliver more business value. 

Companies that can optimally bring humans and AI together will gain a solid competitive advantage, Hyzy said. This transition does require thoughtful change management and governance, but it presents an exciting opportunity to augment human skills, not replace them.

“With the right approach, I'm confident organizations can harness the power of LLMs while still leveraging their prior technology investments."

About the Author
David Barry

David is a European-based journalist of 35 years who has spent the last 15 following the development of workplace technologies, from the early days of document management, enterprise content management and content services. Now, with the development of new remote and hybrid work models, he covers the evolution of technologies that enable collaboration, communications and work and has recently spent a great deal of time exploring the far reaches of AI, generative AI and General AI.

Main image: Vale Zmeykov
Featured Research