fbpx

Leveraging LLMs and Generative AI for a game-changing legal solution

Our client offers innovative software services and solutions to the global legal industry. They had built a POC utilising OpenAI that was able to essentially perform the legwork of a legal aid. Summarising documentation, extracting topics and finding the actors (role players) in the documents. The POC offered users context of a case or trial […]

Our client offers innovative software services and solutions to the global legal industry. They had built a POC utilising OpenAI that was able to essentially perform the legwork of a legal aid. Summarising documentation, extracting topics and finding the actors (role players) in the documents. The POC offered users context of a case or trial and then could prompt users from there. When it came time to productionise the POC however, OpenAI needed to be replaced as the core technology as it posed various data security and compliance issues for their customers, in different geographic regions. Alongside limiting what models could be used – impacting the quality and costs involved.

The client turned to BBD to strip OpenAI from the POC infrastructure and rebuild with technology that would allow for the highly sensitive data to remain within specific geographic regions. Therefore complying with the various data sovereignty policies at play. This was to allow for flexibility in the various models that could be utilised.

The team employed Large Language Models (LLMs), Generative AI, AWS cloud services, and GPU Compute to assist our client in developing a solution. Which in turn could roll out to customers around the world.

Objectives

  • Assess available LLM solutions to find the correct one for the legal fraternity
  • Replace OpenAI in POC infrastructure
  • Ensure data sovereignty compliance across geographic regions including USA, Canada, the EU, Middle East, Hong Kong and Singapore
  • Utilise the cloud and foundation models on AWS Bedrock
  • Move the POC forward into production
  • Roll out in phases and test with customers to ensure solution functions as a helpful tool in the real world. While adding incremental value and limit technical debt

Benefits

  • Larger range of foundation model choices
  • Costs can be better predicted based on the model used and, in most cases, cheaper as well
  • Better security as the clients’ data is not made public and stays within the AWS VPC
  • Added benefit of being able to swap out models with ease

Overview and approach

Trial and court case information is naturally highly sensitive. However, across the different regions our client services, the data sovereignty regulations vary quite drastically. For instance, Canadian and European data cannot move through the USA, and separate regions need to be set up for the likes of the Middle East and Singapore.

Although the POC the client built worked well, the base infrastructure built using OpenAI sends all data through the USA. As well as made the data public due to the nature of OpenAI. It was this issue that led the client to seek out assistance from BBD.

The team started by assessing and comparing various LLMs, seeking a model that would allow for the same tools, functionality and features across various regions, but with strict data regulations in place. Adopting AWS Bedrock, the first phase of the project was to rebuild the infrastructure, creating a solution for European clients. As their regulations are equitable, Canadian clients were federated into that solution as a MVP as the solution continues to grow. Following that, regions were created for the Middle East and Singapore (which includes Hong Kong).

Within the first phase, the following features were introduced: document summarisation, entity extraction, topic modelling and chat capabilities.

Opting to take a phased approach, the client and team worked together to develop a solution that could then be rolled out tocustomers within those regions systematically, adapting and refiningon a continuous basis to ensure the tools took customer and stakeholder feedback into account. This incremental value has worked exceptionally well for the client and their customers.

Technical details

One of the biggest benefits in BBD’s approach to the project lies in how the productionised solution was written in a way that allows for the team to switch between LLM models depending on which individual task (summarising, extracting information, identifying actors) is best served by the models. This powerful feature ensures that as LLM technology continues to mature and evolve, the client’s solution can evolve simultaneously. An added benefit is in how there is no vendor lock-in for client in terms of utilising AWS as Bedrock has been leveraged by the cloud provider as an off-the-shelf platform.

As in all AI project, training the system on correct data is imperative. For this, GPU Compute was leveraged to do the heavy lifting on the training, which a GPU Farm leveraged in each available region. Once the Canadian GPU Farm is in place, Canada will have their own region, and not have to federate with the EU.

Technology implemented includes:

  • AWS Bedrock, CloudWatch
  • Python with Langchain
  • Mongo Atlas

Impact of BBD’s partnership

BBD’s partnership with this client had a major impact on their ability to move this innovative offering into production in a highly secure manner. The innovative LLM-based solution can now easily employ a variety of foundation models, allowing the client to reduce costs and leverage improved options as the technology evolves.

Click here to see more of BBD’s success stories

What’s next? We’re ready!