Case Study | Healthcare
With a workforce exceeding 34,000 professionals and research and development facilities spanning the globe, this biopharmaceutical enterprise ranks among the Fortune 500 powerhouses. It asserts its position as one of the largest global corporations, dedicated to the relentless pursuit of discovering, developing, and delivering groundbreaking medicines to address the critical needs of patients grappling with severe diseases.
In the pharmaceutical sector, time is of the essence. Recognizing this imperative, is committed to a culture of perpetual advancement across all operational dimensions, steadfastly integrating state-of-the-art technologies and fostering agility to ensure optimal performance.
However, the organization encountered a number of challenges, particularly with excessively long user query durations, significantly impeding operational efficacy.
In pursuit of enhanced efficiency and streamlined operational processes, the company’s regulatory business team expressed a desire for a comprehensive solution: a single tool capable of promptly addressing user queries without the customary three-day delay. They had envisioned an interactive Q&A bot, proficient at leveraging enterprise data to furnish precise responses to specific business inquiries. With this conversational bot at their disposal, authorized users could access information instantaneously.
To achieve this objective, the company embarked on a quest to find a partner to spearhead this implementation. With deep expertise, seasoned talent, and a demonstrated track record of success in the AI space, Brillio emerged as the ideal partner.
Two separate solution approaches were devised to tackle the challenge and develop LLM modules, one tailored for unstructured data and another specifically designed for structured data.
In the case of unstructured data, a dual strategy was employed, consisting of a Retriever and a Generator. For the Retriever component, OpenSearch was chosen for its open-source nature, scalability, and cost-effectiveness, facilitating the extraction of relevant context for a given user query. This contextual information is subsequently utilized to generate responses. As for the Generator, an AzureOpenAI model—text-davinci-003—was utilized. This model harnesses the user query along with the relevant context retrieved from OpenSearch to craft natural language responses.
For structured data, the approach required an innovative solution. Given the sheer size of the input structured data, constructing a single generator posed challenges due to token limit errors and scalability issues. To overcome these roadblocks, a unique generator-generator approach was proposed and implemented, built upon the AzureOpenAI model with gpt-35-turbo, with the first generator converting a given question into an SQL query, while the second generating a natural language response by integrating the user query with the output of the SQL query.
To ensure a seamless user experience, an intelligent classification system was employed to categorize input user queries as structured, unstructured, follow-up, or a combination of both, triggering the respective pipelines automatically. Memory management was handled using the langchain framework, with a fallback mechanism incorporated to activate alternate pipelines in case of misclassification, ensuring robustness and reliability in addressing user queries.
While natural language responses are generated by Azure OpenAI models hosted within the Azure cloud, access to bots is tightly controlled through the creation of a security group, only allowing authorized users to submit queries.
For scalability, the Teams UI was chosen as the user interface due to its ability to effortlessly handle load issues, while the remaining components of this architecture, including the OpenSearch index and AzureOpenAI models, inherently possess scalability features, ensuring seamless expansion as demands increase.
These intelligent bots seamlessly navigate between various tabular datasets, addressing both direct inquiries and follow-up questions. When presented with a query lacking context, the model refrains from speculation and instead provides a straightforward response: “No information is available for the given query.”
The implementation through the integration of a one-stop tool has brought forth numerous benefits for the client, revolutionizing the user experience and enhancing operational efficiency.
Users now have instant access to information from documents and tabular datasets, resulting in streamlined processes and reduced turnaround times through intelligent enablement and automation. With an initial build time to market of just 12-14 weeks and the capability for quick enhancements based on requirements, the implementation ensures agility and responsiveness to evolving needs.
Additionally, the chat operates effortlessly within the familiar MS Teams environment, requiring no additional tools or specialized training. This accessibility empowers users of all technical proficiencies to leverage the tool effectively. The integration of the Bot into Teams enables users to enjoy real-time responses, eliminating the need for lengthy waits and follow-up requests. Queries are resolved in under 30 seconds, enhancing productivity and completely transforming the user experience.