Application Integration Framework for Large Language Models

Ho, Joe Ee and Ooi, Boonyaik Yaik and Westner, Markus K. (2024) Application Integration Framework for Large Language Models. In: UNSPECIFIED.

Full text not available from this repository.
Official URL: https://www.scopus.com/inward/record.uri?eid=2-s2....

Abstract

Large Language Models (LLMs) have unlocked new opportunities in processing non-structured information. However, integrating LLM into conventional applications poses challenges due to their non-deterministic nature. This paper introduces a framework designed to effectively integrate LLM into intermediate modules by ensuring more consistent and reliable outputs. The framework includes three key components: the Sieve, which captures and retries processing of incorrect outputs; the Circuit Breaker, which stops processing persistently incorrect outputs; and the Optimizer, which enhances processing efficiency by combining inputs into single prompts. Experimental results employing structured methodology demonstrate the framework's effectiveness, achieving significant improvements a 71.05 reduction in processing time and an 82.97 reduction in token usage while maintaining high accuracy. The proposed framework, agnostic to specific LLM implementations, aids the integration of LLMs into diverse applications, enhancing automation and efficiency in fields such as finance, healthcare, and education. © 2024 IEEE.

Item Type: Conference or Workshop Item (UNSPECIFIED)
Additional Information: Cited by: 2
Uncontrolled Keywords: Data assimilation; Data integration; Network security; reductions; AI integration; Application integration framework; Data pipeline optimization; Data pipelines; Data processing framework; Error handling; Language model; Pipeline optimization; Token efficiency
Depositing User: Mr Ahmad Suhairi UTP
Date Deposited: 12 Jan 2026 12:17
Last Modified: 12 Jan 2026 12:17
URI: https://khub.utp.edu.my/scholars/id/eprint/20421

Actions (login required)

View Item
View Item