Generative AI chatbots like ChatGPT and their underlying Large Language Models (LLMs) have taken the world by storm, with their powerful capabilities to understand and generate human language text. Despite this hype, the field is still exploring the best ways to leverage LLMs productively while avoiding their problematic tendency to produce hallucinatory outputs.

We primarily investigate the benefit of AI chatbots in conjunction with a widespread problem: searching for documents or relevant information hidden within them. We have implemented numerous customer projects in this area, leading to the development of our HIBU platform. This platform already includes countless features and enables the rapid and cost-effective development of custom search solutions.

A seamless integration of chatbot and search engine combines and extends the individual strengths of both systems, as demonstrated by the following use cases we have implemented:

  • Users search for documents (or content elements), open one of the search results in an integrated document viewer, and use the chatbot to ask questions about that document. All of this happens within the same UI without any context switching.

  • Users search for documents and then use the chat function to ask questions across all search results. Again, the user remains within the search UI and does not have to deal with context switches.
    With search terms and domain-specific search filters, the user can narrow down the set of search results with ease and accuracy such that the chat engine looks for answers only within the relevant documents.

  • Users chat with the entire document repository. In this use case, the chat functionality effectively also serves as the search engine, giving the user great flexibility in meeting their information needs.

All these use cases rely on Retrieval-Augmented Generation (RAG). This method ensures that the chat engine does not invent answers. With this approach, the LLM can fully leverage its strengths in understanding and generating language, but it is not expected to also provide the relevant expert knowledge. Instead, this knowledge is found within the relevant documents.

In each of these cases, our chat functionality is enriched with a range of convenience features that boost user experience and productivity. For example, chat responses are linked to the most relevant document sections supporting these answers. This is a substantial step towards “explainable AI” (XAI).


The HIBU Chat in Practice

  • Get the best of both worlds: Search and Chat
  • Run HIBU either in the cloud or on-premise.
  • Choose wether to use an LLM by an external service (e.g., OpenAI) or a self-hosted LLM (data privacy!).
  • Either way, HIBU integrates chat in a cost-effective way (minimizing data traffic with the LLM).

Karakun – Your AI Partner

For running AI systems productively and generating value with them, AI knowledge alone is not enough. Suitable knowhow in engineering, DevOps, language technology and others are also critical. Karakuns offers all these competences in combination.

And, despite the hype, chat is not the best AI option for all use cases.

Let us consult you as to how AI can help to automate some of your business processes and support your knowledge workers in their daily tasks.

Contact us by Mail
give us a call