June 20th, 2023

Customizing Your LLM Model on Mendable.ai

Use the custom prompt feature to take control of your model

Customizing Your LLM Model on Mendable.ai image

One of the key features of Mendable is the Custom Prompt feature, which offers several use cases to enhance your chatbot's performance and provide a tailored experience for your customers. In this tutorial, we will focus on the three main use cases of the Custom Prompt feature: Hallucination Gatekeeping, Maintaining Voice, and Structure/Format Control. By making edits in the custom prompt, we can customize the output of our answers. This example will be trained on the Mendable docs.

Use Case 1: Hallucination Gatekeeping

One of the main use cases of the custom prompt feature is hallucination gatekeeping. This allows you to control the chatbot's responses to unpredictable or potentially harmful inputs. By customizing your prompt on Mendable, you can guide the chatbot to provide accurate and safe responses. Let's dive into an example:


Let's say our product requires a high degree of accuracy, and we want to prevent the model from 'guessing', ensuring it is 100% confident that the answer or information about the answer can be found in the documentation.

Before the prompt change, I asked Mendable "How does Mendable's ClickUp Integration work?". We do not have a ClickUp Integration (yet). Here is the output:

Notice how the chatbot tried to answer anyways. To prevent this, we only need one sentence! In the custom prompt we can write: "If nothing is picked up in the information retrieval, say 'Sorry, I don't have the answer for that. Please contact support at help.mendable.ai for assistance!'".

Now... let's test with the same query to see if our custom prompt change is successful.

Awesome! We just prevented a hallucination! Feel free to experiment with different prompts and refine them based on your specific requirements.

Use Case 2: Maintaining Voice

Maintaining Voice is crucial for creating a consistent user experience. The Custom Prompt feature allows you to define specific language patterns, tones, or styles that align with your brand or product. Let's try it out!


In this example, we'll focus on a couple of ways to bring our company's voice to life. Let's build:

  • A nice introductory message to make the user feel welcomed
  • A supportive voice when the user is running into issues

Let's jump back into the custom prompt. Starting with the introductory message, let's input something like "At the beginning of the conversation, say 'Welcome to Mendable and thanks for reaching out!'".

By specifying this message at the beggining of the conversation, we've pinned the welcome message to only appear on the first message of the conversation.

Additionally, we can sense the tone of our customer's voice. If we sense that they're frustrated, we can customize a message to be as helpful and supportive as possible. I'll set this up as such: "If the user is running into issues, say 'I'm sorry you're running into issues! I'll try my best to help'. Then try to answer. After the answer, say 'If this answer didn't help, please reach out to support at help.mendable.ai!'"

We've taken a few steps to improve CX:

  • Given a tone to be understanding and helpful as the user works through their error
  • Tried to answer the question
  • Given a clear and easy outlet to the company support channel in the event that this issue supersedes the bot's ability.

Here is everything that was added to the custom prompt:

As expected, we have our welcome message in the first message of the conversation:

In our follow up question, we didn't specifially say that we were 'running into issues'. However, the way we wrote the prompt allows the model to detect the negative tone in the question the user asked, and it will use the prompted response as a result:

Congrats! You've now learned how to get started matching your company's voice to its LLM.

Use Case 3: Structure and Format

Custom Prompt empowers you to guide the chatbot's responses according to your desired structure and formatting requirements. Let's explore an example:


Sometimes the answer is outputted in one main large paragraph, which can be difficult to read. Here's an example of what this looks like:

To combat this, I'll just put in a short prompt. In your answer, start a different key point in another paragraph.

With this quick change, we can make the format a bit easier to digest. Testing the same question, here is the new output:

Much easier to read! This is just one of many ways you can customize the format to your liking.

Conclusion: We covered three main use cases of the Custom Prompt feature in Mendable: Hallucination Gatekeeping, Maintaining Voice, and Structure/Format. By customizing prompts, you be confident that your chatbot provides accurate, consistent, and controlled responses. Feel free to explore more prompts and experiment with different scenarios to enhance your chatbot's conversational capabilities.

If you have any questions, feel free to reach out to us!