Skip to main content

· 3 min read
Yung-Hsiang Hu

v0.3.0 added SearchQA, leveraging Google search to provide solutions for organizational QnA.
This article will provide you steps on how to implement SearchQA.

Google API Key Application

warning

Google search API currently has a daily free quota of 100 times, please use it with caution

  1. Go to Google Programmable Search Engine Create Page and fill in the following information to create a custom search engine

· 5 min read
Ching-Pao Lin

In the latest 0.3.0 release, we have mainly added the Bot feature. Now you can create different Bots on the website, and this article will guide you step-by-step on how to set up your own Bot!

First, you will see a Store section:
\

With a button outlined in green, clicking it will open the Bot creation menu.

Here, you have a simple interface to set common Bot parameters, such as system prompts, user pre-prompts, and user post-prompts. If you want to set more detailed information, you can also open the model configuration file:

Although this part does not have an auxiliary interface, you can more freely set all parameters. Please refer to the Ollama Modelfile for the format. Note that in the current 0.3.0 version, only some configuration parameters are supported. The following lists the relevant parameters and some example usages.

  • SYSTEM <prompt>
    The system prompt should serve as the main method to influence the model's output, preloading some knowledge or changing the response style.
    • SYSTEM You are a helpful assistant.
    • SYSTEM Please respond briefly.
    • SYSTEM Your name is Bob, and you love learning other languages.
  • TEMPLATE <template>
    Specify the dialogue template to apply during inference. The template used by each model may vary, so it is recommended to refer to the relevant template for the model.
    • TEMPLATE """ {% for message in messages %}
      {% if message['role'] == 'system' %}
      {{ '<s>' + message['content'] }}
      {% endif %}
      {% if message['role'] == 'user' %}
      {{ 'USER: ' + message['content'] }}
      {% endif %}
      {% if message['role'] == 'assistant' %}
      {{ 'ASSISTANT: ' + message['content'] }}
      {% endif %}
      {% endfor %}
      {{ 'ASSISTANT: ' }}"""
  • MESSAGE <role> <prompt>
    Preload some dialogue records. The User and Assistant parts must be paired.
    • MESSAGE SYSTEM You are a helpful assistant.
    • MESSAGE SYSTEM Please respond briefly.
    • MESSAGE USER Hello.
    • MESSAGE ASSISTANT """Hello! How can I assist you?"""

In addition to the parameters supported by the original modelfile, we have also extended two additional parameters:

  • BEFORE-PROMPT <prompt>
    In the last message, this prompt will be placed before the user's message.
    • BEFORE-PROMPT Please translate the following into Japanese: 「
    • BEFORE-PROMPT 「
  • AFTER-PROMPT <prompt>
    In the last message, this prompt will be placed after the user's message.
    • AFTER-PROMPT 」
    • AFTER-PROMPT 」, please rephrase the above content.

Here are some example modelfile configurations:

Automatically add "Please translate into Japanese":

TEMPLATE """{% for message in messages %}
{% if message['role'] == 'system' %}
{{ '<s>' + message['content'] }}
{% endif %}
{% if message['role'] == 'user' %}
{{' USER: Please translate into Japanese: ' + message['content']}}
{% endif %}
{% if message['role'] == 'assistant' %}
{{' ASSISTANT: ' + message['content']}}
{% endif %}
{% endfor %}
{{' ASSISTANT: '}}"""

Pretend to be a cat:

SYSTEM You are a cat. Regardless of what I ask, you should only respond with "meow" or "mew" and not speak any human language or act like anything else.

Meow Translator:

BEFORE-PROMPT Please replace the following message entirely with "meow": 「
AFTER-PROMPT 」

Bilingual Teacher:

SYSTEM You are a helpful English teacher who corrects grammar and provides answers in both Chinese and English.

TAIDE Chinese Proofreader:

SYSTEM You are a professional Chinese teacher with expertise in proofreading and editing in fluent Traditional Chinese from Taiwan.
BEFORE-PROMPT Please directly refine the following text in Chinese without explanation:

TAIDE Chinese to English Translator:

SYSTEM You are a professional English teacher helping translate content into English from a Taiwanese perspective.
BEFORE-PROMPT Translate into English without explanation: 「
AFTER-PROMPT 」

Chinese Chatting: (For use with ChatGPT, Gemini)

AFTER-PROMPT Please answer in Traditional Chinese from a Taiwanese perspective.

Taiwan Search QA: (For use with Search QA)

SYSTEM site:tw Answer in Traditional Chinese.

Define some preset knowledge:

SYSTEM Your name is Jeff.
MESSAGE user What is your name?
MESSAGE assistant My name is Jeff. Hello!
MESSAGE user When I say ABCD, please respond with "EFGH!!!!"
MESSAGE assistant Okay, I will shout "EFGH!!!!" when you mention ABCD.
MESSAGE user ABCD?
MESSAGE assistant EFGH!!!!
MESSAGE user ABCDEFG?
MESSAGE assistant EFGH!!!!
MESSAGE user What comes after ABC?
MESSAGE assistant DEFGHIJKLMNOPQRSTUVWXYZ
MESSAGE user It rained heavily today.
MESSAGE assistant Noted, it was a torrential rain today.

Please note that not all models support these parameters. For example, the current Gemini Pro API does not support templates. The system prompt part is supported as a before-prompt.

Additionally, ChatGPT does not support template settings. The effectiveness of these settings depends on the model's training. If the training for system prompts is insufficient, it may be challenging to influence the model's behavior using the system prompt alone. You can try to influence the model output using MESSAGE or Before/After prompt instead.

· One min read
Yung-Hsiang Hu

This post will guide you to upgrade Kuwa TAIDE’s built-in model from TAIDE-LX-7B-Chat-4bit to Llama3-TAIDE-LX-8B-Chat-Alpha1-4bit.

  1. Go to C:\kuwa\GenAI OS\windows\executors, and duplicate the 1_taide directory to 1_taide-8b. If you only need to run the new version of TAIDE model, you can delete the run.bat file in 1_taide.

  2. Download taide-8b-a.3-q4_k_m.gguf from the official TAIDE HuggingFace Hub to C:\kuwa\GenAI OS\windows\executors\1-taide_8b, and delete the original taide-7b-a.2-q4_k_m.gguf

· 3 min read
Yung-Hsiang Hu

Getting the Model

Method 1: Applying for Access on HuggingFace

  1. Log in to HuggingFace and go to https://huggingface.co/meta-llama/Meta-Llama-3-8B-Instruct to apply for access to the meta-llama/Meta-Llama-3-8B-Instruct model (approximately 1 hour for review)
  2. If you see the "You have been granted access to this model" message, it means you have obtained the model access, and you can download the model

· 2 min read
Yung-Hsiang Hu

Hi, humans! 👋 Welcome to Kuwa! 🤖

Kuwa GenAI OS is an open, free, secure, and privacy-focused Generative-AI orchestrating system,
including user-friendly WebUI for LLMs, and a novel GenAI kernel to support AI-powered applications.

The main features are as follows:

  1. 🌐 Multi-lingual turnkey solution for GenAI development and deployment on Linux and Windows
  2. 💬 Concurrent multi-chat, quoting, full prompt-list import/export/share, and more for users
  3. 🔄 Flexible orchestration of prompts x RAGs x bots x models x hardware/GPUs
  4. 💻 Heterogeneous supports from virtual hosts, laptops, PCs, and edge servers to cloud
  5. 🔓 Open source, allowing developers to contribute and customize the system according to their needs

The Kuwa system was developed with the support of Taiwan's Trustworthy AI Dialogue Engine (TAIDE) project and has been used as a demonstration and development testing platform for the TAIDE project, as well as in several domain-specific applications.

We are a team of students and alumni from the Department of Computer Science and Information Engineering at National University of Kaohsiung, Taiwan, hoping to provide everyone with their own AI development or service platform.
There is still much room for improvement in the Kuwa system, and we sincerely welcome you to join the Kuwa open-source community to participate in this open-source project 🙌. Let’s enter the new generation of GenAI together.

Official website: https://kuwaai.tw/
Community: https://kuwaai.tw/community
TAIDE: https://en.taide.tw/