Build an enterprise-level AI question and answer system in 5 minutes: Get started with FastGPT pure open source solution

(0 comments)

In today's digital era, enterprises' demand for intelligent services is growing day by day, and application scenarios such as AI customer service and merchant knowledge Q&A are becoming more and more common. Many friends have reported to me that they need to build a similar question and answer system within the company, hoping to find open source solutions and understand how to integrate them into existing code. In this video, I will lead you to use the open source framework FastGPT to implement an enterprise internal knowledge question and answer system. In fact, the process is not complicated, let’s take a look.

FastGPT framework analysis

FastGPT is a knowledge base question and answer system based on the LLM large language model and has many practical functions. It not only provides out-of-the-box data processing and model calling functions, but also performs workflow orchestration through process visualization to realize complex question and answer scenarios. From the architecture diagram, the left side is responsible for core data and vector data storage. The model gateway is built through an API in the middle. The lower part can connect various large models, and even locally deployed open source large models can be used. The selection is very rich.

System deployment process

Server selection

Due to the high demand for GPU resources for large model operations, if a pure CPU is used, the question and answer response may take more than a minute. Therefore, this time I chose a GPU-first server and used the GPUEZ intelligent computing cloud platform. On this platform, we rent an instance with 48G of video memory on demand. Since this is a demo project, I chose to pay as you go, rent it for an hour, and specify the base image. For scientific research and exploration related to machine learning and artificial intelligence, the platform is very convenient to use. After the instance is running, both SSH connection and Jupiter Lab mode are supported. After entering, select terminal, similar to SSH login console. You can enter LINUX commands and get feedback.

FastGPT installation

In order to install FastGPT, we use the docker compose method, which is suitable for LINUX, Mac OSX and Windows systems. The operation is simple and can be called "brainless installation". The only configuration required is model access to a single API platform. Taking the access to a large domestic public welfare model as an example, it is not difficult to follow the documents provided by FastGPT.

Build knowledge base

  1. Configure the proxy and create a new knowledge base : Configure the proxy port on the intelligent computing cloud and obtain the proxy link for access. After entering the system, click New Knowledge Base on the left, select the default general knowledge base, choose a name that is easy to remember, and keep the others as default.
  2. Configuration data set : supports word, PDF, static website content import, and can also customize handwritten content. Select local file upload. If you have multiple files, you can upload them multiple times. After uploading, you can test the segmentation and retrieval effects. Thanks to the GPU, responsiveness is very fast, in contrast to the sluggishness of previous CPU tests.
  3. Add an application shell to the knowledge base : Click Workbench to create a new application, give it a name and keep other default settings. Next, write the prompt word. I have prepared prompt word templates for everyone based on the scope of the knowledge base currently used online. You can adjust them as needed. Enter questions to test the response speed and answer effect. The overall experience is good.

System Integration Method

  1. Page embedding : FastGPT provides a login-free window that can be opened directly in the browser or embedded into the page through the IFRAME component. Copy the specific code to the existing page, and a floating icon will appear in the lower right corner. It's very convenient to start a conversation with just one click.
  2. Code integration : If you need more customization, you can use code integration. First, create a new API key for interface interaction authentication. Unfortunately, FastGPT does not provide code samples or SDK, only curl command samples. We can convert the curl command into java code through a specific website, copy it to IDA and run it. The access effect is good and the speed is fast.

Advantages of GPUEZ intelligent computing cloud platform

The GPUEZ intelligent computing cloud platform used this time has an excellent experience. It supports pay-as-you-go, flexible and convenient; large single-card video memory such as 32G and 48G is rare on other platforms; it provides a pre-installed environment, and self-made images can be saved and reused; it has a rich set of machine learning and model training data. It is of great help to scientific research and exploration. The platform cooperates with teachers and students from many universities across the country and researchers from scientific research institutions, so it is safe, stable and guaranteed. Register now to get a 5 yuan trial reward and a 20% consumption discount. The link has been placed in the comments section. Friends who need computing power rental may wish to give it a try.

Through the above steps, we learned to use FastGPT to build an internal knowledge base question and answer system for the enterprise. The whole process is simple and easy to understand. I hope everyone can try to build their own enterprise-level AI question and answer system. If you have any questions or ideas, please leave a message in the comment area to share. See you in the next video!

Currently unrated

Comments


There are currently no comments

Please log in before commenting: Log in

Recent Posts

Archive

2025
2024
2023
2022
2021
2020

Categories

Tags

Authors

Feeds

RSS / Atom