A Comprehensive Guide

Flowise Logo

Introduction

Flowise is an open-source, low-code tool tailored for developers to build customised orchestration flows and AI agents for Large Language Models (LLMs). It simplifies the development cycle, enabling quick iterations and a smooth transition from testing to production. In this article, we will explore the primary use cases for Flowise.

Get Started

Flowise provides two deployment choices: local installation or obtaining product access via their official website. We chose to proceed with a local deployment for our experimentation.

Download with NodeJS

http://localhost:3000
http://localhost:3000
Docker Compose Command Running Interface
Docker Compose Command Result Interface

Drag-And-Drop UI: Build Customised LLM Applications

Flowise empowers users to create custom applications that harness LLM capabilities for specific needs. This is achieved through a user-friendly drag-and-drop interface, simplifying the development process. To kickstart your LLM application creation, consider exploring the MarketPlace Section. There, you can select a pre-made template and tailor it to your requirements. Let’s explore this process with a step-by-step example.

  1. First, once you access the interface, head over to the Marketplace tab. This page offers a selection of ready-to-use templates, complete with filters to search for specific use cases such as Customer Support.
Flowise Marketplace Tab

2. For our demonstration, we chose the Conversational Retrieval QA Chain template to create a chatbot that responds to uploaded text files. To start using it, open the template and click on the “Use Template” button located in the right corner.

Flowise Template Page

3. To streamline the process of acquiring an API key, we replaced the Mistral chat models with OpenAI chat models. Additionally, we substituted the FAISS library with Pinecone for vector data storage and retrieval (highlighted by green rectangles in the image). Here’s a concise overview of each node shown in the screenshot:

Flowise Node Description
OpenAI API Creation

In this case, we’re using OpenAI’s text-embedding-ada-002, which produces embeddings with 1536 dimensions. Therefore, when creating the Pinecone index for these embeddings, set the dimension to 1536.

Pinecone Index Creation

With the API credentials in place, we can now proceed to upload our content into the workflow:

Flowise Workflow Running Instructions
Flowise Document Upserted To Vector Database

4. Your chatbot is now operational and ready for interaction. To begin:

Flowise Workflow Start Button
Flowise Chatbot Dialog Box

API Integration: Extending Flowise Capabilities

Flowise offers a powerful API alongside its drag-and-drop interface. This API allows programmatic access to Flowise features, enabling automated workflows and custom integrations. It facilitates remote workflow management and expands AI integration possibilities. For instance, interacting with our chatbot can be achieved through a simple post request to the prediction endpoint.

Flowise API Call

Conclusion

In conclusion, Flowise offers a versatile platform for building and deploying AI-powered applications. Whether you prefer the intuitive drag-and-drop interface for visual workflow creation or the powerful API for programmatic integration, Flowise provides the tools to bring your AI projects to life.

References

About the Author

Mingrui Gao
Intern at Research Graph Foundation |  + posts