How to create your own AI chatbot
Why create your own AI chatbot?
Creating something from scratch yourself can be a highly rewarding and beneficial process.
- You get to build your solution. Tailored to what you need.
- You sharpen your understanding of AI
- It builds confidence in your skills
- It is cheaper to use than standard ChatGPT Pro
- It is private - At least OpenAi claims they do not train on your data
- If you know how to do this once, you can use it in any kind of project
On the other hand, you don't want to make it too complex otherwise, you will never get finished and be bored in no time. So let's build a quick demo AI chatbot.
Assuming you have node installed, let's get started.
What we will build
We will build a simple chatbot that will answer questions about the weather. We will use the OpenAI API to generate the answers. The chatbot will be able to answer questions like:
We will build the simplest possible chatbot implementation. You can have it answer questions for your or give it custom prompts. You can also run it locally on your machine or deploy it to a server - you should add auth if you do that though.
How to build it
Create project
Create a new project with Next.js. I always use the latest version of Next.js. You can do this by running the following command in your terminal:
npx create-next-app@latest
Get OpenAI API key
There are countless tutorials out there on how to do this so I will not go into detail here. But you can find the instructions here: https://platform.openai.com/api-keys
Store your API key
Create a new file called .env.local and add your API key
echo OPENAI_API_KEY=your-api-key > .env.local
Create a new API route
// \app\api\chat\route.ts
import OpenAI from 'openai';
import { OpenAIStream, StreamingTextResponse } from 'ai';
// Optional, but recommended: run on the edge runtime.
// See https://vercel.com/docs/concepts/functions/edge-functions
export const runtime = 'edge';
const openai = new OpenAI({
apiKey: process.env.OPENAI_API_KEY!,
});
export async function POST(req: Request) {
// Extract the `messages` from the body of the request
const { messages } = await req.json();
// Request the OpenAI API for the response based on the prompt
const response = await openai.chat.completions.create({
model: 'gpt-3.5-turbo',
stream: true,
messages: messages,
});
// Convert the response into a friendly text-stream
const stream = OpenAIStream(response);
// Respond with the stream
return new StreamingTextResponse(stream);
}
Create Chat component
// \app\components\chat.tsx
"use client";
import { useChat } from "ai/react";
export default function MyComponent() {
const { messages, input, handleInputChange, handleSubmit } = useChat();
return (
<div>
<ul>
{messages.map((m, index) => (
<li key={index}>
{m.role === "user" ? "User: " : "AI: "}
{m.content}
</li>
))}
</ul>
<form onSubmit={handleSubmit}>
<label>
Say something...
<input value={input} onChange={handleInputChange} />
</label>
<button type="submit">Send</button>
</form>
</div>
);
}
Adapt page
You can adapt the main page.tsx from your root app folder to use the chat component or create a new page.tsx in whatever folder you like.
// \app\page.tsx
import Chat from "./chat";
export const runtime = "edge";
export default function Page() {
return <Chat />;
}
Run it
That's it. That is the most basic implementation of an AI chatbot that you could run in the browser and interact with it. Of course, there is much more to extend upon but this is the basic setup. You can run it locally by running npm run dev
or deploy it to Vercel to have it deployed as a webpage.
If you did everything correctly it should look something like this:
From here you can extend it to your liking. You can add more prompts, add more functionality, or even add a database to store the conversations. Or if you care about the looks of it, you can add some CSS to make it look better.
Errors
A frequent error that I ran into is that Next will somehow not pick up your API keys. It says it does, but it doesn't. I could only resolve that issue by deleting the .env.local
file and creating it again.
- Affiliate Disclaimer
- Disclaimer:
Links on the site might be affiliate links, so if you click them I might earn a small commission.