# AI Services

Shipped includes an AI Chatbot to showcase how you can integrate AI services inside your product.

You can open the AI Chatbot at `http://localhost:3000/ai-chatbot`

{% embed url="<https://www.loom.com/share/a5ca9cf3309245a4a87fac673ea33eeb?sid=d8e636bf-85cb-4a53-938f-9e2703eb4a2c>" %}

You can find the AI Chatbot UI in the file `src/components/pages/AIChatbot/AIChatbot.tsx` while the backend route is at `src/app/api/chat/[provider]/route.ts`.

This example AI Chatbot implementation shows how to use the AI Services **OpenAI**, **Anthropic**, an **Google Gemini**, but Shipped makes use of the Vercel AI SDK, which supports many other AI services, likes:

* xAI Grok
* Azure OpenAI
* Amazon Bedrock
* Google Vertex AI
* Mistral
* DeepSeek
* Perplexity
* Ollama

and many more. Check out the [official page](https://sdk.vercel.ai/docs/foundations/providers-and-models) for the full list of AI providers.

### Configuration

To use the AI Chatbot, you need to configure some environment variables:

* `OPENAI_API_KEY`
* `ANTHROPIC_API_KEY`
* `GOOGLE_GENERATIVE_AI_API_KEY`

You can generate the API Keys by creating an account on the relative AI services websites.

{% hint style="warning" %}
**Use Your Own Keys**

If you prefer to not provide your own AI service API Keys, but instead let the users provide their API keys, you can:

* add a modal to ask for the API Key
* save the API Key in the local storage of the browser
* update the backend request and endpoint to read the API Key from the body of the request
  {% endhint %}

The Vercel AI SDK is a powerful library, and the [documentation](https://sdk.vercel.ai/docs/foundations/overview) describes a lot of uses cases and functionalities. I recommend to read it if you want to learn more about what's possible.
