Hi, welcome to this post, In this guide, we will tell you all about VPet-Simulator – How to Enable Openai Extension Model Following this guide step by step.
This guide will teach you how to use localhost instead of 0.0.0.0 to enable openai extension
1. Install oobabooga text-generation-webui
2. Openai Extension:
- At your oobabooga\oobabooga-windows installation directory, launch cmd_windows.bat (or micromamba-cmd.bat, if you used the older version of webui installer.)
- You can find the directory of the extension by clicking on it.
- Install the required requirements
pip3 install -r requirements.txt
- Launch webui. On the Session Tab, enable openai Extension. After “Apply” and “restart”, click on the “Apply” button.
If all goes well, this line will appear on your console:
OpenAI compatible API ready at: OPENAI_API_BASE=http://0.0.0.0:5001/v1 - 
3. Connect to Your Local API
Now is the time to show Chibi how to use our local API.
- Click on the character and select System->Settings.
- Select “Use the API requested by ChatGPT”.
- Open the ChatGPT settings. The settings panel is located inside the Settings menu.
Set API URL as:
*If you instead set it to http://0.0.0.0:5001/v1/chat/completions – 
SocketException error will appear. I don’t why, but using 0.0.0.0 in place of localhost makes.Net very happy.
API Key can be anything. Ooba’s webui does not check the key.
Now you can use the local LLM to talk with your Chibi and do all the things.
For VPet-Simulator – How to Enable Openai Extension Model Guide, see this guide. Please let us know in the comments below if you find anything incorrect or outdated, and we will attend to it as quickly as possible. I hope that today turns out well for you. We are indebted to yukinanka, whose perceptive guide served as the impetus for this one. If you liked this post, you should check back regularly because we publish new information every day.
- All VPet-Simulator Posts List