Merge remote-tracking branch 'refs/remotes/origin/dev' into dev

This commit is contained in:
oobabooga 2023-11-07 08:25:22 -08:00
commit cee099f131
2 changed files with 7 additions and 4 deletions

View File

@ -20,9 +20,8 @@ Its goal is to become the [AUTOMATIC1111/stable-diffusion-webui](https://github.
* [Multimodal pipelines, including LLaVA and MiniGPT-4](https://github.com/oobabooga/text-generation-webui/tree/main/extensions/multimodal) * [Multimodal pipelines, including LLaVA and MiniGPT-4](https://github.com/oobabooga/text-generation-webui/tree/main/extensions/multimodal)
* [Extensions framework](https://github.com/oobabooga/text-generation-webui/wiki/07-%E2%80%90-Extensions) * [Extensions framework](https://github.com/oobabooga/text-generation-webui/wiki/07-%E2%80%90-Extensions)
* [Custom chat characters](https://github.com/oobabooga/text-generation-webui/wiki/03-%E2%80%90-Parameters-Tab#character) * [Custom chat characters](https://github.com/oobabooga/text-generation-webui/wiki/03-%E2%80%90-Parameters-Tab#character)
* Very efficient text streaming
* Markdown output with LaTeX rendering, to use for instance with [GALACTICA](https://github.com/paperswithcode/galai) * Markdown output with LaTeX rendering, to use for instance with [GALACTICA](https://github.com/paperswithcode/galai)
* OpenAI-compatible API server * OpenAI-compatible API server with Chat and Completions endpoints -- see the [examples](https://github.com/oobabooga/text-generation-webui/wiki/12-%E2%80%90-OpenAI-API#examples)
## Documentation ## Documentation

View File

@ -174,11 +174,15 @@ while True:
stream_response = requests.post(url, headers=headers, json=data, verify=False, stream=True) stream_response = requests.post(url, headers=headers, json=data, verify=False, stream=True)
client = sseclient.SSEClient(stream_response) client = sseclient.SSEClient(stream_response)
assistant_message = ''
for event in client.events(): for event in client.events():
payload = json.loads(event.data) payload = json.loads(event.data)
print(payload['choices'][0]['message']['content'], end='') chunk = payload['choices'][0]['message']['content']
assistant_message += chunk
print(chunk, end='')
print() print()
history.append({"role": "assistant", "content": assistant_message})
``` ```
#### Python completions example with streaming #### Python completions example with streaming
@ -233,7 +237,7 @@ OPENAI_API_BASE=http://127.0.0.1:5000/v1
With the [official python openai client](https://github.com/openai/openai-python), the address can be set like this: With the [official python openai client](https://github.com/openai/openai-python), the address can be set like this:
```shell ```python
import openai import openai
openai.api_key = "..." openai.api_key = "..."