Commit Graph

3407 Commits

Author SHA1 Message Date
oobabooga
5c3eb22ce6 Bump llama-cpp-python to 0.2.14 2023-11-07 14:20:43 -08:00
oobabooga
3fc505dc0f Document unused parameters 2023-11-07 08:56:09 -08:00
oobabooga
3d59346871 Implement echo/suffix parameters 2023-11-07 08:43:45 -08:00
oobabooga
cee099f131 Merge remote-tracking branch 'refs/remotes/origin/dev' into dev 2023-11-07 08:25:22 -08:00
oobabooga
48c9c31440 Document the "preset" option in the API 2023-11-07 08:23:17 -08:00
oobabooga
d59f1ad89a
Update README.md 2023-11-07 13:05:06 -03:00
oobabooga
0c440877de
Update 12 - OpenAI API.md 2023-11-07 12:59:40 -03:00
oobabooga
55dc9845cb
Update 12 - OpenAI API.md 2023-11-07 12:51:41 -03:00
oobabooga
b0b999dd68 Merge remote-tracking branch 'refs/remotes/origin/dev' into dev 2023-11-07 07:46:08 -08:00
oobabooga
2bda1a9c9b Mention --api-key 2023-11-07 07:45:55 -08:00
oobabooga
cc04abda49
Update 12 - OpenAI API.md 2023-11-07 12:40:52 -03:00
oobabooga
ddca6948b2
Update 12 - OpenAI API.md 2023-11-07 12:39:59 -03:00
oobabooga
40e73aafce
Update 12 - OpenAI API.md 2023-11-07 12:38:39 -03:00
oobabooga
6ec997f195
Update 12 - OpenAI API.md 2023-11-07 12:36:52 -03:00
oobabooga
15d4ea180d Merge remote-tracking branch 'refs/remotes/origin/dev' into dev 2023-11-07 07:35:36 -08:00
oobabooga
b2afdda4e8 Add more API examples 2023-11-07 07:35:04 -08:00
Morgan Cheng
349604458b
Update 12 - OpenAI API.md (#4501)
Fix the typo in argument. It should be `--api-port` instead of `--port`.

Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
2023-11-07 11:22:17 -03:00
dependabot[bot]
fd893baba1
Bump optimum from 1.13.1 to 1.14.0 (#4492) 2023-11-07 00:13:41 -03:00
dependabot[bot]
18739c8b3a
Update peft requirement from ==0.5.* to ==0.6.* (#4494) 2023-11-07 00:12:59 -03:00
oobabooga
79b3f5a546
Add /v1/internal/stop-generation to OpenAI API (#4498) 2023-11-07 00:10:42 -03:00
oobabooga
97c21e5667 Don't strip leading spaces in OpenAI API 2023-11-06 19:09:41 -08:00
oobabooga
4a45dc4041 Reorder the parameters in the FastAPI documentation 2023-11-06 09:55:36 -08:00
oobabooga
1fba6db69f
Merge pull request #4488 from oobabooga/dev
Merge dev branch
2023-11-06 12:18:55 -03:00
oobabooga
0ed6a17ed4 Update warning 2023-11-06 07:17:49 -08:00
oobabooga
0db81355bc Reorder a parameter 2023-11-06 07:11:49 -08:00
oobabooga
b87c6213ae Remove obsolete endpoint 2023-11-06 05:45:45 -08:00
oobabooga
fcc9114b58 Merge remote-tracking branch 'refs/remotes/origin/dev' into dev 2023-11-06 05:38:47 -08:00
oobabooga
ceb8c92dfc
Update 12 - OpenAI API.md 2023-11-06 10:38:22 -03:00
oobabooga
28fd535f9c Make chat API more robust 2023-11-06 05:22:01 -08:00
oobabooga
5b5ef57049 Remove file 2023-11-05 21:39:59 -08:00
oobabooga
ec17a5d2b7
Make OpenAI API the default API (#4430) 2023-11-06 02:38:29 -03:00
俞航
84d957ba62
[Fix] fix openai embedding_model loading as str (#4147) 2023-11-05 20:42:45 -03:00
kabachuha
e18a0460d4
fix openai extension not working because of absent new defaults (#4477) 2023-11-04 16:12:51 -03:00
oobabooga
b7a409ef57
Merge pull request #4476 from oobabooga/dev
Merge dev branch
2023-11-04 15:04:43 -03:00
oobabooga
fb3bd0203d Update docs 2023-11-04 11:02:24 -07:00
oobabooga
1d8c7c1fc4 Update docs 2023-11-04 11:01:15 -07:00
oobabooga
b5c53041b8
Merge pull request #4475 from oobabooga/dev
Merge dev branch
2023-11-04 14:19:55 -03:00
oobabooga
40f7f37009 Update requirements 2023-11-04 10:12:06 -07:00
Orang
2081f43ac2
Bump transformers to 4.35.* (#4474) 2023-11-04 14:00:24 -03:00
feng lui
4766a57352
transformers: add use_flash_attention_2 option (#4373) 2023-11-04 13:59:33 -03:00
wouter van der plas
add359379e
fixed two links in the ui (#4452) 2023-11-04 13:41:42 -03:00
Casper
cfbd108826
Bump AWQ to 0.1.6 (#4470) 2023-11-04 13:09:41 -03:00
oobabooga
aa5d671579
Add temperature_last parameter (#4472) 2023-11-04 13:09:07 -03:00
oobabooga
1ab8700d94 Change frequency/presence penalty ranges 2023-11-03 17:38:19 -07:00
oobabooga
45fcb60e7a Make truncation_length_max apply to max_seq_len/n_ctx 2023-11-03 11:29:31 -07:00
oobabooga
7f9c1cbb30 Change min_p default to 0.0 2023-11-03 08:25:22 -07:00
oobabooga
4537853e2c Change min_p default to 1.0 2023-11-03 08:13:50 -07:00
kalomaze
367e5e6e43
Implement Min P as a sampler option in HF loaders (#4449) 2023-11-02 16:32:51 -03:00
oobabooga
fcb7017b7a Remove a checkbox 2023-11-02 12:24:09 -07:00
Julien Chaumond
fdcaa955e3
transformers: Add a flag to force load from safetensors (#4450) 2023-11-02 16:20:54 -03:00