kalomaze
|
b6077b02e4
|
Quadratic sampling (#5403)
---------
Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
|
2024-02-04 00:20:02 -03:00 |
|
Forkoz
|
528318b700
|
API: Remove tiktoken from logit bias (#5391)
|
2024-01-28 21:42:03 -03:00 |
|
oobabooga
|
aa575119e6
|
API: minor fix
|
2024-01-22 04:38:43 -08:00 |
|
oobabooga
|
821dd65fb3
|
API: add a comment
|
2024-01-22 04:15:51 -08:00 |
|
oobabooga
|
6247eafcc5
|
API: better handle temperature = 0
|
2024-01-22 04:12:23 -08:00 |
|
oobabooga
|
817866c9cf
|
Lint
|
2024-01-22 04:07:25 -08:00 |
|
oobabooga
|
aad73667af
|
Lint
|
2024-01-22 03:25:55 -08:00 |
|
Cohee
|
fbf8ae39f8
|
API: Allow content arrays for multimodal OpenAI requests (#5277)
|
2024-01-22 08:10:26 -03:00 |
|
Ercan
|
166fdf09f3
|
API: Properly handle Images with RGBA color format (#5332)
|
2024-01-22 08:08:51 -03:00 |
|
lmg-anon
|
db1da9f98d
|
Fix logprobs tokens in OpenAI API (#5339)
|
2024-01-22 08:07:42 -03:00 |
|
Stefan Daniel Schwarz
|
232c07bf1f
|
API: set do_sample=false when temperature=0 (#5275)
|
2024-01-17 23:58:11 -03:00 |
|
oobabooga
|
e055967974
|
Add prompt_lookup_num_tokens parameter (#5296)
|
2024-01-17 17:09:36 -03:00 |
|
Samuel Weinhardt
|
952a05a7c8
|
Correct field alias types for OpenAI extension (#5257)
|
2024-01-14 13:30:36 -03:00 |
|
oobabooga
|
bb2c4707c4
|
API: fix bug after previous commit
|
2024-01-09 19:08:02 -08:00 |
|
oobabooga
|
4332e24740
|
API: Make user_name/bot_name the official and name1/name2 the alias
|
2024-01-09 19:06:11 -08:00 |
|
oobabooga
|
a4c51b5a05
|
API: add "user_name" and "bot_name" aliases for name1 and name2
|
2024-01-09 19:02:45 -08:00 |
|
oobabooga
|
29c2693ea0
|
dynatemp_low, dynatemp_high, dynatemp_exponent parameters (#5209)
|
2024-01-08 23:28:35 -03:00 |
|
oobabooga
|
0d07b3a6a1
|
Add dynamic_temperature_low parameter (#5198)
|
2024-01-07 17:03:47 -03:00 |
|
kalomaze
|
48327cc5c4
|
Dynamic Temperature HF loader support (#5174)
---------
Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
|
2024-01-07 10:36:26 -03:00 |
|
Philipp Claßen
|
3eca20c015
|
Typo fixed in variable names (#5184)
|
2024-01-06 03:05:03 -03:00 |
|
kabachuha
|
dbe438564e
|
Support for sending images into OpenAI chat API (#4827)
|
2023-12-22 22:45:53 -03:00 |
|
oobabooga
|
23818dc098
|
Better logger
Credits: vladmandic/automatic
|
2023-12-19 20:38:33 -08:00 |
|
Felipe Ferreira
|
11f082e417
|
[OpenAI Extension] Add more types to Embeddings Endpoint (#4895)
|
2023-12-15 00:26:16 -03:00 |
|
Kim Jaewon
|
e53f99faa0
|
[OpenAI Extension] Add 'max_logits' parameter in logits endpoint (#4916)
|
2023-12-15 00:22:43 -03:00 |
|
oobabooga
|
39d2fe1ed9
|
Jinja templates for Instruct and Chat (#4874)
|
2023-12-12 17:23:14 -03:00 |
|
oobabooga
|
2a335b8aa7
|
Cleanup: set shared.model_name only once
|
2023-12-08 06:35:23 -08:00 |
|
oobabooga
|
2c5a1e67f9
|
Parameters: change max_new_tokens & repetition_penalty_range defaults (#4842)
|
2023-12-07 20:04:52 -03:00 |
|
oobabooga
|
b6d16a35b1
|
Minor API fix
|
2023-11-21 17:56:28 -08:00 |
|
oobabooga
|
f0d66cf817
|
Add missing file
|
2023-11-19 10:12:13 -08:00 |
|
Jordan Tucker
|
cb836dd49c
|
fix: use shared chat-instruct_command with api (#4653)
|
2023-11-19 01:19:10 -03:00 |
|
oobabooga
|
771e62e476
|
Add /v1/internal/lora endpoints (#4652)
|
2023-11-19 00:35:22 -03:00 |
|
oobabooga
|
ef6feedeb2
|
Add --nowebui flag for pure API mode (#4651)
|
2023-11-18 23:38:39 -03:00 |
|
oobabooga
|
0fa1af296c
|
Add /v1/internal/logits endpoint (#4650)
|
2023-11-18 23:19:31 -03:00 |
|
oobabooga
|
8f4f4daf8b
|
Add --admin-key flag for API (#4649)
|
2023-11-18 22:33:27 -03:00 |
|
wizd
|
af76fbedb8
|
Openai embedding fix to support jina-embeddings-v2 (#4642)
|
2023-11-18 20:24:29 -03:00 |
|
oobabooga
|
e0a7cc5e0f
|
Simplify CORS code
|
2023-11-16 20:11:55 -08:00 |
|
oobabooga
|
c0233bb9d3
|
Minor message change
|
2023-11-16 18:36:57 -08:00 |
|
oobabooga
|
510a01ef46
|
Lint
|
2023-11-16 18:03:06 -08:00 |
|
oobabooga
|
a475aa7816
|
Improve API documentation
|
2023-11-15 18:39:08 -08:00 |
|
oobabooga
|
a85ce5f055
|
Add more info messages for truncation / instruction template
|
2023-11-15 16:20:31 -08:00 |
|
oobabooga
|
e6f44d6d19
|
Print context length / instruction template to terminal when loading models
|
2023-11-15 16:00:51 -08:00 |
|
oobabooga
|
be125e2708
|
Add /v1/internal/model/unload endpoint
|
2023-11-15 15:48:33 -08:00 |
|
oobabooga
|
52758f15da
|
Remove sentence-transformers requirement (for #1575)
|
2023-11-10 07:35:29 -08:00 |
|
oobabooga
|
c5be3f7acb
|
Make /v1/embeddings functional, add request/response types
|
2023-11-10 07:34:27 -08:00 |
|
oobabooga
|
0777b0d3c7
|
Add system_message parameter, document model (unused) parameter
|
2023-11-10 06:47:18 -08:00 |
|
oobabooga
|
4aabff3728
|
Remove old API, launch OpenAI API with --api
|
2023-11-10 06:39:08 -08:00 |
|
GuizzyQC
|
6a7cd01ebf
|
Fix bug with /internal/model/load (#4549)
Update shared.model_name after loading model through API call
|
2023-11-10 00:16:38 -03:00 |
|
oobabooga
|
d86f1fd2c3
|
OpenAI API: stop streaming on client disconnect (closes #4521)
|
2023-11-09 06:37:32 -08:00 |
|
oobabooga
|
effb3aef42
|
Prevent deadlocks in OpenAI API with simultaneous requests
|
2023-11-08 20:55:39 -08:00 |
|
oobabooga
|
678fd73aef
|
Document /v1/internal/model/load and fix a bug
|
2023-11-08 17:41:12 -08:00 |
|