Commit Graph

3082 Commits

Author SHA1 Message Date
oobabooga
3ffd7d36fd Increase the repetition penalty for pygmalion 2023-02-24 12:31:30 -03:00
oobabooga
0b90e0b3b6
Update README.md 2023-02-24 12:01:07 -03:00
oobabooga
1a23e6d185
Add Pythia to README 2023-02-24 11:38:01 -03:00
oobabooga
fe5057f932 Simplify the extensions implementation 2023-02-24 10:01:21 -03:00
oobabooga
e26118eba9 Merge branch 'main' of github.com:oobabooga/text-generation-webui 2023-02-24 09:41:11 -03:00
oobabooga
2fb6ae6970 Move chat preprocessing into a separate function 2023-02-24 09:40:48 -03:00
oobabooga
f4f508c8e2
Update README.md 2023-02-24 09:03:09 -03:00
oobabooga
876761329b Merge branch 'elwolf6-max_memory' 2023-02-24 08:55:55 -03:00
oobabooga
f6f792363b Separate command-line params by spaces instead of commas 2023-02-24 08:55:09 -03:00
oobabooga
e260e84e5a Merge branch 'max_memory' of https://github.com/elwolf6/text-generation-webui into elwolf6-max_memory 2023-02-24 08:47:01 -03:00
oobabooga
146f786c57 Reorganize a bit 2023-02-24 08:44:54 -03:00
oobabooga
c2f4c395b9 Clean up some chat functions 2023-02-24 08:31:30 -03:00
luis
5abdc99a7c gpu-memory arg change 2023-02-23 18:43:55 -05:00
oobabooga
9ae063e42b Fix softprompts when deepspeed is active (#112) 2023-02-23 20:22:47 -03:00
oobabooga
dac6fe0ff4 Reset the history if no default history exists on reload 2023-02-23 19:53:50 -03:00
oobabooga
3b8cecbab7 Reload the default chat on page refresh 2023-02-23 19:50:23 -03:00
oobabooga
f1914115d3 Fix minor issue with chat logs 2023-02-23 16:04:47 -03:00
oobabooga
682f7bdbba
Merge pull request #110 from oobabooga/refactored
Refactor everything
2023-02-23 15:30:32 -03:00
oobabooga
b78561fba6 Minor bug fix 2023-02-23 15:26:41 -03:00
oobabooga
2e86a1ec04 Move chat history into shared module 2023-02-23 15:11:18 -03:00
oobabooga
c87800341c Move function to extensions module 2023-02-23 14:55:21 -03:00
oobabooga
2048b403a5 Reorder functions 2023-02-23 14:49:02 -03:00
oobabooga
7224343a70 Improve the imports 2023-02-23 14:41:42 -03:00
oobabooga
364529d0c7 Further refactor 2023-02-23 14:31:28 -03:00
oobabooga
e46c43afa6 Move some stuff from server.py to modules 2023-02-23 13:42:23 -03:00
oobabooga
1dacd34165 Further refactor 2023-02-23 13:28:30 -03:00
oobabooga
ce7feb3641 Further refactor 2023-02-23 13:03:52 -03:00
oobabooga
98af4bfb0d Refactor the code to make it more modular 2023-02-23 12:05:25 -03:00
oobabooga
18e0ec955e Improve some descriptions in --help 2023-02-23 10:11:58 -03:00
oobabooga
ced5d9ab04
Update README.md 2023-02-23 10:04:07 -03:00
oobabooga
b18071330f
Update README.md 2023-02-23 01:32:05 -03:00
oobabooga
c72892835a Don't show *-np models in the list of choices 2023-02-22 11:38:16 -03:00
oobabooga
95e536f876 Merge branch 'main' of github.com:oobabooga/text-generation-webui 2023-02-22 11:24:14 -03:00
oobabooga
044b963987 Add stop parameter for flexgen (#105) 2023-02-22 11:23:36 -03:00
oobabooga
b4a7f5fa70
Update README.md 2023-02-22 01:54:12 -03:00
oobabooga
ea21a22940 Remove redundant preset 2023-02-22 01:01:26 -03:00
oobabooga
b8b3d4139c Add --compress-weight parameter 2023-02-22 00:43:21 -03:00
oobabooga
193fb1660a Conversion seems to work better this way 2023-02-22 00:35:10 -03:00
oobabooga
eef6fc3cbf Add a preset for FlexGen 2023-02-21 23:33:15 -03:00
oobabooga
311404e258 Reuse disk-cache-dir parameter for flexgen 2023-02-21 22:11:05 -03:00
oobabooga
f3c75bbd64 Add --percent flag for flexgen 2023-02-21 22:08:46 -03:00
oobabooga
b83f51ee04 Add FlexGen support #92 (experimental) 2023-02-21 21:00:06 -03:00
oobabooga
e52b697d5a Add bf16 back here (the fp16 -> bf16 conversion takes a few seconds) 2023-02-21 00:54:53 -03:00
oobabooga
bc856eb962 Add some more margin 2023-02-20 20:49:21 -03:00
oobabooga
444cd69c67 Fix regex bug in loading character jsons with special characters 2023-02-20 19:38:19 -03:00
oobabooga
f867285e3d Make the circle a bit less red 2023-02-20 18:41:38 -03:00
oobabooga
e4440cd984 Make highlighted text gray in cai-chat mode 2023-02-20 16:43:32 -03:00
oobabooga
bb1dac2f76 Convert the download option (A-Z) to upper case 2023-02-20 15:50:48 -03:00
oobabooga
d7a738fb7a Load any 13b/20b/30b model in 8-bit mode when no flags are supplied 2023-02-20 15:44:10 -03:00
oobabooga
c1de491c63 No need to have bf16 support here 2023-02-20 15:12:42 -03:00