Commit Graph

  • 113f94b61e Bump transformers (16-bit llama must be reconverted/redownloaded) oobabooga 2023-04-06 16:04:03 -0300
  • 7d9728b719
    added .env and dockerfile to .dockerignore loeken 2023-04-06 20:58:24 +0200
  • 5f4f38ca5d Merge branch 'main' of github.com:oobabooga/text-generation-webui oobabooga 2023-04-06 14:38:29 -0300
  • 9dbb219212 Add CSS for lists DavG25 2023-04-06 19:33:11 +0200
  • ef0f748618
    Prevent CPU version of Torch from being installed (#10 from jllllll/oobabooga-windows) oobabooga 2023-04-06 13:54:14 -0300
  • d9e7aba714
    Update README.md oobabooga 2023-04-06 13:42:24 -0300
  • 59058576b5 Remove unused requirement oobabooga 2023-04-06 13:28:21 -0300
  • eec3665845
    Add instructions for updating requirements oobabooga 2023-04-06 13:24:01 -0300
  • 03cb44fc8c Add new llama.cpp library (2048 context, temperature, etc now work) oobabooga 2023-04-06 13:12:14 -0300
  • 39f3fec913
    Broaden GPTQ-for-LLaMA branch support (#820) EyeDeck 2023-04-06 11:16:48 -0400
  • 075b2b4510 Sort imports oobabooga 2023-04-06 12:14:39 -0300
  • 8cd899515e Change instruct html a bit oobabooga 2023-04-06 12:00:20 -0300
  • d71fe9d5ea noted the gradio bug that causes login page not working catalpaaa 2023-04-06 03:04:03 -0700
  • 8d1e346351 fix vacuna won't stop talking catalpaaa 2023-04-06 00:16:43 -0700
  • 4a28f39823
    Update README.md oobabooga 2023-04-06 02:47:27 -0300
  • 77d6a1e12d Restore old branch arguments EyeDeck 2023-04-06 01:45:25 -0400
  • 42fcd0ec7f Broaden GPTQ-for-LLaMA branch support EyeDeck 2023-04-06 01:26:02 -0400
  • 158ec51ae3 Increase instruct mode padding oobabooga 2023-04-06 02:20:52 -0300
  • 0c7ef26981
    Lora trainer improvements (#763) Alex "mcmonkey" Goodwin 2023-04-05 22:04:11 -0700
  • 5e4fb1254b Minor change oobabooga 2023-04-06 02:00:57 -0300
  • 5b301d9a02 Create a Model tab oobabooga 2023-04-06 01:54:05 -0300
  • 4a400320dd Clean up oobabooga 2023-04-06 01:47:00 -0300
  • e94ab5dac1 Minor fixes oobabooga 2023-04-06 01:43:10 -0300
  • 641646a801
    Fix crash if missing instructions directory (#812) Randell Miller 2023-04-05 23:24:22 -0500
  • 3f3e42e26c
    Refactor several function calls and the API oobabooga 2023-04-06 01:22:15 -0300
  • 302e3b7973 Fix api extension oobabooga 2023-04-06 01:21:24 -0300
  • b6cb93fca0 Do not create the API in chat mode oobabooga 2023-04-06 00:53:34 -0300
  • 3ac7c9c80a Create new API oobabooga 2023-04-06 00:42:13 -0300
  • 9c3a585915 Create new API oobabooga 2023-04-06 00:31:58 -0300
  • e6569653b1 Bug fix oobabooga 2023-04-05 23:44:32 -0300
  • 9e31fe65ce Rename variables oobabooga 2023-04-05 23:38:01 -0300
  • 572f1d8bdb Remove unused import oobabooga 2023-04-05 23:34:27 -0300
  • 26935af4b6 Fix preset loading oobabooga 2023-04-05 23:31:31 -0300
  • c58fd41f46 Attempt at fixing preset loading oobabooga 2023-04-05 23:23:18 -0300
  • 119726d986 Fix bug oobabooga 2023-04-05 23:08:47 -0300
  • 23f319bb40 Clean up oobabooga 2023-04-05 23:00:19 -0300
  • 126bbc6970 Clean up oobabooga 2023-04-05 22:55:38 -0300
  • 849a54ef2d Remove variables oobabooga 2023-04-05 22:53:21 -0300
  • f1dd728413 Remove unneeded variables oobabooga 2023-04-05 22:48:11 -0300
  • 9a064b78e6 Minor simplification oobabooga 2023-04-05 22:43:10 -0300
  • 92ea89e59a Reorder parameters oobabooga 2023-04-05 22:39:03 -0300
  • 77232fa68e Rename a variable oobabooga 2023-04-05 22:32:52 -0300
  • cfdbc8bd23 Move more widgets into generation_parameters oobabooga 2023-04-05 22:30:45 -0300
  • 9c4f5f178d Fix impersonate DavG25 2023-04-06 03:26:42 +0200
  • 5606577d13
    Merge branch 'oobabooga:main' into api-example-bash-script SDS 2023-04-06 03:11:51 +0200
  • 64978b45fe Remove broken api oobabooga 2023-04-05 22:00:23 -0300
  • 97e8ea219b Use **kwargs in generate_chat_prompt oobabooga 2023-04-05 21:38:49 -0300
  • 10d7d85dcb
    Fix crash if missing instructions directory Randell Miller 2023-04-05 19:23:33 -0500
  • cf239c1232 Merge branch 'main' into state_as_function_params oobabooga 2023-04-05 19:36:54 -0300
  • 3a38fa18d0
    Image strip captioning and plain language wrapper dibrale 2023-04-05 16:52:49 -0500
  • 378d21e80c
    Add LLaMA-Precise preset (#767) SDS 2023-04-05 23:52:36 +0200
  • 1e656bef25
    Specifically target cuda 11.7 ver. of torch 2.0.0 jllllll 2023-04-05 16:52:05 -0500
  • 0ab3046504
    Merge branch 'oobabooga:main' into patch-1 dibrale 2023-04-05 16:47:18 -0500
  • 613996dd01 Use state as function param oobabooga 2023-04-05 17:22:05 -0300
  • 19b516b11b
    fix link to streaming api example (#803) eiery 2023-04-05 13:50:23 -0400
  • 7617ed5bfd
    Add AMD instructions oobabooga 2023-04-05 14:42:58 -0300
  • 290416bd8a fix link to streaming api example eiery 2023-04-05 13:41:22 -0400
  • 770ef5744f Update README oobabooga 2023-04-05 14:38:11 -0300
  • 8203ce0cac
    Stop character pic from being cached when changing chars or clearing. (#798) Forkoz 2023-04-05 12:25:01 -0500
  • 7f66421369 Fix loading characters oobabooga 2023-04-05 14:22:32 -0300
  • aa6a9aaee3
    Merge branch 'main' into default-character Brian O'Connor 2023-04-05 13:16:28 -0400
  • 90141bc1a8 Fix saving prompts on Windows oobabooga 2023-04-05 14:08:54 -0300
  • cf2c4e740b Disable gradio analytics globally oobabooga 2023-04-05 14:05:50 -0300
  • e722c240af Add Instruct mode oobabooga 2023-04-05 11:49:59 -0300
  • ef71ebf638 improve error messages, especially for LoRA-over-LoRA Alex "mcmonkey" Goodwin 2023-04-05 09:41:07 -0700
  • a00374717b
    Stop character pic from being cached when changing chars or clearing. Forkoz 2023-04-05 10:23:34 -0500
  • 46d2a2d721
    Update api-bash-script.sh: usage and examples SDS 2023-04-05 14:33:56 +0200
  • dafb5d35c5
    Rename api-example.sh to api-bash-script.sh SDS 2023-04-05 14:32:03 +0200
  • b7189a7d25
    Update readme.md innightwolfsleep 2023-04-05 17:36:01 +0600
  • 53c5307aab
    Update readme.md innightwolfsleep 2023-04-05 17:34:55 +0600
  • d42de1e0c4
    Merge branch 'main' into patch-1 ye7iaserag 2023-04-05 11:38:31 +0200
  • 9f1673e3a7
    Merge branch 'oobabooga:main' into patch-1 dibrale 2023-04-05 02:34:35 -0500
  • 3d6cb5ed63 Minor rewrite oobabooga 2023-04-05 01:21:40 -0300
  • f3a2e0b8a9 Disable pre_layer when the model type is not llama oobabooga 2023-04-05 01:19:26 -0300
  • 08820d0769 Move cutoff_len back oobabooga 2023-04-05 00:42:31 -0300
  • ca8bb38949 Simplify gallery oobabooga 2023-04-05 00:34:17 -0300
  • 8781c84287 Add support for latest cuda branch oobabooga 2023-04-05 00:09:53 -0300
  • 4ab679480e
    allow quantized model to be loaded from model dir (#760) catalpaaa 2023-04-04 19:19:38 -0700
  • fe56417f41 Minor change oobabooga 2023-04-04 23:19:14 -0300
  • ae1fe45bc0 One more cache reset oobabooga 2023-04-04 23:15:57 -0300
  • 8ef89730a5 Try to better handle browser image cache oobabooga 2023-04-04 23:09:28 -0300
  • 6fcf13c138
    Merge branch 'main' into default-character Brian O'Connor 2023-04-04 22:07:59 -0400
  • cc6c7a37f3 Add make_thumbnail function oobabooga 2023-04-04 23:03:58 -0300
  • 80dfba05f3 Better crop/resize cached images oobabooga 2023-04-04 22:52:15 -0300
  • 65d8a24a6d Show profile pictures in the Character tab oobabooga 2023-04-04 22:28:49 -0300
  • 5c289a41b9 Use --model-dir argument in path_to_model Charles Nguyen 2023-04-05 03:21:58 +0200
  • f70a2e3ad4
    Second attempt at fixing empty space oobabooga 2023-04-04 18:30:34 -0300
  • 9c86acda67
    Fix huge empty space in the Character tab oobabooga 2023-04-04 18:07:34 -0300
  • 38258c4471 Remove unused locale import da3dsoul 2023-04-04 16:34:45 -0400
  • 1848938f7f Improve Silero Preprocessing to Handle European Numbers and Decimals Also add a test script to generate audio clips from CLI da3dsoul 2023-04-04 16:24:25 -0400
  • 38afc2470c
    Change indentation oobabooga 2023-04-04 16:32:27 -0300
  • b2ce7282a1
    Use past transformers version #773 oobabooga 2023-04-04 16:11:42 -0300
  • d5f3036687 Improve Silero's Preprocessor to Handle Abbreviations and Initials Better da3dsoul 2023-04-04 14:15:58 -0400
  • 5aaf771c7d
    Add additional sanity check jllllll 2023-04-04 12:31:26 -0500
  • 9a5e27889b
    tested 8bit, added examples for 8bit model download/cli args to start loeken 2023-04-04 18:45:38 +0200
  • ee4547cd34
    Detect "vicuna" as llama model type (#772) OWKenobi 2023-04-04 18:23:27 +0200
  • 14a6a5f623
    Merge pull request #1 from OWKenobi/OWKenobi-vicuna OWKenobi 2023-04-04 18:14:58 +0200
  • ba9537ad19
    Merge branch 'oobabooga:main' into OWKenobi-vicuna OWKenobi 2023-04-04 18:12:35 +0200
  • 881dbc3d44
    Add back the name oobabooga 2023-04-04 13:11:34 -0300
  • 347c9b10a3
    Merge branch 'oobabooga:main' into OWKenobi-vicuna OWKenobi 2023-04-04 18:08:11 +0200