Commit Graph

  • debda65833 Reorder parameters oobabooga 2023-05-22 19:31:04 -0300
  • a3c45f77b0 Add a dark_theme settings RDeckard 2023-05-22 23:15:00 +0200
  • 7eb2bfd12b Change step for mirostat_tau oobabooga 2023-05-22 18:02:13 -0300
  • 9bfb10e0bc Add mirostat parameters for llama.cpp oobabooga 2023-05-22 17:55:25 -0300
  • 393ec243e1 Support of the --gradio-auth flag RDeckard 2023-05-22 21:28:26 +0200
  • 9f747d462e
    Merge branch 'main' into main flurb18 2023-05-22 11:28:42 -0400
  • ec7437f00a
    Better way to toggle light/dark mode oobabooga 2023-05-22 03:19:01 -0300
  • d46f5a58a3 Add a button for toggling dark/light mode oobabooga 2023-05-22 03:11:44 -0300
  • c6f4dccce5
    Bump llama-cpp-python from 0.1.51 to 0.1.53 dependabot[bot] 2023-05-22 05:51:01 +0000
  • baf75356d4
    Bump transformers from 4.29.1 to 4.29.2 (#2268) dependabot[bot] 2023-05-22 02:50:18 -0300
  • 4ba358e2c7
    Bump gradio from 3.31.0 to 3.32.0 dependabot[bot] 2023-05-22 05:49:47 +0000
  • 01824c4594
    Bump rwkv from 0.7.3 to 0.7.4 dependabot[bot] 2023-05-22 05:49:35 +0000
  • c26bc928bd
    Bump transformers from 4.29.1 to 4.29.2 dependabot[bot] 2023-05-22 05:49:31 +0000
  • 4372eb228c Increase the interface area by 10px oobabooga 2023-05-22 00:55:33 -0300
  • 753f6c5250 Attempt at making interface restart more robust oobabooga 2023-05-22 00:26:07 -0300
  • 30225b9dd0 Fix --no-stream queue bug oobabooga 2023-05-22 00:02:59 -0300
  • 07233a85c4 fixed num_chunks nabelekm 2023-05-22 04:37:43 +0200
  • 288912baf1 Add a description for the extensions checkbox group oobabooga 2023-05-21 23:33:37 -0300
  • 6e77844733 Add a description for penalty_alpha oobabooga 2023-05-21 23:08:44 -0300
  • d63ef59a0f Apply LLaMA-Precise preset to Vicuna by default oobabooga 2023-05-21 23:00:42 -0300
  • e3d578502a Improve "Chat settings" tab appearance a bit oobabooga 2023-05-21 22:58:14 -0300
  • dcc3e54005 Various "impersonate" fixes oobabooga 2023-05-21 22:54:28 -0300
  • e116d31180 Prevent unwanted log messages from modules oobabooga 2023-05-21 22:42:34 -0300
  • 3ad880e396 Added API endpoints for superbooga nabelekm 2023-05-22 03:26:45 +0200
  • fb91406e93 Fix generation_attempts continuing after an empty reply oobabooga 2023-05-21 22:14:50 -0300
  • e18534fe12 Fix "continue" in chat-instruct mode oobabooga 2023-05-21 22:05:59 -0300
  • a3cd2275ec
    update llama-cpp-python to v0.1.53 for ggml v3, fixes #2245 eiery 2023-05-21 20:30:16 -0400
  • d16c496a05 circular import Flurb 2023-05-21 17:44:06 -0400
  • daaac8ac26 Initial changes Flurb 2023-05-21 17:23:46 -0400
  • d7fabe693d Reorganize parameters tab oobabooga 2023-05-21 16:24:35 -0300
  • 8ac3636966
    Add epsilon_cutoff/eta_cutoff parameters (#2258) oobabooga 2023-05-21 15:11:57 -0300
  • 16ca30833b Add parameters to openai extension oobabooga 2023-05-21 15:06:00 -0300
  • d70c146396 Fix a label oobabooga 2023-05-21 15:01:03 -0300
  • 84eb5fce6b Add epsilon_cutoff/eta_cutoff oobabooga 2023-05-21 14:59:10 -0300
  • 767a767989 Fix elevenlabs_tts too oobabooga 2023-05-21 14:11:46 -0300
  • 1e5821bd9e Fix silero tts autoplay (attempt #2) oobabooga 2023-05-21 13:24:54 -0300
  • a5d5bb9390 Fix silero tts autoplay oobabooga 2023-05-21 12:11:59 -0300
  • c916fe1261 Fix Merge Conflicts da3dsoul 2023-05-21 09:22:18 -0400
  • 3296c0b68a Merge remote-tracking branch 'upstream/main' da3dsoul 2023-05-21 09:19:34 -0400
  • 78b2478d9c
    assistant: space fix, system: prompt fix (#2219) matatonic 2023-05-20 22:32:34 -0400
  • 05593a7834 Minor bug fix oobabooga 2023-05-20 23:22:36 -0300
  • 9c53517d2c
    Fix superbooga error when querying empty DB (Issue #2160) (#2212) Luis Lopez 2023-05-21 09:27:22 +0800
  • 20d8d59246 Style changes oobabooga 2023-05-20 22:26:12 -0300
  • ab6acddcc5
    Add Save/Delete character buttons (#1870) Matthew McAllister 2023-05-20 17:48:45 -0700
  • a9ad5d7108 Minor changes oobabooga 2023-05-20 21:46:13 -0300
  • b920d00555 Change a font oobabooga 2023-05-20 21:43:16 -0300
  • 2c7462d328 Some qol changes oobabooga 2023-05-20 21:41:30 -0300
  • aacc6c0dc8 Add a confirmation for deleting a character oobabooga 2023-05-20 21:33:30 -0300
  • 40cce26418 Change the layout / ask the user for the file name oobabooga 2023-05-20 21:29:34 -0300
  • 7872ff8a4b Merge branch 'main' into matthew-mcallister-character-saving oobabooga 2023-05-20 20:29:45 -0300
  • 722bfbbfe3
    use pkg_resources to display the runtime version of llama-cpp-python Brandon McClure 2023-05-20 16:06:00 -0600
  • c5af549d4b
    Add chat API (#2233) oobabooga 2023-05-20 18:42:17 -0300
  • 75c68dc0de Fix a bug oobabooga 2023-05-20 18:31:47 -0300
  • 72e619c37d Merge branch 'main' into history-state oobabooga 2023-05-20 18:16:12 -0300
  • f2627f7971 Add chat API oobabooga 2023-05-20 17:58:10 -0300
  • 2aa01e2303
    Fix broken version of peft (#2229) jllllll 2023-05-20 15:54:51 -0500
  • fca0c2dd8b
    set host port back to 7860 from 7861 Minecrafter20 2023-05-20 15:16:52 -0500
  • 6208b4c846
    Fix broken version of peft jllllll 2023-05-20 13:59:28 -0500
  • f67ea705f2 assistant: space fix, system: prompt fix Matthew Ashton 2023-05-20 11:19:01 -0400
  • fe60cbb78b
    Update server.py ChobPT 2023-05-20 13:02:33 +0100
  • 14ae6655e4 Fix superbooga error when querying empty DB toast22a 2023-05-20 17:06:18 +0800
  • 159eccac7e
    Update Audio-Notification.md oobabooga 2023-05-19 23:20:42 -0300
  • a3e9769e31
    Added an audible notification after text generation in web. (#1277) HappyWorldGames 2023-05-20 05:16:06 +0300
  • 90b16510fb Change docs oobabooga 2023-05-19 23:12:44 -0300
  • 5ac3f16084 Merge branch 'main' into HappyWorldGames-detached oobabooga 2023-05-19 23:11:07 -0300
  • 1b52bddfcc
    Mitigate UnboundLocalError (#2136) Konstantin Gukov 2023-05-19 19:46:18 +0200
  • 50c70e28f0
    Lora Trainer improvements, part 6 - slightly better raw text inputs (#2108) Alex "mcmonkey" Goodwin 2023-05-19 08:58:54 -0700
  • 511470a89b Bump llama-cpp-python version oobabooga 2023-05-19 12:13:25 -0300
  • 5a007612cd Minor changes oobabooga 2023-05-19 12:00:36 -0300
  • 9cb9ee902e Merge branch 'main' into mcmonkey4eva-lora-improvements-6 oobabooga 2023-05-19 11:53:47 -0300
  • 8b4ea45a6b Merge branch 'main' into mcmonkey4eva-lora-improvements-6 oobabooga 2023-05-19 11:53:10 -0300
  • a9733d4a99
    Metharme context fix (#2153) Carl Kenner 2023-05-20 00:16:13 +0930
  • c86231377b
    Wizard Mega, Ziya, KoAlpaca, OpenBuddy, Chinese-Vicuna, Vigogne, Bactrian, H2O support, fix Baize (#2159) Carl Kenner 2023-05-20 00:12:41 +0930
  • 5d3d7cbba6 Fix regression oobabooga 2023-05-19 11:39:12 -0300
  • 182b9197d9 Merge branch 'main' into CarlKenner-wizard-mega oobabooga 2023-05-19 11:34:59 -0300
  • c98d6ad27f
    Create chat_style-messenger.css (#2187) Mykeehu 2023-05-19 16:31:06 +0200
  • 499c2e009e Remove problematic regex from models/config.yaml oobabooga 2023-05-19 11:20:35 -0300
  • 9d5025f531 Improve error handling while loading GPTQ models oobabooga 2023-05-19 11:20:08 -0300
  • 39dab18307 Add a timeout to download-model.py requests oobabooga 2023-05-19 11:19:34 -0300
  • 19fa083096 Use a libary for log coloring Konstantin Gukov 2023-05-19 12:11:50 +0200
  • 9348035e80 Manticore support CarlKenner 2023-05-19 16:56:18 +0930
  • 309767a4e6
    Create chat_style-messenger.css Mykeehu 2023-05-19 07:38:49 +0200
  • 69eaf0f242
    Update llama.cpp-models.md Crimsonfart 2023-05-19 01:12:58 +0200
  • 4b667bf164
    Update docker-compose.yml to reflect the move to the "docker" folder instead of being at the root. ramblingcoder 2023-05-18 17:11:44 -0500
  • 4ef2de3486
    Fix dependencies downgrading from gptq install (#61) jllllll 2023-05-18 10:46:04 -0500
  • bd5812d758 h2o model_types CarlKenner 2023-05-19 00:03:56 +0930
  • 1ecc92e95f Support H2O instruction formats CarlKenner 2023-05-18 23:38:51 +0930
  • 07510a2414
    Change a message oobabooga 2023-05-18 10:58:37 -0300
  • 0bcd5b6894
    Soothe anxious users oobabooga 2023-05-18 10:56:49 -0300
  • 9b9c11bd6e Better recognise Open Assistant and some Alpaca models. Fix Bactrian. CarlKenner 2023-05-18 21:34:45 +0930
  • 363c8e398b Bactrian support CarlKenner 2023-05-18 20:31:56 +0930
  • 5845d65234 Support TheBloke/LLaMa-65B-GPTQ-3bit CarlKenner 2023-05-18 20:07:26 +0930
  • 7aff78770d Vigogne support CarlKenner 2023-05-18 19:53:47 +0930
  • d269c4e909 "Chinese-Vicuna" support (not actually Vicuna format) CarlKenner 2023-05-18 19:01:05 +0930
  • 16ffb8f34d OpenBuddy support CarlKenner 2023-05-18 18:02:28 +0930
  • 48602ff29d KoAlpaca support CarlKenner 2023-05-18 17:32:58 +0930
  • 46096afc4d Add Ziya-LLaMA-13B-v1 instruction support CarlKenner 2023-05-18 15:51:27 +0930
  • 59a0f0741a Fix Baize instruction format CarlKenner 2023-05-18 15:44:09 +0930
  • cf45d6033b Add Wizard Mega support. There are three characters because it was trained on two different prompt formats. Also fixed order of GPTQ in config.yaml (defaults must come before specified values). CarlKenner 2023-05-18 14:40:18 +0930
  • 66d904d852 Remove irrelevant Metharme context about text adventure games CarlKenner 2023-05-18 12:06:02 +0930