Commit Graph

  • 58b34c0841 Fix chat_prompt_size oobabooga 2023-04-10 20:06:42 -0300
  • 16c780ef98
    Merge branch 'oobabooga:main' into webui.conf sukipop 2023-04-10 15:37:27 -0600
  • 39fa6e57cc
    Bump llama-cpp-python from 0.1.30 to 0.1.32 dependabot[bot] 2023-04-10 21:05:51 +0000
  • 5234071c04 Improve Instruct mode text readability oobabooga 2023-04-10 17:41:07 -0300
  • 09d8119e3c
    Add CPU LoRA training (#938) IggoOnCode 2023-04-10 22:29:00 +0200
  • 0caf718a21
    add on-page documentation to parameters (#1008) Alex "mcmonkey" Goodwin 2023-04-10 13:19:12 -0700
  • e5c30d7d65 Minor changes, reactivate min_length oobabooga 2023-04-10 17:15:03 -0300
  • 85a7954823 Update settings-template.json oobabooga 2023-04-10 16:53:07 -0300
  • d37b4f76b1 Merge branch 'main' of github.com:oobabooga/text-generation-webui oobabooga 2023-04-10 16:45:09 -0300
  • bd04ff27ad Make the bos token optional oobabooga 2023-04-10 16:44:22 -0300
  • f035b01823
    Update README.md oobabooga 2023-04-10 16:20:23 -0300
  • b7ca89ba3f
    Mention that build-essential is required (#1013) Jeff Lefebvre 2023-04-10 14:19:10 -0500
  • 52339e9b20
    add make/g++ to docker (#1015) loeken 2023-04-10 21:18:07 +0200
  • 4961f43702 Improve header bar colors oobabooga 2023-04-10 16:15:16 -0300
  • 617530296e Instruct mode color/style improvements oobabooga 2023-04-10 16:04:21 -0300
  • 14960660e9
    Update readme.md to reflect llama-cpp-python requirements m40l 2023-04-10 11:49:04 -0700
  • 0f1627eff1 Don't treat Intruct mode histories as regular histories oobabooga 2023-04-10 15:48:07 -0300
  • 5172fbc87b
    Merge branch 'oobabooga:main' into patch-1 dibrale 2023-04-10 13:44:00 -0500
  • faec3e8ed3
    tested on windows needs make/g++ to build new dependency llama cpp loeken 2023-04-10 20:36:05 +0200
  • 13208656c0
    Resolved ubuntu dependancy CMAKE_C_COMPILER Jeff Lefebvre 2023-04-10 13:10:00 -0500
  • 8a3168abd3
    Merge branch 'oobabooga:main' into patch-2 ImpactFrames 2023-04-10 18:28:49 +0100
  • 0a89bc026c
    script.py ImpactFrames 2023-04-10 18:05:56 +0100
  • 143141f098
    script.py ImpactFrames 2023-04-10 16:35:36 +0100
  • 751934a620 add on-page documentation to parameters Alex "mcmonkey" Goodwin 2023-04-10 08:16:52 -0700
  • 94c6ef1c8b
    script.py ImpactFrames 2023-04-10 16:12:47 +0100
  • 086c77f5fe Using --cpu parameter to enable cpu training. IggoOnCode 2023-04-10 17:07:39 +0200
  • d679c4be13 Change a label oobabooga 2023-04-10 11:44:37 -0300
  • 45244ed125 More descriptive download info oobabooga 2023-04-10 11:42:12 -0300
  • 7e70741a4e
    Download models from Model tab (#954 from UsamaKenway/main) oobabooga 2023-04-10 11:38:30 -0300
  • 11b23db8d4 Remove unused imports oobabooga 2023-04-10 11:37:42 -0300
  • 2c14df81a8 Use download-model.py to download the model oobabooga 2023-04-10 11:36:39 -0300
  • db07e844db
    script.py ImpactFrames 2023-04-10 15:19:51 +0100
  • c6e9ba20a4 Merge branch 'main' into UsamaKenway-main oobabooga 2023-04-10 11:14:03 -0300
  • 843f672227
    fix random seeds to actually randomize (#1004 from mcmonkey4eva/seed-fix) oobabooga 2023-04-10 10:56:12 -0300
  • 769aa900ea Print the used seed oobabooga 2023-04-10 10:53:31 -0300
  • 254609daca
    Update llama-cpp-python link to official wheel (#19) jllllll 2023-04-10 08:48:56 -0500
  • 4b39490f46 Issue 1001: max_new_tokens maxed ignores previous conversation andresdelcampo 2023-04-10 15:46:46 +0200
  • 32d078487e Add llama-cpp-python to requirements.txt oobabooga 2023-04-10 10:45:51 -0300
  • 30befe492a fix random seeds to actually randomize Alex "mcmonkey" Goodwin 2023-04-10 06:29:10 -0700
  • 7812b03880 Added CPU training. IggoOnCode 2023-04-07 19:12:18 +0200
  • 001db69b45
    Merge branch 'main' into custom-stopping-strings catalpaaa 2023-04-09 23:35:01 -0700
  • 9a9740bc01
    Updated to add textbox to save custom prompt name BlueprintCoding 2023-04-09 22:20:11 -0600
  • c3e1a58cb3
    Correct llama-cpp-python wheel link (#17) jllllll 2023-04-09 21:46:54 -0500
  • 1911504f82 Minor bug fix oobabooga 2023-04-09 23:45:41 -0300
  • 8178fde2cb
    Added dropdown to character bias. (#986) BlueprintCoding 2023-04-09 20:44:31 -0600
  • c025d98893 Add blank lines oobabooga 2023-04-09 23:43:32 -0300
  • dba2000d2b Do things that I am not proud of oobabooga 2023-04-09 23:40:17 -0300
  • 97840c92f9
    Add working llamaa-cpp-python install from wheel. (#13 from Loufe/oobabooga-windows) oobabooga 2023-04-09 23:23:27 -0300
  • 65552d2157 Merge branch 'main' of github.com:oobabooga/text-generation-webui oobabooga 2023-04-09 23:19:53 -0300
  • 8c6155251a More robust 4-bit model loading oobabooga 2023-04-09 23:19:28 -0300
  • 992663fa20
    Added xformers support to Llama (#950) MarkovInequality 2023-04-09 22:08:40 -0400
  • 595ab10a44 Minor changes oobabooga 2023-04-09 23:08:04 -0300
  • 9a0616b23d Minor readability changes oobabooga 2023-04-09 23:06:48 -0300
  • 7d5b73f318 Add to README, add more flashy warning oobabooga 2023-04-09 23:02:07 -0300
  • b8e2f3b48d
    Merge branch 'oobabooga:main' into load-character Brian O'Connor 2023-04-09 21:28:31 -0400
  • 625d81f495
    Update character log logic (#977) Brian O'Connor 2023-04-09 21:20:21 -0400
  • 57f768eaad Better preset in api-example.py oobabooga 2023-04-09 22:18:40 -0300
  • 1d2177efcf Remove errant space Brian O'Connor 2023-04-09 21:07:21 -0400
  • 63e52f38ec First pass at chat API Brian O'Connor 2023-04-09 20:51:33 -0400
  • a3085dba07 Fix LlamaTokenizer eos_token (attempt) oobabooga 2023-04-09 21:19:39 -0300
  • 120f5662cf Better handle spaces for Continue oobabooga 2023-04-09 20:37:31 -0300
  • b27d757fd1 Minor change oobabooga 2023-04-09 20:06:20 -0300
  • d29f4624e9 Add a Continue button to chat mode oobabooga 2023-04-09 20:04:16 -0300
  • 2b523f782f Update character log logic Brian O'Connor 2023-04-09 17:03:43 -0400
  • f0d544756e Add ability to load a character via args Brian O'Connor 2023-04-09 16:54:15 -0400
  • 787b7e208b
    Merge pull request #1 from BlueprintCoding/BlueprintCoding-CharacterBiasUpgraded BlueprintCoding 2023-04-09 14:18:53 -0600
  • 9cd243a4b0
    Updated Script.py to add a dropdown BlueprintCoding 2023-04-09 14:18:10 -0600
  • 170e0c05c4 Typo oobabooga 2023-04-09 17:00:59 -0300
  • 34ec02d41d Make download-model.py importable oobabooga 2023-04-09 16:59:59 -0300
  • 1b9aaa90bc Fix message text overflow DavG25 2023-04-09 21:10:39 +0200
  • 821f4106e7 Fix code snippet text overflow DavG25 2023-04-09 21:05:35 +0200
  • 9b9a16b2f9 Merge branch 'main' of https://github.com/oobabooga/text-generation-webui myluki2000 2023-04-09 20:35:46 +0200
  • 7a8da9a211 lora training fixes: Fix wrong input format being picked Fix crash when an entry in the dataset has an attribute of value None myluki2000 2023-04-09 20:01:14 +0200
  • f91d3a3ff4 server.py readability oobabooga 2023-04-09 14:46:32 -0300
  • 33475255cb Added print statement to indicate loading in sdp_attention MarkovInequality 2023-04-09 11:32:47 -0400
  • ebdf4c8c12 path fixed Usama Kenway 2023-04-09 16:53:21 +0500
  • 9579226895
    Moveed init of no_config to arg parser sukipop 2023-04-09 05:18:35 -0600
  • 7436dd5b4a download custom model menu (from hugging face) added in model tab Usama Kenway 2023-04-09 16:11:43 +0500
  • 1fefabdebb Added sdp attention support to Llama MarkovInequality 2023-04-09 07:00:42 -0400
  • a7be688ec3
    Updated README.md sukipop 2023-04-09 03:32:21 -0600
  • b08219fa9a Added xformers support to Llama MarkovInequality 2023-04-09 04:57:56 -0400
  • 4d3c9c80a3
    Added config opts and webui.conf info sukipop 2023-04-09 02:34:43 -0600
  • 0086fd2da5
    Rewrote get_args_from_file sukipop 2023-04-09 02:17:11 -0600
  • a899ab4420
    Added config args to parser sukipop 2023-04-09 02:02:32 -0600
  • d4c1c7772f
    Defined no_config in Default settings sukipop 2023-04-09 02:01:17 -0600
  • 1eb8f24913
    Refactor arg parsing & added config opts sukipop 2023-04-09 01:52:27 -0600
  • 0c88fbc523
    moved get_args_from_file sukipop 2023-04-09 01:44:17 -0600
  • f6824d1666
    Created a modify version ImpactFrames 2023-04-09 07:51:47 +0100
  • b11cdcaeb3
    Merge branch 'oobabooga:main' into webui.conf sukipop 2023-04-09 00:40:48 -0600
  • bce1b7fbb2
    Update README.md oobabooga 2023-04-09 02:19:40 -0300
  • f7860ce192
    Update README.md oobabooga 2023-04-09 02:19:17 -0300
  • ece8ed2c84
    Update README.md oobabooga 2023-04-09 02:18:42 -0300
  • f294e050ff
    Added comments sukipop 2023-04-08 22:41:51 -0600
  • cc693a7546 Remove obsolete code oobabooga 2023-04-09 00:51:07 -0300
  • 0818bc93ad Add working llamaa-cpp-python install from wheel. Lou Bernardi 2023-04-08 22:44:55 -0400
  • 90cd9a3f9a
    Add check to skip lines starting with '#' sukipop 2023-04-08 20:32:38 -0600
  • 3e2f28745e
    Added comment above function get_args_from_file sukipop 2023-04-08 20:01:22 -0600
  • 9d4895d0b1
    Update to handle quoted values in webui.conf sukipop 2023-04-08 19:59:38 -0600
  • 533fed6b03
    Merge branch 'oobabooga:main' into webui.conf sukipop 2023-04-08 19:54:30 -0600
  • 2fde50a800
    Delete docker.md oobabooga 2023-04-08 22:37:54 -0300