Commit Graph

  • bee73cedbd
    Streamline GPTQ-for-LLaMa support jllllll 2023-08-09 23:42:34 -0500
  • 376fdb3caa Minor changes oobabooga 2023-08-09 19:24:09 -0700
  • 87e6dc10ea Add Vicuna-v1.5 detection berkut1 2023-08-09 23:54:04 +0300
  • 45f25643f6
    Merge branch 'oobabooga:main' into main Gennadij 2023-08-09 23:07:47 +0300
  • a3295dd666 Detect n_gqa and prompt template for wizardlm-70b oobabooga 2023-08-09 10:38:35 -0700
  • 0c83af97e0
    Bump aiofiles from 23.1.0 to 23.2.1 dependabot[bot] 2023-08-09 15:32:11 +0000
  • a4e48cbdb6 Bump AutoGPTQ oobabooga 2023-08-09 08:31:17 -0700
  • fb1ad4b08e
    Bump safetensors from 0.3.1 to 0.3.2 dependabot[bot] 2023-08-09 15:09:20 +0000
  • e0b4d26214
    Bump aiofiles from 23.1.0 to 23.2.0 dependabot[bot] 2023-08-09 15:09:13 +0000
  • 7c1300fab5 Pin aiofiles version to fix statvfs issue oobabooga 2023-08-09 08:07:55 -0700
  • 6c6a52aaad Change the filenames for caches and histories oobabooga 2023-08-09 07:47:19 -0700
  • 58fd60cb72
    Merge pull request #17 from Cognitage/main Ricardo Pinto 2023-08-09 14:25:59 +0100
  • eb94a23382
    Merge branch 'cognitage' into main Ricardo Pinto 2023-08-09 14:24:48 +0100
  • 2255349f19 Update README oobabooga 2023-08-09 05:46:25 -0700
  • 5bfcfcfc5a
    Added the logic for starchat model series (#3185) GiganticPrime 2023-08-09 21:26:12 +0900
  • 57fe533bbb Added configuration for starchat model series. And removed logic I added in last commit. GiganticPrime 2023-08-09 07:48:20 +0000
  • fa4a948b38
    Allow users to write one flag per line in CMD_FLAGS.txt oobabooga 2023-08-09 01:58:23 -0300
  • d8fb506aff Add RoPE scaling support for transformers (including dynamic NTK) oobabooga 2023-08-08 21:24:28 -0700
  • f4caaf337a
    Fix superbooga when using regenerate (#3362) Hans Raaf 2023-08-09 04:26:28 +0200
  • 430e90e0b7 Remove file oobabooga 2023-08-08 18:44:41 -0700
  • 2d2fbef7f6 Minor changes oobabooga 2023-08-08 18:44:14 -0700
  • 901b028d55
    Add option for named cloudflare tunnels (#3364) Friedemann Lipphardt 2023-08-09 03:20:27 +0200
  • 58bcdcbef1 Minor changes oobabooga 2023-08-08 18:16:57 -0700
  • 891d64b5d6 Documentation update oobabooga 2023-08-08 15:24:48 -0700
  • e44dc09736 Add AutoGPTQ.md oobabooga 2023-08-08 15:17:31 -0700
  • ce773f8370 Remove unused param & monkey patch code oobabooga 2023-08-08 15:08:34 -0700
  • 47c5bfca56 Remove GPTQ-for-LLaMA support oobabooga 2023-08-08 14:31:55 -0700
  • 4ba30f6765 Add OpenChat template oobabooga 2023-08-08 14:10:04 -0700
  • bf08b16b32 Fix disappearing profile picture bug oobabooga 2023-08-08 14:09:01 -0700
  • b7447cefa0
    Merge pull request #1 from Thutmose3/improve-readability Thomas De Bonnet 2023-08-08 10:54:17 +0200
  • 6fa23e3031
    Chore: Improve readability of download-model.py Thomas De Bonnet 2023-08-08 10:53:10 +0200
  • 0e78f3b4d4
    Fixed a typo in "rms_norm_eps", incorrectly set as n_gqa (#3494) Gennadij 2023-08-08 06:31:11 +0300
  • fd6f65d308 Fixed a typo when displaying parameters on the llamm.cpp model did not correctly display "rms_norm_eps" berkut1 2023-08-08 06:17:52 +0300
  • 37fb719452
    Increase the Context/Greeting boxes sizes oobabooga 2023-08-08 00:09:00 -0300
  • 6d354bb50b
    Allow the webui to do multiple tasks simultaneously oobabooga 2023-08-07 23:57:25 -0300
  • 584dd33424
    Fix missing example_dialogue when uploading characters oobabooga 2023-08-07 23:44:59 -0300
  • bbe4a29a25
    Add back dark theme code oobabooga 2023-08-07 23:03:09 -0300
  • 2d0634cd07 Bump transformers commit for positive prompts oobabooga 2023-08-07 08:57:19 -0700
  • 3b27404865
    Make dockerfile respect specified cuda version (#3474) Sam 2023-08-07 23:19:16 +1000
  • 412f6ff9d3 Change alpha_value maximum and step oobabooga 2023-08-07 06:08:51 -0700
  • a373c96d59 Fix a bug in modules/shared.py oobabooga 2023-08-06 20:36:35 -0700
  • 2cf64474f2
    Use chat_instruct_command in API (#3482) jllllll 2023-08-06 21:46:25 -0500
  • 91a42ec352 Minor change oobabooga 2023-08-06 19:44:22 -0700
  • 3d48933f27 Remove ancient deprecation warnings oobabooga 2023-08-06 18:58:59 -0700
  • b8dc832528
    Use chat_instruct_command in API jllllll 2023-08-06 20:45:37 -0500
  • c237ce607e Move characters/instruction-following to instruction-templates oobabooga 2023-08-06 17:50:07 -0700
  • 65aa11890f
    Refactor everything (#3481) oobabooga 2023-08-06 21:49:27 -0300
  • f01efedac3 Add missing event handlers oobabooga 2023-08-06 17:05:10 -0700
  • 20d1e91eb1 Fix extensions oobabooga 2023-08-06 16:54:26 -0700
  • 13f9a64e73 Simplify password processing oobabooga 2023-08-06 16:44:31 -0700
  • 50f171baf6 Change some comments oobabooga 2023-08-06 16:36:22 -0700
  • 65fab0f643 Organize modules/text-generation.py oobabooga 2023-08-06 16:25:51 -0700
  • bc5a229ae3 Reorganize oobabooga 2023-08-06 16:19:24 -0700
  • 73b8c083d3 Change some comments oobabooga 2023-08-06 15:43:36 -0700
  • bc2bfbe259 Rename a function oobabooga 2023-08-06 15:33:44 -0700
  • a6424e634c Revert "Move characters/instruction-following to instruction-templates" oobabooga 2023-08-06 15:26:33 -0700
  • e2f99f1c74 Move characters/instruction-following to instruction-templates oobabooga 2023-08-06 15:25:12 -0700
  • 9d4f194fc1 Add a comment oobabooga 2023-08-06 14:08:05 -0700
  • ea22450b87 Space oobabooga 2023-08-06 14:06:16 -0700
  • 9989d0159b Fix the audio notification oobabooga 2023-08-06 13:53:59 -0700
  • d4b851bdc8 Credit turboderp oobabooga 2023-08-06 13:42:43 -0700
  • 07c9c02e48 Remove duplicate code oobabooga 2023-08-06 13:39:17 -0700
  • f2f2fe727b Merge branch 'main' into refactor oobabooga 2023-08-06 13:28:28 -0700
  • 0af10ab49b
    Add Classifier Free Guidance (CFG) for Transformers/ExLlama (#3325) oobabooga 2023-08-06 17:22:48 -0300
  • e1feb5488d Fix preset saving oobabooga 2023-08-06 13:19:06 -0700
  • d9526dcb19 Fix regression oobabooga 2023-08-06 13:09:56 -0700
  • b9c8071182 Add to the API oobabooga 2023-08-06 13:07:13 -0700
  • 23c6d63503 Change the minimum max_seq_len for ExLlama in the UI oobabooga 2023-08-06 12:59:26 -0700
  • 95df726b4a Fix a bug oobabooga 2023-08-06 12:42:00 -0700
  • 4abffd7b10 Merge branch 'main' into cfg oobabooga 2023-08-06 12:40:08 -0700
  • c06a1019e1 Update to transformers dev branch oobabooga 2023-08-06 12:35:11 -0700
  • faa758af77 Move a common declaration to server.py oobabooga 2023-08-06 12:18:41 -0700
  • 972fc54587 Change a comment oobabooga 2023-08-06 12:10:45 -0700
  • 9c226445de Remove duplicate function oobabooga 2023-08-06 12:03:00 -0700
  • f10d3cea53 Organize oobabooga 2023-08-06 11:57:17 -0700
  • 20d1192e7e Refactor everything oobabooga 2023-08-06 11:49:38 -0700
  • 2e03630c53 just test[do not merge] tianchen z 2023-08-06 10:25:23 -0700
  • 08e6dfde35 pycodestyle cleanup MB7979 2023-08-06 12:24:07 +0100
  • 334efe987c Fixes to splitting of raw strings MB7979 2023-08-06 11:57:33 +0100
  • 16240582aa
    Make dockerfile respect specified cuda version Sam 2023-08-06 16:31:27 +1000
  • 5134878344
    Fix chat message order (#3461) missionfloyd 2023-08-05 10:53:54 -0600
  • 44f31731af
    Create logs dir if missing when saving history (#3462) jllllll 2023-08-05 11:47:16 -0500
  • 5ee95d126c
    Bump exllama wheels to 0.0.10 (#3467) jllllll 2023-08-05 11:46:14 -0500
  • 9dcb37e8d4
    Fix: Mirostat fails on models split across multiple GPUs Forkoz 2023-08-05 16:45:47 +0000
  • 39c7b98b55
    Bump exllama wheels to 0.0.10 jllllll 2023-08-05 10:52:18 -0500
  • 9e17325207
    Add CMD_FLAGS.txt functionality to WSL installer (#119) jllllll 2023-08-05 08:26:24 -0500
  • a61825e8cd
    sampler_hijack.py: put topk on cuda Forkoz 2023-08-05 10:46:12 +0000
  • d0126be6fc
    Create logs dir if missing when saving history jllllll 2023-08-05 04:26:43 -0500
  • c8f84c6561 Fix chat style missionfloyd 2023-08-05 00:09:09 -0600
  • 0f0009ef7e Render messages in normal order missionfloyd 2023-08-04 23:41:38 -0600
  • 23055b21ee
    [Bug fix] Remove html tags form the Prompt sent to Stable Diffusion (#3151) SodaPrettyCold 2023-08-05 07:20:28 +0800
  • 12a94798b6 Remove duplicate import oobabooga 2023-08-04 16:19:44 -0700
  • 5b3c06c158
    Bump fastapi from 0.95.2 to 0.101.0 dependabot[bot] 2023-08-04 22:29:35 +0000
  • 6e30f76ba5
    Bump bitsandbytes to 0.41.1 (#3457) jllllll 2023-08-04 17:28:59 -0500
  • 11908b7256
    Bump bitsandbytes to 0.41.1 jllllll 2023-08-04 16:46:34 -0500
  • 7f438e2cb2 [vits] add new extension tianchen z 2023-08-04 11:04:27 -0700
  • 8df3cdfd51
    Add SSL certificate support (#3453) oobabooga 2023-08-04 13:57:31 -0300
  • 01f10a1ea9 Simplify the syntax oobabooga 2023-08-04 09:56:31 -0700
  • bfa905fc07 Add SSL certificate support oobabooga 2023-08-04 09:46:36 -0700
  • 2a291774f1 hotfix in get characters Paulo Henrique Silveira 2023-08-04 12:15:28 -0300