Commit Graph

  • 890897cf4c Merge branch 'main' into happyme531-stop-everything-blocking oobabooga 2023-06-24 11:17:57 -0300
  • b735a7e86b Merge branch 'main' into fix-tokenizer yiximail 2023-06-24 22:11:32 +0800
  • 3a5010726f Merge branch 'main' into api yiximail 2023-06-24 22:10:29 +0800
  • ec482f3dae Apply input extensions after yielding *Is typing...* oobabooga 2023-06-24 11:07:11 -0300
  • 3e80f2aceb Apply the output extensions only once oobabooga 2023-06-24 10:59:07 -0300
  • 8c9ecabe10
    Make stop_everything work with non-streamed generation happyme531 2023-06-24 21:47:18 +0800
  • 77baf43f6d
    Add CORS support to the API (#2718) rizerphe 2023-06-24 16:16:06 +0300
  • 8c36c19218
    8k size only for minotaur-15B (#2815) matatonic 2023-06-24 09:14:19 -0400
  • 38897fbd8a
    fix: added model parameter check (#2829) Roman 2023-06-24 14:09:34 +0100
  • eac8450ef7
    Move special character check to start script (#92) jllllll 2023-06-24 08:06:35 -0500
  • 51a388fa34
    Organize chat history/character import menu (#2845) missionfloyd 2023-06-24 06:55:02 -0600
  • 8bb3bb39b3
    Implement stopping string search in string space (#2847) oobabooga 2023-06-24 09:43:00 -0300
  • 99965df5b9 Bug fix oobabooga 2023-06-24 09:39:37 -0300
  • 82048811b2 Fix a small issue oobabooga 2023-06-24 09:18:43 -0300
  • 199c46d2d1 Implement stopping string search in string space oobabooga 2023-06-24 08:56:10 -0300
  • ad234898d5
    Move Chat history upload/download labels missionfloyd 2023-06-24 01:47:22 -0600
  • ad1719ebe7 Organize character import menu missionfloyd 2023-06-24 01:15:34 -0600
  • cfbbdbd1c0 update tc 2023-06-23 21:14:30 -0700
  • 641e7b839c Merge branch 'main' into stopping_strings yiximail 2023-06-24 12:11:50 +0800
  • 5fd901ecd6
    Merge branch 'oobabooga:main' into pull_branch FartyPants 2023-06-23 13:12:42 -0400
  • 37c7811fd6
    Merge branch 'oobabooga:main' into main FartyPants 2023-06-23 13:12:33 -0400
  • 16a91a7851 bug fix yiximail 2023-06-23 23:33:21 +0800
  • 0f9088f730 Update README oobabooga 2023-06-23 12:24:43 -0300
  • 3ae9af01aa Add --no_use_cuda_fp16 param for AutoGPTQ oobabooga 2023-06-23 12:22:56 -0300
  • 5646690769
    Fix some models not loading on exllama_hf (#2835) Panchovix 2023-06-23 10:31:02 -0400
  • f997c2c067 Unify 'LlamaForCausalLM', 'LlamaGPTQForCausalLM', 'ExllamaHF' oobabooga 2023-06-23 11:29:13 -0300
  • 642bd6586c update dependency tc 2023-06-23 07:16:20 -0700
  • c10ecc1962
    Merge branch 'oobabooga:main' into api-instruct-fixes atriantafy 2023-06-23 15:40:46 +0300
  • 6fd6e31a17 but fix yiximail 2023-06-23 19:27:40 +0800
  • 53ac9f3770 Merge branch 'main' into stopping-strings-for-HF yiximail 2023-06-23 18:43:47 +0800
  • 5dbe298ed8 improve performance yiximail 2023-06-23 18:43:06 +0800
  • 000d20e803
    Fix some models not loading on exllama_hf Panchovix 2023-06-23 04:04:07 -0400
  • 98901990e7
    Merge branch 'oobabooga:main' into pull_branch FartyPants 2023-06-23 03:33:12 -0400
  • 5293e99ec2 Resolved merge conflict by incorporating both suggestions. FPHam 2023-06-23 03:32:47 -0400
  • 383c50f05b
    Replace old presets with the results of Preset Arena (#2830) oobabooga 2023-06-23 01:48:29 -0300
  • 089383a9f3 Update README oobabooga 2023-06-23 01:36:33 -0300
  • 1a1d726a1d Add new presets oobabooga 2023-06-22 22:20:43 -0300
  • d709746984
    fix: added model parameter check Roman Siebert 2023-06-23 05:17:07 +0100
  • 28efbc04b8 Add CT2 Streaming support, use auto-optimized load and generally cleanup the generator Timothy Alexander 2023-06-22 22:31:03 -0500
  • f88fd9f4e1 Initial support for (non-streaming) ct2 models, allowing int8 inference on either CPU or CUDA. Timothy Alexander 2023-06-22 20:53:40 -0500
  • 3264aa0058
    Merge branch 'oobabooga:main' into api_extras matatonic 2023-06-22 20:15:51 -0400
  • 0a5b91c0c0
    Bump exllama module to 0.0.2 jllllll 2023-06-22 17:18:12 -0500
  • 5964582c44 word change FPHam 2023-06-22 17:28:08 -0400
  • 1d49ac4468
    Merge branch 'oobabooga:main' into pull_branch FartyPants 2023-06-22 17:27:22 -0400
  • db46c4bd7e Merge branch 'main' of https://github.com/FartyPants/text-generation-webui FPHam 2023-06-22 17:25:53 -0400
  • 96ab583340 Word fixes FPHam 2023-06-22 17:24:54 -0400
  • 1ddcfcc229
    Merge branch 'oobabooga:main' into main FartyPants 2023-06-22 17:16:20 -0400
  • b031a0f6aa
    Revert c5a3596bae jllllll 2023-06-22 15:55:42 -0500
  • c5a3596bae
    Update requirements.txt to use wheel index for exllama jllllll 2023-06-22 15:10:14 -0500
  • 37b31b72a1
    Merge branch 'oobabooga:main' into exllama-module jllllll 2023-06-22 15:09:25 -0500
  • aa1f1ef46a
    Fix printing, take two. (#2810) missionfloyd 2023-06-22 13:06:49 -0600
  • b4a38c24b7
    Fix Multi-GPU not working on exllama_hf (#2803) Panchovix 2023-06-22 15:05:25 -0400
  • 7c68cef3a9
    Comment fused_attn / fused_mlp_thd oobabooga 2023-06-22 15:54:35 -0300
  • 69520fb55f support torch.Tensor type and bug fix yiximail 2023-06-23 02:01:55 +0800
  • d7888d97a2 Training improvements FPHam 2023-06-22 13:39:03 -0400
  • bbe33a3702 Training improvements FPHam 2023-06-22 13:38:39 -0400
  • 224fb4324d Training Enhancements: Stop at Loss, saving training_turnaround.json FPHam 2023-06-22 13:24:16 -0400
  • 16a0239a91 Training Enhancements: Stop at Loss, saving turnaround.json FPHam 2023-06-22 13:22:49 -0400
  • 632b44c426
    Removed act_order config Panchovix 2023-06-22 11:16:51 -0400
  • 8df11aeeb8
    Lora fixes for AutoGPTQ Forkoz 2023-06-22 07:58:08 -0500
  • ef0f142def 8k size only for minotaur-15B Matthew Ashton 2023-06-22 07:17:33 -0400
  • c733c5ec27 minor improvements, stop on \n### only for alpaca Matthew Ashton 2023-06-22 07:13:18 -0400
  • ead7eab926 Replace stopping_criteria to better match strings yiximail 2023-06-22 18:29:28 +0800
  • 5bcea1298a remove the space yiximail 2023-06-22 17:38:30 +0800
  • d618d0daf2 leave an empty row yiximail 2023-06-22 17:36:51 +0800
  • 853dc03cf4 Merge branch 'main' into stopping_strings yiximail 2023-06-22 17:36:02 +0800
  • 2b077af85d
    Update blocking_api.py Sebastian Bodza 2023-06-22 10:59:28 +0200
  • 3bb33960ed small change FPHam 2023-06-22 04:02:02 -0400
  • d83fbec367
    Removed TODO Panchovix 2023-06-22 02:51:06 -0400
  • b66971e9d6
    Fix tensors for more than 1 GPU. Panchovix 2023-06-22 02:47:57 -0400
  • 64afa8c748
    Merge branch 'oobabooga:main' into patch-1 Panchovix 2023-06-22 02:46:49 -0400
  • 31539c83e1 Better printing missionfloyd 2023-06-21 22:55:28 -0600
  • 3077ec1024 Merge remote-tracking branch 'upstream/main' into print-chat missionfloyd 2023-06-21 22:50:36 -0600
  • 809bdb2b38
    Add comments Panchovix 2023-06-21 23:59:42 -0400
  • a855d7ff33 Merge branch 'pull_branch' of https://github.com/FartyPants/text-generation-webui into pull_branch FPHam 2023-06-21 21:53:07 -0400
  • 7260f39758 training updated FPHam 2023-06-21 21:52:45 -0400
  • c6aaae5eea small changes FPHam 2023-06-21 21:50:17 -0400
  • ce1beecb63
    Merge branch 'oobabooga:main' into pull_branch FartyPants 2023-06-21 21:40:52 -0400
  • d3c5cb0e9c
    Merge branch 'oobabooga:main' into main FartyPants 2023-06-21 21:40:44 -0400
  • d94ea31d54
    more models. +minotaur 8k (#2806) matatonic 2023-06-21 20:05:08 -0400
  • 104e430cb4 +minotaur Matthew Ashton 2023-06-21 18:31:12 -0400
  • 12030fe0e1
    update gpu_split where it should be Panchovix 2023-06-21 16:50:13 -0400
  • 6ad74030e6
    Added self.ex_config Panchovix 2023-06-21 16:39:36 -0400
  • 7456be14ca
    Update exllama_hf.py Panchovix 2023-06-21 16:28:54 -0400
  • 1804f169aa
    Merge branch 'oobabooga:main' into api_extras matatonic 2023-06-21 15:27:54 -0400
  • ff6caa9d3f
    Extend exllama module support to exllama_hf.py jllllll 2023-06-21 14:18:09 -0500
  • 6254203f84
    Merge branch 'oobabooga:main' into exllama-module jllllll 2023-06-21 14:15:08 -0500
  • 04cae3e5db
    Remove bitsandbytes compatibility workaround (#91) jllllll 2023-06-21 13:40:41 -0500
  • 580c1ee748
    Implement a demo HF wrapper for exllama to utilize existing HF transformers decoding. (#2777) LarryVRH 2023-06-22 02:31:42 +0800
  • abf0cbe45a Update README oobabooga 2023-06-21 15:27:48 -0300
  • 015a13e19f Let ExLlama and ExLlama_HF loaders coexist oobabooga 2023-06-21 15:25:04 -0300
  • fbd8015aa8 Merge branch 'main' into api yiximail 2023-06-22 02:18:11 +0800
  • a06acd6d09
    Update bitsandbytes to 0.39.1 (#2799) jllllll 2023-06-21 13:04:45 -0500
  • 31e47d0165
    Update bitsandbytes to 0.39.1 jllllll 2023-06-21 12:54:32 -0500
  • 7c079f6b7f
    Merge branch 'oobabooga:main' into main FartyPants 2023-06-21 13:49:56 -0400
  • 74e2ae39b5 fix tokenizer yiximail 2023-06-22 01:42:36 +0800
  • e3efb398e2
    Merge branch 'oobabooga:main' into api_extras matatonic 2023-06-21 12:24:42 -0400
  • 89fb6f9236
    Fixed the ZeroDivisionError when downloading a model (#2797) Gaurav Bhagchandani 2023-06-21 11:31:50 -0400
  • 90be1d9fe1
    More models (match more) & templates (starchat-beta, tulu) (#2790) matatonic 2023-06-21 11:30:44 -0400
  • 0830d5fe9a
    Small change oobabooga 2023-06-21 12:29:12 -0300