Commit Graph

  • 6063a66414
    Update accelerate requirement from ==0.33.* to ==0.34.* (#6416) dependabot[bot] 2024-09-30 18:50:38 -0300
  • 4d9ce586d3 Update llama_cpp_python_hijack.py, fix llamacpp_hf oobabooga 2024-09-30 14:04:21 -0700
  • 9a6fafdbcb
    Update optimum requirement from ==1.17.* to ==1.22.* dependabot[bot] 2024-09-30 20:46:06 +0000
  • 232b745342
    Update gradio requirement from ==4.26.* to ==4.44.* dependabot[bot] 2024-09-30 20:45:54 +0000
  • 990f133c03
    Update peft requirement from ==0.12.* to ==0.13.* dependabot[bot] 2024-09-30 20:45:46 +0000
  • a7e25bfdc6
    Bump fastapi from 0.112.4 to 0.115.0 dependabot[bot] 2024-09-30 20:45:39 +0000
  • d4fcad7d24
    Update accelerate requirement from ==0.33.* to ==0.34.* dependabot[bot] 2024-09-30 20:45:35 +0000
  • 9ca0cd7749 Bump llama-cpp-python to 0.3.1 oobabooga 2024-09-29 20:47:04 -0700
  • bbdeed3cf4 Make sampler priority high if unspecified oobabooga 2024-09-29 20:45:27 -0700
  • 01362681f2 Bump exllamav2 to 0.2.4 oobabooga 2024-09-29 07:42:44 -0700
  • e4b0467f9f
    Add beforeunload event to add confirmation dialog when leaving page (#6279) Hanusz Leszek 2024-09-29 06:14:19 +0200
  • 0f90a1b50f
    Do not set value for histories in chat when --multi-user is used (#6317) Manuel Schmid 2024-09-29 06:08:55 +0200
  • 109644103a Merge branch 'dev' into mashb1t-main oobabooga 2024-09-28 21:08:02 -0700
  • 02661c55ba Naming oobabooga 2024-09-28 21:07:57 -0700
  • 055f3f5632 Fix after #6386 (thanks @Touch-Night) oobabooga 2024-09-28 20:55:26 -0700
  • 57160cd6fa Update README oobabooga 2024-09-28 20:50:41 -0700
  • 3f0571b62b Update README oobabooga 2024-09-28 20:48:30 -0700
  • 3fb02f43f6 Update README oobabooga 2024-09-28 20:38:43 -0700
  • 3b99532e02 Remove HQQ and AQLM from requirements oobabooga 2024-09-28 20:34:59 -0700
  • c61b29b9ce Simplify the warning when flash-attn fails to import oobabooga 2024-09-28 20:33:17 -0700
  • b92d7fd43e Add warnings for when AutoGPTQ, TensorRT-LLM, or HQQ are missing oobabooga 2024-09-28 20:30:24 -0700
  • 65e5864084 Update README oobabooga 2024-09-28 20:25:26 -0700
  • 1a870b3ea7 Remove AutoAWQ and AutoGPTQ from requirements (no wheels available) oobabooga 2024-09-28 19:38:56 -0700
  • 85994e3ef0 Bump pytorch to 2.4.1 oobabooga 2024-09-28 09:44:08 -0700
  • ca5a2dba72 Bump rocm to 6.1.2 oobabooga 2024-09-28 09:39:37 -0700
  • 7276dca933 Fix a typo oobabooga 2024-09-27 20:26:36 -0700
  • 46996f6519
    ExllamaV2 tensor parallelism to increase multi gpu inference speeds (#6356) RandoInternetPreson 2024-09-27 23:26:03 -0400
  • 725a463d26 Simplify oobabooga 2024-09-27 19:01:57 -0700
  • 1a4c0543be Lint oobabooga 2024-09-27 18:56:27 -0700
  • dc064954c1 Merge branch 'dev' into RandomInternetPreson-main oobabooga 2024-09-27 18:54:29 -0700
  • 301375834e
    Exclude Top Choices (XTC): A sampler that boosts creativity, breaks writing clichés, and inhibits non-verbatim repetition (#6335) Philipp Emanuel Weidmann 2024-09-28 07:20:12 +0530
  • 29d38a74ba Add missing : oobabooga 2024-09-27 18:34:07 -0700
  • 15daf366bc Merge branch 'dev' into p-e-w-xtc oobabooga 2024-09-27 18:33:44 -0700
  • 0cd9cd1b04
    Merge 3d17c80954 into 626b0a0437 randoentity 2024-09-28 02:45:18 +0200
  • 3492e33fd5 Bump bitsandbytes to 0.44 oobabooga 2024-09-27 16:59:30 -0700
  • 626b0a0437
    Force /bin/bash shell for conda (#6386) Thireus ☠ 2024-09-27 23:47:04 +0100
  • 5c918c5b2d Make it possible to sort DRY oobabooga 2024-09-27 15:40:48 -0700
  • 78b8705400 Bump llama-cpp-python to 0.3.0 (except for AMD) oobabooga 2024-09-27 15:06:31 -0700
  • c5f048e912 Bump ExLlamaV2 to 0.2.2 oobabooga 2024-09-27 15:04:08 -0700
  • 7424f789bf
    Fix the sampling monkey patch (and add more options to sampler_priority) (#6411) oobabooga 2024-09-27 19:03:25 -0300
  • 203783a58c Remove debug statements oobabooga 2024-09-27 15:02:36 -0700
  • 3f5d8a05b5 Fix the sampling monkey patch (and add more options to sampler_priority) oobabooga 2024-09-27 14:57:51 -0700
  • c497a32372 Bump transformers to 4.45 oobabooga 2024-09-26 11:55:51 -0700
  • aeaa05a72c
    Bump pydantic from 2.8.2 to 2.9.2 dependabot[bot] 2024-09-23 20:03:51 +0000
  • 440292a074
    Update Mistral V1.yaml pandora 2024-09-21 18:57:07 +0200
  • c6047508b7
    Update Mistral V1.yaml pandora 2024-09-21 18:54:48 +0200
  • 015ad8505b
    Rename Mistral.yaml to Mistral V1.yaml pandora 2024-09-21 18:51:36 +0200
  • b98635d823
    Fixing Mistral Templates pandora 2024-09-21 18:48:28 +0200
  • dcc3100b38
    Force /bin/bash shell for conda Thireus ☠ 2024-09-17 20:01:47 +0100
  • 32c7c0077c
    Update gradio requirement from ==4.26.* to ==4.44.* dependabot[bot] 2024-09-16 20:20:35 +0000
  • a8ecbc5db2
    Bump hqq from 0.1.7.post3 to 0.2.2 dependabot[bot] 2024-09-16 20:20:24 +0000
  • 2ae1c98f26
    Merge branch 'oobabooga:main' into i18n Guanghua Lu 2024-09-13 10:44:24 +0800
  • 97240b2762
    Merge branch 'oobabooga:dev' into dev Artificiangel 2024-09-10 12:25:02 -0400
  • cf0c623267
    Bump pydantic from 2.8.2 to 2.9.1 dependabot[bot] 2024-09-09 20:57:10 +0000
  • 597a1cb99d
    Update numpy requirement from ==1.26.* to ==2.1.* dependabot[bot] 2024-09-09 20:57:01 +0000
  • 2b35b362a7
    Update gradio requirement from ==4.26.* to ==4.43.* dependabot[bot] 2024-09-09 20:56:54 +0000
  • ae16e958ee
    Bump lm-eval from 0.3.0 to 0.4.4 dependabot[bot] 2024-09-09 20:56:44 +0000
  • 9630b4a953
    Bump hqq from 0.1.7.post3 to 0.2.1.post1 dependabot[bot] 2024-09-07 01:48:40 +0000
  • f98431c744 Apply the change to all requirements (oops) oobabooga 2024-09-06 18:47:25 -0700
  • a50477ec85 Apply the change to all requirements (oops) oobabooga 2024-09-06 18:47:25 -0700
  • ac30b004ef Pin fastapi/pydantic requirement versions oobabooga 2024-09-06 18:38:39 -0700
  • e86ab37aaf Merge remote-tracking branch 'refs/remotes/origin/dev' into dev oobabooga 2024-09-06 18:44:43 -0700
  • 27797a92d0 Pin fastapi/pydantic requirement versions oobabooga 2024-09-06 18:38:39 -0700
  • 4924ee2901
    typo in OpenAI response format (#6365) Jean-Sylvain Boige 2024-09-06 02:42:23 +0200
  • 5aee123f13
    typo in OpenAI response format Jean-Sylvain Boige 2024-09-04 23:30:06 +0200
  • bba5b36d33 Don't import PEFT unless necessary oobabooga 2024-09-03 19:40:53 -0700
  • c5b40eb555 llama.cpp: prevent prompt evaluation progress bar with just 1 step oobabooga 2024-09-03 17:37:06 -0700
  • 2cb8d4c96e Bump llama-cpp-python to 0.2.90 oobabooga 2024-09-03 05:53:18 -0700
  • 64919e0d69 Bump flash-attention to 2.6.3 oobabooga 2024-09-03 05:51:46 -0700
  • 68d52c60f3 Merge remote-tracking branch 'refs/remotes/origin/dev' into dev oobabooga 2024-09-02 21:16:39 -0700
  • d1168afa76 Bump ExLlamaV2 to 0.2.0 oobabooga 2024-09-02 21:15:51 -0700
  • 0f62744df1 Check for EOS and \n oobabooga 2024-09-02 19:54:47 -0700
  • 9a150c3368
    API: Relax multimodal format, fixes HuggingFace Chat UI (#6353) Stefan Merettig 2024-09-03 04:03:15 +0200
  • 4c74c7a116
    Fix UnicodeDecodeError for BPE-based Models (especially GLM-4) (#6357) GralchemOz 2024-09-03 10:00:59 +0800
  • 41a8eb4eeb
    Training pro update script.py (#6359) FartyPants (FP HAM) 2024-09-02 22:00:15 -0400
  • 5f8cf3da20
    Bump hqq from 0.1.7.post3 to 0.2.1 dependabot[bot] 2024-09-02 20:03:52 +0000
  • b29cc5ccae
    Update script.py FartyPants (FP HAM) 2024-09-02 00:25:27 -0400
  • 6e294af0a6 Adding parameter to compose API's model list jsboige 2024-09-01 12:24:51 +0200
  • fe2c268088
    Merge branch 'oobabooga:dev' into dev Artificiangel 2024-09-01 03:53:23 -0400
  • 3e44373e8d
    Updated hf version too RandoInternetPreson 2024-08-31 11:17:48 -0400
  • dc5149464d Fix UnicodeDecodeError for partial character output in BPE tokenizer GralchemOz 2024-08-30 17:44:13 +0800
  • 377018eb22
    Add files via upload RandoInternetPreson 2024-08-29 18:14:06 -0400
  • 2104aa8285 Update docs Touch-Night 2024-08-29 14:40:24 +0800
  • 3e87d43447 Fix bugs Touch-Night 2024-08-29 14:30:55 +0800
  • 366a54c623 Update zh-cn.json Touch-Night 2024-08-29 13:57:28 +0800
  • 88c1829b35 Select localization in session tab Touch-Night 2024-08-29 11:46:55 +0800
  • 6148e634b6 Basic localization support Touch-Night 2024-08-28 23:32:07 +0800
  • 0f1cb0a90b
    Add ui_block(), ui_tab() and ui_params() Vasyanator 2024-08-27 21:43:05 +0400
  • f2a46333ef
    Update script.py Vasyanator 2024-08-27 21:34:04 +0400
  • 8abadc93ec
    make ui_params() work Vasyanator 2024-08-27 21:18:55 +0400
  • c14f7c4e7c
    Add ui_block(), ui_tab() and ui_params() support Vasyanator 2024-08-27 21:17:26 +0400
  • 5acbec2732 extensions/openai/completions.py: Relax multimodal format Stefan Merettig 2024-08-27 13:46:24 +0200
  • bc7ceabc03
    Update gradio requirement from ==4.26.* to ==4.42.* dependabot[bot] 2024-08-26 20:48:43 +0000
  • e460da94a3
    Update optimum requirement from ==1.17.* to ==1.21.* dependabot[bot] 2024-08-26 20:48:36 +0000
  • a87e3dd5b8
    Update numpy requirement from ==1.26.* to ==2.1.* dependabot[bot] 2024-08-26 20:48:24 +0000
  • 9fdfd5ecb0 Add user_input in impersonate reply for consistency Yiximail 2024-08-23 09:46:17 +0800
  • 1f288b4072 Bump ExLlamaV2 to 0.1.9 oobabooga 2024-08-22 12:40:15 -0700
  • 38b3daad55 Fix the impersonate response role Yiximail 2024-08-22 16:26:41 +0800
  • 34b493b231 Fix the debug_msg didn't show impersonate prompt Yiximail 2024-08-22 16:17:09 +0800
  • ce6a836b46 Add impersonate feature to API /v1/chat/completions Yiximail 2024-08-22 15:56:37 +0800