Commit Graph

  • bfa905fc07 Add SSL certificate support oobabooga 2023-08-04 09:46:36 -0700
  • 2a291774f1 hotfix in get characters Paulo Henrique Silveira 2023-08-04 12:15:28 -0300
  • f1f9a70d88 implemented set method in api for set characters Paulo Henrique Silveira 2023-08-04 12:06:13 -0300
  • 1c3fd9df74 Fix: Rope freq base by alpha value in llamacpp module gabriel-pena 2023-08-04 11:55:24 -0300
  • ed57a79c6e
    Add back silero preview by @missionfloyd (#3446) oobabooga 2023-08-04 02:29:14 -0300
  • 0147cc4ac4 Minor changes oobabooga 2023-08-03 22:27:36 -0700
  • 3858e8679b Add back silero preview by @missionfloyd oobabooga 2023-08-03 22:23:22 -0700
  • 2336b75d92
    Remove unnecessary chat.js (#3445) missionfloyd 2023-08-03 22:58:37 -0600
  • d8aa184f87 Remove unnecessary chat.js missionfloyd 2023-08-03 22:17:01 -0600
  • 4b3384e353 Handle unfinished lists during markdown streaming oobabooga 2023-08-03 17:10:57 -0700
  • f4005164f4
    Fix llama.cpp truncation (#3400) Pete 2023-08-03 19:01:15 -0400
  • 33ca9b0e78 Remove print oobabooga 2023-08-03 15:59:17 -0700
  • 8095adc9ec Handle truncation only in llamacpp_model.py oobabooga 2023-08-03 15:57:07 -0700
  • ac379e4829
    Add files via upload FartyPants 2023-08-03 18:38:12 -0400
  • 4e6dc6d99d Add Contributing guidelines oobabooga 2023-08-03 14:36:35 -0700
  • 173d46af71 Changes to make the blocking API work with user history. RuntimeRacer 2023-08-03 22:09:22 +0200
  • 8f98268252
    extensions/openai: include content-length for json replies (#3416) matatonic 2023-08-03 15:10:49 -0400
  • 32e7cbb635
    More models: +StableBeluga2 (#3415) matatonic 2023-08-03 15:02:54 -0400
  • f61573bbde
    Add standalone Dockerfile for NVIDIA Jetson (#3336) Paul DeCarlo 2023-08-03 21:57:33 +0300
  • d578baeb2c
    Use character settings from API properties if present (#3428) rafa-9 2023-08-03 14:56:40 -0400
  • 2a2af38cff Some fixes oobabooga 2023-08-03 11:47:14 -0700
  • 60f7df35ed Give a more up-to-date example oobabooga 2023-08-03 11:43:36 -0700
  • 8e086428f5 Minor changes oobabooga 2023-08-03 11:37:39 -0700
  • a7e362d255 Merge branch 'main' into uogbuji-main oobabooga 2023-08-03 11:32:54 -0700
  • 601fc424cd
    Several improvements (#117) oobabooga 2023-08-03 14:39:46 -0300
  • 9f8f6c9274
    Update api-example-chat-stream.py rafa-9 2023-08-03 10:23:40 -0700
  • a390c7cdc9
    Update api-example-chat.py rafa-9 2023-08-03 10:22:49 -0700
  • fa8d8d06c9
    Merge branch 'oobabooga:main' into openai_update matatonic 2023-08-03 12:32:01 -0400
  • d93087adc3 Merge remote-tracking branch 'refs/remotes/origin/main' oobabooga 2023-08-03 08:14:10 -0700
  • 1839dff763 Use Esc to Stop the generation oobabooga 2023-08-03 08:13:17 -0700
  • 87dab03dc0
    Add the --cpu option for llama.cpp to prevent CUDA from being used (#3432) oobabooga 2023-08-03 11:00:36 -0300
  • 3e70bce576 Properly format exceptions in the UI oobabooga 2023-08-03 06:57:21 -0700
  • 4403de74f5 Fix a bug oobabooga 2023-08-03 06:51:05 -0700
  • 129b3654bd README updates oobabooga 2023-08-03 06:32:28 -0700
  • 895998f07c Document the new --cpu use in the README oobabooga 2023-08-03 06:28:57 -0700
  • 275399a935 Add the --cpu option for llama.cpp oobabooga 2023-08-03 06:17:38 -0700
  • 3390196a14 Add some javascript alerts for confirmations oobabooga 2023-08-02 22:13:57 -0700
  • e074538b58 Revert "Make long_replies ban the eos token as well" oobabooga 2023-08-02 21:45:10 -0700
  • 6bf9e855f8 Minor change oobabooga 2023-08-02 21:39:56 -0700
  • 32c564509e Fix loading session in chat mode oobabooga 2023-08-02 21:13:16 -0700
  • 4b6c1d3f08 CSS change oobabooga 2023-08-02 20:20:23 -0700
  • 0e8f9354b5 Add direct download for session/chat history JSONs oobabooga 2023-08-02 18:50:13 -0700
  • aca5679968
    Properly fix broken gcc_linux-64 package (#115) jllllll 2023-08-02 21:39:07 -0500
  • 6242d8ca61
    Update util.py rafa-9 2023-08-02 19:10:48 -0700
  • aea447c0a8 fixed shared secret Paulo Henrique Silveira 2023-08-02 21:30:43 -0300
  • c68353c483 added run.sh and fixed shared variable api-token Paulo Henrique Silveira 2023-08-02 21:29:57 -0300
  • bed9009f63
    Add files via upload FartyPants 2023-08-02 16:13:56 -0400
  • 32a2bbee4a Implement auto_max_new_tokens for ExLlama oobabooga 2023-08-02 11:01:29 -0700
  • e931844fe2
    Add auto_max_new_tokens parameter (#3419) oobabooga 2023-08-02 14:52:20 -0300
  • e40c923152 Add to openai API oobabooga 2023-08-02 10:46:33 -0700
  • 4265f2ca3b Minor change oobabooga 2023-08-02 10:43:12 -0700
  • b8a472db5e Add auto_max_new_tokens parameter oobabooga 2023-08-02 10:41:35 -0700
  • 0d9932815c Improve TheEncrypted777 on mobile devices oobabooga 2023-08-02 08:45:14 -0700
  • 976dbfc8e5 include content-length for json replies Matthew Ashton 2023-08-02 11:30:41 -0400
  • b1b6ed9f1c +StableBeluga2 Matthew Ashton 2023-08-02 11:17:06 -0400
  • 6afc1a193b
    Add a scrollbar to notebook/default, improve chat scrollbar style (#3403) Pete 2023-08-02 11:02:36 -0400
  • 6d2cd7d29e Minor change oobabooga 2023-08-02 08:01:18 -0700
  • 3d3d9407a9 Minor change oobabooga 2023-08-02 08:00:58 -0700
  • c0089f8299 Hide resize handle oobabooga 2023-08-02 07:58:38 -0700
  • aea32ca0be Add a style for the scrollbars oobabooga 2023-08-02 07:51:04 -0700
  • 7bd21b03a7
    Merge branch 'oobabooga:main' into multi-lora Googulator 2023-08-02 16:17:09 +0200
  • 75f45bc31c
    Merge branch 'oobabooga:main' into chat-token-count atriantafy 2023-08-02 14:51:20 +0100
  • ada7a17852 Move js to main.js, add more scrollbars oobabooga 2023-08-01 20:54:26 -0700
  • 6c521ce967 Make long_replies ban the eos token as well oobabooga 2023-08-01 18:47:49 -0700
  • 9ae0eab989
    extensions/openai: +Array input (batched) , +Fixes (#3309) matatonic 2023-08-01 21:26:00 -0400
  • 40038fdb82
    add chat instruction config for BaiChuan model (#3332) CrazyShipOne 2023-08-02 09:25:20 +0800
  • 11bb71df06
    Main (#15) Ricardo Pinto 2023-08-02 00:24:58 +0100
  • 57daf042d8
    Make function generic Pete 2023-08-01 17:17:43 -0400
  • da3ca64683
    Give the Raw textbox a unique identifier Pete 2023-08-01 17:10:34 -0400
  • 346465228a
    Update server.py Pete 2023-08-01 17:05:34 -0400
  • 88e0b427f4
    Sync settings-template.yaml with main branch Pete 2023-08-01 16:40:17 -0400
  • 30d30a4d3b
    Merge branch 'oobabooga:main' into fix-llama-truncation Pete 2023-08-01 16:29:07 -0400
  • c8a59d79be Add a template for NewHope oobabooga 2023-08-01 13:27:29 -0700
  • b53ed70a70 Make llamacpp_HF 6x faster oobabooga 2023-08-01 13:15:14 -0700
  • baf63da2ca Remove debug code Pete 2023-07-30 18:31:31 -0400
  • ce54b4ab59 Simplify code Pete 2023-07-30 18:30:30 -0400
  • e6906b5f62 Fix truncation for ExLlama and RWKV models Pete 2023-07-30 18:27:21 -0400
  • a2d067bfda Fix truncation for llama.cpp models Pete 2023-07-30 00:03:11 -0400
  • 10d1adfaa9
    Merge branch 'oobabooga:main' into chat-token-count atriantafy 2023-08-01 18:45:42 +0100
  • 385229313f Increase the interface area a bit oobabooga 2023-08-01 09:41:57 -0700
  • 8d46a8c50a Change the default chat style and the default preset oobabooga 2023-08-01 09:35:17 -0700
  • 9773534181 Update Chat-mode.md oobabooga 2023-08-01 07:57:47 -0700
  • 959feba602 When saving model settings, only save the settings for the current loader oobabooga 2023-08-01 06:10:09 -0700
  • f92f01b16e
    Template for Open Chat Instruction Following Louis Del Valle 2023-08-01 04:45:52 -0500
  • ebb4f22028 Change a comment oobabooga 2023-07-31 20:06:10 -0700
  • 8e2217a029 Minor changes to the Parameters tab oobabooga 2023-07-31 19:55:11 -0700
  • b2207f123b Update docs oobabooga 2023-07-31 19:20:33 -0700
  • f094330df0 When saving a preset, only save params that differ from the defaults oobabooga 2023-07-31 19:13:29 -0700
  • 84297d05c4 Add a "Filter by loader" menu to the Parameters tab oobabooga 2023-07-31 18:44:00 -0700
  • 410e4e993f
    Bump fastapi from 0.95.2 to 0.100.1 dependabot[bot] 2023-07-31 20:47:43 +0000
  • 3648880f98
    Bump gradio from 3.33.1 to 3.39.0 dependabot[bot] 2023-07-31 20:47:31 +0000
  • 7e2f582746
    Bump gradio-client from 0.2.5 to 0.3.0 dependabot[bot] 2023-07-31 20:47:18 +0000
  • 0c4e061f25 Merge branch 'main' into cfg oobabooga 2023-07-31 12:04:03 -0700
  • abea8d9ad3 Make settings-template.yaml more readable oobabooga 2023-07-31 12:01:50 -0700
  • 7de7b3d495 Fix newlines in exported character yamls oobabooga 2023-07-31 10:46:02 -0700
  • 6490a71f3a
    Merge branch 'oobabooga:main' into main FartyPants 2023-07-31 13:34:21 -0400
  • d06c34dea5
    Add an extension that makes chat replies longer (#3363) oobabooga 2023-07-31 13:34:41 -0300
  • 96bb919eb0 Change the defaults oobabooga 2023-07-31 09:21:44 -0700
  • 9c085eecd2 Initial support for LLaVA-LLaMA-2. Haotian Liu 2023-07-30 22:43:58 -0500
  • e6be25ea11 Fix a regression oobabooga 2023-07-30 18:12:30 -0700