Commit Graph

  • aec9e4ef79
    Merge branch 'main' into models_api matatonic 2023-05-30 22:12:39 -0400
  • 5686498e3a resolved conflicts Matthew Ashton 2023-05-30 22:02:04 -0400
  • 13a18a1946 models api Matthew Ashton 2023-05-26 18:23:00 -0400
  • 6627f7feb9
    Add notice about downgrading gcc and g++ (#2446) AlpinDale 2023-05-31 05:58:53 +0430
  • bfbd13ae89
    Update docker repo link (#2340) Atinoda 2023-05-31 02:14:49 +0100
  • a6d3f010a5
    extensions/openai: include all available models in Model.list (#2368) matatonic 2023-05-30 21:13:37 -0400
  • e5b756ecfe
    Fixes #2331, IndexError: string index out of range (#2383) matatonic 2023-05-30 21:07:40 -0400
  • b984a44f47
    fix error when downloading a model for the first time (#2404) Juan M Uys 2023-05-31 02:07:12 +0100
  • 4715123f55
    Add a /api/v1/stop-stream API that allows the user to interrupt the generation (#2392) Yiximail 2023-05-31 09:03:40 +0800
  • ebcadc0042
    extensions/openai: cross_origin + chunked_response (updated fix) (#2423) matatonic 2023-05-30 20:54:24 -0400
  • df50f077db
    fixup missing tfs top_a params, defaults reorg (#2443) matatonic 2023-05-30 20:52:33 -0400
  • 2931260684
    Add notice about downgrading gcc and g++ AlpinDale 2023-05-31 04:35:47 +0430
  • 86b2bd3ee1 Only return the new tokens while streaming oobabooga 2023-05-30 21:00:01 -0300
  • 4ace541eb0 Don't stream at more than 24 fps oobabooga 2023-05-30 20:27:20 -0300
  • 90e6d3f07e Add streaming support oobabooga 2023-05-30 20:19:00 -0300
  • 8e246fb943
    Fixed missing TFS and Top_A ChobPT 2023-05-31 00:09:34 +0100
  • f5790324eb Add exllama support (janky) oobabooga 2023-05-30 19:43:22 -0300
  • c7d816abbf fixup missing tfs top_a params, defaults reorg Matthew Ashton 2023-05-30 12:32:42 -0400
  • b21e4dba70
    Merge 0a17565e53 into 9ab90d8b60 tohrnii 2023-05-30 15:46:45 +0100
  • 9ab90d8b60
    Fix warning for qlora (#2438) Forkoz 2023-05-30 09:09:18 -0500
  • 39708ff317
    Fix warning for qlora Forkoz 2023-05-30 08:39:09 -0500
  • c45cfe3b6e Add fixed Flash attention code kaiokendev 2023-05-30 02:08:53 -0400
  • 0db4e191bd
    Improve chat buttons on mobile devices oobabooga 2023-05-30 00:30:15 -0300
  • 3209440b7c
    Rearrange chat buttons oobabooga 2023-05-30 00:17:31 -0300
  • 3578dd3611
    Change a warning message oobabooga 2023-05-29 22:40:54 -0300
  • 3a6e194bc7
    Change a warning message oobabooga 2023-05-29 22:39:23 -0300
  • e763ace593
    Update GPTQ-models-(4-bit-mode).md oobabooga 2023-05-29 22:35:49 -0300
  • 86ef695d37
    Update GPTQ-models-(4-bit-mode).md oobabooga 2023-05-29 22:20:55 -0300
  • 8e0a997c60
    Add new parameters to API extension oobabooga 2023-05-29 22:03:08 -0300
  • 9e7204bef4
    Add tail-free and top-a sampling (#2357) Luis Lopez 2023-05-30 08:40:01 +0800
  • 43911c15b7
    Merge branch 'main' into custom-samplers oobabooga 2023-05-29 21:39:42 -0300
  • 4286c1b780 Include Anse in the compatible apps Matthew Ashton 2023-05-29 17:48:27 -0400
  • 621f47bb48 add top_a and tfs Elinas 2023-05-29 16:39:21 -0500
  • 6409479171 cross_origin + chunked_response Matthew Ashton 2023-05-29 17:13:40 -0400
  • cf63913e07
    Bump llama-cpp-python from 0.1.53 to 0.1.55 dependabot[bot] 2023-05-29 21:02:57 +0000
  • 6938107794 Add starcoder.cpp to docs Sergey Kostyaev 2023-05-27 16:45:02 +0700
  • 7c36f344b7 Add starcoder and starchat ggml support Sergey Kostyaev 2023-05-27 03:33:31 +0700
  • b4662bf4af
    Download gptq_model*.py using download-model.py oobabooga 2023-05-29 16:12:54 -0300
  • 540a161a08
    Update GPTQ-models-(4-bit-mode).md oobabooga 2023-05-29 15:45:40 -0300
  • b8d2f6d876 Merge remote-tracking branch 'refs/remotes/origin/main' oobabooga 2023-05-29 15:33:05 -0300
  • 1394f44e14 Add triton checkbox for AutoGPTQ oobabooga 2023-05-29 15:32:45 -0300
  • 166a0d9893
    Update GPTQ-models-(4-bit-mode).md oobabooga 2023-05-29 15:07:59 -0300
  • 962d05ca7e
    Update README.md oobabooga 2023-05-29 14:56:55 -0300
  • 4a190a98fd
    Update GPTQ-models-(4-bit-mode).md oobabooga 2023-05-29 14:56:05 -0300
  • 2b7ba9586f
    Fixes #2326, KeyError: 'assistant' (#2382) matatonic 2023-05-29 13:19:57 -0400
  • 6de727c524 Improve Eta Sampling preset oobabooga 2023-05-29 13:56:15 -0300
  • f34d20922c Minor fix oobabooga 2023-05-29 13:31:17 -0300
  • 983eef1e29 Attempt at evaluating falcon perplexity (failed) oobabooga 2023-05-29 13:28:25 -0300
  • 204731952a
    Falcon support (trust-remote-code and autogptq checkboxes) (#2367) Honkware 2023-05-29 08:20:18 -0500
  • d750c3fdb3 Add autogptq checkbox oobabooga 2023-05-29 10:16:49 -0300
  • 4b60e6c26d Some small adaptations oobabooga 2023-05-29 10:04:40 -0300
  • 11605bb776
    Merge branch 'main' into friyin friyin 2023-05-29 13:42:45 +0200
  • 02ce057d31 Modify generated HTML player to go back to the first text when it is done playing all audio files. Cocktail Boy 2023-05-29 00:48:34 -0700
  • 64f9a8e28f removed chat-13b Orion 2023-05-29 14:34:39 +0800
  • df852d0b0a my own modifications Orion 2023-05-29 14:32:32 +0800
  • ac0e002b8d Remove unnessary \ redirects Cocktail Boy 2023-05-28 22:33:22 -0700
  • 0527d959ef Add cross-origin headers, chunked responses, update [done] termination andreas.echavez 2023-05-28 23:29:57 -0600
  • 7f9c78d3ed Remove weird characters oobabooga 2023-05-28 23:42:47 -0300
  • 60ae80cf28
    Fix hang in tokenizer for AutoGPTQ llama models. (#2399) Forkoz 2023-05-28 21:10:10 -0500
  • 2f811b1bdf Change a warning message oobabooga 2023-05-28 22:48:20 -0300
  • 9ee1e37121 Fix return message when no model is loaded oobabooga 2023-05-28 22:46:32 -0300
  • f27135bdd3 Add Eta Sampling preset oobabooga 2023-05-28 22:42:43 -0300
  • 00ebea0b2a Use YAML for presets and settings oobabooga 2023-05-28 22:34:12 -0300
  • ff6b08148a Add handling of custom abbreviations like Dr. Cocktail Boy 2023-05-28 16:54:15 -0700
  • a5b80b59d1 Modify audio and text file names to be sortable format. Cocktail Boy 2023-05-28 16:11:28 -0700
  • 8f8d3e69c3 Add function to save HTML audio/text player. Cocktail Boy 2023-05-28 15:43:39 -0700
  • a03a2d221c Add trust_remote_code support for AutoGPTQ Honkware 2023-05-28 21:26:53 +0000
  • 72eab89624 Split output text into paragraphs and generate audio files. Cocktail Boy 2023-05-28 14:12:13 -0700
  • 8eb134e58e
    error when downloading a model for the first time Juan M Uys 2023-05-28 21:42:13 +0100
  • a66e67c4bf
    Update Dockerfile to resolve superbooga requirement error ramblingcoder 2023-05-28 13:52:26 -0500
  • 48af24d4f3
    Fix hang in tokenizer for AutoGPTQ llama models. Forkoz 2023-05-28 16:17:07 +0000
  • a5e19834dd The API is only available for the stream api yiximail 2023-05-28 18:37:36 +0800
  • 09099e3249 Add a stop API yiximail 2023-05-28 18:02:14 +0800
  • 83bcff42e1 Fixes #2331, IndexError: string index out of range Matthew Ashton 2023-05-27 14:51:22 -0400
  • b358832e12 Fixes #2326, KeyError: 'assistant' Matthew Ashton 2023-05-27 14:39:57 -0400
  • 473a57e352 Add trust_remote_code support for AutoGPTQ TheBloke 2023-05-27 09:23:50 +0100
  • 8a2f621662 docker compose working with wsl ehgp 2023-05-27 00:10:02 -0400
  • f0960ffd3a working docker version ehgp 2023-05-26 22:40:46 -0400
  • b0f7fe411b include all available models in Model.list Matthew Ashton 2023-05-26 20:07:23 -0400
  • 65f605ad40 Falcon support Honkware 2023-05-27 00:02:17 +0000
  • 680ada7a53 models api Matthew Ashton 2023-05-26 18:23:00 -0400
  • 8e41fa1684
    Merge branch 'oobabooga:main' into subpath-support catalpaaa 2023-05-26 12:55:54 -0700
  • fffa4eeeb4 Created using Colaboratory Aitrepreneur 2023-05-26 21:28:52 +0300
  • c4f02936d3 Add tail-free and top-a sampling toast22a 2023-05-24 03:57:26 +0800
  • 0a17565e53 add api endpoint for finetuning tohrnii 2023-05-25 22:42:32 +0100
  • 837b7e5e3c
    Delete UPDATE_pyg_13b_GPTQ_4bit_128g.ipynb Aitrepreneur 2023-05-26 12:33:55 +0300
  • 8614dab874
    Update README.md Aitrepreneur 2023-05-26 12:33:37 +0300
  • 65c60f0a83 Created using Colaboratory Aitrepreneur 2023-05-26 12:32:42 +0300
  • 2cf711f35e
    update SpeechRecognition dependency (#2345) Elias Vincent Simon 2023-05-26 05:34:57 +0200
  • 6c7c1d093e Merge remote-tracking branch 'origin/superapi' into superapi nabelekm 2023-05-26 04:48:39 +0200
  • 235de7e0b1 chromadb fixed missing [] nabelekm 2023-05-26 04:45:31 +0200
  • aa0fce1cce Merge branch 'main' into superapi nabelekm 2023-05-26 04:33:34 +0200
  • 78dbec4c4e
    Add 'scipy' to requirements.txt #2335 (#2343) jllllll 2023-05-25 21:26:25 -0500
  • 0dbc3d9b2c
    Fix get_documents_ids_distances return error when n_results = 0 (#2347) Luis Lopez 2023-05-26 10:25:36 +0800
  • ba0604ef0e
    Delete chroma directory nabelekm 2023-05-26 04:15:24 +0200
  • 0064109172
    Delete .chroma directory nabelekm 2023-05-26 04:15:07 +0200
  • a7f45e5622 updated to latest main nabelekm 2023-05-26 04:12:54 +0200
  • 4a86edc6ae Fix get_documents_ids_distances return error when n_results = 0 toast22a 2023-05-26 08:46:55 +0800
  • 17b8afdda5
    Merge branch 'oobabooga:main' into whisper-extension-auto-record-stop Elias Vincent Simon 2023-05-26 01:40:18 +0200
  • c264783bfa Added PDF parsing. Messy code nabelekm 2023-05-26 01:08:22 +0200