oobabooga
|
7078d168c3
|
Missing import
|
2023-03-23 22:16:08 -03:00 |
|
oobabooga
|
d1327f99f9
|
Fix broken callbacks.py
|
2023-03-23 22:12:24 -03:00 |
|
oobabooga
|
b0abb327d8
|
Update LoRA.py
|
2023-03-23 22:02:09 -03:00 |
|
oobabooga
|
bf22d16ebc
|
Clear cache while switching LoRAs
|
2023-03-23 21:56:26 -03:00 |
|
oobabooga
|
4578e88ffd
|
Stop the bot from talking for you in chat mode
|
2023-03-23 21:38:20 -03:00 |
|
oobabooga
|
9bf6ecf9e2
|
Fix LoRA device map (attempt)
|
2023-03-23 16:49:41 -03:00 |
|
oobabooga
|
c5ebcc5f7e
|
Change the default names (#518)
* Update shared.py
* Update settings-template.json
|
2023-03-23 13:36:00 -03:00 |
|
oobabooga
|
29bd41d453
|
Fix LoRA in CPU mode
|
2023-03-23 01:05:13 -03:00 |
|
oobabooga
|
eac27f4f55
|
Make LoRAs work in 16-bit mode
|
2023-03-23 00:55:33 -03:00 |
|
oobabooga
|
bfa81e105e
|
Fix FlexGen streaming
|
2023-03-23 00:22:14 -03:00 |
|
oobabooga
|
de6a09dc7f
|
Properly separate the original prompt from the reply
|
2023-03-23 00:12:40 -03:00 |
|
wywywywy
|
61346b88ea
|
Add "seed" menu in the Parameters tab
|
2023-03-22 15:40:20 -03:00 |
|
oobabooga
|
45b7e53565
|
Only catch proper Exceptions in the text generation function
|
2023-03-20 20:36:02 -03:00 |
|
oobabooga
|
db4219a340
|
Update comments
|
2023-03-20 16:40:08 -03:00 |
|
oobabooga
|
7618f3fe8c
|
Add -gptq-preload for 4-bit offloading (#460)
This works in a 4GB card now:
```
python server.py --model llama-7b-hf --gptq-bits 4 --gptq-pre-layer 20
```
|
2023-03-20 16:30:56 -03:00 |
|
oobabooga
|
9a3bed50c3
|
Attempt at fixing 4-bit with CPU offload
|
2023-03-20 15:11:56 -03:00 |
|
oobabooga
|
75a7a84ef2
|
Exception handling (#454)
* Update text_generation.py
* Update extensions.py
|
2023-03-20 13:36:52 -03:00 |
|
oobabooga
|
ddb62470e9
|
--no-cache and --gpu-memory in MiB for fine VRAM control
|
2023-03-19 19:21:41 -03:00 |
|
oobabooga
|
a78b6508fc
|
Make custom LoRAs work by default #385
|
2023-03-19 12:11:35 -03:00 |
|
oobabooga
|
c753261338
|
Disable stop_at_newline by default
|
2023-03-18 10:55:57 -03:00 |
|
oobabooga
|
7c945cfe8e
|
Don't include PeftModel every time
|
2023-03-18 10:55:24 -03:00 |
|
oobabooga
|
e26763a510
|
Minor changes
|
2023-03-17 22:56:46 -03:00 |
|
Wojtek Kowaluk
|
7994b580d5
|
clean up duplicated code
|
2023-03-18 02:27:26 +01:00 |
|
Wojtek Kowaluk
|
30939e2aee
|
add mps support on apple silicon
|
2023-03-18 00:56:23 +01:00 |
|
oobabooga
|
9256e937d6
|
Add some LoRA params
|
2023-03-17 17:45:28 -03:00 |
|
oobabooga
|
9ed2c4501c
|
Use markdown in the "HTML" tab
|
2023-03-17 16:06:11 -03:00 |
|
oobabooga
|
f0b26451b4
|
Add a comment
|
2023-03-17 13:07:17 -03:00 |
|
oobabooga
|
3bda907727
|
Merge pull request #366 from oobabooga/lora
Add LoRA support
|
2023-03-17 11:48:48 -03:00 |
|
oobabooga
|
614dad0075
|
Remove unused import
|
2023-03-17 11:43:11 -03:00 |
|
oobabooga
|
a717fd709d
|
Sort the imports
|
2023-03-17 11:42:25 -03:00 |
|
oobabooga
|
29fe7b1c74
|
Remove LoRA tab, move it into the Parameters menu
|
2023-03-17 11:39:48 -03:00 |
|
oobabooga
|
214dc6868e
|
Several QoL changes related to LoRA
|
2023-03-17 11:24:52 -03:00 |
|
askmyteapot
|
53b6a66beb
|
Update GPTQ_Loader.py
Correcting decoder layer for renamed class.
|
2023-03-17 18:34:13 +10:00 |
|
oobabooga
|
0cecfc684c
|
Add files
|
2023-03-16 21:35:53 -03:00 |
|
oobabooga
|
104293f411
|
Add LoRA support
|
2023-03-16 21:31:39 -03:00 |
|
oobabooga
|
ee164d1821
|
Don't split the layers in 8-bit mode by default
|
2023-03-16 18:22:16 -03:00 |
|
oobabooga
|
e085cb4333
|
Small changes
|
2023-03-16 13:34:23 -03:00 |
|
awoo
|
83cb20aad8
|
Add support for --gpu-memory witn --load-in-8bit
|
2023-03-16 18:42:53 +03:00 |
|
oobabooga
|
1c378965e1
|
Remove unused imports
|
2023-03-16 10:18:34 -03:00 |
|
oobabooga
|
a577fb1077
|
Keep GALACTICA special tokens (#300)
|
2023-03-16 00:46:59 -03:00 |
|
oobabooga
|
4d64a57092
|
Add Interface mode tab
|
2023-03-15 23:29:56 -03:00 |
|
oobabooga
|
66256ac1dd
|
Make the "no GPU has been detected" message more descriptive
|
2023-03-15 19:31:27 -03:00 |
|
oobabooga
|
c1959c26ee
|
Show/hide the extensions block using javascript
|
2023-03-15 16:35:28 -03:00 |
|
oobabooga
|
348596f634
|
Fix broken extensions
|
2023-03-15 15:11:16 -03:00 |
|
oobabooga
|
c5f14fb9b8
|
Optimize the HTML generation speed
|
2023-03-15 14:19:28 -03:00 |
|
oobabooga
|
bf812c4893
|
Minor fix
|
2023-03-15 14:05:35 -03:00 |
|
oobabooga
|
05ee323ce5
|
Rename a file
|
2023-03-15 13:26:32 -03:00 |
|
oobabooga
|
d30a14087f
|
Further reorganize the UI
|
2023-03-15 13:24:54 -03:00 |
|
oobabooga
|
cf2da86352
|
Prevent *Is typing* from disappearing instantly while streaming
|
2023-03-15 12:51:13 -03:00 |
|
oobabooga
|
ec972b85d1
|
Move all css/js into separate files
|
2023-03-15 12:35:11 -03:00 |
|