oobabooga
f7ad634634
Remove --chat flag
2023-08-12 21:13:50 -07:00
oobabooga
949c92d7df
Create README.md
2023-08-10 14:32:40 -03:00
jllllll
28e3ce4317
Simplify GPTQ-for-LLaMa installation ( #122 )
2023-08-10 13:19:47 -03:00
oobabooga
fa4a948b38
Allow users to write one flag per line in CMD_FLAGS.txt
2023-08-09 01:58:23 -03:00
jllllll
9e17325207
Add CMD_FLAGS.txt functionality to WSL installer ( #119 )
2023-08-05 10:26:24 -03:00
oobabooga
601fc424cd
Several improvements ( #117 )
2023-08-03 14:39:46 -03:00
jllllll
aca5679968
Properly fix broken gcc_linux-64 package ( #115 )
2023-08-02 23:39:07 -03:00
jllllll
ecd92d6a4e
Remove unused variable from ROCm GPTQ install ( #107 )
2023-07-26 22:16:36 -03:00
jllllll
1e3c950c7d
Add AMD GPU support for Linux ( #98 )
2023-07-26 17:33:02 -03:00
jllllll
52e3b91f5e
Fix broken gxx_linux-64 package. ( #106 )
2023-07-26 01:55:08 -03:00
oobabooga
cc2ed46d44
Make chat the default again
2023-07-20 18:55:09 -03:00
jllllll
fcb215fed5
Add check for compute support for GPTQ-for-LLaMa ( #104 )
...
Installs from main cuda repo if fork not supported
Also removed cuBLAS llama-cpp-python installation in preperation for 4b19b74e6c
2023-07-20 11:11:00 -03:00
jllllll
4df3f72753
Fix GPTQ fail message not being shown on update ( #103 )
2023-07-19 22:25:09 -03:00
jllllll
11a8fd1eb9
Add cuBLAS llama-cpp-python wheel installation ( #102 )
...
Parses requirements.txt using regex to determine required version.
2023-07-16 01:31:33 -03:00
oobabooga
bb79037ebd
Fix wrong pytorch version on Linux+CPU
...
It was installing nvidia wheels
2023-07-07 20:40:31 -03:00
oobabooga
564a8c507f
Don't launch chat mode by default
2023-07-07 13:32:11 -03:00
jllllll
eac8450ef7
Move special character check to start script ( #92 )
...
Also port print_big_message function to batch
2023-06-24 10:06:35 -03:00
jllllll
04cae3e5db
Remove bitsandbytes compatibility workaround ( #91 )
...
New bnb does not need it.
Commented out in case it is needed in the futute.
2023-06-21 15:40:41 -03:00
jllllll
d1da22d7ee
Fix -y from previous commit ( #90 )
2023-06-20 22:48:59 -03:00
oobabooga
80a615c3ae
Add space
2023-06-20 22:48:45 -03:00
oobabooga
a2116e8b2b
use uninstall -y
2023-06-20 21:24:01 -03:00
oobabooga
c0a1baa46e
Minor changes
2023-06-20 20:23:21 -03:00
jllllll
5cbc0b28f2
Workaround for Peft not updating their package version on the git repo ( #88 )
...
* Workaround for Peft not updating their git package version
* Update webui.py
---------
Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
2023-06-20 20:21:10 -03:00
jllllll
9bb2fc8cd7
Install Pytorch through pip instead of Conda ( #84 )
2023-06-20 16:39:23 -03:00
jllllll
b1d05cbbf6
Install exllama ( #83 )
...
* Install exllama
* Handle updating exllama
2023-06-17 19:10:36 -03:00
jllllll
657049d7d0
Fix cmd_macos.sh ( #82 )
...
MacOS version of Bash does not support process substitution
2023-06-17 19:09:42 -03:00
jllllll
b2483e28d1
Check for special characters in path on Windows ( #81 )
...
Display warning message if detected
2023-06-17 19:09:22 -03:00
jllllll
c42f183d3f
Installer for WSL ( #78 )
2023-06-13 00:04:15 -03:00
oobabooga
53496ffa80
Create stale.yml
2023-06-05 17:15:31 -03:00
oobabooga
522b01d051
Grammar
2023-06-01 14:05:29 -03:00
oobabooga
5540335819
Better way to detect if a model has been downloaded
2023-06-01 14:01:19 -03:00
oobabooga
248ef32358
Print a big message for CPU users
2023-06-01 01:40:24 -03:00
oobabooga
290a3374e4
Don't download a model during installation
...
And some other updates/minor improvements
2023-06-01 01:30:21 -03:00
oobabooga
2e53caa806
Create LICENSE
2023-05-31 16:28:36 -03:00
Sam
dea1bf3d04
Parse g++ version instead of using string matching ( #72 )
2023-05-31 14:44:36 -03:00
gavin660
97bc7e3fb6
Adds functionality for user to set flags via environment variable ( #59 )
2023-05-31 14:43:22 -03:00
Sam
5405635305
Install pre-compiled wheels for Linux ( #74 )
2023-05-31 14:41:54 -03:00
jllllll
be98e74337
Install older bitsandbytes on older gpus + fix llama-cpp-python issue ( #75 )
2023-05-31 14:41:03 -03:00
jllllll
b1b3bb6923
Improve environment isolation ( #68 )
2023-05-25 11:15:05 -03:00
oobabooga
c8ce2e777b
Add instructions for CPU mode users
2023-05-25 10:57:52 -03:00
oobabooga
996c49daa7
Remove bitsandbytes installation step
...
Following 548f05e106
2023-05-25 10:50:20 -03:00
jllllll
4ef2de3486
Fix dependencies downgrading from gptq install ( #61 )
2023-05-18 12:46:04 -03:00
oobabooga
07510a2414
Change a message
2023-05-18 10:58:37 -03:00
oobabooga
0bcd5b6894
Soothe anxious users
2023-05-18 10:56:49 -03:00
oobabooga
1309cdd257
Add a space
2023-05-10 18:03:12 -03:00
oobabooga
3e19733d35
Remove obsolete comment
2023-05-10 18:01:04 -03:00
oobabooga
4ab5deeea0
Update INSTRUCTIONS.TXT
2023-05-10 18:00:37 -03:00
oobabooga
d7d3f7f31c
Add a "CMD_FLAGS" variable
2023-05-10 17:54:12 -03:00
oobabooga
b8cfc20e58
Don't install superbooga by default
2023-05-09 14:17:08 -03:00
jllllll
29727c6502
Fix Windows PATH fix ( #57 )
2023-05-09 01:49:27 -03:00