Sandbox: Difference between revisions
Jump to navigation
Jump to search
No edit summary |
No edit summary |
||
Line 1: | Line 1: | ||
Voici comment installer un chatbot sur un Desktop LINUX. 16 Go sont nécessaires. | |||
==chatbot== | ==chatbot== | ||
<pre> | <pre> | ||
cd | cd | ||
mkdir chatbot | mkdir chatbot | ||
cd chatbot | cd chatbot | ||
wget https://huggingface.co/Pi3141/vicuna-13b-v1.1-ggml/resolve/main/ggml-model-q4_2.bin | wget https://huggingface.co/Pi3141/vicuna-13b-v1.1-ggml/resolve/main/ggml-model-q4_2.bin | ||
wget https://github.com/ItsPi3141/alpaca-electron/releases/download/v1.0.5/Alpaca-Electron-linux-x64-v1.0.5.tar.gz | wget https://github.com/ItsPi3141/alpaca-electron/releases/download/v1.0.5/Alpaca-Electron-linux-x64-v1.0.5.tar.gz | ||
tar -xvf Alpaca-Electron-linux-x64-v1.0.5.tar.gz | tar -xvf Alpaca-Electron-linux-x64-v1.0.5.tar.gz | ||
cd "release-builds/Alpaca Electron-linux-x64" | |||
./"Alpaca Electron" | |||
</pre> | </pre> | ||
* Charger le binaire ggml-model-q4_2.bin | |||
* C'est bon ! | |||
===old=== | |||
#https://www.youtube.com/watch?v=KopKQDmGk_o | |||
#https://huggingface.co/Pi3141/alpaca-7b-native-enhanced/blob/main/ggml-model-q8_0.bin | |||
#https://github.com/ItsPi3141/alpaca-electron/releases | |||
#wget https://huggingface.co/Pi3141/alpaca-7b-native-enhanced/resolve/main/ggml-model-q8_0.bin | |||
#wget https://huggingface.co/Pi3141/alpaca-7b-native-enhanced/resolve/main/ggml-model-q4_1.bin | |||
https://huggingface.co/Pi3141/alpaca-7b-native-enhanced/resolve/main/consolidated.00.pth | |||
https://huggingface.co/Pi3141/vicuna-13b-v1.1-ggml/resolve/main/ggml-model-q4_2.bin | |||
https://huggingface.co/Pi3141/alpaca-7b-native-enhanced/resolve/main/ggml-model-q4_3.bin | |||
https://huggingface.co/Pi3141/alpaca-7b-native-enhanced/resolve/main/ggml-model-q4_1.bin |
Revision as of 10:39, 12 June 2023
Voici comment installer un chatbot sur un Desktop LINUX. 16 Go sont nécessaires.
chatbot
cd mkdir chatbot cd chatbot wget https://huggingface.co/Pi3141/vicuna-13b-v1.1-ggml/resolve/main/ggml-model-q4_2.bin wget https://github.com/ItsPi3141/alpaca-electron/releases/download/v1.0.5/Alpaca-Electron-linux-x64-v1.0.5.tar.gz tar -xvf Alpaca-Electron-linux-x64-v1.0.5.tar.gz cd "release-builds/Alpaca Electron-linux-x64" ./"Alpaca Electron"
- Charger le binaire ggml-model-q4_2.bin
- C'est bon !
old
- https://www.youtube.com/watch?v=KopKQDmGk_o
- https://huggingface.co/Pi3141/alpaca-7b-native-enhanced/blob/main/ggml-model-q8_0.bin
- wget https://huggingface.co/Pi3141/alpaca-7b-native-enhanced/resolve/main/ggml-model-q8_0.bin
- wget https://huggingface.co/Pi3141/alpaca-7b-native-enhanced/resolve/main/ggml-model-q4_1.bin
https://huggingface.co/Pi3141/alpaca-7b-native-enhanced/resolve/main/consolidated.00.pth
https://huggingface.co/Pi3141/vicuna-13b-v1.1-ggml/resolve/main/ggml-model-q4_2.bin
https://huggingface.co/Pi3141/alpaca-7b-native-enhanced/resolve/main/ggml-model-q4_3.bin
https://huggingface.co/Pi3141/alpaca-7b-native-enhanced/resolve/main/ggml-model-q4_1.bin