mirror of
https://github.com/ggerganov/llama.cpp.git
synced 2024-12-29 15:44:18 +01:00
807b0c49ff
* llama : add inference support and model types for T5 and FLAN-T5 model families * llama : add new API functions to support encoder-decoder models: llama_encode(), llama_model_has_encoder(), llama_model_decoder_start_token() * common, llama-cli, llama-batched : add support for encoder-decoder models * convert-hf : handle shared token embeddings tensors in T5Model * convert-hf : add support for SentencePiece BPE tokenizer in T5Model (for Pile-T5 models) * convert-hf : add MT5ForConditionalGeneration and UMT5ForConditionalGeneration to architectures supported by T5Model * convert : add t5 tokenizer tests, use "slow" HF tokenizer for t5 --------- Co-authored-by: Stanisław Szymczyk <sszymczy@gmail.com> Co-authored-by: Georgi Gerganov <ggerganov@gmail.com>
47 lines
2.6 KiB
Plaintext
47 lines
2.6 KiB
Plaintext
474 287 29871 29946 29871 30226 7378
|
|
383 4000 261
|
|
|
|
259
|
|
1678
|
|
268
|
|
29871 12
|
|
29871 13
|
|
29871 13 13
|
|
29871 13 13 13
|
|
29871 12 13
|
|
15043 3186
|
|
29871 15043 3186
|
|
15043 2787
|
|
29871 15043 2787
|
|
29871 15043 2787 29991
|
|
15043 29892 3186 29991
|
|
29871 15043 29892 3186 29991
|
|
29871 445 338 29871 243 162 169 156 29889 8223
|
|
281 29900 29946 29947 29871 29955 9161 13535 18031 2176 6905
|
|
1538 4851 665 1386 29713 1305
|
|
29871 31849 31324 31934 228 162 142 228 161 146 228 162 133 228 161 153 228 161 186 31708 228 162 132 31708 228 161 165 31324 228 161 136 228 161 132 228 161 158 228 161 136 228 162 132 228 161 140
|
|
29871 243 162 157 131 313 8945 29897 29871 243 162 155 185 30722 243 162 143 174 30598 313 20787 953 3848 275 16125 630 29897 29871 31681 313 6194 953 29877 2397 393 756 967 1914 5993 29897
|
|
15043
|
|
29871 15043
|
|
259 15043
|
|
1678 15043
|
|
268 15043
|
|
268 15043 13 1678 15043
|
|
29871 313
|
|
29871 13 353
|
|
525 3152
|
|
15043 29892 343 29915 497 29991 1128 526 366 29871 243 162 155 132 1577 30672 31522 30505 11548 31041 30732 29896 29941 29896 29946 29896 29945 29896 30408 30739
|
|
1738 6824 21004
|
|
29871 29941
|
|
29871 29941 29941
|
|
29871 29941 29941 29941
|
|
29871 29941 29941 29941 29941
|
|
29871 29941 29941 29941 29941 29941
|
|
29871 29941 29941 29941 29941 29941 29941
|
|
29871 29941 29941 29941 29941 29941 29941 29941
|
|
29871 29941 29941 29941 29941 29941 29941 29941 29941
|
|
29871 29941 29941 29941 29941 29941 29941 29941 29941 29941
|
|
315 228 190 176 29874 10630 30529 29873
|
|
29871 2313 3163
|
|
29871 13 29871 13 13 29871 13 13 13 29871 12 29871 12 12 29871 12 13 259 13 1678 13 268 13 418 13 243 162 157 131 313 8945 29897 29871 243 162 155 185 30722 243 162 143 174 30598 313 20787 953 3848 275 16125 630 29897 29871 31681 29871 243 162 169 156 243 162 169 156 29871 29941 29871 29941 29941 29871 29941 29941 29941 29871 29941 29941 29941 29941 29871 29941 29941 29941 29941 29941 29871 29941 29941 29941 29941 29941 29941 29871 29941 29941 29941 29941 29941 29941 29941 29871 29941 29941 29941 29941 29941 29941 29941 29941 29871 29941 29889 29941 29871 29941 636 29941 29871 29941 856 29941 29871 31849 31324 31934 228 162 142 228 161 146 228 162 133 228 161 153 228 161 186 31708 228 162 132 31708 228 161 165 31324 228 161 136 243 162 155 132 1577 30672 31522 30505 11548 31041 30732 29896 29941 29896 29946 29896 29945 29896 30408 30739 448 23648 2751 25512 1538 4851 665 1386 29713 1305 14550 4907 11120 16159 16159 16159 15945 15945 3045 636 6824 6824 6824 8773 8773 8773 306 29915 345 1063 525 29873 1025 540 29915 29879 727 29892 525 1525 366 1854 29973 525 29924 451 1854 306 29915 645 1207 372 29892 525 29928 366 763 777 23429 29973 1334 29915 29963 29872 263 29915 29880 29931
|