oobabooga 8781c84287 Add support for latest cuda branch il y a 2 ans
..
GPTQ_loader.py 8781c84287 Add support for latest cuda branch il y a 2 ans
LoRA.py a21e580782 Move an import il y a 2 ans
RWKV.py 09b0a3aafb Add repetition_penalty il y a 2 ans
callbacks.py b246d17513 Fix `type object is not subscriptable` il y a 2 ans
chat.py ae1fe45bc0 One more cache reset il y a 2 ans
deepspeed_parameters.py f38c9bf428 Fix deepspeed (oops) il y a 3 ans
extensions.py 1edfb96778 Fix loading extensions from within the interface il y a 2 ans
html_generator.py 8ef89730a5 Try to better handle browser image cache il y a 2 ans
llamacpp_model.py 2c52310642 Add --threads flag for llama.cpp il y a 2 ans
models.py 4ab679480e allow quantized model to be loaded from model dir (#760) il y a 2 ans
shared.py 65d8a24a6d Show profile pictures in the Character tab il y a 2 ans
text_generation.py b0890a7925 Add shared.is_chat() function il y a 2 ans
training.py 2a267011dc Use Path.stem for simplicity il y a 2 ans
ui.py d30a14087f Further reorganize the UI il y a 2 ans