瀏覽代碼

Detect "vicuna" as llama model type (#772)

OWKenobi 2 年之前
父節點
當前提交
ee4547cd34
共有 1 個文件被更改,包括 1 次插入1 次删除
  1. 1 1
      modules/GPTQ_loader.py

+ 1 - 1
modules/GPTQ_loader.py

@@ -52,7 +52,7 @@ def load_quantized(model_name):
     if not shared.args.model_type:
         # Try to determine model type from model name
         name = model_name.lower()
-        if any((k in name for k in ['llama', 'alpaca'])):
+        if any((k in name for k in ['llama', 'alpaca', 'vicuna'])):
             model_type = 'llama'
         elif any((k in name for k in ['opt-', 'galactica'])):
             model_type = 'opt'