Historia zmian

Autor SHA1 Wiadomość Data
  oobabooga 145c725c39 Bump RWKV version 2 lat temu
  oobabooga 2af66a4d4c Fix <USER> in pygmalion replies 2 lat temu
  oobabooga a54b91af77 Improve readability 2 lat temu
  oobabooga 8e706df20e Fix a memory leak when text streaming is on 2 lat temu
  oobabooga 5492e2e9f8 Add sentencepiece 2 lat temu
  oobabooga 90206204aa Merge pull request #163 from oobabooga/hf_llama 2 lat temu
  oobabooga c33715ad5b Move towards HF LLaMA implementation 2 lat temu
  oobabooga bd8aac8fa4 Add LLaMA 8-bit support 2 lat temu
  oobabooga c93f1fa99b Count the tokens more conservatively 2 lat temu
  oobabooga 736f61610b Update README 2 lat temu
  oobabooga ed8b35efd2 Add --pin-weight parameter for FlexGen 2 lat temu
  oobabooga 05e703b4a4 Print the performance information more reliably 2 lat temu
  oobabooga 5a79863df3 Increase the sequence length, decrease batch size 2 lat temu
  oobabooga e62b9b1074 Revamp the "Default" preset with HF defaults 2 lat temu
  oobabooga a345a2acd2 Add a tokenizer placeholder 2 lat temu
  oobabooga 4cc36dc434 Tweak the Naive preset (for LLaMA/RWKV) 2 lat temu
  oobabooga 5b354817f6 Make chat minimally work with LLaMA 2 lat temu
  oobabooga ea5c5eb3da Add LLaMA support 2 lat temu
  oobabooga 2bff646130 Stop chat from flashing dark when processing 2 lat temu
  oobabooga 7c70e0e2a6 Fix the download script (sort of) 2 lat temu
  oobabooga bcea196c9d Bump flexgen version 2 lat temu
  oobabooga 76378c6cc2 Update README 2 lat temu
  oobabooga 169209805d Model-aware prompts and presets 2 lat temu
  oobabooga 024d30d1b4 Reorder imports 2 lat temu
  oobabooga 7bbe32f618 Don't return a value in an iterator function 2 lat temu
  oobabooga ff9f649c0c Remove some unused imports 2 lat temu
  oobabooga 1a05860ca3 Ensure proper no-streaming with generation_attempts > 1 2 lat temu
  oobabooga a2a3e8f797 Add --rwkv-strategy parameter 2 lat temu
  oobabooga 99dc95e14e Minor aesthetic change 2 lat temu
  oobabooga 449116a510 Fix RWKV paths on Windows (attempt) 2 lat temu