تاریخچه Commit ها

نویسنده SHA1 پیام تاریخ
  oobabooga c7aa51faa6 Use a list of eos_tokens instead of just a number 2 سال پیش
  Xan b3e10e47c0 Fix merge conflict in text_generation 2 سال پیش
  oobabooga 341e135036 Various fixes in chat mode 2 سال پیش
  oobabooga b0e8cb8c88 Various fixes in chat mode 2 سال پیش
  oobabooga 0bd5430988 Use 'with' statement to better handle streaming memory 2 سال پیش
  oobabooga 37f0166b2d Fix memory leak in new streaming (second attempt) 2 سال پیش
  oobabooga 59b5f7a4b7 Improve usage of stopping_criteria 2 سال پیش
  oobabooga add9330e5e Bug fixes 2 سال پیش
  Xan 5648a41a27 Merge branch 'main' of https://github.com/xanthousm/text-generation-webui 2 سال پیش
  Xan ad6b699503 Better TTS with autoplay 2 سال پیش
  oobabooga 33fb6aed74 Minor bug fix 2 سال پیش
  oobabooga ad2970374a Readability improvements 2 سال پیش
  oobabooga 72d539dbff Better separate the FlexGen case 2 سال پیش
  oobabooga ab50f80542 New text streaming method (much faster) 2 سال پیش
  oobabooga 8e89bc596b Fix encode() for RWKV 2 سال پیش
  oobabooga 19a34941ed Add proper streaming to RWKV 2 سال پیش
  oobabooga 8660227e1b Add top_k to RWKV 2 سال پیش
  oobabooga 20bd645f6a Fix bug in multigpu setups (attempt 3) 2 سال پیش
  oobabooga 09a7c36e1b Minor improvement while running custom models 2 سال پیش
  oobabooga 24c4c20391 Fix bug in multigpu setups (attempt #2) 2 سال پیش
  oobabooga d88b7836c6 Fix bug in multigpu setups 2 سال پیش
  oobabooga e91f4bc25a Add RWKV tokenizer 2 سال پیش
  oobabooga a54b91af77 Improve readability 2 سال پیش
  oobabooga 8e706df20e Fix a memory leak when text streaming is on 2 سال پیش
  oobabooga c33715ad5b Move towards HF LLaMA implementation 2 سال پیش
  oobabooga c93f1fa99b Count the tokens more conservatively 2 سال پیش
  oobabooga 05e703b4a4 Print the performance information more reliably 2 سال پیش
  oobabooga a345a2acd2 Add a tokenizer placeholder 2 سال پیش
  oobabooga 5b354817f6 Make chat minimally work with LLaMA 2 سال پیش
  oobabooga ea5c5eb3da Add LLaMA support 2 سال پیش