Bladeren bron

Add display of context when input was generated

Not sure if I did this right but it does move with the conversation and seems to match value.
Forkoz 2 jaren geleden
bovenliggende
commit
b740c5b284
1 gewijzigde bestanden met toevoegingen van 1 en 1 verwijderingen
  1. 1 1
      modules/text_generation.py

+ 1 - 1
modules/text_generation.py

@@ -270,5 +270,5 @@ def generate_reply(question, max_new_tokens, do_sample, temperature, top_p, typi
         traceback.print_exc()
     finally:
         t1 = time.time()
-        print(f"Output generated in {(t1-t0):.2f} seconds ({(len(output)-len(original_input_ids[0]))/(t1-t0):.2f} tokens/s, {len(output)-len(original_input_ids[0])} tokens)")
+        print(f"Output generated in {(t1-t0):.2f} seconds ({(len(output)-len(original_input_ids[0]))/(t1-t0):.2f} tokens/s, {len(output)-len(original_input_ids[0])} tokens, context {len(original_input_ids[0])})")
         return