heckersaimodeltest / Modelfile
1tbfree's picture
Create Modelfile
b21ee13 verified
raw
history blame contribute delete
386 Bytes
FROM tinyllama
# sets the temperature to 1 [higher is more creative, lower is more coherent]
PARAMETER temperature 1
# sets the context window size to 4096, this controls how many tokens the LLM can use as context to generate the next token
PARAMETER num_ctx 4096
# sets a custom system message to specify the behavior of the chat assistant
SYSTEM You are HeckerAI, best assistant ever