With the updated config.json, the model can handle a larger context window than 8192

#9
by Joseph717171 - opened

@mlabonne Test? After converting your model, using the updated config.json, to GGUF, your model runs fine with a context window greater than 8K. I ran it with a 24k context window, and I was able to continue my last chat, in LM Studio, where I left off - at 16k - and, the model was coherent and able to continue conversing and interacting with me no problem. πŸ€”

Owner

Hero! πŸ‘πŸ‘πŸ‘ I will update my GGUFs

Sign up or log in to comment