Text Generation
Transformers
Safetensors
English
falcon_mamba
conversational
Inference Endpoints

model not working as intended

#14
by Daemontatox - opened

whats the best way to inference/ use the model? Whenever i use it i get gibberish
Like :
Screenshot_20240908_183617_Chrome.jpg

Anything besides the examples produces gibberish

Daemontatox changed discussion title from Inference Requirements to model not working as intended

if your message is too short, you can get weird results.

Sadly it's not working , i tried everything from prompting to fine tuning , i even tried modifying the example prompts in the demo.
Still not working , had high hopes for a mamba llm

This is what I got from the demo.

image.png

What type of task are you trying?

Sign up or log in to comment