The only thing I see is that it hallucinates a lot when you ask it for knowledge. Which makes sense because 8B is just not a lot to keep detailed information around. But the ability to recite training knowledge is really a misuse of LLMs and only a peculiar side-effect. I combine it with google searches (though OpenWebUI and SearXNG) and it works amazingly well then.