1Show HN: Project AELLA – Open LLMs for structuring 100M research papers (opens in new tab)(aella.inference.net)6funfunfunction4mo ago2
2Hybrid-Attention models are the future for SLMs (opens in new tab)(inference.net)4funfunfunction4mo ago0
3Show HN: Using LLMs and >1k 4090s to visualize 100k scientific research articles (opens in new tab)(twitter.com)5funfunfunction5mo ago2
4Viral GPT wrappers are now training their own LLMs (opens in new tab)(twitter.com)8funfunfunction5mo ago0
5UWU – generate CLI commands without leaving the terminal (opens in new tab)(github.com)16funfunfunction7mo ago2
6Show HN: UwU – Generate CLI commands inline with GPT-5 (opens in new tab)(github.com)3funfunfunction7mo ago0
7How much energy does it take to produce an LLM token? (opens in new tab)(energy.inference.net)3funfunfunction8mo ago0
8When to use model distillation in production (opens in new tab)(inference.net)1funfunfunction8mo ago0
9Show HN: Batch inference for large-scale synthetic data generation (opens in new tab)(inference.net)2funfunfunction1y ago0
13VC-backed AI startups are struggling. Indie devs are not (opens in new tab)(twitter.com)2funfunfunction2y ago0
14Show HN: Chatbots for Technical Documentation (opens in new tab)(usecontext.io)2funfunfunction2y ago1
15BabyAGI-ts: An NPM module to easily install, run, and play with BabyAGI locally (opens in new tab)(github.com)2funfunfunction2y ago0