NHacker Next
  • new
  • past
  • show
  • ask
  • show
  • jobs
  • submit
Show HN: 1-Bit Bonsai, the First Commercially Viable 1-Bit LLMs (prismml.com)
alyxya 5 minutes ago [-]
I expect the trend of large machine learning models to go towards bits rather than operating on floats. There's a lot of inefficiency in floats because typically they're something like normally distributed, which makes the storage and computation with weights inefficient when most values are clustered in a small range. The foundation of neural networks may be rooted in real valued functions, which are simulated with floats, but float operations are just bitwise operations underneath. The only issue is that GPUs operate on floats and standard ML theory works over real numbers.
Archit3ch 1 minutes ago [-]
Doesn't Jevons paradox dictate larger 1-bit models?
OutOfHere 2 minutes ago [-]
How do I run this on Android?
syntaxing 24 minutes ago [-]
Super interesting, building their llama cpp fork on my Jetson Orin Nano to test this out.
yodon 45 minutes ago [-]
Is Bonsai 1 Bit or 1.58 Bit?
woadwarrior01 43 minutes ago [-]
1-bit g128 with a shared 16-bit scale for every group. So, effectively 1.125 bit.
stogot 36 minutes ago [-]
What is the value of a 1 bit? For those that do not kno
jacquesm 32 minutes ago [-]
That you can process many operations with a single instruction.
SwellJoe 31 minutes ago [-]
0 or 1
trebligdivad 33 minutes ago [-]
Speed and density.
2 hours ago [-]
simian1983 41 minutes ago [-]
[dead]
Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact
Rendered at 23:23:44 GMT+0000 (Coordinated Universal Time) with Vercel.