Source: Hacker News
Article note: I've been suggesting that most of the shitty floating point formats could be replaced with {-1,0,1,NaN} as a joke about the terribleness of shitty number representations, and the general sloppiness of the current "AI" hype cycle... it kind of looks like it's not actually a joke, you _can_ replicate a lot of the "AI" models with only {-1,0,1}. And that definitionally doesn't involve any multiply hardware, so it's _ridiculously_ more power efficient.
Comments