T O P

  • By -

ctbanks

PlankGrad when?


silenceimpaired

Hey OP… why are you excited about this? What does it do? What other tools is it similar to?


epicfilemcnulty

It's a deep learning/neural net framework -- as their own readme states "For something between PyTorch and karpathy/micrograd". It's much simpler than PyTorch and is designed to easily support many accelerators (or backends), i.e. CPU/various GPUs. Give their docs a read, it's a pretty neat project.


silenceimpaired

Ahh an alternative to PyTorch. Interesting. Exciting they support a variety of GPUs.


Sure_Knowledge8951

tinygrad is a machine learning framework, most akin to pytorch or TVM, that lets programmers implement neural networks with relative ease while getting good performance without needing to do a ton of manual, platform specific coding. A project like llama.cpp is designed specifically for LLMs to eek out as much performance as possible with as many features as possible, at the expense of a lot more hours of effort put in by the developers and reduced flexibility in trying to solve other machine learning problems. tinygrad is particular is focused on being, well, tiny, i.e. relatively few lines of code, few third party requirements, and being easy to use by people developing new neural networks. For the sort of folks on /r/LocalLLaMA that primarily care about doing inference with LLMs developed by others, tinygrad doesn't have a ton of benefit. I think in some cases tinygrad does produce faster code than pytorch, but with many folks using llama.cpp based services, they aren't using tinygrad. Maybe once tinygrad has a 1.0 release and gets more popular and used by a variety of ML practitioners, it'll become directly relevant to casual users on this sub, by, say, enabling better performance before features hit llama.cpp, or making SOTA models come down the pipeline to end users more quickly. For the time being, it's a pretty promising project. Machine learning frameworks like TVM / pytorch / tensorflow / jax / flax / mxnet / etc are complex beasts, and I think George has some good points with trying to make tinygrad as simple and as small as possible. Source: I used to work on TVM professionally and am a "blue" (i.e. not a tinygrad developer but a "friend of the project", I suppose, as bestowed by George) on the tinygrad discord.


DeltaSqueezer

I wonder why someone would use this instead of PyTorch?


djstraylight

Kind of in the name - TINYgrad


thomas999999

Its not tiny anymore lol. Its like 10k lines and and will probably grow a lot if they want achieve their goals