Rewriting PyTorch Badly... on Purpose
Error, Error on the wall...

Rewriting PyTorch Badly... on Purpose

A few months ago, I set out on a journey. This is right after I wrapped up xsNumPy. Not to build the fastest model, nor the flashiest architecture. No, this journey was different... it was about going slow, on purpose.

I called the project SlowTorch.

At first glance, the name might sound like an ironic jab at PyTorch 's efficiency (it isn’t), or perhaps a cheeky play on slow learning (which it is). But beneath the pun lies something deeply personal: a return to the fundamentals, a deliberate re-immersion into the building blocks of neural networks, and the mindset required to truly understand them.

Motivation behind SlowTorch

Field like AI which is moving at Flash's speed, where models are literally deployed before their papers finish peer review, and tutorials skip straight to “just import this”, it’s easy to feel like you’re constantly racing just to stay relevant. I’ve felt that too, even as someone who’s worked on systems like GitHub Copilot and contributed to projects at OpenAI .

But one thing is true no matter what... understanding doesn’t come from speed. It comes from slowing down.

It's been a while since I last wrote anything on LinkedIn, things have changed. I'm a teacher now 😎 (yay!). So, SlowTorch began as a teaching tool, but quickly became a learning tool for me. From the name itself, you might've figured it out, it’s a cheeky PyTorch re-implementation starting from scratch (of the more important constructs at least), one piece at a time. No magic, no abstraction walls. Just tensors, gradients, and a lot of curiosity.

Teaching Myself (and Others)

As a professor, mentor, and open-source contributor, I often guide others through ML systems. But with SlowTorch, I’ve flipped the script... I’m the student again.

And it’s humbling.

Documenting this journey has reminded me of the power of pedagogy — not just teaching what we do in machine learning, but why we do it. The plan was to write a slow-paced blog series, much in the spirit of xsNumPy, exploring each primitive in a way that demystifies it. If a student can say, “Oh, I see how the gradients are computed from scratch now,” then SlowTorch will have done its job.

With SlowTorch, I was rebuilding models by hand and not just using PyTorch modules, but recreating them. Linear layers, activation functions, loss calculations, backpropagation… each line of code felt like a rediscovery.

Some highlights of what I’ve learned (or rather, remembered deeply):

  • Why weight initialisation really matters — and not just in terms of convergence. Trust me, it's not truly random...
  • How gradients flow (or don’t) through non-linearities.
  • The beauty of writing your own .backward() pass — and the clarity it brings.
  • What truly happens inside nn.Module when you hit .forward().

Slowness is a Feature, Not a Bug 😌

In an age of accelerated everything, I believe there’s something beautiful about slow software. Not because it’s inefficient — but because it’s transparent. You can see the ideas unfold. You can trace the logic. And most importantly, you can learn.

Because sometimes, the best way forward is to take a step back.

Jennifer Sanchez Lugo

Actuarial Science Student at DePaul University | Operations Associate at Rite Portable Restrooms

2w

Love it, thanks for sharing Akshay!

Like
Reply
Aniket Bangar

Machine Learning Engineer | Python | Gen AI | ex-Accenture Data Science CoE | Msc in AI & ML

2w

Interesting Akshay. Will definitely try this out!

Mitansh Chaudhari

Sophomore Student at National Louis University

2w

💡 Great insight

To view or add a comment, sign in

More articles by Akshay Mestry

  • Curiosity Meets Code

    Building xsNumPy to Learn NumPy As developers, researchers, and enthusiasts, we often rely on sophisticated tools to…

    2 Comments
  • From Loss to Gratitude...

    For those who all are reading this, I'd like to thank you for taking your time. Ever since last week, I wanted to write…

    1 Comment
  • Embracing Open Science with NASA's TOPS Initiative!

    Over the years, a lot of people that I've interacted with especially students and even experienced professionals have…

    2 Comments
  • Advancing Open Science with NASA’s TOPST SCHOOL Initiative 🌎

    I'm incredibly excited to share a significant milestone in my professional as well as academic journey — one that…

    4 Comments
  • 🚀 My Experience with NASA's TOPS! 🌟

    Last month, I embarked on an extraordinary journey with NASA's Transform to Open Science (TOPS) 101 workshop at…

    5 Comments

Insights from the community

Others also viewed

Explore topics