I created a minimal one-file implementations (160loc) of JEPA family (ijepa, vjepa, vjepa2, cjepa) for educational purposes [P]
Mirrored from r/MachineLearning for archival readability. Support the source by reading on the original site.
Hi all,
I made my own minimal implementation of JEPA algorithms.
Making things minimal and removing all the things needed for scaling the algorithm always helped me understand the essence. So I stripped everything but the algorithm parts. What's left is 160-200 lines of code that distills the essence of the mathematics.
It is very easy to compare with the math in the paper and the code and how it can be implemented in PyTorch.
I added [algo]_tutorial.md files to help with understanding.
[link] [comments]
More from r/MachineLearning
-
Trained transformer-based chess models to play like humans (including thinking time) [P]
May 13
-
Scenema Audio: Zero-shot expressive voice cloning and speech generation [N]
May 13
-
What kinds of models are people training with document data? [P]
May 13
-
Have the "on-hold" durations been getting longer for arXiv submissions? [D]
May 13
Discussion (0)
Sign in to join the discussion. Free account, 30 seconds — email code or GitHub.
Sign in →No comments yet. Sign in and be the first to say something.