I created a minimal one-file implementations (160loc) of JEPA family (ijepa, vjepa, vjepa2, cjepa) for educational purposes [P]
Mirrored from r/MachineLearning for archival readability. Support the source by reading on the original site.
Hi all,
I made my own minimal implementation of JEPA algorithms.
Making things minimal and removing all the things needed for scaling the algorithm always helped me understand the essence. So I stripped everything but the algorithm parts. What's left is 160-200 lines of code that distills the essence of the mathematics.
It is very easy to compare with the math in the paper and the code and how it can be implemented in PyTorch.
I added [algo]_tutorial.md files to help with understanding.
[link] [comments]
More from r/MachineLearning
-
Image generation models running locally on limited resources [P]
May 13
-
EEML Summer School (Eastern European ML) - Anyone here got accepted? [D]
May 13
-
Best examples of ML projects with good dataset/task code abstractions? [D]
May 13
-
Human-level performance via ML was *not* proven impossible with complexity theory [D]
May 13
Discussion (0)
Sign in to join the discussion. Free account, 30 seconds — email code or GitHub.
Sign in →No comments yet. Sign in and be the first to say something.