Adithya "Adi" Yerramsetty
Howdy! My name's Adi. Below you'll find some things I've made public, which you hopefully find interesting/useful.
I'm currently
- a junior at ASU going for a double major in CS + Math
- working on building a NeRF of all of Seattle from some driving data I collected
- implementing and playing with various algos/problems in ML/CV including Diffusion Models, NeRFs, Visual-SLAM and more
In the past I
- interned as a Deep Learning Intern at Silicon Valley Bank(May 2022 - January 2023), where I worked on Graph Neural Nets for investment predictions
- was president at the ASU Machine Learning Club, where I taught ML, Deep Learning and more to interested members
- was an undergrad researcher at ASU DREAM Lab working on NeRF/View Synthesis
- won at some hackathons, including DataFest @ ASU, ASU AI In Education and SVB @ ASU 2021
For fun, I've been working on
- a toy scalar-only AutoGrad library I built, inspired by MicroGrad by Karpathy - not very performant, but can fit Linear Models and MNIST MLP, and in theory can handle most models
- SimpleVAE: some simple VAEs I've trained up, as stepping stones towards building an LDM for NeRF supervision.
- Linear Regression in OCAML: out of curiosity about how OCAML works, I just built a Linear Regression model in it. As it turns out, OCAML is pretty nice.
I'm currently writing up
- a whirlwind tour of ML, going over some of the topics we would normally cover at the ML club
- an explainer for the proof for gradient descent; a lot of it is detailed nicely by this article, but I want to go through it myself
- a derivation for the SVD Solution to Homogenous LLS. I have it on my old site, but haven't moved it here yet. It's pretty useful in V-SLAM, from triangulation to Fundamental/Essential Matrix computations
- a post on the basics of vector search(LSH, HNSW, etc). They're a super cool application of Neural Nets, and fascinated me during my freshman year