2020 - 2021

Master's degree (MS) in Machine Learning

at San Jose State University, California, USA

Hands-on experience with almost all traditional ML algorithms/techniques(like one hot ecnoding, scaling, Normalization, PCA), libraries (infra ecosystem) and architectures, having applied them to multiple datasets.

Explored the evolution/timeline of AI, from the early days of the perceptron to Hopfield and Boltzmann networks, followed by Recurrent Neural Networks (RNNs), Convolutional Neural Networks (CNNs), Long Short-Term Memory (LSTM) networks, and the more recent developments in attention mechanisms. Realized history of Computing is just history of creating Artificial Intelligence and Von Neumann architecture might just be a detour.

I tried to unlock GPT-2, but apparently, it was too powerful for mere mortals like me at that time. So, I had to make do with the ancient arts: unigrams, bigrams (Tf-idf), beginner-level tokenizing, basic NLP spells, Word2Vec, and the legendary ImageNet/WordNet combo. Basically, I was stuck in the Stone Age while GPT-2 laughed from its ivory tower

I dove into the wild world of Diffusion models, Transformers, and GANs. Basically, I spent my time making machines dream, transform, and battle in pixelated arenas. Fun times in the AI dungeon

Few Projects

Coursework

hello world
Python
def hello_world():
    print("Hello, AI World!")

hello_world()
Back to Timeline