Vizuara’s Substack
Subscribe
Sign in
Home
Notes
Archive
About
Latest
Top
Discussions
The three horsemen of Classical Reinforcement Learning
All about Dynamic Programming, Monte-Carlo and Temporal Difference Methods
Aug 8
•
Vizuara AI
3
Share this post
Vizuara’s Substack
The three horsemen of Classical Reinforcement Learning
Copy link
Facebook
Email
Notes
More
Hands-on RL Bootcamp Lecture 1
A practical and easy-to-follow program from Q-learning and DQNs to RLHF and GRPO!
Aug 1
•
Vizuara AI
14
Share this post
Vizuara’s Substack
Hands-on RL Bootcamp Lecture 1
Copy link
Facebook
Email
Notes
More
July 2025
Multi-modal representation learning : A deepdive and comparison of CLIP, SIGLIP and BLIP architectures
This article discusses in detail about refined parametric methods like CLIP, BLIP and SIGLIP, which help us in designing of a architecture capable of…
Jul 28
•
Siddhant Rai
2
Share this post
Vizuara’s Substack
Multi-modal representation learning : A deepdive and comparison of CLIP, SIGLIP and BLIP architectures
Copy link
Facebook
Email
Notes
More
A Primer on Self-supervised Learning
Learn without labels, think beyond tasks. Self-Supervised Learning is how machines teach themselves, from contrast to clustering, from pixels to…
Jul 14
•
Siddhant Rai
6
Share this post
Vizuara’s Substack
A Primer on Self-supervised Learning
Copy link
Facebook
Email
Notes
More
Decoding Naive Bayes: From Word Counts to Smart Predictions
Uncover the simple yet powerful math of probability that powers spam filters and text classifiers. Discover how Bayes' Theorem and a "naive" assumption…
Jul 10
•
Naman Dwivedi
4
Share this post
Vizuara’s Substack
Decoding Naive Bayes: From Word Counts to Smart Predictions
Copy link
Facebook
Email
Notes
More
Decoding the Transformer: From Sequential Chains to Parallel Webs of Attention
Unravel the architecture that dethroned RNNs. Discover how Self-Attention allows models to look at all words at once, enabling parallel processing and a…
Jul 8
•
Naman Dwivedi
7
Share this post
Vizuara’s Substack
Decoding the Transformer: From Sequential Chains to Parallel Webs of Attention
Copy link
Facebook
Email
Notes
More
Decoding Multi-Head Latent Attention (Part 2): Solving the RoPE Paradox
In Part 1, we solved the memory wall with latent compression. Now, discover how standard RoPE breaks this efficiency and why DeepSeek's "Decoupled RoPE…
Jul 7
•
Naman Dwivedi
3
Share this post
Vizuara’s Substack
Decoding Multi-Head Latent Attention (Part 2): Solving the RoPE Paradox
Copy link
Facebook
Email
Notes
More
Decoding Multi-Head Latent Attention (Part 1): The KV Cache Memory Bottleneck, Solved.
Discover why the KV Cache is the biggest bottleneck in LLM inference, how MQA and GQA tried to fix it, and how DeepSeek's Latent Attention masterfully…
Jul 6
•
Naman Dwivedi
4
Share this post
Vizuara’s Substack
Decoding Multi-Head Latent Attention (Part 1): The KV Cache Memory Bottleneck, Solved.
Copy link
Facebook
Email
Notes
More
Welcome to the Era of Experience!
Why Reinforcement Learning will dominate the next phase of AI intelligence.
Jul 5
•
Vizuara AI
11
Share this post
Vizuara’s Substack
Welcome to the Era of Experience!
Copy link
Facebook
Email
Notes
More
Vizuara AI Agents Bootcamp Day 10
n8n and AI Automation
Jul 4
•
Vizuara AI
3
Share this post
Vizuara’s Substack
Vizuara AI Agents Bootcamp Day 10
Copy link
Facebook
Email
Notes
More
Vizuara AI Agents Bootcamp Day 9
CrewAI: A simple yet powerful platform for building multi-agent frameworks
Jul 4
•
Vizuara AI
5
Share this post
Vizuara’s Substack
Vizuara AI Agents Bootcamp Day 9
Copy link
Facebook
Email
Notes
More
Decoding Convolutional Neural Networks: From Pixels to Patterns
Discover how CNNs mimic the human eye with sliding filters, learn to see features instead of just pixels, and unlock the power of image classification.
Jul 3
•
Naman Dwivedi
4
Share this post
Vizuara’s Substack
Decoding Convolutional Neural Networks: From Pixels to Patterns
Copy link
Facebook
Email
Notes
More
3
Share
Copy link
Facebook
Email
Notes
More
This site requires JavaScript to run correctly. Please
turn on JavaScript
or unblock scripts