tToqi Tahamid Sarker
  • About
  • Publications
  • Projects
  • Resume
  • Blog
AboutPapersProjectsResumeBlog
© 2026 Toqi Tahamid Sarker • All rights reserved
EntriesLinksNotesQuotesTILsGuides

Links

13 links · all posts

Modern Transformer Architecture: A Curated YouTube Course(x.com)

Jia-Bin Huang (UMD) curated a short YouTube course on Transformer architecture: attention, positional encodings, vision transformers, and recent variants. If you work in CV and feel like you picked up transformers by osmosis rather than really understanding them, this is a good way to fill the gaps. Links in the thread, in order.

#Apr. 10, 2026/transformers,deep-learning,computer-vision,learning-resources

The Idea File: Why LLM Agents Change How We Share Work(x.com)

Karpathy's follow-up to a viral tweet. The argument: now that agents can write the code, what you actually want to share is the idea, not the implementation. He calls it an "idea file": a short spec of what to build, nothing more. Worth thinking about for anyone who shares research tools or scripts with collaborators.

#Apr. 4, 2026/llm,agents,research,productivity,ai

Using LLMs to Build Personal Research Knowledge Bases(x.com)

Karpathy on using LLMs to build knowledge bases from your own reading, papers, and notes instead of asking the model to recall from its training data. For niche research areas, the model just doesn't know enough. You feed it your corpus, and it becomes a reference you can actually interrogate. For fields like precision agriculture or remote sensing where survey coverage is thin, this is genuinely practical.

#Apr. 2, 2026/llm,research-tools,productivity,knowledge-management

Flow Matching and Diffusion Models from Scratch: Free Lecture Notes(arxiv.org)

Lecture notes from Peter Holderrieth and Ezra Erives covering flow matching and diffusion models from scratch. No background in generative modeling assumed. Goes from the math up to current state of the art. Reading the original papers in sequence (DDPM, DDIM, flow matching) works, but each one assumes you got something from the previous. These notes don't make that assumption.

#Mar. 20, 2026/diffusion-models,generative-ai,deep-learning,learning-resources

LLM Architecture Gallery: Every Major Architecture in One Place(sebastianraschka.com)

Sebastian Raschka collected architecture diagrams for most of the major LLM families in one place: GPT, BERT, T5, LLaMA variants, Mistral, Gemma. When a paper says it builds on LLaMA-2 with GQA and you want to know what that actually looks like, this is faster than digging through GitHub readmes.

#Mar. 15, 2026/llm,transformers,architecture,learning-resources,deep-learning

The Shorthand Guide to Everything Claude Code(x.com)

Affaan Mustafa's complete Claude Code setup after 10 months of daily use, covering skills, hooks, subagents, MCPs, plugins, context window management, and editor integration. Dense with practical patterns: hook examples for auto-formatting and tmux reminders, rules structure, subagent scoping, and the one rule that matters most. Keep your active MCPs under 10 or your 200k context window quietly becomes 70k.

#Feb. 28, 2026/claude-code,tools,productivity,ai

Polars — A Lightning-Fast DataFrame Library(pola.rs)

Polars is a DataFrame library written in Rust with a Python API. It's dramatically faster than pandas for most workloads. Lazy evaluation, SIMD, and zero-copy Arrow memory mean you can process large datasets without running out of RAM. Worth trying if you're still reaching for pandas by default.

#Feb. 28, 2026/python,data,tools

Everything We Learned About LLMs in 2025: Simon Willison's Annual Roundup(simonwillison.net)

Simon Willison's annual LLM recap, this time 26 sections long. Covers reasoning models, multimodal, tool use, agents, fine-tuning, inference efficiency, safety, and open weights. He's been doing this for three years so there's a lot of accumulated context. Don't read it straight through. Pick a section from the table of contents and start there.

#Dec. 31, 2025/llm,ai,year-in-review,research

Reading Research Papers with a 3-Pass LLM Method(x.com)

Andrej Karpathy's reading habit: three passes through anything worth understanding. First pass is manual. Second, ask the LLM to explain and summarize. Third, Q&A on the parts that didn't land. He says he comes away with noticeably better understanding than if he'd just moved on. Works especially well for CV papers where the actual contribution is buried under five pages of related work.

#Nov. 18, 2025/research,llm,productivity,academic,reading

What We Know About AI Agents: A 264-Page Survey from Meta, DeepMind, Stanford(arxiv.org)

A 264-page survey on AI agents from researchers at Meta, Yale, Stanford, Google DeepMind, and Microsoft. Covers memory, planning, tool use, multi-agent coordination, and evaluation. If you only read one chapter, make it the one on evaluation. That's where most agent benchmark claims quietly stop making sense.

#Apr. 5, 2025/agents,llm,research,survey,ai

How to Write the Discussion Section of a Research Paper(x.com)

An annotated example showing what a well-written discussion section looks like, paragraph by paragraph. Most papers either restate the results or jump straight to future work. This one shows how to connect your findings to existing literature, be honest about what the limitations actually mean, and say something that sticks. More practical than any generic academic writing guide.

#Jun. 21, 2024/academic-writing,research,phd,paper-writing

A Complete Guide to Learning ML from YouTube (13 Courses)(x.com)

Sanju Sinha put this together from what he actually watched during his PhD, in the order he'd recommend watching them. Not just a list of popular channels, but sequenced to build on each other. Good to share with anyone just getting started in the lab.

#Nov. 7, 2022/machine-learning,deep-learning,learning-resources,phd

Zotero 101: The Best Free Citation Manager for Academics(x.com)

Mushtaq Bilal's visual walkthrough for getting started with Zotero. Covers the browser connector, library organization, citation styles, and syncing across devices. If you're still wrestling with a .bib file by hand or copy-pasting DOIs one at a time, this is the right place to start.

#Aug. 25, 2022/academic,research-tools,phd,productivity,writing

Years

20267
20253
20241
20222

Tags

productivity7
ai6
llm6
research5
tools4
deep-learning4
learning-resources4
claude-code3
phd3
transformers2
agents2
research-tools2
academic2
memory1
computer-vision1
knowledge-management1
diffusion-models1
generative-ai1
architecture1
python1