LogoThread Easy
  • 探索
  • 撰写 Thread
LogoThread Easy

您的一体化 Twitter 线程助手

© 2025 Thread Easy All Rights Reserved.

探索

最新在前,按卡片方式浏览线程

开启时会模糊预览图,关闭后正常显示

I see a lot of bad takes on X about PhDs and frontier labs (not just this quoted tweet), so let me chime in.

For context, I didn't do a prestigious undergrad, worked a bit in a startup as an applied ML engineer, then did a PhD, and now work in a frontier lab.

A PhD isn't following a course from a teacher or magically becoming an overpowered scientist. It's about doing a few things with intense focus for a long time, alone. You may have a great advisor guiding you, but that only affects the learning rate. At the end of the day, it's a lot of alone time with you and your thoughts on a problem few people care about.

Good news: if you can find time, you can get a very similar experience! It'll be slower since you may not have as much free time, compute (though a free T4 on Colab is great), or an advisor and teammates (but there are great open communities).

Today's distinction between ML research and applied ML is often small. Grinding paper reproductions on Colab and improving them one step at a time is a great way to become a researcher.

The real worry is "Can I join a frontier lab without a PhD?" You'll face fierce competition for research scientist jobs, but even top PhD grads do—it's a mix of talent and luck. Re-implementing a paper and posting it on X probably isn't enough now, but publicly trying improvements and sharing interesting results, even as a blog post, can work! I know several frontier lab researchers, some extremely famous on X, who started this way.

Personally, I loved my PhD. It was time to learn and explore ideas fully and freely. Do you absolutely need one? No.

I see a lot of bad takes on X about PhDs and frontier labs (not just this quoted tweet), so let me chime in. For context, I didn't do a prestigious undergrad, worked a bit in a startup as an applied ML engineer, then did a PhD, and now work in a frontier lab. A PhD isn't following a course from a teacher or magically becoming an overpowered scientist. It's about doing a few things with intense focus for a long time, alone. You may have a great advisor guiding you, but that only affects the learning rate. At the end of the day, it's a lot of alone time with you and your thoughts on a problem few people care about. Good news: if you can find time, you can get a very similar experience! It'll be slower since you may not have as much free time, compute (though a free T4 on Colab is great), or an advisor and teammates (but there are great open communities). Today's distinction between ML research and applied ML is often small. Grinding paper reproductions on Colab and improving them one step at a time is a great way to become a researcher. The real worry is "Can I join a frontier lab without a PhD?" You'll face fierce competition for research scientist jobs, but even top PhD grads do—it's a mix of talent and luck. Re-implementing a paper and posting it on X probably isn't enough now, but publicly trying improvements and sharing interesting results, even as a blog post, can work! I know several frontier lab researchers, some extremely famous on X, who started this way. Personally, I loved my PhD. It was time to learn and explore ideas fully and freely. Do you absolutely need one? No.

And if some are curious about the grinding part of a PhD, I really enjoyed reading https://t.co/JnotoVqzLI. It was strangely comforting when doing my PhD, often alone, and sharing through reading another fellow PhD student's experience.

avatar for Arthur Douillard
Arthur Douillard
Thu Nov 06 20:48:40
RT @ekuyda: there really was a lull in terms of interest in language models at some point - Google was still working on them but without ev…

RT @ekuyda: there really was a lull in terms of interest in language models at some point - Google was still working on them but without ev…

AI Apps investing @ A16Z; A1111; Boards of Krea, Deel, Clutch, Titan, Arc Boats, Untitled, Happy Robot + more; If you’re not at the table, you’re on the menu

avatar for Anish Acharya
Anish Acharya
Thu Nov 06 20:38:54
RT @daniel_d_kang: It's that time of year again! I'm actively recruiting PhD students, please apply to UIUC if you're interested in working…

RT @daniel_d_kang: It's that time of year again! I'm actively recruiting PhD students, please apply to UIUC if you're interested in working…

Asst professor @MIT EECS & CSAIL (@nlp_mit). Author of https://t.co/VgyLxl0oa1 and https://t.co/ZZaSzaRaZ7 (@DSPyOSS). Prev: CS PhD @StanfordNLP. Research @Databricks.

avatar for Omar Khattab
Omar Khattab
Thu Nov 06 20:31:58
RT @dillon_mulroy: life update...

very excited to share i'll be joining @cloudflare to work on the developer platform and workers!

the mo…

RT @dillon_mulroy: life update... very excited to share i'll be joining @cloudflare to work on the developer platform and workers! the mo…

vp developers & ai @cloudflare ✨ and how does that error make you feel?

avatar for rita kozlov 🐀
rita kozlov 🐀
Thu Nov 06 20:27:42
RT @sach1n: This is @LiquidAI_’s LFM2-ColBERT-350M model converted to CoreML, uses MLX Swift for generating an indexed store and for search…

RT @sach1n: This is @LiquidAI_’s LFM2-ColBERT-350M model converted to CoreML, uses MLX Swift for generating an indexed store and for search…

FOLLOWS YOU. Artificial Intelligence, Cognitive Architectures, Computation. The goal is integrity, not conformity. https://t.co/rFUNzdYXuK

avatar for Joscha Bach
Joscha Bach
Thu Nov 06 20:26:49
an interesting result that doesn’t surprised me at all: dropping memorization affects significantly math results. from what I’ve seen, similarly to humans, models need to memorize standard operations with small numbers.

an interesting result that doesn’t surprised me at all: dropping memorization affects significantly math results. from what I’ve seen, similarly to humans, models need to memorize standard operations with small numbers.

Reasoning models coming (very) soon. Co-founder @pleiasfr

avatar for Alexander Doria
Alexander Doria
Thu Nov 06 20:17:34
  • Previous
  • 1
  • More pages
  • 653
  • 654
  • 655
  • More pages
  • 2111
  • Next