LogoThread Easy
  • 探索
  • 撰写 Thread
LogoThread Easy

您的一体化 Twitter 线程助手

© 2025 Thread Easy All Rights Reserved.

探索

最新在前,按卡片方式浏览线程

开启时会模糊预览图,关闭后正常显示

Not only is it way too easy to look back in hindsight & convince yourself your predictions (about any given thing) were right, but the algo will conspire to help you with this. It's a risk for everyone, myself included. I see ppl doing this all the time now. Concerning.

Not only is it way too easy to look back in hindsight & convince yourself your predictions (about any given thing) were right, but the algo will conspire to help you with this. It's a risk for everyone, myself included. I see ppl doing this all the time now. Concerning.

Author. Coder. CTO. θηριομάχης. Building: https://t.co/otXT4Wy6WR. Writing: https://t.co/dBPBtyCIHw.

avatar for Jon Stokes
Jon Stokes
Mon Dec 22 22:21:53
RT @lusya68911418: @PandaTalk8 还有一种搞法是聊了500个传统老板才学会的

不谈钱,也不谈事,只谈人

谈对人的认可,认可他的努力,认可他的过去,认可他犯的错,认可他现在的成就和失败,认可他现在的认知…

RT @lusya68911418: @PandaTalk8 还有一种搞法是聊了500个传统老板才学会的 不谈钱,也不谈事,只谈人 谈对人的认可,认可他的努力,认可他的过去,认可他犯的错,认可他现在的成就和失败,认可他现在的认知…

独立开发者 | 个人IP教练 | 帮助新手在X上完成早期成长| 公众号:PandaTalk8

avatar for Mr Panda
Mr Panda
Mon Dec 22 22:21:48
I admit I'm tiring of seeing DS-MoE everywhere. 
Whale has got to reinvent this one more time. Read Google papers, discern kernels of usable ideas, add your own, then everyone will be like "oh of course" for 2 more years

I admit I'm tiring of seeing DS-MoE everywhere. Whale has got to reinvent this one more time. Read Google papers, discern kernels of usable ideas, add your own, then everyone will be like "oh of course" for 2 more years

We're in a race. It's not USA vs China but humans and AGIs vs ape power centralization. @deepseek_ai stan #1, 2023–Deep Time «C’est la guerre.» ®1

avatar for Teortaxes▶️ (DeepSeek 推特🐋铁粉 2023 – ∞)
Teortaxes▶️ (DeepSeek 推特🐋铁粉 2023 – ∞)
Mon Dec 22 22:20:48
Korea (Upstage) has pretrained a 100B class MoE that they claim is "enterprise-grade" on Blackwells
Previous Solar model that made a splash was 10.7B, an interesting experiment in depth-upscaling Mistral-7B that rivaled Mixtral 8x7B.

Korea (Upstage) has pretrained a 100B class MoE that they claim is "enterprise-grade" on Blackwells Previous Solar model that made a splash was 10.7B, an interesting experiment in depth-upscaling Mistral-7B that rivaled Mixtral 8x7B.

I admit I'm tiring of seeing DS-MoE everywhere. Whale has got to reinvent this one more time. Read Google papers, discern kernels of usable ideas, add your own, then everyone will be like "oh of course" for 2 more years

avatar for Teortaxes▶️ (DeepSeek 推特🐋铁粉 2023 – ∞)
Teortaxes▶️ (DeepSeek 推特🐋铁粉 2023 – ∞)
Mon Dec 22 22:18:03
humans are the best. because of course you'd build a massive model train village in your house every year just for the holidays.

https://t.co/fg81bRxIuy

humans are the best. because of course you'd build a massive model train village in your house every year just for the holidays. https://t.co/fg81bRxIuy

✨ dabbler 🪀 @SuperFanToys 🎨 @habitisdead

avatar for Josh Pigford
Josh Pigford
Mon Dec 22 22:17:48
RT @nntaleb: Molto importante!
@ylecun is delivering the AI consequence of Quine's "Dogmas of empiricism".
There are structural reasons tha…

RT @nntaleb: Molto importante! @ylecun is delivering the AI consequence of Quine's "Dogmas of empiricism". There are structural reasons tha…

Professor at NYU. Chief AI Scientist at Meta. Researcher in AI, Machine Learning, Robotics, etc. ACM Turing Award Laureate.

avatar for Yann LeCun
Yann LeCun
Mon Dec 22 22:15:48
  • Previous
  • 1
  • More pages
  • 111
  • 112
  • 113
  • More pages
  • 5634
  • Next