LogoThread Easy
  • Explorar
  • Criar thread
LogoThread Easy

Seu parceiro completo para threads do Twitter

© 2025 Thread Easy All Rights Reserved.

Explorar

Newest first — browse tweet threads

Keep on to blur preview images; turn off to show them clearly

RT @HuggingPapers: Facebook just released the OMC25 dataset on Hugging Face.

This massive collection of over 27M molecular crystal structu…

RT @HuggingPapers: Facebook just released the OMC25 dataset on Hugging Face. This massive collection of over 27M molecular crystal structu…

Co-founder & CEO @HuggingFace 🤗, the open and collaborative platform for AI builders

avatar for clem 🤗
clem 🤗
Fri Dec 12 18:04:24
RT @aiDotEngineer: 🆕 Leadership talks: is all this AI code being spammed out any good? how do we prove any ROI?

@yegordb is back with an e…

RT @aiDotEngineer: 🆕 Leadership talks: is all this AI code being spammed out any good? how do we prove any ROI? @yegordb is back with an e…

achieve ambition with intentionality, intensity, & integrity i made: - @dxtipshq - @cognition - @sveltesociety - @aidotengineer - @latentspacepod + @smol_ai

avatar for swyx
swyx
Fri Dec 12 18:04:11
CUDA-L2 uses reinforcement learning to outperform cuBLAS on matrix multiplication.
Tested across 1000 HGEMM configs, it beats torch.matmul, cuBLAS, and cuBLASLt AutoTuning on A100.

+22% in offline mode.
+28.7% in server mode.
LLMs are now tuning kernels.

CUDA-L2 uses reinforcement learning to outperform cuBLAS on matrix multiplication. Tested across 1000 HGEMM configs, it beats torch.matmul, cuBLAS, and cuBLASLt AutoTuning on A100. +22% in offline mode. +28.7% in server mode. LLMs are now tuning kernels.

📄 Paper: https://t.co/8u9Vmul5KX 🔗 GitHub: https://t.co/xijapZLAgY

avatar for GitHub Projects Community
GitHub Projects Community
Fri Dec 12 18:00:13
Ted Underwood et. al. also trained a GPT-1914 which might well still the largest historical model to date (nearly 800M). https://t.co/wSA1WCmz2D

Ted Underwood et. al. also trained a GPT-1914 which might well still the largest historical model to date (nearly 800M). https://t.co/wSA1WCmz2D

Artisanal baker of reasoning models @pleiasfr

avatar for Alexander Doria
Alexander Doria
Fri Dec 12 17:58:38
i had thought we were past peak webpack but apparently not, what happened?

(vite not shown due to data error but yes has overtaken webpack)

i had thought we were past peak webpack but apparently not, what happened? (vite not shown due to data error but yes has overtaken webpack)

achieve ambition with intentionality, intensity, & integrity i made: - @dxtipshq - @cognition - @sveltesociety - @aidotengineer - @latentspacepod + @smol_ai

avatar for swyx
swyx
Fri Dec 12 17:57:03
Example of historically grounded generation (model has zero idea who Trump is and revert to early 20th century). I see model is actually still popular on HF. https://t.co/1ye5VEGerd

Example of historically grounded generation (model has zero idea who Trump is and revert to early 20th century). I see model is actually still popular on HF. https://t.co/1ye5VEGerd

Ted Underwood et. al. also trained a GPT-1914 which might well still the largest historical model to date (nearly 800M). https://t.co/wSA1WCmz2D

avatar for Alexander Doria
Alexander Doria
Fri Dec 12 17:52:36
  • Previous
  • 1
  • More pages
  • 884
  • 885
  • 886
  • More pages
  • 5634
  • Next