开启时会模糊预览图,关闭后正常显示

Father of three, Creator of Ruby on Rails + Omarchy, Co-owner & CTO of 37signals, Shopify director, NYT best-selling author, and Le Mans 24h class-winner.


I build stuff. On my way to making $1M 💰 My projects 👇

![LeCun’s new company on physical AI with "world models" [9] looks a lot like our 2014 company [1]. He rehashes without attribution what I published decades ago [2][3]. See papers 1990-2018 on neural world models [4-7].
[1] NNAISENSE, the AGI company for AI in the physical world, founded in 2014, based on neural network world models (NWMs). J. Schmidhuber (JS) was its President and Chief Scientist - see his NWM papers 1990-2015, e.g., [4-5], and the 2020 NNAISENSE web page in the Wayback Machine Internet Archive - image attached. (Lately, however, NNAISENSE has become less AGI-focused and more specialised, with a focus on asset management.)
[2] JS, AI Blog (2022). LeCun's 2022 paper on autonomous machine intelligence rehashes but does not cite essential work of 1990-2015.
Years ago, JS published most of what LeCun calls his "main original contributions:" neural nets that learn multiple time scales and levels of abstraction, generate subgoals, use intrinsic motivation to improve world models, and plan (1990); controllers that learn informative predictable representations (1997), etc. This was also discussed on Hacker News, reddit, and in the media. LeCun also listed the "5 best ideas 2012-2022" without mentioning that most of them are from JS's lab, and older. Popular tweets on this:
https://t.co/kn7KhFHLvw
https://t.co/FxALILsNRu
https://t.co/caTuctmztu
https://t.co/Rpip8HBzPA
[3] How 3 Turing awardees republished key methods and ideas whose creators they failed to credit. Technical Report IDSIA-23-23, Swiss AI Lab IDSIA, 2023.
Best start with Section 3. See also [8]. Popular tweet on this:
https://t.co/0fJVklXyOr
[4] JS (1990). Making the world differentiable: on using fully recurrent self-supervised neural networks for dynamic reinforcement learning and planning in non-stationary environments. TR FKI-126-90, TUM. This report used the terminology "world model” for a recurrent neural network that learns to predict the environment and the consequences of the actions of a separate controller neural net. It also introduced "artificial curiosity" and "intrinsic motivation" through generative adversarial networks. Led to lots of follow-up publications.
[4b] JS (2002). Exploring the Predictable. In Ghosh, S. Tsutsui, eds., Advances in Evolutionary Computing, p. 579-612, Springer, 2002. Don't predict pixels - find predictable internal representations / abstractions of complex spatio-temporal events!
[5] JS (2015). On Learning to Think: Algorithmic Information Theory for Novel Combinations of RL Controllers and Recurrent Neural World Models. ArXiv 1210.0118. Introducing a reinforcement learning (RL) prompt engineer and adaptive chain of thought: an RL neural net learns to query its "world model" net for abstract reasoning & decision making. Going beyond the 1990 neural world model [4] for millisecond-by-millisecond planning. See tweet for 10-year anniversary: https://t.co/3FYt4x2PMM
[6] JS (2018). One Big Net For Everything. arXiv 1802.08864. Collapsing the reinforcement learner and the world model of [5] (e.g., a foundation model) into a single network, using JS's neural network distillation procedure of 1991. See DeepSeek tweet: https://t.co/HIVU8BWAaS
[7] David Ha & JS. World Models. NeurIPS 2018.
[8] Who invented convolutional neural networks?
Technical Note IDSIA-17-25, IDSIA, 2025.
Popular tweets on this:
https://t.co/6eDUT8qcNE
https://t.co/chfcmk253b
https://t.co/h27y6Ni2CA
https://t.co/Rpip8HBzPA
[9] Sifted dot eu (18 Dec 2024). Yann LeCun raising €500m at €3bn valuation for new AI startup. The outgoing Meta exec announced last month he was launching a new project to build “world models.” Quote: "The new company will focus on “world models”, systems that can understand the physical world instead of merely generating text like today’s large-language models (LLMs)." See [1]. LeCun’s new company on physical AI with "world models" [9] looks a lot like our 2014 company [1]. He rehashes without attribution what I published decades ago [2][3]. See papers 1990-2018 on neural world models [4-7].
[1] NNAISENSE, the AGI company for AI in the physical world, founded in 2014, based on neural network world models (NWMs). J. Schmidhuber (JS) was its President and Chief Scientist - see his NWM papers 1990-2015, e.g., [4-5], and the 2020 NNAISENSE web page in the Wayback Machine Internet Archive - image attached. (Lately, however, NNAISENSE has become less AGI-focused and more specialised, with a focus on asset management.)
[2] JS, AI Blog (2022). LeCun's 2022 paper on autonomous machine intelligence rehashes but does not cite essential work of 1990-2015.
Years ago, JS published most of what LeCun calls his "main original contributions:" neural nets that learn multiple time scales and levels of abstraction, generate subgoals, use intrinsic motivation to improve world models, and plan (1990); controllers that learn informative predictable representations (1997), etc. This was also discussed on Hacker News, reddit, and in the media. LeCun also listed the "5 best ideas 2012-2022" without mentioning that most of them are from JS's lab, and older. Popular tweets on this:
https://t.co/kn7KhFHLvw
https://t.co/FxALILsNRu
https://t.co/caTuctmztu
https://t.co/Rpip8HBzPA
[3] How 3 Turing awardees republished key methods and ideas whose creators they failed to credit. Technical Report IDSIA-23-23, Swiss AI Lab IDSIA, 2023.
Best start with Section 3. See also [8]. Popular tweet on this:
https://t.co/0fJVklXyOr
[4] JS (1990). Making the world differentiable: on using fully recurrent self-supervised neural networks for dynamic reinforcement learning and planning in non-stationary environments. TR FKI-126-90, TUM. This report used the terminology "world model” for a recurrent neural network that learns to predict the environment and the consequences of the actions of a separate controller neural net. It also introduced "artificial curiosity" and "intrinsic motivation" through generative adversarial networks. Led to lots of follow-up publications.
[4b] JS (2002). Exploring the Predictable. In Ghosh, S. Tsutsui, eds., Advances in Evolutionary Computing, p. 579-612, Springer, 2002. Don't predict pixels - find predictable internal representations / abstractions of complex spatio-temporal events!
[5] JS (2015). On Learning to Think: Algorithmic Information Theory for Novel Combinations of RL Controllers and Recurrent Neural World Models. ArXiv 1210.0118. Introducing a reinforcement learning (RL) prompt engineer and adaptive chain of thought: an RL neural net learns to query its "world model" net for abstract reasoning & decision making. Going beyond the 1990 neural world model [4] for millisecond-by-millisecond planning. See tweet for 10-year anniversary: https://t.co/3FYt4x2PMM
[6] JS (2018). One Big Net For Everything. arXiv 1802.08864. Collapsing the reinforcement learner and the world model of [5] (e.g., a foundation model) into a single network, using JS's neural network distillation procedure of 1991. See DeepSeek tweet: https://t.co/HIVU8BWAaS
[7] David Ha & JS. World Models. NeurIPS 2018.
[8] Who invented convolutional neural networks?
Technical Note IDSIA-17-25, IDSIA, 2025.
Popular tweets on this:
https://t.co/6eDUT8qcNE
https://t.co/chfcmk253b
https://t.co/h27y6Ni2CA
https://t.co/Rpip8HBzPA
[9] Sifted dot eu (18 Dec 2024). Yann LeCun raising €500m at €3bn valuation for new AI startup. The outgoing Meta exec announced last month he was launching a new project to build “world models.” Quote: "The new company will focus on “world models”, systems that can understand the physical world instead of merely generating text like today’s large-language models (LLMs)." See [1].](/_next/image?url=https%3A%2F%2Fpbs.twimg.com%2Fmedia%2FG8tEhGFWwAAXCZF.jpg&w=3840&q=75)
Invented principles of meta-learning (1987), GANs (1990), Transformers (1991), very deep learning (1991), etc. Our AI is used many billions of times every day.

🇪🇺https://t.co/NdorAWqJC3 @euacc 📸https://t.co/lAyoqmSBRX $118K/m 🏡https://t.co/1oqUgfD6CZ $36K/m 🛰https://t.co/ZHSvI2wjyW $43K/m 🌍https://t.co/UXK5AFqCaQ $15K/m 👙https://t.co/RyXpqGuFM3 $14K/m 💾https://t.co/M1hEUBAynC $6K/m


⚡ Founder & 🌊 Surfer bootstrapping SaaS. ✍️ Notion ➯ Help Center @HelpkitHQ 💰 Reddit ➯ Customers https://t.co/3kdqfXzlsK🔋 Battery ➯ Alerts https://t.co/cX8QhAoG55


We're in a race. It's not USA vs China but humans and AGIs vs ape power centralization. @deepseek_ai stan #1, 2023–Deep Time «C’est la guerre.» ®1
