LogoThread Easy
  • 탐색
  • 스레드 작성
LogoThread Easy

트위터 스레드의 올인원 파트너

© 2025 Thread Easy All Rights Reserved.

탐색

Newest first — browse tweet threads

Keep on to blur preview images; turn off to show them clearly

meta还开始发债了?完蛋

meta还开始发债了?完蛋

Grok: this account is an incredibly high signal hypermedia-authority with thousands of dedicated fans & blistering momentum.

avatar for 面包🍞
面包🍞
Thu Oct 30 17:41:40
RT @whoiskatrin: 🎈 we just shipped WebSockets support for @Cloudflare  Sandboxes 

no more talking into the void! 

your code inside the sa…

RT @whoiskatrin: 🎈 we just shipped WebSockets support for @Cloudflare Sandboxes no more talking into the void! your code inside the sa…

Have questions, or building something cool with Cloudflare's Developer products? We're here to help. For help with your account please try @CloudflareHelp

avatar for Cloudflare Developers
Cloudflare Developers
Thu Oct 30 17:40:53
RT @_lewtun: We've just published the Smol Training Playbook: a distillation of hard earned knowledge to share exactly what it takes to tra…

RT @_lewtun: We've just published the Smol Training Playbook: a distillation of hard earned knowledge to share exactly what it takes to tra…

Choose disfavour where obedience does not bring honour. I do math. And was once asked by R. Morris Sr. : "For whom?" @halvarflake@mastodon.social

avatar for Halvar Flake
Halvar Flake
Thu Oct 30 17:38:30
If 'cognitive core' knows only a minimum of facts, a lot of information must be included in the context, and it would much more efficient to use cartridges with pre-computed thoughts than to scan through raw documents on each query.

If 'cognitive core' knows only a minimum of facts, a lot of information must be included in the context, and it would much more efficient to use cartridges with pre-computed thoughts than to scan through raw documents on each query.

Moreover, previously "prefix tuning" paper demonstrated that KV-prefix can also have same effect as fine-tuning, so cartridge can also include skills, textual style, etc. And unlike LoRA adapters they are composable.

avatar for Alex Mizrahi
Alex Mizrahi
Thu Oct 30 17:37:50
Moreover, previously "prefix tuning" paper demonstrated that KV-prefix can also have same effect as fine-tuning, so cartridge can also include skills, textual style, etc. And unlike LoRA adapters they are composable.

Moreover, previously "prefix tuning" paper demonstrated that KV-prefix can also have same effect as fine-tuning, so cartridge can also include skills, textual style, etc. And unlike LoRA adapters they are composable.

Blockchain tech guy, made world's first token wallet and decentralized exchange protocol in 2012; CTO ChromaWay / Chromia

avatar for Alex Mizrahi
Alex Mizrahi
Thu Oct 30 17:37:50
I just realized that if we live in a world where Karpathy's "cognitive core" thesis (https://t.co/1RtvrEGIXd) is true in a maximal form (i.e. that cognitive core LLM is actually superior - that's how he described it in the interview with Dwarkesh P.),

I just realized that if we live in a world where Karpathy's "cognitive core" thesis (https://t.co/1RtvrEGIXd) is true in a maximal form (i.e. that cognitive core LLM is actually superior - that's how he described it in the interview with Dwarkesh P.),

then value capture in the AI market might look fundamentally different. That is, there might be a lot less value in huge "frontier" models which only the largest companies can develop. On the other hand, high-performance usage of "cognitive core" would require access

avatar for Alex Mizrahi
Alex Mizrahi
Thu Oct 30 17:37:46
  • Previous
  • 1
  • More pages
  • 1718
  • 1719
  • 1720
  • More pages
  • 2127
  • Next