Maybe the most delusional take I've seen this week and I read lots of China hawks and other wackos. On-device inference is niche until we change the hardware (eg compute-in-memory) and the very mode of interaction with AI. All big boy functions will remain reliant on the cloud.
Carregando detalhes do thread
Buscando os tweets originais no X para montar uma leitura limpa.
Isso normalmente leva apenas alguns segundos.