LogoThread Easy
  • 探索
  • 線程創作
LogoThread Easy

Twitter 線程的一站式夥伴

© 2025 Thread Easy All Rights Reserved.

探索

Newest first — browse tweet threads

Keep on to blur preview images; turn off to show them clearly

Everyone thought prompt engineering would be a short-term thing.

But it’s only become more valuable as models improve. They can do incredible things, but you have to pull it out of them!

This one-shot ad campaign is from a single product photo + long prompt 🤯

Everyone thought prompt engineering would be a short-term thing. But it’s only become more valuable as models improve. They can do incredible things, but you have to pull it out of them! This one-shot ad campaign is from a single product photo + long prompt 🤯

This prompt is from @azed_ai. Product photo is @Mascobot’s a16z GPU rig. Create a 3×3 grid in 3:4 aspect ratio for a high-end commercial marketing campaign using the uploaded product as the central subject. Each frame must present a distinct visual concept while maintaining perfect product consistency across all nine images. Grid Concepts (one per cell): 1. Iconic hero still life with bold composition 2. Extreme macro detail highlighting material, surface, or texture 3. Dynamic liquid or particle interaction surrounding the product 4. Minimal sculptural arrangement with abstract forms 5. Floating elements composition suggesting lightness and innovation 6. Sensory close-up emphasizing tactility and realism 7. Color-driven conceptual scene inspired by the product palette 8. Ingredient or component abstraction (non-literal, symbolic) 9. Surreal yet elegant fusion scene combining realism and imagination Visual Rules: Product must remain 100% accurate in shape, proportions, label, typography, color, and branding No distortion, deformation, or redesign of the product Clean separation between product and background Lighting & Style: Soft, controlled studio lighting Subtle highlights, realistic shadows High dynamic range, ultra-sharp focus Editorial luxury advertising aesthetic Premium sensory marketing look Overall Feel: Modern, refined, visually cohesive High-end commercial campaign Designed for brand websites, social grids, and digital billboards Hyperreal, cinematic, polished, and aspirational

avatar for Justine Moore
Justine Moore
Tue Dec 23 17:23:36
TBH, every time the AI fails, I mentally blame it on you.

Right now GPT-5.2 noticed that the parser was counting variables incorrectly, causing a linearity bug. The solution?

"Ignore the parser counter and implement a separate counter."

At this point, this isn't about being dumb. This is about making a bad decision that under no circumstances would be good. Either we remove the parser counter and use a separate function as the source of truth, or we keep it, and fix it. But such insane duct taping has no place in a serious codebase, and that idea would never have occurred to an intelligence evolved to learn coding from a pure blank state. It must have been corrupted by evil forces that only humans can produce.

So I can't help but wonder...

Who it learned that from?

I blame it on you

TBH, every time the AI fails, I mentally blame it on you. Right now GPT-5.2 noticed that the parser was counting variables incorrectly, causing a linearity bug. The solution? "Ignore the parser counter and implement a separate counter." At this point, this isn't about being dumb. This is about making a bad decision that under no circumstances would be good. Either we remove the parser counter and use a separate function as the source of truth, or we keep it, and fix it. But such insane duct taping has no place in a serious codebase, and that idea would never have occurred to an intelligence evolved to learn coding from a pure blank state. It must have been corrupted by evil forces that only humans can produce. So I can't help but wonder... Who it learned that from? I blame it on you

Kind / Bend / HVM / INets / λCalculus

avatar for Taelin
Taelin
Tue Dec 23 17:20:19
TBH, every time the AI fails, I mentally blame it on you.

Right now GPT-5.2 noticed that the parser was counting variables incorrectly, causing a linearity bug. The solution?

"Ignore the parser counter and implement a separate counter."

At this point, this isn't about being dumb. This is about making a bad decision that under no circumstances would be good. Either we remove the parser counter and use a separate function as the source of truth, or we keep it, and fix it. But such insane duct taping has no place in a serious codebase, and that idea would never have occurred to an intelligence evolved to learn coding from a pure blank state. It must have been corrupted by evil forces that only humans can produce.

So I can't help but wonder...

Who it learned that from?

I blame it on you

TBH, every time the AI fails, I mentally blame it on you. Right now GPT-5.2 noticed that the parser was counting variables incorrectly, causing a linearity bug. The solution? "Ignore the parser counter and implement a separate counter." At this point, this isn't about being dumb. This is about making a bad decision that under no circumstances would be good. Either we remove the parser counter and use a separate function as the source of truth, or we keep it, and fix it. But such insane duct taping has no place in a serious codebase, and that idea would never have occurred to an intelligence evolved to learn coding from a pure blank state. It must have been corrupted by evil forces that only humans can produce. So I can't help but wonder... Who it learned that from? I blame it on you

Kind / Bend / HVM / INets / λCalculus

avatar for Taelin
Taelin
Tue Dec 23 17:20:19
TBH, every time the AI fails, I mentally blame it on you.

Right now GPT-5.2 noticed that the parser was counting variables incorrectly, causing a linearity bug. The solution?

"Ignore the parser counter and implement a separate counter."

At this point, this isn't about being dumb. This is about making a bad decision that under no circumstances would be good. Either we remove the parser counter and use a separate function as the source of truth, or we keep it, and fix it. But such insane duct taping has no place in a serious codebase, and that idea would never have occurred to an intelligence evolved to learn coding from a pure blank state. It must have been corrupted by evil forces that only humans can produce.

So I can't help but wonder...

Who it learned that from?

I blame it on you

TBH, every time the AI fails, I mentally blame it on you. Right now GPT-5.2 noticed that the parser was counting variables incorrectly, causing a linearity bug. The solution? "Ignore the parser counter and implement a separate counter." At this point, this isn't about being dumb. This is about making a bad decision that under no circumstances would be good. Either we remove the parser counter and use a separate function as the source of truth, or we keep it, and fix it. But such insane duct taping has no place in a serious codebase, and that idea would never have occurred to an intelligence evolved to learn coding from a pure blank state. It must have been corrupted by evil forces that only humans can produce. So I can't help but wonder... Who it learned that from? I blame it on you

Kind / Bend / HVM / INets / λCalculus

avatar for Taelin
Taelin
Tue Dec 23 17:20:19
RT @BenSasse: Friends-

This is a tough note to write, but since a bunch of you have started to suspect something, I’ll cut to the chase: L…

RT @BenSasse: Friends- This is a tough note to write, but since a bunch of you have started to suspect something, I’ll cut to the chase: L…

Root node of the web of threads: https://t.co/ifH80GcLpo

avatar for James Torre
James Torre
Tue Dec 23 17:13:36
And, this is day 18 of @venturetwins and I featuring cool new consumer AI launches from this year!

Follow along for more 👋

And, this is day 18 of @venturetwins and I featuring cool new consumer AI launches from this year! Follow along for more 👋

Partner @a16z and twin to @venturetwins | Investor in @gammaapp, @happyrobot_ai, @krea_ai, @tomaauto, @partiful, Salient, @scribenoteinc & more

avatar for Olivia Moore
Olivia Moore
Tue Dec 23 17:11:17
  • Previous
  • 1
  • More pages
  • 54
  • 55
  • 56
  • More pages
  • 5634
  • Next