You’re Prompting AI in Wrong Language

From prompt language to infrastructure strategy, here’s what’s actually working for AI builders right now.

In partnership with

Hey there,

We've been prompting AI in English this whole time.

Turns out, we've been doing it wrong.

Polish is the most effective language for prompting AI. Not English. Not Mandarin. Polish.

I know it’s quite unexpected, but researchers from the University of Maryland and Microsoft tested this across 26 languages. Polish scored an average accuracy of 88%. And English isn't even second or third; it's the sixth most effective language for conversational AI.

Most businesses and developers have been optimizing prompts in English. But the syntax of Polish, with its inflections, cases, and linguistic structure, aligns better with how language models process information.

Here's what this means: If you're building an AI product and optimizing for English prompts, you're leaving performance on the table. Your Claude agent works 12% better with Polish prompts. Your chatbot understands requests with more nuance.

This isn't just a fun fact. It's a competitive advantage most people aren't using.

Now, let's talk about what's happening in the AI infrastructure game.

OpenAI's $1.4 Trillion Infrastructure Play

OpenAI announced a $38 billion partnership with Amazon Web Services through 2032. That's seven years of compute locked in. But here's what's wild, this is just one piece.

OpenAI now has $1.4 trillion in total infrastructure commitments across Nvidia, Broadcom, AMD, SoftBank, and UAE data centers.

Why does this matter for you?

Infrastructure is becoming the new moat. Not innovation. Not models. Infrastructure.

Anyone can fine-tune a model or prompt engineer. But can you afford to train AGI-level models or run a 30-billion parameter multimodal model like Qwen 3-Max at scale?

The companies securing this compute are securing the future. But here's the thing, you don't need to compete on infrastructure. You need to compete on utility.

While OpenAI and Google fight over data center capacity, builders are making money by solving problems the big models can't: memory, consistency, and specialization.

AI Memory Is Broken

Users are constantly complaining about the AI’s memory issues; it forgets what you told it five messages ago or more than these five messages, and it will start hallucinating. 

The issue isn't model intelligence. It's context windows and how AI manages memory during long interactions. It's about maintaining coherence as conversations extend.

So what are smart builders doing?

They're creating memory systems. Summarizing previous messages. Building loops that reintegrate context. Some are building regex-powered systems to automatically detect intents and restructure prompts for clarity.

This is a business opportunity. People are building tools for prompt management, memory augmentation, and conversation continuity. And not enough people are talking about it.

Vision Language Models (VLMs) are another frontier. Research shows that when images are converted into tokens, each image token represents multiple textual concepts simultaneously. Multimodal AI isn't just "text + image", it's a different way of encoding information.

Companies building specialized VLM applications for document processing, medical imaging, or real estate are creating defensible products now.

The AI Safety Divide

Geoffrey Hinton, the godfather of neural networks, believes we're building intelligences far smarter than us that could arrive within the next decade. He thinks only Anthropic and Google are taking this seriously.

Now, you might be thinking: "Why should I care?"

Because this matters for your business model.

If Anthropic becomes the "safety" brand, they win enterprise trust. If OpenAI remains the "move fast" option, that's another market segment. If Meta ships models without guardrails, that creates regulatory pressure affecting everyone.

Utah and California already require AI disclosure. More states will follow.

Your business might need to pivot for compliance or double down on markets that don't care. Either way, the landscape is shifting.

What This Means For You

Let me be direct:

  1. Language matters more than we thought. If you're building for global markets, test your prompts in multiple languages. Polish signals that linguistic structure affects AI performance in ways we don't fully understand yet. Especially while coding.

  2. Infrastructure is consolidating. If you're building an AI product that requires massive compute, you're competing against billion-dollar companies. Build solutions that work with constrained resources instead: local models, edge compute, on-device inference.

  3. Memory is the next frontier. The company that solves AI conversation continuity, context management, and persistent memory systems will be profitable. This is underexplored.

  4. Safety is becoming table stakes. Regulations are coming. Build compliance into your product from day one.

I'm watching companies building memory-augmented tools, multi-language prompt optimization platforms, and infrastructure for local/edge AI. They're not flashy. They don't get TechCrunch coverage. But they're profitable, defensible, and solving real problems.

The infrastructure wars are for big players. The real money is in tools that sit on top of these models, helping them work better, remember longer, and stay compliant.

The question is: where are you building?

Let me know. I'm curious what segment you think will win.

- Aashish

P.S. Do test a prompt in Polish and compare it to English

Shoppers are adding to cart for the holidays

Peak streaming time continues after Black Friday on Roku, with the weekend after Thanksgiving and the weeks leading up to Christmas seeing record hours of viewing. Roku Ads Manager makes it simple to launch last-minute campaigns targeting viewers who are ready to shop during the holidays. Use first-party audience insights, segment by demographics, and advertise next to the premium ad-supported content your customers are streaming this holiday season.

Read the guide to get your CTV campaign live in time for the holiday rush.