KAINDLY × Live Coverage
HumanX 2026 venue entrance with signature [X] banner

HUMAN[X]

San Francisco April 6–9, 2026

Day 1 — The Plain-Language Version

What happened at the biggest AI conference — and what it actually means for your work

You don't need to be technical to understand this. That's the whole point.

April 6, 2026 | Leanna Baker Williams | KAINDLY Collective

Why This Exists

AI conferences talk to the industry.
We translate it for everyone else.

Most conference coverage assumes you already know the jargon, follow the players, and understand the technical landscape. That leaves out the vast majority of professionals who need to make decisions about AI but didn't grow up in the ecosystem.

KAINDLY's HumanX readouts are written for the person who doesn't have time to attend a four-day conference — but whose budget, team, and strategy depend on understanding what's happening in AI right now. Every term is defined. Every insight comes with a concrete next step. No gatekeeping.

The world's biggest AI conference just kicked off in San Francisco. You don't need to have been there — or even follow AI closely — to care about what happened. Three things came up on Day 1 that will affect how your organization works, spends money, and makes decisions about technology. Here's what they are, in plain English.

Quick Glossary — 10 Terms You'll See Below
AI
Artificial intelligence — software that can learn from data and make decisions or predictions, rather than just following fixed rules.
Language Model
An AI system trained on text that can read, write, summarize, and answer questions. ChatGPT is the most well-known example.
Spatial AI
AI that understands physical, three-dimensional spaces — not just text and images, but rooms, buildings, objects, and how they relate to each other.
AI Stack
The layers of technology that make AI work — from the computer chips at the bottom to the apps you use at the top. Think of it like a building: foundation, structure, plumbing, wiring, and the rooms you actually live in.
Inference
When an AI model actually runs and gives you an answer. Every time you ask ChatGPT a question, that's inference. It costs money each time.
Compute
The raw computing power (usually specialized chips called GPUs) needed to train and run AI. It's expensive, and who controls it matters.
Fine-Tuning
Taking a general-purpose AI model and customizing it with your own data so it works better for your specific needs.
Generative AI
AI that creates new content — text, images, video, code — rather than just analyzing existing data. Most of the AI tools you hear about today are generative.
Personalization
Using data about a specific person to tailor what they see, read, or experience. AI makes this possible at massive scale.
A/B Testing
Showing two different versions of something (an email, a webpage, an ad) to different groups and measuring which performs better.

What You Need to Know

THREE
TAKEAWAYS

Dr. Fei-Fei Li on stage at HumanX 2026
01

AI Is Learning to Understand Physical Spaces

What Happened

Fei-Fei Li — often called the Godmother of AI — took the stage to present the vision behind World Labs, her spatial AIAI that understands physical, three-dimensional spaces — not just text and images, but rooms, buildings, objects, and how they relate to each other. company. Her core argument: the next frontier of AI is not generating more text or images. It is building machines that understand the three-dimensional physical world — that can navigate environments, understand how objects relate to each other, and simulate spaces before they're built.

Li positioned this not as a distant possibility but as an active area of development with real products in the pipeline. World Labs is backed by significant venture capital, the research comes out of Stanford, and the first applications are already being built for enterprises.

What This Means for You

Most AI tools you've heard of are language modelsAn AI system trained on text that can read, write, summarize, and answer questions. ChatGPT is the most well-known example. — they read text, generate text, process images, create images. Fundamentally flat. But a lot of real work happens in physical space: warehouses, hospitals, construction sites, retail stores, manufacturing floors.

If this category matures the way language models did, it will reshape logistics and warehouse automation, product design and prototyping, surgical planning, architecture and construction, and retail — potentially within 36 months. If your organization's AI strategy starts and ends with chatbots and document summarizers, the landscape just got a lot bigger. Most organizations are not tracking spatial AI yet. It's worth understanding now, before it becomes urgent.

One Thing to Try

Think about where your team does work that involves physical spaces or objects. Ask: "If AI could 'see' and understand our physical workspace, what would we do differently?" You don't need to buy anything — just start noticing where space matters in your work.

HUMAN[X] main stage during the Five-Layer AI Stack panel
02

AI Is Not One Thing — It's Five

What Happened

A panel featuring NVIDIA's Bill Catanzaro, Fireworks AI CEO Lin Qiao, and Augment Code CEO Denis Yarats broke AI down into five layers — what they called the AI stackThe layers of technology that make AI work — from the computer chips at the bottom to the apps you use at the top. Like a building: foundation, structure, plumbing, wiring, and the rooms you live in.. Each panelist represented a different layer: Catanzaro spoke from NVIDIA's position at the bottom (chips and computeThe raw computing power (usually specialized chips called GPUs) needed to train and run AI. It's expensive, and who controls it matters.), Lin Qiao represented the serving and speed layer, and Yarats represented the top — the apps you actually use.

The Five Layers, Explained

L1

Chips & Compute

The physical hardware — specialized computer chips (GPUs) that do the heavy math AI requires. NVIDIA dominates this layer and commands extraordinary pricing power. Every layer above depends on this one.

L2

Model Training

Where AI models are built by feeding them enormous amounts of data. This is extremely expensive — millions of dollars per model — and only a handful of companies (OpenAI, Google, Anthropic, Meta) operate at this scale.

L3

Serving & InferenceWhen an AI model actually runs and gives you an answer. Every time you ask ChatGPT a question, that's inference. It costs money each time.

Where the trained model actually runs and gives you answers. Every time you ask an AI a question, that's inference — and it costs money each time. Companies like Fireworks AI work to make this faster and cheaper.

L4

Fine-TuningTaking a general-purpose AI model and customizing it with your own data so it works better for your specific needs. & Customization

Taking a general-purpose model and customizing it with your company's own data so it works better for your specific needs. This is where AI goes from "generic" to "useful for your business."

L5

Applications

The tools you actually see and use — ChatGPT, Copilot, Jasper, the AI features in your CRM. This is the layer most people think of when they say "AI." It sits on top of all four layers below it.

What This Means for You

When most people say "we're using AI," they mean they bought an app — the top layer. But when that app is slow, the problem might live at Layer 3 (inference). When it's expensive, it might be a Layer 1 problem (compute). When it gives bad answers, it might need work at Layer 4 (fine-tuning). These are different problems with different solutions, and mixing them up leads to bad purchasing decisions and wasted money.

Each layer has its own cost structure, its own vendors, and its own dynamics. NVIDIA sits at the bottom and commands extraordinary pricing power — every layer above depends on them. The further up the stack you go, the less control you have over your own costs. You don't need to become an engineer — but knowing that your vendor operates on one specific layer, and that problems at a different layer require a different kind of fix, changes how you evaluate tools, negotiate contracts, and diagnose issues.

One Thing to Try

Next time your team is evaluating an AI tool, ask: "What exactly are we paying for when we pay for this?" Is it the computing power? The model itself? The customization? The app on top? If no one can answer clearly, that's a sign you're buying something you don't fully understand yet — and that's worth fixing before you sign.

AI for Marketing session at HumanX 2026
03

One Industry Already Runs on AI. Yours Might Be Next.

What Happened

The marketing track at HumanX ran all day with 15+ speakers — from PJ Pereira's keynote on creative AI to sessions on personalizationUsing data about a specific person to tailor what they see, read, or experience. AI makes this possible at massive scale., content workflows, brand strategy, and the operational mechanics of AI-augmented teams. The message was clear: marketing teams aren't experimenting with AI anymore. They're running on it.

Speakers described production environments where AI handles first-draft content creation, audience segmentation, A/B testingShowing two different versions of something (an email, a webpage, an ad) to different groups and measuring which performs better. at scale, real-time personalization, and performance analytics. These aren't pilot programs — they're how the work gets done now.

What This Means for You

The most important thing marketing teams have figured out isn't which AI tools to use — it's how to redesign their entire workflow to move faster. The human role is shifting from creator to curator and editor: setting the creative direction, reviewing AI-generated output, making judgment calls that require brand sensitivity and cultural context. People aren't being replaced — but their job description is changing.

This matters even if you're not in marketing. Marketing is the canary in the coal mine. The patterns forming there — how teams restructure around AI tools, where human judgment remains essential, how output quality is measured, what happens to headcount and skill requirements — are the same patterns that will play out in finance, operations, HR, legal, and every other knowledge-work function over the next 18–36 months. The transition from "we have an AI tool" to "AI is built into how we work" is the hardest step, and marketing is the first function to cross it at scale.

One Thing to Try

Find someone on your company's marketing team (or a peer at another company) and ask them: "What does your AI workflow actually look like day to day?" Their answer will tell you more about where AI adoption is heading than any conference keynote.

On the Ground

Leanna and colleague outside HumanX venue in San Francisco
Attendees networking at HumanX 2026

Your Next Step

One question to ask your team this week

"Do we actually know which part of AI we're paying for — and whether it's the part that matters most for our work?"

One decision to sit with

AI is a stack of layers. If you only understand the top layer (the app you bought), you can't diagnose problems or negotiate well. That's worth 15 minutes of your time this week.

That's what KAINDLY helps with. Not selling you AI. Helping you understand it.

Try This

Take the Assessment

Not sure where you stand with AI? KAINDLY's AI Readiness Assessment gives you a personalized report on your current comfort level, strengths, and where to focus next.

You don't need to know everything.
You just need to start.

KAINDLY helps professionals build real AI fluency — at your own pace, in your own context, without the jargon.