Search in the Age of AI
Search is no longer about finding—it’s about understanding. From keywords to context, here’s how AI is changing the way we ask, find, and think.

From puzzles to conversations. How search became fluent.
Search used to feel like solving a puzzle. You had to guess the right keywords, strip away the natural rhythm of your question, and hope the system knew what you meant. Most of the time, it didn’t.
Today, something different is happening.
Search is becoming fluid. Conversational. Even responsive. The keyword box is giving way to an interface that listens, responds, remembers. What once felt like a static lookup is now turning into a dialogue.
This change is more than a surface-level improvement. It’s a shift in how we think, how we find, and how we connect ideas.
Behind the scenes, the architecture has evolved—from keyword indexes to semantic embeddings, from matching results to generating answers. But on the surface, it’s changing how we use search to orient ourselves in the world.
In this article, I’ll walk through how search has evolved. Not just technically, but experientially. I’ll share what this shift means for how we ask, how we retrieve, and how we make sense of things.
1. The Keyword Era: When Search Was a Puzzle
The early web taught us to speak in code. Instead of asking “What are the risks of RSV for newborns?” you learned to type “RSV newborn risk factors”. And even then, it helped if you knew how Google thought.
This style of search was efficient, but narrow. It required precision. The interface didn’t support uncertainty, follow-up questions, or clarification. Search was a one-shot action. You either hit the target, or you rewrote the query.
And behind the curtain? It was a beautiful but limited machine. Tokenisation, inverted indexes, relevance scores, link-based ranking. All built for speed, not for nuance.
For years, that’s what search meant: find the right words, match them well, and rank the results.
But human language isn’t made of keywords. It’s made of questions, intentions, detours, and follow-up thoughts.
2. Enter AI: Search Becomes a Conversation
The arrival of large language models didn’t just change how we interact with information. It changed how systems retrieve it.
With AI, you can ask a half-formed question. You can clarify it later. You can stay in the flow. The model doesn’t just return links, it interprets what you mean, fetches relevant content, and composes an answer in real time.
This shift is profound. Under the hood, a process called retrieval-augmented generation (RAG) quietly rewires everything:
- The model retrieves chunks of relevant information from a database
- It blends them with your query
- It generates a contextual response
This doesn’t replace search, it wraps around it. Suddenly, the search engine isn’t just a finder. It’s a participant.
Instead of clicking through results, you’re building understanding together.

3. Search Expands: From Text to Everything
The old model of search worked best in one mode: text in, text out. But human understanding isn’t limited to words. We process images, sound, video, structure. And now, search is catching up.
AI has cracked open the door to multimodal search, the ability to look inside other forms of content. Not just text, but timelines. Not just titles, but tone.
- Tools like Whisper turn speech into searchable transcripts.
- Apps like Descript let you scrub through video as if it were text.
- LLMs like GPT-4 can “watch” images, interpret screenshots, even extract meaning from diagrams or interfaces.
- Zeta Alpha brings deep, semantic search to internal research data—designed for teams working with complex knowledge in academic or scientific contexts.
- Glean focuses on intelligent enterprise search, helping employees find the right document, person, or answer across internal company systems.
- Gemini adds powerful multimodal capabilities, making it possible to search within images, diagrams, and even across video content.

Underneath this magic is a technical breakthrough: embeddings. These are mathematical representations of meaning. Instead of indexing words, systems now index relationships, between concepts, sentences, sounds, even moods.
This means search doesn’t just look for matches. It looks for meaningful proximity. What feels related, what sounds aligned, what rhymes with your intention, even if it doesn’t use the same words.
This is why AI-powered search feels intuitive. It moves like thought.

4. Infrastructure Shift: From Indexing to Understanding
Behind the scenes, the entire infrastructure of search has changed. The old stack, tokens, inverted indexes, PageRank, still exists. But around it, new layers have formed:
- Vector databases store semantic embeddings instead of just text
- RAG pipelines orchestrate retrieval, re-ranking, and response generation
- Agent frameworks allow systems to run multiple queries in parallel, adapt them, test answers, and respond in natural language
These layers are still under rapid construction, but they point to something clear: we’re not just building better finders. We’re building interpreters. Systems that can reason across time, across modalities, across contexts.
This makes search an active process, not just a lookup, but a loop. Each answer refines the next question.
If that sounds familiar, it’s because it’s how humans learn.

4.5 Adding Structure: The Role of the Knowledge Graph
Probabilistic models like GPT are powerful, but they don't “know” things in the traditional sense. They work by recognising patterns and predicting likely continuations, not by retrieving structured facts. That’s where knowledge graphs come in.
A knowledge graph represents entities, like people, places, concepts, and their relationships in a structured, deterministic way.
It’s a way of encoding meaning with precision. You could think of it as a map of interlinked truths.
When combined with language models, knowledge graphs do two important things:
- Grounding: They reduce hallucination by letting the system check facts and draw from verified sources.
- Enrichment: They provide explicit relationships that help the AI understand context and nuance it might otherwise miss.
This hybrid approach, combining probabilistic reasoning with structured logic, is becoming the new norm.
It lets AI systems not only sound fluent, but also stay anchored in facts. And it brings us one step closer to search systems that don’t just guess well, but also understand what they're working with.

5. From Results to Real Understanding
Search used to be about answers. But increasingly, it's about orientation. What’s going on here? What’s relevant? What matters right now?
AI systems are getting better not at knowing everything, but at knowing what to show you next. They retrieve what you didn’t know to ask. They connect pieces. They compress the irrelevant and surface what’s useful. They don’t replace your thinking, they scaffold it.
This changes how we work, learn, research, explore.
It even changes how we write.
This article, for example, has been shaped by conversations I’ve had with AI, not just to find information, but to build understanding.

6. Context Is the New Query
One of the biggest shifts in search, often overlooked, is the rise of contextuality. We no longer just ask isolated questions. We build conversations, trajectories, threads of thought.
In traditional search, every query was a reset. The system had no idea what you just asked. But with AI, memory matters. Each prompt can build on the last.
Each session becomes a layer of context. Some systems, like ChatGPT with memory turned on, even remember your past sessions, your tone, your preferences.
This changes everything.
It means we can think more fluidly, ask in fragments, revise our line of inquiry without starting over. The system becomes less of a tool, more of a thought partner.
And technically, it’s made possible by embedding layers, session-aware architectures (think identity), and increasingly sophisticated memory models.
In practical terms, it’s the difference between searching for a fact, and working something out over time.
As users, we’re beginning to expect this. We want systems that remember what we’ve said, how we said it, and what we meant. Not because it’s convenient, but because it reflects how thinking really works: iteratively, relationally, in motion.
In that sense, context has become the query.

From Search to Sensemaking
What we’re witnessing isn’t the end of search. It’s its expansion.
Search is no longer just a way to find things. It’s how we think through them. How we enter a topic, stay with it, revise our questions, and follow the thread.
From puzzles to conversations. From keywords to context. From input/output to flow.
The tools are evolving quickly. The infrastructure is still taking shape. But the experience is already here: search that listens, that learns, that adapts.
And it’s just the beginning.
Happy Easter, keep on finding answers!
Related posts:

