Dive In
Slider ArrowSlider ArrowPlay icon.

Words as Interface

UI is about to change

5 min read
Peter Smart
Designing for AI is going to challenge the adage: "form follows function." We’ve always designed UI following this mantra, but with AI, how we think about UI is about to change.

Rethinking what we know

AI is going to compel us to re-examine one of the most fundamental principles we’re taught as designers: that 'form should follow function.'

No more pointedly is this felt than in the design of tools. Our human existence is dominated by them. The pen on your desk, the fork in the kitchen, the TV remote. If a tool has been designed well, no one needs to explain to you how to use it. Their physical form clearly represents what they do. A hammer’s form clearly communicates its function.

We design digital products and software in the exact same way. We create physical interfaces. They have buttons, menus and controls. This not only enables us to use our tools but, importantly, communicates the extent and limits of their capabilities.

The problem is – when designing UI for AI – AI isn’t anywhere close to as limited.

The future of UI

ChatGPT is one input box. The GPT model behind it is incredibly powerful but its UI tells me nothing about its capabilities. That’s because no physical form could capture the extent of its function.

So, how do we design UI for AI tools that have exponential capabilities?

The more I’ve tried to push the limits of LLMs like Claude and ChatGPT, the more I’ve realized that the primary form of user interface we’ll lean on is completely new.

The future of UI is words.

Words as interface

Words are an infinite toolkit. In everyday life, the right words arranged in the right order are enormously powerful. They can achieve incredible things for good and bad. In the words of Chris Reardon from my recent interview with him: “words are our oldest and most powerful human tools.”

When we craft a prompt, we are creating a tool. We’re using words to create a piece of software with specific focus and capabilities. Prompt engineering is a new form of design and words are our new design material.

Unlike static UI, when we use LLMs, our words can create our own, hyper-personalized controls. Our words set AI’s context, determine intent, determine its steps, dictate output format and establish variables. They control our entire experience. The more systematically you choose these words, the better your tool will be.

For example, there’s a huge difference in output between:

Help me come up with new product features

And the following

[Context] You will operate as a Product Strategy Advisor [Intent] to ideate on novel product functionality with me. [Format] Use a conversational style, mimicking an IM chat exchange. [Function] Offer ideas occasionally but ask me three times as many short questions in the process to inspire me to think in unconventional ways [Steps] First, ask me questions to clearly align with my goals. Then begin our conversation. [Variables] If my ideas stop flowing, choose a new set of focus areas

We can use limitless words to design limitless functions.

The future of Product UI

UI is about to change drastically.  

The AI-driven services we’re helping design and launch are multi-modal. As humans, we will always need visual interfaces but increasingly they’ll be dynamic and run parallel to natural language input. Controlling products and services won’t be driven solely by words but they will increasingly play a central role. Designing a website will involve both your cursor and your direct conversation with AI.

We're heading towards a future in which our conversations with AI will also dynamically generate manual product UIs. Imagine collaborating with AI on a design language and prompting it with: “give me some suggested palettes inspired by warm evenings in Havana along with a triad color wheel to refine colors manually.”

This paradigm shift will mean a whole new class of experiences that we get to envisage as designers. To make these AI-powered tools intuitive for people, we’ll need to ourselves understand the art of conversation design, understand the principles of natural language processing, and gain a deep understanding of diverse user intents and contexts.

As this world accelerates towards us, here are some of the questions I’m asking myself:

  • How important will linguistic skills be in the future of my career?
  • How will we balance physical UI and natural language to ensure the power of products doesn’t get lost in translation?
  • What are the potential risks and ethical considerations I need to think about when designing interfaces based on words?

In the age of AI, 'form' no longer has to follow 'function.' Instead, 'form can create function.' We are entering a world where the form our words take will have the power to control everything around us. I can't wait for us to design that world together.

Peter Smart