Can the "Preview Window" be UX Bridge to the Future for AI User Hold Outs?
As a UX enthusiast, pondering the future of AI interaction usability is too much fun. One concept that I've been kicking around in my head is re-imaging the "preview window" for AI tools aiding in content creation.
You know what I mean when I say preview window, right?
Well just in case; A preview window in apps provides a visual representation of the output or changes being made, allowing users to review and edit content in real-time.
It can have various iterations of the idea.
For example, Adobe Photoshop, users can preview the effects of their edits before applying them.
Now for Word Processors, it is slightly different. The main interface IS the preview; but even then, we could hit the "Print Preview" button to see exactly how the final product would render. Or Kinda like those interfaces used in Markdown Editors, such as GhostWriter.
Well, imagine that, but with things like ChatGPT. You enter the prompt on the left. You get your AI feedback on the left...but the actual CONTENT output gets displayed separately.
It can be baked in the primary interface (similar to a markdown preview mode), or it can show up as a separate pop-up window. This way - if you tell the AI tool to change something, you get to see the same content window get updated automatically.
This small functionality could have huge impact on new user adoption since it can give those who are ai-apprehensive a familiar workflow that they are use to...preview windows.
Here are just some use cases that can make the user more comfortable and more efficient; instead of dealing with scrolling and sifting through the output dump that usual ai tools currently leave us with.
1. Better Visuals and Tracking of Reiterations
Imaging saying "create a blog out of my outline".
The AI tool would produce the results and prompt the user at the end:
"Would you like to launch the preview window?
You click yes- and a new window will appear with the 1ST DRAFT of the content displaying.
Lets say the opening paragraph sounded too academic. You can enter a follow up prompt, like,
"Replace the first paragraph with something more catchy"
Now, instead of having it dump it all back with that one change from within the prompt window- only your Preview Window gets updated, and the title of the window gets renamed "2nd Draft" or something else that helps the user know how many iterations exist.
And now you can then swiftly identify and reference sections of the material that requires reiteration.
That was just through prompt still. What if I want to engage with the content right within the preview window itself? That brings us to #2.
2. More Precise Feedback and Iterations
Imagine if I find a word that chatGPT used that I wanted to check. I can right click a word and hit on the "define" button that provides a floating overlay of the definition. Or I can press "use synonym" and it would replace the word with a better choice for the context of the content.
Recommended by LinkedIn
Or better yet, I can highlight a sentence, right click, and enter a "context-prompt" targeting specifically that text, saying something like, "This sentence is pretty long. Break it up into two and simplify it".
The preview pop up refreshes and you see it get added instantly along with the title, now reading "My blog - 3rd Draft". This makes fine tuning things more precise for you, and more efficient for the AI tool. You also get a "revision history" that can get tracked and logged as well in case you wanted to revert back to an earlier copy or share the evolution of your work with other co-workers (similar to revision history in Google Docs).
3. Learning and Adaptation for AI
The preview window could also benefit the AI itself in other ways. By observing how users interact with the visual output, the AI might better learn from these what parts didn't require user intervention, while picking up on datapoints on what parts did.
This targeted feedback loop can lead to continuous improvement in the quality and relevance of its responses. And if you want to pre-emptively tell the ai tool WHY you didn't like that closing sentence, or HOW it got the wrong answer (because we all know it happens!) - it will be a highlight and click away.
Currently, most use a basic, thumbs up / thumbs down feedback interface where the AI tool has to make assumptions, or worse, require a person to review...I'm looking at you Bard!
4. Paving the Way for Voice
Looking ahead, the preview window could, in theory, pave the way for even better iteration experiences in the future.
Imagine being able to highlight a specific area within the preview and "speak" the changes aloud to the AI without typing into a "context prompt".
This seamless integration of voice commands could potentially further streamline the editing and revision process, making interactions with AI tools for content creation even more intuitive and efficient.
You simply hit the "mic" button and say "Italicize all quotes mentioned in the blog". And viola. No highlighting needed because you reference what needed to be changed audibly. And if there was a mix-up, you simply hit the "undo" button.
No more copying and pasting and fussing through draft fatigue. This gets us closer to "minority report" levels of interacting with our tools...
The Preview Window - a Bridge to Futuristic Functionality (and Tool Loyalty)
The irony of recycling a standard feature that has been around in many content creation tools for decades becoming the very thing that helps onboard holdout users to the future isn't lost on me.
While the addition of a preview window to AI tools undeniably intriguing, it serves as another purpose. It can keep the user inside the tool of choice, for longer periods of time. How? Because the iteration process isn't transplanting into another editor or bouncing back and forth.
By owning the editing process, not just the "output dumps" you give users a chance to build a pristine final draft as supposed to that "FINAL FINAL DRAFT.docx" (we have that one co-worker who names files that way).
Owning the creative process from beginning to end within the AI Tool will keep users from using yet another product, like Word, to finish the job (cough, Co-Pilot, cough).
If such a feature were to become a reality, it could not only empower users to interact with the AI output in a more visual and interactive way but also feel native to the creative process. So, as we speculate on the future of AI interfaces, let's envision how time tested ux paradigms can help new users into the future.
If you enjoyed this piece or have any feedback, let me know by messaging me on Linkedin! I love talking to other UX enthusiasts.
-Joed
(with a little assistance of "Bing Chat")