logo
logo
Sign in

A Few Things You Should Know on Improving UX with AI

avatar
Tamara Lang
A Few Things You Should Know on Improving UX with AI

Artificial Intelligence continues to grow because of improvements in deep learning and GPU acceleration. Prediction is that AI will create a net increase of 58 million jobs by 2022, and is poised to transform how we do everything. AI advocates promise that our devices will soon know us better than our own families.  But the question is can AI/ML be used in the here-and-now, as a tool rather than a whole consuming self-sufficient program? And how AI is influencing user experience?

Before We Begin: A Note on Terminology

You may have noticed that people are rather free with all the terms we’ve been using so far, even though most of them are not really sure what Artificial Intelligence and Machine Learning mean, as they seem to be used interchangeably. On the other hand, User Experience appears to be vaguer. This section is here to eliminate any doubts and resolve any confusion going forward. 

AI is the most general term implying an algorithm acting with human-like flexibility. Machine Learning, however, is a subtype of AI, and it tries to achieve the goal of AI by designing algorithms that can learn, either based on training or experience.  

User experience is how someone feels while using an interface. For computers, feelings are hard to process. So it’s best to say that user experience is the complex connection between the measurable behavior of a user interacting with an interface and the outcomes of the interaction. 

Measurable Behavior and Testing

The first step when trying to understand something is to measure it. With UX this is difficult. Scientists have determined that bias always finds a way to creep into any observation, and the gold standard for research is double- or even triple-blinded studies. When it comes to testing interfaces, they often need interpretations.

This interpretation causes chaos: if the user is asked, then the UX designer must contend with the demand characteristic. Every participant in an experiment forms an idea of what the experiment is for and this immediately brings bias. Worse yet, if a designer is doing the testing, it’s inevitable they are biased as well. 

The solution is to introduce a note of objectivity by using extended telemetry. Note the length of each session, where the user clicks, how long they take to do any operation, how often they fail, and how often they quit. A human analyst would drown in this sea of data but when it comes to a neural network, the more data the better. 

The promise of chatbots

For most people, chatbots are still some intimidating automation that needs a lot of developers to be involved. Also, there’s a prejudice that chatbots are harsh and they can disturb the user experience. 

The truth is that, if you do conversational design right, it can boost not only user experience, but your sales, ROI, and your employees’ work hours. The trick is to know exactly what you can and need to automate and to always leave an option for human connection. You also need to create conversations carefully -  you can always start with FAQs. A chatbot can usually recognize only the user’s inputs that exist in the flow you made. especially if the language you use has complex conjugations.  

That’s exactly the place where machine learning excels. AI chatbots are regular chatbots with the additional power to recognize fine lines from a user's sentences. The more users use your chatbot, the finer its recognition gets. And I believe you can see how that leads to improved UX. 

What can we do about emotions?

Earlier, we decided not to guess what the user is feeling and simply track their behavior, but can’t we do at least a little guessing? Computer Vision offers a tentative ‘yes.’ 

Right now it’s possible to train a neural network to classify faces according to expressions that are associated with primary emotions — sadness, happiness, anger, disgust, fear. And the precision of getting the emotions right is improving. The problem is that the central idea: that facial expression maps directly to emotion is false. Different people simply show emotions differently. 

The solution is the growing field of artificial empathy, where they study emotions in a broader context than just facial expressions. A fully general solution is still far away, but software tuned to a very specific emotional state, like anger or distraction is already in use, especially in high-risk situations like driving. 

One step ahead of the user

Your app or website might be broad, but the paths the users take are surprisingly predictable. With A/B testing or heat maps, you can easily learn user behavior. Using that data can change the interface so that it corresponds to a path that’s more likely to be traced from your average user.  

From recommendation systems to targeted advertising, this level of analysis is possible thanks to machine learning. In particular, AI can be fed with bis sets of user actions. They can be sorted by the user. Machine learning can then produce either universal insights or ones tailored to a specific user. 

AI isn’t magic

Certainly, AI isn’t a wand you can wave to make your UX problems go away. But it’s a powerful tool. It’s a regular Swiss army chainsaw that can solve most UX problems in multiple ways. 

 

collect
0
avatar
Tamara Lang
guide
Zupyak is the world’s largest content marketing community, with over 400 000 members and 3 million articles. Explore and get your content discovered.
Read more