Good practice #1. Focusing on user feedback
An iterative, feedback-driven process is central to agile and commonly used in AI projects. Yet, far too many agile projects fail to extend this approach to encompass end-user feedback. In our experience, incorporating UX techniques for user research in addition to stakeholder feedback can be a game-changer, more so when engaging broad, diverse user groups. Consider how OpenAI approaches its user testing needs. As of the latest available data, ChatGPT boasts over 100 million users, with the website receiving 1.6 billion visits in June 2023. Despite these numbers, ChatGPT remains in beta, gathering feedback from this substantial user base.
But how to collect user feedback without a 100 million user base?



Good practice #2. Having user controls in place
User control is fundamental to UX design. By offering options to customize and fine-tune AI output based on real-world user needs, companies can enable users to shape their own experiences, boosting engagement with AI solutions. Let’s look at Google’s ML-powered Translate app to see what customizations could be applied to improve AI output.


Bad practice #1. Disregarding negative feedback
Issues like model drift are inherent in AI systems, and addressing them involves collecting negative user feedback through UI/UX methods.
Bad practice #2. Sacrificing privacy for personalization
While personalization can enhance user experiences, sacrificing user privacy for customization is a detrimental pattern. Ignoring UX principles of respecting user boundaries and informed consent can lead to a breach of trust, or worse, regulatory compliance issues.AI systems that collect extensive user data without clear communication and opt-in mechanisms can alienate users and tarnish brand reputation.What to do about it?
- Clearly communicate data collection and usage policies to users. When a user signs up for an AI-powered app, provide a clear and concise pop-up that explains what data will be collected, how it will be used, and who it might be shared with.
- Offer granular controls over what data users are comfortable sharing. Controls of this type can take the form of a series of checkboxes that allow users to choose what information they want to share.
- Implement robust data protection measures and comply with relevant regulations. For instance, if your app targets European users, implement processes that allow users to request their data to be deleted in compliance with GDPR.