As artificial intelligence continues to influence every corner of the digital landscape, UX/UI professionals face a pivotal opportunity and responsibility to redefine how users interact with digital products. We spoke with Martyna Gołębiewska, founder of UXhands and a seasoned UX researcher and design strategist with over a decade of experience, about how her team integrates AI into design practices, the trends shaping the future, and how industries can prepare for the next era of user experience.
Martyna’s vision for the AI-powered UX future balances innovation with integrity. Her emphasis on continuous validation, co-creation with data science, and ethical design underscores a shift in the industry: AI is no longer a future disruptor – it’s a present catalyst. For organizations ready to embrace it, the transformation begins at the intersection of empathy, data, and adaptability.
How do you see AI reshaping UX/UI design practices over the next 2 to 3 years?
In the next two to three years, we will witness AI becoming deeply embedded in the UX/UI process – not just as a tool but as a co-creator. We’re moving beyond AI for automation toward AI as augmentation. Designers will increasingly use generative AI for ideation, wireframing, and even prototyping in real time, shortening cycles and making experimentation faster and less costly.
We’re already seeing intelligent systems capable of analyzing behavioral data in milliseconds and adapting interfaces accordingly. That’s a dramatic shift – from designing for users to designing with adaptive systems that learn from users constantly. This doesn’t replace UX professionals – it elevates their strategic role. UX teams will focus more on intent modeling, emotional resonance, and systems design rather than pixel-level execution.

What future AI trends are you actively preparing for, such as emotion-aware interfaces, AI copilots, or hyper-personalization?
We’re actively preparing for emotion-aware interfaces and autonomous UX systems. These include AI copilots that assist not just users, but designers themselves during research and prototyping phases. I see the next wave of UX infused with affective computing – systems that recognize and respond to user emotions in real time.
Hyper-personalization is another core area of investment. We’re experimenting with layered personalization – designing systems that dynamically shift user journeys based on data such as user goals, interaction context, and micro-emotions. This requires rethinking architecture from static flows to modular, adaptable patterns.
We’re also exploring “zero UI” interactions, where AI anticipates user needs through predictive analytics and sensor data, minimizing the need for direct interaction. That’s particularly powerful in accessibility design.
Which industries are getting the most value from AI-infused UX today, and which ones are still lagging?
E-commerce and fintech are leading the charge. Their business models demand constant optimization and high personalization, and they have both the data and the incentive to push AI-infused UX. We see advanced use cases like predictive cart experiences, adaptive pricing interfaces, and behavioral nudges that optimize conversion in real time.
Healthcare is catching up rapidly, especially in patient engagement platforms, but regulation remains a constraint. On the other end, traditional public sector platforms and legacy enterprise tools are still lagging. Their slow adoption often stems from rigid infrastructures, risk-averse cultures, and a lack of AI governance frameworks.
Ironically, these same sectors stand to benefit the most from AI if adoption is handled responsibly. AI can unlock inclusivity and personalization at a scale legacy UX methods can’t reach.
What are some early indicators you look for that suggest an AI-powered feature will gain traction with users?
We track indicators such as reduced cognitive load, improved task completion time, and signs of emotional trust. If users feel guided – not manipulated – by the AI, we’re on the right track. One very telling metric is feature re-engagement: if users return to the feature unprompted within a short window, it signals perceived value.
Qualitative signals matter, too. If in usability tests users describe the AI interaction as “helpful” or “intuitive” without being directly asked, that’s a powerful indicator of emotional alignment.
Another critical marker is how well the feature adapts over time. Successful AI features become more relevant with use. If we see declining error rates and higher personalization acceptance (e.g., users opting in vs opting out), it confirms that the feedback loop is working.
How do your teams work alongside data scientists or machine learning engineers when shaping AI-based UX solutions?
We work in integrated product squads where UX, data science, and engineering are peers, not handoff stages. From the discovery phase, we involve data scientists to define what is learnable from user behavior and which signals are feasible to capture ethically and accurately.
My team provides experience blueprints and user journey scenarios that inform model design. We also shape how models surface predictions and decisions – essentially, how explainable AI becomes part of the UI.
A key part of our collaboration is ongoing validation. We conduct co-analysis sessions with ML engineers to compare model behavior with user expectations. That’s essential when
you’re designing systems that adapt dynamically – UX must become a continuous practice, not a phase.
How do you future-proof the AI experiences you design, knowing how fast the technology evolves?
We design for adaptability over perfection. That means building modular interfaces, defining UX patterns that can accommodate new logic, and using microservices for AI decision layers so they can be swapped or improved without redesigning the entire experience.
We also place a strong emphasis on ethical foresight – what are the long-term implications of this design decision when the AI model evolves? How do we give users transparency and control as the system changes?
Finally, we’re investing in living design systems that evolve alongside product and model updates. These include AI-guided design documentation that stays up to date with real user data and model behavior.
The pace of AI evolution requires that we don’t just ship features – we cultivate ecosystems. Ecosystems that learn, adapt, and scale responsibly.