Cover.png

AI-powered chatbot

AI-Powered Learner Assistant

Project Overview

In 2024, I had the opportunity to design an AI-powered learner assistant within AWS Skill Builder. The goal of this project was to enhance the learning experience by providing learners with a tool to deepen their understanding of training material. The AI-powered assistant was envisioned as a personal guide, offering immediate answers to questions, hints, and support throughout the learning process, all within a private, judgement-free environment.

My role

As the lead UX Designer, I was responsible for the overall user experience design, conducting user research, and leading usability testing. I also ensured that the designs were accessible by conducting accessibility (a11y) notation markup throughout the design process. My role involved close collaboration with stakeholders, iterating rapidly on design concepts, and ensuring that the final product met the needs of our diverse user base.

Problem Statement

The primary challenge was to improve the learning experience on Skill Builder by offering learners a tool that could provide real-time assistance as they navigated through their training. Learners needed a way to ask questions and receive answers instantly, without disrupting their learning flow. Additionally, the assistant needed to be approachable and supportive, helping users without the fear of judgement, which is often a barrier in group learning environments.

Research

To better understand the needs of our users, I conducted surveys using UserZoom and facilitated usability testing sessions. Key insights from the research included:

  • Engagement: Most participants (7 out of 9) did not read the assistant's initial messages, highlighting a need for more streamlined and engaging onboarding.

  • Feedback Mechanisms: While participants found it easy to use the ‘thumbs up/down’ icons for feedback, they wanted the option to provide custom feedback.

  • Feature Awareness: The ‘Explain with Skill Builder Assistant’ feature was underutilized because users were initially unaware of it, despite its recognized value once discovered.

  • Response Design: Participants emphasized the need for a balance between the length and detail of the assistant's responses, and the importance of a friendly, yet clear, tone.

Design Process

a11y notation

I adopted an iterative design approach, beginning with wireframes and moving through to high-fidelity prototypes and mockups. I engaged in nearly daily meetings with key stakeholders to ensure rapid progress and alignment with the project’s goals. Throughout this process, I focused on:

  • Onboarding Experience: Simplifying the onboarding process to increase user engagement and comprehension.

  • Feedback Mechanisms: Introducing an “Other” option and modifying predefined feedback categories to capture more nuanced user sentiment.

  • Interface Design: Ensuring that the assistant’s interface was visually appealing yet unobtrusive, with interactive tutorials and visual cues to guide users.

  • Accessibility: I conducted a11y notation markup to ensure that the designs were inclusive and accessible to all users, complying with accessibility standards.

Solution

The final design of the AI-powered learner assistant was well-received during beta testing. Learners appreciated having an assistant that could guide them through their learning journey, answer their questions instantly, and provide support without judgement. The assistant was designed to be approachable and helpful, with a balance between concise and detailed responses, and a tone that was both friendly and clear.

Challenges & Learnings

One of the main challenges was ensuring that the assistant's responses were both informative and concise, as users varied in their preferences for response length. Additionally, raising awareness of the assistant’s features required careful consideration of the onboarding process. From this project, I learned the importance of iterative testing and feedback, particularly in refining complex AI-driven interactions.

Impact & Results

Though the assistant is set to launch in December 2024, beta testing has shown a positive reception from learners. They are excited to have a personal assistant to aid them in their learning journey, and early feedback indicates that the assistant has the potential to significantly enhance the learning experience on AWS Skill Builder.

Conclusion

This project has reinforced my belief in the value of iterative design and user-centered research. By staying closely aligned with user needs and stakeholder goals, we were able to create a product that not only solves a significant problem for learners but also adds value to the AWS Skill Builder platform. I’m excited to see how the assistant will be received upon its official launch.