We’ve found that the first few minutes of a Virtual Reality (VR) experience can entice users to delve deeper or make them lose interest completely.
Join Key Lime Interactive team members, Rick, Levi, and Eugene, for a thirty-minute exploration of how to define the virtual reality relationship early and how to create a user experience that will make your users fall in love at first sight.
User Experience Research (UXR) & Artificial Intelligence Webinar
Artificial Intelligence (AI) is constantly evolving, which means so is the user experience for correlating products/services. For this reason, Key Lime Interactive UX researcher and VP of Client Insights will reveal key UXR insights in the following webinar.
He predicts this evolution, or even revolution, is gaining speed and that UX Researchers, known experts in considering the human experience, understanding how humans trust and naturally tech-savvy, will help usher in this change and ideally make it more compelling and perhaps help protect us from ourselves. Technologies like geofencing, mobile networks, and the ever-present mutually beneficial, yet still competitive business relationship, will all play a role as well. Wath a recap of this event below or check out this related white paper.
We’re told artificial intelligence (AI) is all around us, but what is it really? What does it mean when machine learning (ML) is used? Should I fear AI? How are people utilizing these technologies? Join Carol Smith, Sr. UX Researcher at Uber ATG in exploring these questions and more in this secondary webinar on Machine Leaning and AI.
Watch the Replay of: Death of the 6 Week UXR Project
Working under the pressure to meet market demand, in early 2016, Key Lime Interactive (KLI) recognized that we needed to formalize a research model that was unconventional. Our clients immediately needed research that was efficient; fast and multiple iterations that fit a budget that was designed before the model shifted. They wanted to be involved and they wanted to hit the ground running with the next steps for improvement. We saw an opportunity to change the way our clients expected results.
Join us for a 30-minute webinar with our VP of User Research, Eugene Santiago, who will tell the story via a case study of how research is changing, how KLI is ushering along that change, and the impact this is having on product design as a whole.
The death of the 6-week research project
The power of collaborative workshops
Alternative ways to socialize research beyond the traditional PowerPoint Deck
Please join Key Lime Interactive‘s Director of Quantitative Research, Dana Bishop, and User Experience Researcher, Phil McGuinness who have been collaborating on Competitive Benchmarking studies for the past 5 years. In this webinar, they will reveal some of the best practices they have developed over the years.
Topics will range from the importance of taking the proper time for building a strong foundation for your benchmarking study, the key questions you should be asking yourself before starting, as well as participant recruiting, pre-analysis planning, and collecting and analyzing data.
Questions that we’ll answer within this webinar:
Why benchmark yourself? Why include your competitors when benchmarking?
How to set clear goals and objectives for your benchmarking study?
How do I get the right participants? How many do I need?
More about population, sample size, and statistical significance
Which tools are best for collecting benchmarking data?
What are the Best Practices for analyzing competitive data?
In the third and final segment of our body & facial expression webinar series, we will talk about how to automate the analysis of facial responses to understand your user’s emotions. Facial analysis is one of the least intrusive methods for capturing real-time emotional reactions in a highly automated manner. These systems use computer algorithms that take video from a common webcam and provide frame-by-frame emotion metrics as outputs.
Current methods for collecting emotional responses often require a participant to constantly verbalize their thoughts and feelings. This approach is unnatural and may yield unreliable results. Facial metrics can be captured in the background as a participant performs a series of tasks while evaluating a website. The software can automatically pinpoint when a person is feeling certain emotions and can be tied directly to what they are doing.
36 Virtual Events Hosted & Counting
Watch KLI’s Webinar:
Join our Director/Principal Researcher, Andrew Schall, and CEO of Kairos, Brian Brackeen, to learn about the latest tools and methods used to analyze facial expressions as well as understand the opportunities and limitations of using this technology in user research. During this webinar we will cover:
An overview of available facial analysis tools
Demonstration of facial analysis software
How to integrate facial analysis into your existing user research methods
Discuss the pros and cons of using facial analysis
In our first webinar of this 3-part series, we discussed the role of body language in user research. We looked at some examples of different postures, gestures, and facial expressions people often make and we shared some tips for how to respond as the moderator when you’re in those situations
In this webinar, we are going to dive deeper into facial expressions specifically and the role they play in user research. We’ll share some common situations in UX research and some of the more relevant facial expressions to pick up on while observing users. We will be discussing the seven universal facial expressions of emotion, which are:
36 Virtual Events Hosted & Counting
Facial Expressions in UX Research
Obtaining feedback from users during a research study is not always just about listening to what people have to say. As researchers, we should listen to what participants tell us while also focusing on their nonverbal behaviors like posture, hand gestures, facial expressions, etc. This nonverbal data can be coupled with other qualitative feedback you observe, and it will help you better understand what the user’s actual feelings and emotions are during the test session.
Recognizing these facial expressions and attributes can help moderators to classify a person’s mental and emotional state throughout a test session. Utilizing facial recognition can and should be used in combination with established research methods, such as usability testing, to provide a more detailed view of users’ experiences. For example, when testing the new website of a financial institution, you may find that users are able to use the site and complete tasks successfully, but are potentially experiencing surprises in response to certain messaging on the site. It’s great that tasks are being completed successfully, but it’s the facial expressions that are commonly overlooked, and detecting these emotional reactions can and will aid in creating the most pleasant experience for users.