In this strategic endeavor, we delve into the nuanced design and development journey of Zoom Assistant—a specialized feature intricately designed to elevate the user experience within Zoom meetings. This initiative is propelled by the overarching objective of effectively addressing challenges encountered by attendees and presenters during Zoom meetings, with a keen focus on enhancing efficiency and usability. The undertaking is rooted in the imperative for Zoom to remain competitive in the market, ensuring a steadfast commitment to industry standards and user satisfaction.
In the era of remote work and virtual collaboration, video conferencing platforms have become indispensable tools for seamless communication.
Zoom, a prominent player in this domain, has seen a remarkable surge in users. However, this surge has illuminated various user experience challenges faced by both attendees and presenters during Zoom meetings.
In today's highly competitive market, several alternatives vie for user attention. To bolster Zoom's competitive edge, this project focuses on enhancing its functionality and user experience.
Before jumping into brain-storming, we have Google Meet and Microsoft Team as our main competitors..
Here are the key findings from our analysis:
Product Under test
The product under test is Zoom Assistant, a chatbot designed to help users navigate and utilize Zoom's features effectively. It provides real-time guidance to make the Zoom experience smoother, especially for those less familiar with Zoom's features.
Business Case
This usability test aims to evaluate the usability and user-friendliness of the Zoom Assistant in the context of users who are less familiar with Zoom's presenting features.
Test Objectives:
Participants
5-10 participants, who meet the following criteria:Have a basic familiarity with Zoom but are less experienced with Zoom's presenting features.Represent a diverse user group in terms of age, profession, and technical background.
Equipments
Laptops and phones.
Test Tasks:
Responsibility
Test facilitator/observer- Pei
Dates
All the tests will be finished be by Sep 7th
Procedure:
The Zoom Assistant received pretty positive feedback from users, with most participants finding it helpful and efficient. On a scale of 1-10, the feature's usefulness was rated between 7 and 10. Users appreciated the guidance and tips provided by the assistant. While there was a consensus that the feature was beneficial, some users had specific suggestions for improvement
Throughout this project, designing and developing the Zoom Assistant proved to be a rewarding challenge. The primary objective was to simplify the Zoom learning/troubleshooting experience, ensuring smooth navigation for both novices and regular users.
In our usability tests, the Zoom Assistant received positive feedback from users, with a usability rating ranging from 7 to 10 on a scale of 1-10. Users appreciated the guidance and tips provided, marking it as helpful and efficient.
A deeper dive into A/B testing could have provided valuable insights into how users interact with video prompts or other features. Options such as in-app expanded video windows or the prompt for playing the video only when desired were considered, with the latter being chosen based on user preferences.
Given more time, delving into accessibility features would have been imperative. Making the Zoom Assistant compliant with accessibility standards would ensure inclusivity and a broader user reach.
In summary, this project highlighted the importance of user-centric design, continuous testing, and adaptability. It has provided valuable learnings for future endeavors, emphasizing the need for a holistic approach that considers diverse user needs.