Microsoft Research

My case studies while working with Microsoft as a UX Researcher.
Microsoft Logo

Overview: During my tenure, I was part of the foundational research team and worked on two major projects as part of the Future of Work portfolio. This included heuristic evaluations on internal tools and thematic analysis of the hybrid employee experience survey. Key findings and specific details (such as product names and data) are confidential but I discuss the methodologies conducted, my research process, and the impact.

Roles: UX Researcher

Projects' Scope: Outreach, Project Planning, Scheduling, Heuristic Evaluation, Report Compilation, Newsletters, Surveys, Data Analysis, Verbatim Theme Analysis, Research Report & Presentation

Timeline: Mar 2022 - Oct 2022

slide from heuristic evaluation report showing traps per tenet/category
key insights page on the hybrid survey powerbi

Heuristic Evaluations

Background

In order to improve the usability of internal tools, Microsoft created the "UX Health" initiative. The project called for a holistic usability assessment, utilizing heuristic evaluations (HE) triangulated along with other methods such as usability studies with end-users, telemetry data and other user-feedback listening systems.

Goals and objectives of the initiative included: having a healthy (accessible, coherent, user-centered) UX portfolio across Microsoft Digital Products, supporting teams without dedicated UXR resources, and advocating the importance of usability by scaling the process to be accessible to all at Microsoft (regardless of UXR background).

My Role

I was the lead UX researcher conducting heuristic evaluations. This included reaching out to stakeholders to schedule and plan meetings, troubleshooting access issues, completing the evaluation process, presenting the report, and filing the usability issues as bugs for resolution.

During my tenure, I lead 5 HE's; one was conducted solo and the remaining were conducted with various UX researchers that I onboarded and trained.

spreadsheet tracking heuristic evaluation progress

Why Heuristic Evaluations?

Benefits:

  • Fast, quick method
  • Cheap, no recruitment costs
  • Anyone can do it (with training), solely or as a group
  • T+T method has a manageable number of heuristics and is not overly complex

Limitations:

  • Can be overly conservative (may not match user issues)
  • Some issues might be missed
  • Different individuals may find different issues (ideally want 3-5 evaluators)

Planning & Execution:

1

We would begin by reaching out to product teams to confirm interest in participating in an HE. Then we would gather information from the team, discuss plans, and determine a timeline through a kickoff meeting. Key details included deciding what core tasks and common tasks we should evaluate in the HE as well as if we should conduct the HE in PROD or the testing environment.

2

Then I would conduct the HE, either by myself or with other researchers (if they were available to assist), by going through the tool to complete tasks as if I were a new user and logging any UX traps found. While the HE process is a manual one, we created and refined the same (scalable) method and process to use across all HEs.  We used the "Tenets + Traps" method to find and categorize UX traps then used a rating formula for trap severity that we developed in order to minimize evaluator subjectivity.

3

After the evaluation was complete, I would present the overall findings to the team, focusing on reviewing showstoppers and/or a some examples of high severity traps. This allowed for the product team to ask questions about the traps or provide context to our team regarding the design decisions. Finally, I would file all the high severity bugs for the product team to address then share the report and spreadsheet for the product team to review the complete findings.

heuristic evaluation spreadsheet where we logged, categorized, rated, and added notes for ux traps

Challenges

scheduling conflicts and time constraints

access issues to tools in order to evaluate them

working with PMs and/or engineers to demo tool

refining the HE process in order to scale and teach others

learning the process quickly in order to onboard and train others

being able to explain findings and respond to feedback from stakeholders

following up afterwards to ensure HE bugs were addressed to improve UX

an example of a high severity trap found in a HEURISTIC EVALUATION DISCUSSED DURING THE READ-OUT

Outcomes & Takeaways

Hundreds of Usability Issues Identified

Increased awareness of user experience

Helping those without UXD/UXR

While I came in with limited experience conducting heuristic evaluations, within a few months, I became skilled in this methodology and was even able to teach many others. Additionally, I was equally rewarded working with other researchers by learning from them. As we collaborated and they shared their expertise and findings, I was able to keep my HE skills sharp and gain something new with each experience. I thoroughly enjoyed working on this initiative as it was rewarding process to learn new research methods and witness how our research was able to positively impact multiple teams and products.

a slide detailing what types of issues a product team did or did not address after participating in a HEURISTIC EVALUATION
a slide LISTING THE POSITIVE impact a product team EXPERIENCED from participating in a HEURISTIC EVALUATION

"[One of the UX traps found in the HE], where customers would always require user education to enable external connectivity. After the HE and now with the new experience, all those tickets are drastically reduced... these small things were overlooked but had a big impact on the accessibility of the product overall."

Senior Software Engineer at Microsoft

Hybrid Survey Analysis

Background

Microsoft seeks to deliver the best hybrid work experience regardless of work site, work location, and work hours. A core part of this effort is the hybrid employee survey, a rolling survey that encompasses both quantitative and qualitative data to shed insight on how to improve the employee hybrid work experience.

Through the results and findings gathered from employee feedback, the research team provides employee-focused and data-driven recommendations for company policies and culture.

slide showing themes and percentages from open-end responses

My Role

I provided key support to the senior researcher in charge. Initial efforts included analyzing core open-ended questions utilizing ML tools for theme analysis of 1000+ and manually coding 300+ open-end responses for themes on a monthly basis. As the project scope expanded, other researchers were brought on and I worked alongside them.

With each month, I took on more responsibilities including testing multiple tools and methods to analyze themes, expanding our areas of focus to include other open-ended questions and regions, creating newsletters and reports, and assisting in updating and publishing the PowerBI dashboard.

powerbi with data from one of the machine learning tools for theme analysis

Collaboration

As they say, "Collaboration is key!"
With such a large dataset where we still wanted to pull qualitative insights, this was definitely not a solo or even two-man project. This was particularly exciting for me as it involved collaborating with 3 other UX researchers where we were more or less working on the same level, but tackling different aspects in order to achieve our goal.

Prior to this survey, all my past research studies were either done solo or where I was leading or supporting other UXRs, typically only one. This meant it was crucial to have biweekly stand-ups and feedback sessions to ensure we were all on the same page and able to adapt the research plan with any new developments as we worked through the research.

spreadsheet comparing results from machine learning tools vs manual coding
slide showing benefits and limitations to verbatim analysis methods

Challenges

meeting timelines, short turnaround every month to get the latest insights out

analyzing such a large amount of responses (concerns about being accurate/inclusive)

calibrating with other researchers to get compare and combine findings for more accurate results

Outcomes & Takeaways

improved efficiency of research process

advocating employee/user experience

impact on company initiatives

Again, I started this project with very little experience regarding survey analysis, practically none! But I left with another set of skills in tool belt that I know I will utilize in future research studies. This experience was especially memorable for me as I, myself, am passionate about hybrid work. I was thrilled to be able to contribute to such a meaningful project that will no doubt, have a lasting positive impact on not just users/employees and the company, but the workforce culture as a whole.

snippets from monthly mailer for hybrid survey analyses