top of page
mpbi.png

Microsoft Power BI

Sponsored Project for HCDE 517- Usability Studies

Pranali, Akankshya, Akeeksha

and Vaishnavi

Note

This project is part of an NDA clause from the Microsoft stakeholders, and hence, most of the information is not included in this case study. The purpose is to communicate a broad overview of the project and learnings.

overview

purpose

The purpose of this study was to explore the discoverability, findability, and readability of the content on a new view of the Metrics feature in Microsoft Power BI, based on the experience and familiarity of the users with business analytics platforms. 

research Questions

RQ1- A. How discoverable is the entry point for the new mode?
         B. Do users understand that they have to enter the mode to customize the visibility and/or order of a setting?

​​

RQ2- What are the users expectations for customization options?

​​

RQ3- How findable/discoverable is the entry point for the new view?

​​

RQ4- How likely are the users to understand that their changes in column settings in the new view, also reflect in the familiar view?

methodology

Study Type 

Moderated task based study and semi-structured interviews.

Participants 

8 participants, recruited by employing the snowballing technique. Participants has prior experience of working with business analytical tools. 
Compensation for recruited participants

No compensation was provided for the participants

Method

It was a remote moderate study.

Equipment
A. Required by moderator

Laptop, earphones, screen recorder, diary (for notes)
B. Required by participant

Laptop, screen recorder

data Collection

A. Qualitative Data:
1. Thinking Aloud Technique
2. Pre-Test Questionnaire
3. Post- Task Open Ended Questions
4. Post- Test Open Ended Questions

B. Quantitative Data:
1. Task Times
2. Task Ratings
3. Number of Tasks Completed Successfully
4. Number of Clicks
5. Post-Test SUS questionn
aire
 

analysis

The analysis of the data was conducted to capture common themes, usability problems faced by users, pain points, identifying mental models, and noting suggestions. 

Qualitative Data:
The Figma tool, FigJam, was used for affinity mapping analysis of the qualitative data of each participant. Comments and answers with common themes were clustered together and usability problems were identified.
The pre-test questionnaire answers regarding participant backgrounds were helpful in noticing certain mental models that existed with participants due to their prior experience with certain tools.  


Quantitative Data:
Task Times, Task Ratings, Number of Clicks, Number of Clicks, Task Success Rates and Post-Test SUS scores were all analyzed with context to the qualitative data in order to score the identified usability problems. 

findings

Finding 1: Users had difficulty finding an essential button.
Severity Rating: 3 (Minor) as it impacted several users causing minor levels of frustrations and increasing time to complete the ta
sk


Finding 2 : Essential mode was not discoverable.
Severity ranking: 1 (Critical) as it prevents users to enter the essential mode in order to edit necessary information.


Finding 3 : Users had difficulty locating an essential settings.
Severity ranking: 2 (Major) as it prevents users from editing necessary information.


Finding 4: Users are likely to understand that their changes would be reflected across different views.
Severity ranking: 4 (Suggestion) as it a possible enhancement.


 

reflections

The project was successful due to the efficient collaboration between our team members and the Microsoft stakeholders. Honest communication, well-defined goals, and timelines were also essential in making this experience a holistic one. Personally, I was exposed to a formal work-setting project with stakeholder expectations, plans, and timelines. I learned a lot in terms of industry timelines, realistic targets, communication with stakeholders, and presentations of findings.

 

things i would do differently

If I were to participate in this project again and would have more time, I would consider doing the following:

1. Randomized method of
participant recruitment, instead of snowballing.
2. Developing a controlled
quantitative experiment by employing eye-tracking equipment.
3. I would then follow this with
qualitative questions to delve deeper into user behavior.

Employing such a mixed-methods can leverage the advantages of both qualitative and quantitative data, and provide a robust analysis of user behavior.


 

© 2023 by pranali raorane 

bottom of page