UX RESEARCH @ THE INTERNATIONAL MONETARY FUND

Finding Workflow Solutions

Standardizing the internal review process with SharePoint Online

The International Monetary Fund (IMF)’s internal web content and customized tools are hosted in the SharePoint 2016 environment. Many tracking tools in the SharePoint 2016 environment were created on a case-by-case basis, leading to a lack of consistency or standardization across these internal tools. As the organization upgrades to SharePoint Online to enhance productivity and efficiency, standardization becomes a key priority. Our team, as one of the pilot departments for this upgrade, was tasked with finding a standardized workflow solution that would be scalable in the new SharePoint Online environment.

I was brought on board as the sole UX researcher during the initial phase of the project. My contributions included conducting stakeholder interviews and workshops, documenting current practices, performing a technology assessment, and creating user stories and flows for the working prototype. My responsibilities did not extend to the implementation stage.

PROJECT OVERVIEW

Timeline
Sept 2023-Jan 2024

Role
UX Researcher

Research Method
User interviews & synthesis, Gap analysis, Analytics

System
Microsoft ecosystem: SharePoint Online and Power Automate

Key Results

Our user research enabled us to identify 6 key points for automation in the existing flow, aimed at enhancing review cycle efficiency by minimizing manual emails and repetitive notifications. These six points also facilitate the integration of scattered tools, further streamlining the tracking process.

Let’s dive in and see what insights I extracted to build the foundation for the design process!

THE BACKGROUND

Let me give you some basic background information first.

As an international financial institution, the IMF plays a crucial role in shaping the global economy by offering policy advice, providing financial assistance, and promoting intergovernmental dialogues. The main output for achieving such a goal is through publications. All IMF papers go through an internal review system (eReview) before publication.

So an internal tool is in place to help streamline staff’s review process and ideally, staff would stay in the eReview ecosystem for the review cycle, but this is not the case. See the flow chart for the hierarchy/parties involved and the infographic for the process.

THE CHALLENGE

The current practice isn’t consistent, but our goal here is not to clean up the entire clutter. Our focus is on the divisional tracking cycle from steps 5 to 8, reviewing division admin updates internal tracker for tracking purposes.

We have 9 divisions within the department and as such 9 systems and methods for internal tracking. The lack of uniformity hinders collaboration and limits transparency, leading to confusion and inefficiency. The department wants to standardize and create a single scalable solution as we step into the SharePoint Online environment.

PROBLEM STATEMENT

How might we create a standardized tracking solution that increases efficiency and reduces duplication in the review process?

THE PLAN

I worked in a team of 4 to tackle this challenge. The team consisted of:

  • Johan: an Advisor and the manager for the project, who guides big-picture requirements

  • Emeric: a Senior Information Management Officer, who is a Power Automate Developer and has experience with customized tools in the SharePoint environment

  • Prema: a Senior Information Management Analyst who has worked with each division’s administrative coordinator on managing the current existing tracking solution.

With my team, We created a 4-phase plan.

Phases 1 and 2 encompass the Discover and Define phases, along with initial Ideate activities through prototype development and stakeholder feedback. I greyed out the phases that I was a part of.

THE APPROACH

Discover

With a set plan for solving the challenge, we felt confident coming into the problem space. In this section, I want to go in depth about the research methodologies we used to extract valuable insights.

  • User interview

  • Technology assessment

  • Data Analytics

UNDERSTAND THE BUSINESS NEEDS

Let’s take a quick look at the business needs to undersand the research scope. SPR is the department our web team operates in.

  1. SPR needs to find a standardized tracking solution that doesn’t conflict with the current workflow at the divisional level

  2. SPR’s new solution needs to be scalable from the divisional level and adds value to staff’s day-to-day operation

  3. The new solution needs to comply with the technical constraints set by the IT department that the solution should be created within the out-of-box Microsoft ecosystem (no customization at this time)

USER INTERVIEWS

direct access to user’s current practice + needs

We conducted 9 interview sessions with 13 admin coordinators, representing all 9 divisions within the department. As the first point of contact for staff inquiries, these key individuals actively manage their divisional trackers and possess deep knowledge of the workflow. Emeric, Prema, and I conducted the interviews via Teams calls with all nine participants.

To gain a comprehensive understanding of divisional tracker management by admin coordinators, we conducted semi-structured interviews. The first half focused on guided walkthroughs of their tracking processes, while the second half involved consistent clarifying and follow-up questions to ensure data coherency.

I recorded these interviews and took detailed notes. See below.

I then crosschecked my notes with my team as well as the interview recording to ensure the quality. I then synthesized my notes to find the following themes, which I presented to both Emeric and Prema. This finding was later used to identify user stories and the flow for the solution in the later stages.

While 1 division lacks a tracking system entirely, the other 8 divisions have more comprehensive tracking systems, albeit scattered across multiple tools and mediums. This led us to a round of technology assessment because we wanted to better understand the purposes of utilizing different tools and how we can consolidate these functions into one single solution.

TECHNOLOGY ASSESSMENT

valuable insights into existing solutions and finding gaps where new solutions might fill

Remember in the challenge section when I said the goal here is to find a solution for steps 5 to 8, the reviewing cycle on the divisional level? I might have made that cycle sound a little too simple up there. In fact, in this gap analysis, we will really zoom into the smaller steps involved in making steps 5 to 8.

Let’s expand that cycle and take a look at the flow of the existing solution. This flow chart is created based on insights we extracted from all user interviews. It represents the common steps involved across all divisions. We also discovered new steps involved in this cycle as well. See below.

For more detailed divisional flows and how they vary, feel free to check out the Figma file below.

Now let’s get back to the synthesis. When you examine the workflow, you may not notice a significant issue because many organizations use multiple tools to manage their workflows as well. Here are the three top gaps we identified in this exercise to help us define the user stories and finalize the new user flow for the solution.

  1. Tools used in this process operate independently of each other. There is no cohesive workflow connecting them;

  2. There are many manual steps involved. There is an opportunity to streamline these steps and create a connection between these tools, making them feel like a single, integrated solution;

  3. Another related issue we observed is the inconsistency in the master tracker's location. There is also a need to standardize the master tracker tool, triggering automated workflows with other tools used by different divisions.

DATA ANALYTICS

understanding user behavior and prioritizing research efforts

During our user interviews, we uncovered that admin coordinators also oversee summary dashboards for each staff member's review portfolio. These dashboards comprise basic information, indicating the number of review items a staff member has completed throughout the year. Such summaries play a crucial role in enabling managers to assess staffers' performance at the end of the fiscal year.

Given that this was a recent discovery, we sought to determine the extent of usage of these dashboards at a high level. Upon assessing the analytics for the most widely accessible dashboards related to review tracking, we found that, despite having a department of almost 300 people, these dashboards generated fewer than 100 clicks over the past few months, with very few unique visitors.

This data helped us prioritize our efforts and narrow our solution scope. We decided not to worry about the dashboards in the pilot version.

Define

We came into the define phase with many useful and valuable user insights. I helped define 2 user stories from the admin coordinator’s perspective.

USER STORIES

As an administrative coordinator who manages the internal daily review flow, I want to…

  1. …help my team stay on track with their workflow without duplicating efforts;

  2. collaborate with my team more efficiently by reducing unnecessary bilateral communication

USER FLOW

I then created a user flow that initiates when the divisional admin coordinator receives a review request in the mailbox from eReview. I identified points where the process directly interacts with eReview and highlighted in yellow where the divisional tracker generates automated workflows. The key highlight is that, despite the need for some manual labor, all activities are consolidated in one master location—the tracker itself. When action is needed, the tracker will automatically trigger an email notification, eliminating the need for manual emailing processes.

The user flow above shows the finalized flow that we decided on moving into ideation for the working pilot solution. I want to quickly compare the final flow with the initial flow below that we started in discovery to show you the positive changes that our user research was able to implement.

Our user research enabled us to identify 6 key points (refer to both yellow and blue sections) for automation, aimed at enhancing review cycle efficiency by minimizing manual emails and repetitive notifications. These six points also facilitate the integration of scattered tools, further streamlining the tracking process.

RESULTS

Before

Ideate

With clearly defined user stories and flows, we were ready to find a solution within the Microsoft ecosystem that could make this happen for us. A key tool that would connect all the workflows together that we chose was Power Automate. The idea is to set up a trigger in Power Automate so that whenever a review email comes in with certain titles, Power Automate detects the keywords and starts generating a review task card.

I joined a few brainstorming sessions on how this might look with the team, but unfortunately, this stage was beyond my scope and my contribution ended at the research stage, with the user flow being my last formal deliverable.

THE REFLECTION

Next steps

If I were to take this project further, I would focus on the following:

  • While our project heavily focused on the admin coordinator’s perspective, it's crucial to involve other key parties in testing the solution and gathering feedback. I plan to test the prototype with non-admin staff members to ensure the solution is applicable to a wider group and receives diverse perspectives;

  • Even before the tracker is ready to be adopted more widely, my focus will be on promoting the tool within the department. Given that the organization has an average age of 46 years old, individuals tend to revert to familiar processes. Introducing new technology becomes more challenging and will require additional efforts in promoting this innovative solution.

Learnings and highlights

The web team is relatively new and lacks a standard UX practice. This structure provided me with the freedom to create my own research process and make recommendations where I deemed necessary. I introduced UX language to the team wherever possible, which they appreciated and are now adopting these practices.

When I initially joined this project, I was also new to the review trackers. I had no idea that the process involved numerous steps from the organizational level down to the individual level. To catch up, I frequently reached out to senior staff members on my team to clarify any questions I had. Additionally, I would set up quick calls with them before presenting my findings to ensure I had the right logic and was headed in the right direction. I learned how to navigate a new project quickly by being proactive and asking a lot of questions.

Next
Next

Sofvie