
Plexus requests dashboard
re-design
A UX/UI project to reimagine the requests dashboard and provide confident to our users.
OVERVIEW
Plexus is Macquarie's internal data sharing platform that simplifies the connection and sharing process for data producers and data consumers.
While Plexus is an integral aspect of risk management at Macquarie, for many key users the web platform was frustrating to use. Specifically, the complaints were indicative of usability issues and misalignment with the design system. I was part of this project as the sole designer to lead the redesign a section of Plexus, the requesting and approving workflow for data access and improve the end to end experience.
​
Currently this site is live with the new design features in build and expected to be implemented gradually to the exisiting live site. It is only available to internal staff.
MY ROLE
User research
UX/UI design
Prototyping
​
OTHERS ON MY TEAM
Product Owner/Technical lead
Backend Engineer
Frontend engineer
DISCLAIMER
Due to the private nature of my work, I intentionally left out specific details and minimised images.
THE PROBLEM
The most commonly used features on Plexus today are the request access form and the request dashboard. Consequently it was also generating lots of failure demand, which burdened the workload for the product engineers.
The platform was littered with various UX/UI style bugs and was largely misaligned to the design system brand with inconsistent components and page layouts even within the feature itself. These discrepancies combined meant a fragmented and frustrating experience for both requesters and approvers using the platform.
I was brought into the team to own the creation of a new experience for these two features.
Following the initial discussion with the PO and team about the feedback they were receiving such as:
-
Users are not filling out the form with enough information
-
There is always too much back and forth off the platform in order to complete a request.
We made two key assumptions:
-
Users are not aware of how to fill out the form with adequate information.
-
The information needed to action requests confidently is not available to the user causing uncertainty and unnecessary follow up.​
These had to be validated during user research.

ANALYSING THE CURRENT STATE AND USER INSIGHTS
The first stage was to unpack the current live feature, understand the use cases and what information I could leverage or remove in the to-be design.
Process mapping
The initial stages of this project involved lots of 'sync' meetings to get me up to speed on the requirements of the system and the different ways the data request form can change due to different factors. I kept track of this using process maps to tick off later whilst designing to ensure all stages had a planned design.

Heuristic evaluation
To better understand the specific usability issues of the request dashboard I conducted an informal round of heuristic evaluations with one other designer on the team to identify areas to focus on. After our separate reviews we had a debriefing session to share our results and consolidate any differences. The outcome made it easy for me to choose areas of focus for the user interviews.
​
Scale used:
0 = I don't agree that this is a usability problem at all
1 = Cosmetic problem only: fix if time is available
2 = Minor usability problem: fixing this should be given low priority
3 = Major usability problem: important to fix, given high priority
4 = Usability catastrophe: fix this before product can be released

Initial user interviews
Whilst I had the log of feedback from users across the groups, I also wanted to also explore fresh thoughts through user interviews. In this stage I conducted 8 1:1 user interviews to probe further into the feedback submitted.
Using the Rose-Bud-Thorn method it was easy to see at a glance where the key pain points (red) were.
​
The findings were then presented to the team to discuss the 'changes to consider' section in terms of identifying possible reusable components already available in other systems and give all members a chance to voice their opinions.
​
The problems became clearer and validated our original assumptions to which I wrote some 'How might we' statements:
-
How might we help users feel confident and in control when either submitting requests or approving requests?
-
How might we clearly display requesting process and status to facilitate greater visibility to the users?


I made journey maps for each user type to outline their needs and pain points in the requesting workflow.
IDEATION AND PROTOTYPING
After the initial research it was time to start creating. I always start with doing a round of simple sketches and make my way to the final design outcome.
Between each iteration until the high fidelity wireframes, I held guerrilla user testing session by scheduling in quick session via Zoom or desk hopping in my office space. The more feedback (Especially from the software engineers) I got the more confident I felt in solidifying aspects of the design.



With each iteration I increased created higher fidelity designs.
Along the way I had to pivot from my original designs several times according to feedback from fellow designers at the weekly design sync. A strong reminder that your first design is never your last!
One instance of a pivot in design was re-ordering my original form flow to better reflect the user's thought process when making a request.
​
Shout out to the designers in the design sync for pointing this out 🎉

PROTOTYPE TESTING AND FEEDBACK
I used both Maze and 1:1 usability interviews (Zoom) to test my prototypes to derive insights that I reported back to my product team.
​
In the final round I tested 14 users and scored a 100% CSAT score* and 4.6 CES score** on the prototypes.
Maze was useful in quickly pin pointing which screens I needed to work on whilst 1:1 interviews allowed me to collect more detailed insights based on the Maze insights.
* Customer Satisfaction score calculated by: Number of test users who scored 4/5 or 5/5 divided by the total number of test users.
​
** Customer effort score calculated by: Average of all the scores out of 5.

Between formal progress reviews with the team, I used mural to capture their feedback on my prototypes.
I found a great template as seen on the right, in the template gallery which gave an 'at a glance' view of what was going well and what needed some attention.

I also made sure to always assess the accessibility of the designs.
Each feature design was also reviewed in greyscale to check for issues in usability for those with low colour vision. Colours were checked against Stark plug-in on Sketch. If I had more time I would have liked to test this

PROJECT OUTCOME & HANDOVER
I presented my final designs to the product team and relevant stakeholders showing not only the visuals but also the entire process involved to paint the transformation story.
The deliverables were well received as I pinpointed specific components in the design that solved for the original 'How might we' statements:
-
How might we help users feel confident and in control when either submitting requests or approving requests?
-
Informational alert or tooltips to provide additional guidance to the user when unsure --> reducing the extra communications offline.
-
Added feature of sending reminders to remove the "awkwardness" of reminding via direct message multiple times.
-
-
How might we clearly display requesting process and status to facilitate greater visibility to the users?
-
Approval path progress indicator to show the exact status of an in-progress request and who/what it is waiting on.
-
Consistent and relevant information put on each page
-
​
Other positive outcomes of this project also included:
-
Greater alignment to the central Macquarie design system
-
Enhanced UX writing for clarity to the user
-
Help components to cater to infrequent users
To guide the implementation of my designs I also organised an additional workshop with the team to prioritise what to build first.
As the product team is quite small and allocate story points to other projects as well, the PO wanted a phased implementation approach to roll out the improvements. I hosted a 1 hour workshop with the team using mural to map out a priority matrix to determine the order of fixes.
​
This map compared 'heuristic violation severity' (inputs from the original heuristic evaluation) against 'effort to build'.

A glimpse at the final designs





TITLE OF THE CALLOUT BLOCK
LESSONS LEARNED
1. Unconscious bias is really hard to beat!
With everything I do I always aim to be as objective as possible and especially in my craft this means considering all users. The Plexus project showed me that some bias always manages to sneak in! For example, the system had technical terms but also what I thought were simple concepts such as the 'read only' and 'read and write' permissions. It was only in user testing with new users did I realise that users who have never encountered these phrases were uncertain as to what it meant. Only then did I realised i had to include help text to support them.
2. Understanding the triple diamond approach
From this experience I learnt from my mentor about the triple diamond approach. I appreciate that it incorporates that implementation aspect that you need when building in-house products, it was definitely something I had not given much thought.
I'm currently at that stage now and it's been fun and insightful to work so closely with the engineers to bring the designs to life. I usually do one build review each sprint and between that the team has been extremely welcome to my constant poking around of the dev environment to record style bugs and inconsistencies into Jira tickets to be picked up when possible.
It just goes to show that a designer's job does not stop when designs are done.

3. Never feel like you're the 'dumbest' person in the room
While I had some exposure from university to basic terms about dataset and front-end development, there were still many technical terms and concepts such as different environments that I didn't know. This made me feel like I was always the 'dumbest' person in the room and reluctant to ask for further clarification.
But seeing my team also ask lots of questions about different design techniques and terms, I was also able to recognise very quickly that I didn't need to know everything right off the bat. My team was there to support me to learn and I was also able to teach them about different design techniques.