work experience
Ontario Digital Renewal Service

Working for the MPBSD Cluster (Ministry of Public & Business Service Delivery) in the Ontario Government as part of the XDstudio, I was given the unique opportunity to reshape the way millions in Ontario access government services.


With an initial team of seven User Experience designers, we were tasked with creating a user-centered online platform that would allow Ontario residents to access everything from license renewals to health card renewals in one digital platform. This project, originally started by me and my team, would evolve over time, bringing in dozens of developers, a product manager, business analysts, service designers, a scrum master and a host of disparate stakeholders across the Ontario government.


The Problem

Much of the digital infrastructure that facilitates various governmental transactions with residents of Ontario involves a multitude of disparate websites across various ministries, with overlapping jurisdictions and separate logins.

An outside agency was tasked with polling Ontario residents on their opinions of these websites and their organization. The results reflected majority dissatisfaction with:

  • Accessing resources and help tools



  • Navigation through websites

    s



  • Information finding

    s



  • Requirement of multiple log-ins and passwords


The Solution

The CXP Platform


Based on the data gathered through the report, it was concluded that a centralized and user-centered digital platform would be required to address the multitude of needs and wants of the residents of Ontario.

To prove the necessity of such a platform beyond the initial report, our team of seven UX designers crafted a rigorous user research methodology that involved interviewing 25 participants, both online and in person, across a range of government services.

Using this data as our foundation, we crafted personas and journey maps, developed medium and high-fidelity prototypes, conducted walkthroughs to iterate on our designs, and invited disparate stakeholders—from product managers to business analysts and an entire development team—to translate the concept into reality through a series of sprints culminating in an MVP.

User Research

Initial Steps


Our first step was to develop research questions for the user testing phase.

During the research ideation process, we had to adapt our UX approach to meet the expectations of our stakeholders.

Given the project's scale and tight timelines, they insisted that the user testing include a prototype to provide users with context on how the product would function.

Understanding their perspective, we worked to integrate the initial research data to create low-fidelity prototypes, which we incorporated into our research ideation. We then provided these to users for testing alongside our research questions.

LoFi Collaborative Creation


Initially, we collaborated to create the baseline design to work off, as shown in the images before. Leveraging the Ontario Design System, we designed the dashboard, the central hub of the website, to provide seamless access to government services. Based on the initial research, we understood that users prioritized quick, direct access, so we centered the dashboard around this core need. We explored features like search functionality, direct access to action items and status updates, and the visualization of available services. One of the key innovations we formulated was enabling users to view and track all transactions from a single webpage.

On the left is my initial draft, which I presented to my team. It worked off our baseline, including functionalities such as action items and a message center, but my crucial and unique contribution was the idea of "linking services." Through this, a user would be able to personalize which services they wanted to be visible to them immediately on the homepage. This contribution was integrated into all subsequent prototypes and ultimately became a key part of the entire system during the later development of the MVP. We then organized a workshop, bringing in each of our individual low-fidelity wireframes to create a medium fidelity prototype that included many of our key ideas.

Research Questions & MidFi Prototype


Through iteration and collaborative development, my team and I were able to craft 22 user research questions for our interviewees. The first eight questions primarily aimed to gather more meaningful answers about their current perceptions of the system, the viability of the new centralized platform, and, crucially, their interactions with both online and in-person systems and preferences. The remaining 14 questions sought to gather user feedback on the medium fidelity prototype while also building on our initial questions through important context on how a potential centralized system would function.


Although the initial report covered some of this, we really wanted to dig deeper into users' perceptions. We sought a multifaceted and nuanced approach that covered the full diversity of Ontario's population. Alongside crafting the questions, we also iterated on our team's medium fidelity prototype until we were able to create a final iteration, visible below.


User Research Analysis

Mapping Research & Next Steps


Another individual and I within our team were tasked with transcribing each of the interviews into Figma for the purpose of affinity mapping. We transcribed from two sets of notes and video recordings. Afterward, we were given responsibility for affinity mapping all the data from the interviews.The image below displays the scale of the data.

Affinity Mapping


The process of affinity mapping was a complex task. My partner and I were assigned to map all the data we had collected, and I initially found the volume of information to be a considerable challenge, with one other UX designer assisting me.

Our approach involved a lot of conversation and back-and-forth between my partner and me. I was very conscious of my own personality, which, when faced with logic puzzles like the affinity mapping process, tended to dominate the dialogue in my desire to solve them. With this self-awareness, I held back and aimed to interact with my partner in an equitable way, focusing on identifying patterns in the data and clustering them in a way that built consensus through mutual respect for each other’s perspectives. Our project lead also joined the conversation at times, offering advice and sharing their opinions.

Overall, we identified 33 different cluster groupings, with dozens of sub-clusters that specified the users' perceptions. Following that, we summarized each into a series of paragraphs that detailed the most significant data points.

Overall Findings


The information we collated drew us to several conclusions:

1

Users found accessing government services to be complex, with the websites being confusing and tedious

2

Users valued seamlessness of in-person transactions

3

Users generally had high trust in sharing information on government websites

4

Users felt inundated by information overload and the lack of responsive help features on government websites

5

Users overwhelmingly found value in the concept of a centralised platform

6

Users praised the customization feature of the new platform

7

Users were confused by the scope of the new platform

8

User viewed the ability to conduct and track all transactions on a singular webpage as innovative and helpful

Personas & Journey Maps


The information we captured was then translated by my partner and me into 6 personas and 4 journey maps, a few of which are visible below. Our product manager played a decisive role in this stage, acting as a critic by challenging us to dive deep into the data and reflect it concisely and effectively. The user groups in our personas and journey maps were categorized based on tech literacy, journey complexity, and trust in government, which reflected the diversity of the people we interviewed. We were also tasked with creating a 'typical journey' user to provide context for our stakeholders.


During this stage, I faced the challenge of accurately translating the information. I focused on ensuring the data was represented with integrity, and initially, my instinct was to shape the deliverables in a way that emphasized the desire for a centralized platform. However, I quickly recognized the importance of reflecting user perceptions more objectively, allowing the data to guide the conclusion. While the users were overwhelmingly positive and supportive of the idea of a centralized platform, many were still able to use the existing government webpages to achieve their goals, and that had to be reflected.


An example of this occurred while I was crafting the journey map for "Typical Tuan". As I created the journey map, I unconsciously tried to include more pain points in the persona's experience to validate the need for the platform. However, after becoming aware of my biases through honest, critical feedback from our product manager, I reviewed the data with more nuance and was able to find a balance that accurately reflected user perceptions


Prototype Iteration

High Fidelity Mockup Creation


Building on the abundant data we gathered from user perceptions of our medium-fidelity prototype, we set out to create a high-fidelity mockup of the entire account registration process for the purpose of user testing. This was a collaborative effort involving everyone on the team. We mapped out the flow of the account registration process, with our product manager and business analysts present to provide crucial context on how security and information safety would function. Then, utilizing the Ontario Design System, we crafted the prototype. In the process, we also developed our own design system for the platform.

With my background in art and graphic design, I was able to leverage my strengths in visual design at this stage. Crafting a new and innovative system was exciting, and adhering to OPS digital guidelines while also creating our own presented a unique challenge.

Through close collaboration, my team and I were able to construct the registration flow for our high-fidelity mockup, which is viewable through this link.
  • User appreciation of features like explore services and saving information


  • Perceived user value in concise, simple and intuitive UI


  • User confusion at features like digital wallet and selfie verification


  • User critique of lack of proactive help features


  • User desire for additional features like 3rd party login and more accessibility


  • Positive user perception of the safety of the platform


Final Iteration

These were then incorporated into our final high-fidelity prototype through successive iterations of our designs. Some of the key screens are found below.
Final High Fidelity Prototype

Sprints
Sprint Sessions
Following the creation of our final prototype, our larger team started planning and executing a series of sprints. These involved us, a large development team, business analysts, product managers, content strategists, and quality assurance. These sprint sessions concluded with the development of an MVP for health card renewal.

My team and I took on the 'ready' role, crafting user stories and iterating on the platform processes one sprint ahead of the DevOps team. In the process, we would constantly interact with the development team to bridge the gap on technical feasibility. I found the process to be challenging yet exciting, with the quick windows of development meaning that we were almost always working and iterating on our designs.

Our work allowed development to immediately and quickly start each sprint with a defined vision of how the end stage of the product would look and function. Furthermore, we would regularly hold design review meetings to align the greater team on each sprint. Throughout this process, I gained firsthand experience in how UX works alongside the development team within the agile methodology.

Self Directed Collaborative Ideation Session


During one of the sprints, I was tasked with designing how renewal reminders would function on the platform and was given the freedom to approach the solution in any way I chose.

The first thing on my agenda was a collaborative ideation session with part of the DevOps team and a product manager. I created a board and structured the session with time blocks, including specific periods for brainstorming, researching comparative features on other websites, and ideating on how the feature could look and function. I was able to organize, lead, and complete the session without many issues.

Initially, I felt slightly apprehensive taking the lead, but as the session progressed, that feeling dissipated. I also used this opportunity to gather information on the platforms technical limitations, which helped me discard many of the more unconventional designs.

Independent Iterative Design Within Sprints



Using the information gained from the collaborative ideation session along with drawing inspiration from other websites and systems, I sought to design a notification that was immediately accessible, eye catching and and aligned with both our platform's design system and Ontario's broader design standards.

I developed several iterations of the tooltip. Including positioning it as a banner, a popup, within it's own subcategory on the main page and as an icon. Each of these either conflicted with an already existing feature of the system or was unviable.

After much iteration, I created the symbol on the right (highlighted in red). Initially, it featured a multicolor design—green, yellow, and red—each color representing a different concept: green for "no action needed," yellow for an "alert or reminder," and red indicating "expiry." When I presented this to the team, I received critical feedback that the color system might overwhelm and confuse users, though the overall concept was solid. Taking that feedback I iterated a last time and stripped back the design, ultimately creating the alert seen on the left.

Evaluation & Lessons Learned

This project was long, multifaceted, and pushed my growth as a UX designer in ways I had never conceptualized. Throughout it, I not only gained firsthand experience working as a designer during every step of designing a new product, but I also learned how UX functions in the broader context of development.

I’ll be honest: I entered UX because of my design skills and my intrinsic belief in centering people within systems. However, as I gained more knowledge about UX during my master’s program, I became slightly disillusioned with the field and felt that UX was often treated as a secondary aspect of development, sometimes even sidelined. This project taught me how wrong that belief was. At every stage, I saw how integral UX is to development, how it was able to provide the parameters and foundation for the entire project and shaped the development process to create an end product that users truly value.

Throughout this project, I not only gained hard skills, such as how to distill vast amounts of information into convenient deliverables, but also the intrinsic soft skills a UX designer needs for their multifaceted role. I learned leadership and organizational skills while also building my capacity for empathy and self-reflection. All of this left me very proud of what I was able to accomplish.
Made on
Tilda