Work Experience
Ontario Property Tax Analysis Modernisation
Software Last Updated in 1998

Working for the Ministry of Finance (MOF), I was onboarded to the newly formed project management unit as a senior project analyst, my role included the responsibilities of a lead UX designer & specialist and of a business analyst. My team included a senior project manager, two project managers, a senior project lead and another senior project analyst. The purpose of our unit was to facilitate a large scale modernization of the Ontario Property Tax Analysis (OPTA) software. This was a daunting task as many of the system's internal mechanisms were copyrighted by the vendor and unavailable for the ministry to examine. What our team effectively had to do was examine the platform and deconstruct it to gain comprehensive & granular insight into how it functioned. This meant that, as a UX designer, I had little information on how the software I was required to modernize operated.


OPTA was built in 1998 as "a comprehensive, centralized budgetary planning tool and property tax accounting system for Ontario municipalities, the unincorporated territories and government ministries". It effectively functions as the executive tool with which all property tax in Ontario (approx.12 billion annually) is measured, modeled and levied. The vendor who built the platform has held a monopoly on this system for decades, and has not updated it to align with current industry standards. Because of the length of the contract, it was decided by the MOF to open a competitive procurement process which centered around the need for an updated and modernised property tax software system.

Overview

My primary purpose as a UX designer on this project was to act as a disruptor. I was responsible for creating a series of deliverables - personas, journey maps, and high fidelity wireframes that were to be embedded within a Request for Bids (RFB). These deliverables were designed to convey to potential vendors the necessity of a system centered and built around modern user experience principles and an updated software framework.

I was given permission to show the two images on the left from the Education Property Tax (EPT) reports section of OPTA. They accurately depict the UI and functionality of the system as it was before our modernization project.

A Steep Climb



Beyond the outdated software, I faced several other significant challenges over the course of this project. The first and most significant was that users of OPTA unequivocally did not desire a new system. A poll conducted found that 85% of all users expressed satisfaction with OPTA. Many of the individuals who used the old system on a day to day basis had been doing so for the past few decades and felt comfortable & accustomed to it. The need for change was being expressed from the top down rather than bottom up. Besides that, critical issues included:

  • Lack of Insight into System Functionality: As mentioned before, due to the system being copyrighted by the vendor, we only had an external view of how the system itself functioned & executed tasks and for what purpose.
  • Complex Domain Knowledge: Just understanding OPTA required deep and intricate tax knowledge, let alone modernizing the entire system.
  • Political and Bureaucratic Hurdles: Political constraints prevented me initially from conducting one-on-one user testing, while bureaucratic inertia within MOF slowed decision-making and progress.
  • Lack of UX Awareness: The branch had limited understanding of UX and its benefits, requiring me to not only design but also consistently advocate for the value of user-centered design principles.
  • Solo Effort: As the sole UX lead, I independently handled every aspect of the UX process, from research to design, without team or wider branch support.
  • Dual Role: In addition to UX design, I also assumed business analysis responsibilities, requiring me to swiftly adapt and balance multiple roles.

In this case study, I will detail how each of these issues challenged me fundamentally, how I asserted myself and the necessity of UX, and the steps I took to overcome them in order to craft an end product that best centered user needs.

User Research

Background Research


For the first month, I spent each day learning about how property tax functions in Ontario. I learned the basics from a handful of guides included in my welcome package, but as I navigated through OPTA, I realized I didn’t have nearly enough knowledge to begin conducting user research. Additionally, being new to Canada, I was unfamiliar with the local tax and civil law systems, further complicating my efforts.


Recognizing this to be a direct impediment to my ability to conceptualize the system, I took the initiative to set up a series of meetings with the department heads of my division, Provincial Local Finance. I prepared question sets for them to answer, with the aim to deepen my understanding of the nuances of property tax in Ontario. Each of them had a extensive knowledge of their specialization in property tax law, and provided me with booklets, binders and folders of information along with guidance on where to access further resources. I also sought out colleagues in my branch for more informal discussions. Through these actions, I was able to build a foundational knowledge of property tax which allowed me to understand OPTA at a broad level and create the diagram displayed on the right.

UX Design Process Timeline


My next task was to create a timeline for the UX process. I was working a dual role as the principal UX designer and as a business analyst, so I recognised the significant workload ahead and that the completion of the UX centered design process was entirely mine to manage. The timeline below allowed me to set personal deadlines to match up with our overall project deadlines, aligning with our goal of an April RFB post.

Question Crafting & Interview Roadblocks


Although I had gained enough tax knowledge to understand the system in a broad context, I still did not understand the intricacies of its mechanisms, nor how it functioned, like an end user would. Considering this, I decided that the best course of action was to conduct a series of one-on-one interviews. These interviews would include a range of questions and then a task-based user walkthrough, incorporating a speak-out-loud protocol, that sought to mimic and thus understand day-to-day tasks users accomplished within the system. Because of my additional role as a business analyst, I included questions about help desk functionality, an area I was also working to transform. I sought to craft the tasks for different user groups according to what module they utilized. This defined four sets of user groups:


  • Education Property Tax (EPT) users who primarily worked with education taxes in municipalities
  • Provincial Land Tax (PLT) users who worked in area & service boards within the unincorporated areas (UA)
  • Tax Analysis users who worked to model and set tax rates in municipalities
  • Ministry users who utilized the tool primarily for oversight over municipalities and unincorporated areas

Because of the nuance associated with each of these groupings and the intricacies of their tasks, I knew that I required more knowledge of the day-to-day tasks a user might go through when interacting with their module. I did three things: joining the OPTA training sessions provided by the vendor which walked through common steps users would take, taking detailed notes on each aspect of the 'help' videos OPTA provided, and setting up meetings with 5 Ministry users with some knowledge of common tasks users undertook on the platform. With this, I was able to formulate a series of questions and a task based user walkthrough that was unique to each user grouping. An example can be found below.

  • Interviewer:
    General questions:
    • What aspects of the OPTA system do you deem to be important?
    Tasks for Tax Analysis Users:
    • For the first task, please input an alternative tax ratio for the commercial optional class.
    ----------
    • For the final task, please model a potential percent levy change of 5% to the general municipal levy and view how it is being distributed across all classes.
    Participant:
    .....

| First Critical Setback


This point in the project is where I ran into my first critical setback. Our director, following the request of our assistant deputy minister, declined to allow one-on-one user research with the end users of the system because of political considerations. Due to the fact that users were comfortable with the OPTA system, the Ministry did not yet want to inform municipalities that it was to be placed into a competitive procurement process. This decision coincided with the deferment of the next property tax assessment, a significant issue, which led the Ministry to focus on avoiding any tension with Ontario’s municipalities.


This obviously left me struggling: how could I craft a user centric product without interacting with the end user?

Survey & PATMAC Meeting


Through meetings with my director, I was able to emphasize the importance of my work and how UX was the necessary foundation to creating a product that user's valued. I crafted a series of presentations to explain how UX was critical to the success of the overall project. This, combined with consistent support from my senior project manager, resulted in a compromise.

I was allowed to:

  1. Create a survey that would be distributed to municipalities and UA users to gain information on their wants and needs.
  2. Conduct a focus group with the PATMAC committee, a group of senior representatives of the municipalities under an NDA, who could be interacted with freely.

Again, this was not ideal. Countless articles and studies have been conducted on the limited effectiveness of focus groups as a UX methodology. My hope was that, combined with the comprehensive survey, I would be able to elicit the necessary feedback to inform my design process.

| Second Critical Setback


Although I was able to carry out a survey and receive 35 responses there were several issues:

  • The survey had a lack of diversity, only certain larger municipalities answered and the majority of those surveyed worked exclusively with the tax tool module. Additionally, no PLT users from the unincorporated areas participated in the survey.
  • Despite gaining broad details of user roles and their relationships with OPTA, I was still missing much of the more granular and specific aspects of users day to day interactions that I could have gained through follow up and more probing questions.
  • Survey participants also chose not to answer many of the more long form questions that could have explained to me specific details of their roles and of the OPTA platform
  • It only provided me with attitudinal data, where users only reflected on their experiences and behaviors rather than demonstrating actual behaviors within the platform

I worked with my colleagues to organize the PATMAC focus group meeting, and developed a series of questions for it. Since I was not allowed to speak in the meeting itself, I coached our branch director on the information I aimed to gather, my planned methodology, and how he could help by asking the questions and following up in specific patterns. The meeting, however, proved ineffective, as most members of PATMAC were senior executives from municipalities who did not interact with the OPTA system on a daily basis, or at all. The information I gained from my time and effort was largely anecdotal, reflecting the executives' perceptions of what the platform’s users might do rather than anything concrete.

Conducting One-on-One Interviews & Walkthroughs

With the failure of the PATMAC meeting, I felt desperate knowing I could not move forward with the user experience process without the key information I required. It was as if I had pieces of a puzzle but was missing the corner points to build off of. I sought advice from two individuals with much more experience: my UX Design Manager from my previous job, and my mentor from ADP list. I set up a three way meeting in my personal time to ask for advice, and they generously provided me with their insight. I described my dilemma, and together we spent about two hours crafting a series of arguments that sought to appeal to the logic of the necessity of conducting one-on-one interviews with users, which I then distilled into a presentation.

I presented this to my director, who, with some pressure from my senior project manager, acquiesced to its necessity. My presentation and argument was then passed along to our assistant deputy minister who agreed, with the caveat that these sessions were not allowed to be recorded. Success! It was also helpful that the Ministry at this time had informed the municipalities that the next property tax assessment would be indefinitely delayed.

I had already worked out the structure of the interviews and the questions, adapting them slightly with contextual information from the surveys. I organized the sessions myself, emailing municipalities to ask for volunteers and then choosing participants based on regional diversity, crafting documents to keep track of information, finding & training notetakers among my colleagues, and conducting each of the interviews.


I was able to organize 12 online one-on-one user interviews and walkthroughs. The participants included five education property tax users, two provincial land tax users, and five tax analysis users. I further conducted three interviews with Ministry users during a later date. I found much of the apprehension I had when conducting previous user testing faded, and I was able to effectively execute these sessions. With reflection, I believe this was due to the sheer amount of work I was doing as the sole UX lead and the experience I gained through it.


User Research Analysis

Affinity Mapping


The process of affinity mapping was something I did individually. I felt cautious about how I was collating the data, primarily because I did not have a partner with alternative perspectives to work through the logic of linking ideas with.

Upon finishing it I decided to do two things: schedule a meeting with an Ontario Digital Service Advisor I had developed a working relationship with previously, and ask my previous UX Design Manager on any feedback they could provide. Both validated my approach, with feedback that I was able to integrate into the affinity maps.

Overall I created 9 cluster groupings and 38 sub-clusters that specified user perceptions. This informed me of what improvements could be made, and provided me with the contextual information required to build an alternative platform.

Overall Findings


The information I collated primarily reinforced that a majority of users valued the platform as it was. However, through follow-up questions and by delving into more specific details of the platform, I was able to elicit several features and improvements users desired.


These were:

1

Quick and specific access to the tools they utilized most often

2

A more modernised and accessible UI

3

Updated filtering and data comparison tools

4

Less dense information architecture

5

Faster loading speed and quicker page navigation

6

Easier access to knowledge base and tax references

Personas & Journey Maps


While the data still reinforced the explicit message that users did not desire change of the platform, the more intricate trends I noticed reflected a desire for some material changes. Particularly the mention of modernized UI, less dense information architecture and faster loading speeds subtly reflected a desire for a new platform. This was primarily due to inability to update OPTA to modern standards because of its development in 1998.

I was able to translate these patterns into Personas and Journey Maps, making sure to correctly emphasise that users currently overwhelmingly valued the platform. These deliverables were organised into the four user groups I had identified before, Education Property Tax, Provincial Land Tax, Tax Analysis and Ministry users. In the MOF, I was also referred to a colleague with experience with similar deliverables. I was able to ask them for critique and confirm that the user groups aligned with Ministry perceptions.

Prototyping

Low & Medium Fidelity Prototypes



My next steps were to craft low and medium fidelity prototypes for the process of user testing. I knew that I could not do this alone, and decided that a workshop with my team and the department heads of the provincial local finance division was the best path to formulating concepts of how a modernized OPTA platform could look, feel and function. I organized a one-hour session on a day where everyone could attend in person. Using a FigJam board, I structured the workshop so that my colleagues could spent 10 minutes researching, 15 minutes collectively ideating, 15 minutes individually drawing and 20 minutes collaborating. This proved to be invaluable, as the expertise of the managers helped me define how exactly I could bring user conceptions to actuality.

There was tension within the workshop due to some within the Ministry being unequivocally against the need for the modernisation, aligning their views with the user base. This was an underlying failure within our division to come to a consensus despite the project actively taking place, that would consistently hinder its goals. In the workshop, this manifested as less active participation of some key members. To some degree, I acknowledged this as an area where more strategic communication could have strengthened understanding of the value of a UX-centered system.

Through the workshop, I was able to take the collection of ideas, craft a low fidelity prototype and then advance it to a medium fidelity prototype for user testing.

| Third Critical Setback


Due to a return of tensions between the Ministry and municipalities, and in order to not antagonize municipalities further, I was not allowed to conduct user testing on the medium fidelity prototype. It was made clear that the deputy minister had specified this to be Ministry-wide. My director and senior project manager informed me that this request was firm and inflexible.

Formulation of High Fidelity Prototype



Still requiring critical feedback, I began user testing with the next best proxy: Ministry users of OPTA. As mentioned before, the provincial local financial division had specific sub-groupings headed by senior managers who specialized in different areas of property tax. These specializations aligned with the modules of OPTA, meaning my colleagues in each of those groups had some knowledge of how users utilized each module. This was obviously not ideal, I would have liked end users' direct and critical feedback - how else could you iterate and design to suit their needs? Given the political and time-based constraints I was working under, and that there was a large overlap between municipality and Ministry users, it was the next best alternative.


I set up, organized, and operated 7 user tests of the medium fidelity prototype. The user testing was conducted online in 60 minute blocks, with me conducting the interviews and a colleague volunteering to serve as the note taker. My prototype received critical feedback about potential new functionality & features, how wireframes could be optimized to promote information accessibility, how the help functionality could be refined & updated, and user desire to integrate complementary tools like Tableau & Sequel. The most important desired functionality that users reported was one that aligned exactly with my user research: an ability for quick and specific access to tools within modules they utilized most often.


Using these insights, I reconfigured my entire prototype. Integrating features like the ability to select and place specific reports from a variety of modules on the dashboard, a chat functionality that was linked to an AI-based knowledge base (in line with what my overall team was trying to develop and integrate into a new system), and updating in-home OPTA features to mimic the structure, formatting and features present in the complimentary tools users preferred.

Final High Fidelity Prototype

Request for Business Published
Post-RFB
Reflection

My work was integrated into the RFB and was published within the tender alongside the business requirements. I believe that the UX-centered process of my development of each of the deliverables, personas, journey maps, and the final high fidelity prototype, were integral to vendors understanding the context of the new system they were tasked with developing and ultimately, what end users desired and wanted from it.


I also know that the company that won the bid was the same company that had developed the platform. At first I was somewhat disappointed, feeling like all the work and my struggles accomplishing this project were redundant. A colleague on my team was able to help me understand that directly because of my efforts, the system would have to be modernized and altered according to the dimensions described in my deliverables and the overall RFB which aligned with them.


Below is a screenshot of the new OPTA homepage. It reflects how the company has already aligned to aspects of my deliverables in modernizing OPTA. I was informed by my senior project manager that this only reflects the first stage of the modernization effort, with the company promising to integrate more features I had suggested (such as customization options), increased legibility & information accessibility, and a completely redesigned UI in future releases.


Business Analyst Experience



While completing the UX design methodology, I was also working as a Business Analyst for the first time. Although this is a UX case study, I will include an overview of my work because of the strong connections between business analysis and UX design.


My particular focus as a BA was twofold. My first aim was completing the requirements for each of the modules of OPTA. As mentioned before, our team was tasked with learning about the platform and deconstructing its various intricacies so that they could be included in the RFB. Our requirements for each module of OPTA were centralised in a Business Requirements Document (BRD), what me and my fellow senior project analyst did was structure the documentation and construct the functional and non-functional requirements. This involved near-constant interaction with various members of the branch with intricate knowledge of property tax. We filtered the knowledge into specific requirements that each module and aspect of the system necessitated.


The second aim was a large-scale transformation of the customer help desk. Our unit sought to facilitate the transformation of help desk functionality and the teams that run it from the vendor into the Ministry. My role was to constantly assess the impact of the changes on the Ministry, identify potential risks & challenges, and develop strategies for risk mitigation during the change. I created documentation and presentations that assessed the current state, highlighted the changes, defined the future state, and facilitated stakeholder collaboration. This involved me organizing and facilitating workshops, meetings, and one-on-one interactions across the division.


Taking on a role as a business analyst without prior experience was a difficult process. I had taken one course in my masters on Project Management, which provided me a foundational understanding of my role. Before being hired for the role, I reached out to a number of business analysts within the Ontario government and asked for their advice. They provided me with detailed notes and resources I could access to rapidly build my knowledge. After completing my onboarding and background research on property tax, I quickly began crafting requirements. During the first two months, I focused on learning through observation, refining my understanding of the BRD, and iterating on assessments and requirements as I gained more insight. I was able to quickly adapt and iterate, gradually building my competence in the process. My teammates and my fellow project analyst were instrumental in this process, providing me with mentorship, ideation, and support as I learned. Through this, I was able to rapidly develop proficiency as a BA from a point of having almost no knowledge of the role.


Evaluation

As mentioned throughout this case study, this project was notable because of the many critical issues and setbacks that made it particularly difficult to accomplish the task of finishing the UX process. From not being able to interact with Municipality users to the lack of Ministry knowledge of how the platform functioned, I was almost constantly pivoting, finding workarounds, and challenging my leadership in order to facilitate the creation of the best possible product for end users. Through my role as UX lead and my constant interaction with stakeholders as a business analyst, I was able to build considerable amounts of confidence in my skills as a designer. Through being constantly challenged, I developed quick means of formulating solutions to complex obstacles. To work around the lack of UX awareness within the Ministry, I had to assert myself, my knowledge and the inherent value of UX as a development paradigm. Through my dual role, I gained dynamic organization skills as a facilitator of workshops, discussions, and collaborations, while juggling the responsibilities of both roles.

I would characterize this project as a kind of "trial by fire", one in which I emerged with significantly enhanced skills as a UX designer, expertise as a Business Analyst, and now, an unfortunately deep knowledge of property tax.
Made on
Tilda