Background

The ADEPT (Awareness of Decisions in Evaluation of Promotion and Tenure) team at Georgia Tech works towards giving professionals in academia (like professors and other faculty) information about how promotion and tenure activities happen in academia.

The existing tool was built around 10 years ago and consists of three activities/games: Simulated Meetings, Navigating Your Career, and Annotated Vitae. Each game is primarily targeted towards a particular group of stakeholders.

I worked in a team of four over the past year to redesign and develop these games. I worked as the designer alongside our advising professor who oversaw working of the project, a developer, and a content strategist.

Process

Prompt

How may we redesign an existing flash-based role-playing game to incorporate a better user experience focusing on inclusivity for stakeholders within the academic space.

Research

We had a very short amount of time dedicated to research, as we needed to concentrate on designing, developing, and taking the entire tool live in a short span of time. We spent a short amount of time dedicated solely to research, and then started designing along with our research simultaneously.

Existing Product

The existing product was an out of date Flash game that needed to be downloaded to be played. The existing tool also didn’t comply with any of the brand guidelines of Georgia Tech, which is the parent organization that this initiative belongs to. Instead of focusing on what didn’t work in the existing product, we decided to concentrate more on what worked. At this stage, we also needed to consider constraints defined by the customer and the business needs.

Older designs

Primary constraint

The textual content that the games were based on could not change.

What worked

Conversation
The primary content was framed in the form of a conversation instead of a case study; this mirrored the actual situation, making it easier to connect with and understand the situations.

What is the best answer?
There wasn’t always one correct answer to a question, which implored the user to explore and discuss multiple scenarios. On the other hand, to give the user some indication of whether they were going in the correct direction, the game scored them on multiple aspects of academic life rather than just give one final score.

Learning tools

To begin with, we researched existing learning tools like Duolingo and Coursera to understand what works and what doesn’t when it comes to information architecture, user flows, and task analyses.

Inspirations for design

Define

Personas + User Goals

Based on our target audience in the academic realm we created two distinct user personas. These guided our design process that followed and made sure we designed for a specific user and avoided trying to design for everyone.

Design Guidelines

Based on our research, we came up with the following design guidelines:

User experience should allow for full immersion into the game.
Since the game requires a good amount of reading, it is important that the user is fully engaged in the game. Research showed that letting the user ‘play a character/role’ proved essential for this cause.

Users should be aware of their progress
It is important to constantly display the player’s progress to inform them of how much of the game is completed and how much is left.

Users should have supplementary resources at their disposal
While playing the game, the user needs to make decisions based on a candidates past record. Therefore it is important that this past record is available on the screen to refer to at all time.

Validations and alerts
The system should clearly respond to user actions to validate the action. The system should show alerts at crucial points in the user flow. This is important as (due to time and resources constraints) we could not create a functioning back end for the game. Therefore, on exiting a game all progress is lost.

Users should be aware of external resources to explore after playing the game
The activities are designed for the user to explore different scenarios concerning evaluation and mentoring. The statements in the games are meant to open doors to further discussion. Therefore this game should be able to lead users to other useful resources in the academic world.

The design should follow Georgia Tech brand guidelines
Since the product is owned by GT, it should follow all brand design guidelines, while at the same time encourage inclusivity by inviting scholars from other schools to use it too.

Design

Our design phase began with revising the task flow and the information architecture to provide a better user experience. We then moved on to low-fidelity paper prototypes and eventually began the visual and UI design part.

User Flow

Based on the analyses of educational tools that we carried out in the research phase, we created a revised user flow for the game.

User Flow

Low Fidelity Prototypes

Introductory Page
Members’ Bio Option 1
Members’ Bio Option 2
Simulated Meeting Main Screen
Simulated Meeting Transcript
Navigate your Career Main Screen

Characters – Visual Design

Our existing content consisted of conversations between different stakeholders, depending on the game. For example, Simulated Meeting was a discussion between four panel members (the user being one of these four members). Therefore, we designed characters to go with the names and scenarios for each of these games.

Visual Design of Characters

Designing for Diversity:  The academic community at Georgia Tech is striving to create a safe and inclusive atmosphere for all its members. One of the requirements for this project was to be part of that initiative. Our characters are designed to represent the diversity in race, gender, and more that is often seen in the faculty at different educational institutions across the country.

User Interface

Following brand guidelines

While this product is owned by Georgia Tech, its users would be from multiple educational institutes. Therefore it was important to follow Georgia Tech’s brand guidelines and at the same time retain a general visual design that is inclusive to all audiences.

Easily distinguishing participants in the conversation

Following Gestalts principle of similarity, each character was given their own distinct color in the conversation. This was generally the color of their clothing. This color also served as an accent color for the chat bubbles to let the users easily distinguish the person talking.

Recognize rather than Recall

Users would often need to refer to the candidate’s profile (CV and Case Study) while making decisions. The user first reviews this profile on the introductory page of the game. The same profile is then condensed and conveniently placed on the sidebar, from where it can be accessed at any stage in the game.

Estimated Time

Professors are often very busy during their week and dedicating an undivided stretch of time to this activity may require some planning. The opening screen tells the user the approximate time it would take to complete the activity. Now they can make an informed decision of whether they should start the game immediately or come back another time.

Progress Bar

The progress bar will  constantly inform the user of how much of the game is completed and how much is remaining.

Testing

We conducted multiple focus groups and usability testing sessions to get feedback. The focus groups consisted of academic faculty who would ideally be using this tool; both young faculty as well as seasoned professors who would serve as panelists and mentors. We also involved professionals with expertise in accessible design.

Usability Testing

We conducted usability testing with 11 users, both familiar and unfamiliar with the system of promotion and tenure. We used a think aloud protocol while each user was instructed to play atleast one game from start to finish.

We tested for:

  • Understandability: Does the user understand their task?
  • User flow: Since the flow of each game is primarily linear, does the user understand how to get to the next screen?
  • Findability: Can the user easily find content they assume/know is available?
  • Discoverability: Is supplementary content easily discoverable on the screen by the user?

Survey

To better quantify user responses to the new design, users were asked to complete a short survey regarding the usability of each game. The survey primarily consisted of likert scale-based questions, and was created and distributed the survey through Qualtrics.

Focus Groups

We also conducted a casual focus group consisting of the same users who tested the system. As these were academics, it was sufficient to give them a prompt related to a certain aspect of the game, and they discussed it to lengths with slight moderation from us designers and researchers.

The overall feedback from all the testing sessions was positive. The following are some of the observations and insights we received:

  • Users appreciated the indication of the time required to play each game at the start. This prepared them for what was to come.
  • Conversations that were intentionally negative got two distinct responses from users: one of rage, and the other of disbelief. Regardless, both of them started passionate conversations around the topic, which was the main purpose of the product.
  • Some users thought of this product as a test or training, constantly striving to get the correct answer. This goes against the purpose of the product to encourage the user to explore different scenarios. In response to this, we plan to reduce the points section to the background, and put more stress on the explanation sections.
  • Users don’t read instructions, especially if they are static. We are currently working on a short onboarding process.
  • While the language of the actual text content should fit academic standards, the instructions should be straightforward, short, and easy to understand.

Takeaways

Keep stakeholders involved always
While a majority of our testing happened in the second half of our project, we had a couple of professors who served as mentors and field experts throughout the process.

Don’t assume user behavior
The first iteration of our design was heavy on words, and there were detailed instructions given using high level English. Assuming that these were professors and academics, we thought that they would read every single word. It turned out to be the opposite. The users often skipped crucial instructions. Therefore, we vastly reduced the amount of text for instructions, as changed it to simpler and straightforwards words.

Define metrics for evaluation
At the start of the process, we defined what would entail a successful redesign. Metrics revolved around time, ease, pleasurability, etc. This helped us assess our design in the later sessions. Due to constraints put on us by our customers, we also defined what would not qualify as a metric. For example, the text content and the stories of each individual character were not to be changed. Therefore, if users had slight issues with them we had to overlook those.

Collaboration
I worked at the principal designer in the team, along with a developer, mentor, and another designer/marketing professional. At the beginning of our project we clearly defined our roles and veto powers. I was responsible for final decisions wrt design, ofcourse with taking into consideration all team members’ opinions. We set weekly meeting times and goals, and often met for secondary meetings during the week whenever needed. Lastly, we constantly kept each other updated on our work through Slack.