UI/UX Designer at Jobscan

Company size

10 - 15 employees


Visual Design, UX Design, Front End Development

How can we help people looking for jobs get past Applicant Tracking Systems large companies use to filter out candidates?

Jobscan was created to address this problem by analyzing candidates resumes, comparing them against job descriptions, and providing a detailed report showing hard skills, soft skills, keywords, and other findings. This puts candidates in a better position to land interviews, and get the job they're looking for. Jobscan is a self-sustaining startup not backed by investors. Currently they're located in Pioneer Square.

The Problem

An Applicant Tracking Systems (ATS) is a software application used by companies to handle recruiting needs. Using this software, recruiters can manage applicants in a number of different ways, depending on the type of Applicant Tracking System used. These systems can filter out candidates by searching their resume for keywords, education, years of experience, and many other criteria.

Many people applying for jobs don't consider this step, yet its a crucial point where their resume can get lost in a black hole. Each ATS has its own way of managing candidates, so unless someone knows how a specific ATS works, knowing what to change on a resume for a job is impossible. In general, people don't know enough about applicant tracking systems when applying for jobs online. If they do, there's uncertainty around what to change in their resume to give them the best chance at landing a job.

Jobscan has researched the top ATS systems used by thousands of companies, and built an algorithm based on common patterns among them.

The Target Users are people that have digital resumes, who are currently applying to jobs online. Whether or not the user knows what Applicant Tracking Systems are isn't important. It also isn't important for the user to know what ATS the company they're applying to is using. Jobscan figures that out for them. I segmented the target audience by demographics, geography, and also behavioral tendencies to focus on refining the experience for certain flows and outstanding usability issues.

Jobscan works in 3 steps:

Designer Goals

  • Create a comprehensive style guide to create cohesion between engineering, marketing, design, and other departments.
  • Improving the UX in the resume scan and linkedin scan funnels
  • Addressing general design needs of from other departments
  • Performing a UX audit of Jobscan's current webite
  • Lead user research
  • Assist with the design of new features
  • Find and address usability issues on the site
  • Lead a bi-weekly "Designer Round Table" where I updated the team on research, visual designs, and other work



Career Coach Interviews

Career Coaches assist jobseekers during their job application journey. They can help with anything from formatting their resume to deciding on which industry they should focus on in their job search. Career coaches can focus on anything from career exploration, to career changes, to career development. Jobscan has features built specifically for career coaches. Coaches are given the ability to manage clients, track progress, and scan (evaluate a resume against a job description to see a detailed report) their clients resumes. There were new Career Coach features to roll out, so reaching out to existing Jobscan Coach users was the first step in a user centered design process. I started by emailing current career coaches, creating an interview guide, and scheduling interviews.

The interview guide was structured around understanding Career Coaches attitudes towards the current experience and features, pain points they faced, and functionality they would like to see in future versions of the product. I documented findings from each interview, analyzed and categorized the raw data, then turned the insights into design and engineering work items in github.

User Behavior

I analyzed user behavior by studying recordings and heatmaps of users on Hotjar, an excellent user research tool. By watching recordings of users accomplishing tasks, I was able to compare the recording findings against my UX Audit findings, and validate usability bugs in the product. Hotjar's allows designers to view which pages users visit, where their mouse is, and highlights moments where users click to interact with elements all in a timeline. I also used scroll heatmaps set up on specific pages to determine what elements on a page were getting higher visibility than others.

I used Google analytics to study demographic information about the users of the site. I was able to determine the mobile platform wasn't where a large number of users were interacting with the site, so we prioritized the desktop experience. Google Analytics was especially helpful in seeing what pages were the most highly trafficked, and how those users were initially coming to Jobscan. Analytics helped see what pages and flows had the highest conversion rates, and how long it took users to convert.

Mixpanel was one of my favorite tools I used. It was especially helpful in identifying conversion rates for specific flows, so the team could focus on prioritizing specific features. I looked at numbers for different funnels, and measured against my UX Audit findings to address the most pressing usability issues.

Focus Group with Recruiters

I assisted planning a focus group where we invited recruiters in to talk about their experiences, needs, and wants in the industry. I also documented insights during the focus group, and afterwards turned those quotes from the recruiters into design requirements. Turning raw research data into requirements is a two step process of first analyzing, then synthesizing.

Many recruiters wanted more information about who their candidate was. To have a more humanized approach to recruiting that didn't feel so robotic.

The first step of analyzing the data I did by organizing all user feedback into a document, and categorizing by naturally occuring categories. For example, say a high number of recruiters talked about wanting a tool that allowed them to analyze a person's past experience that isn't directly related to their company. I would add all relevant data from the focus group into this category. Then I would name it "Valuable Candidate Experience" or something similar.

Synthesis is a bit more difficult. After synthesizing data, a team should emerge with design requirements. Changes to the taxonomy or visual design of the site could follow. Synthesizing the data should be done as a team, or at the least validated by the team. Going off the above example, first addressing whether "Valuable Candidate Experience" is a problem that warrants solving is the first question. A designer and the team should have a list of questions they evaluate research findings against to determine which they address. Does it fit in the product? How high priority is this change? Etc.

Regarding design requirements, it can be a simple color change in a button to change heirarchy. Or it could be an entirely new feature that warrants its own . Research findings can manifest themselves in features and changes that can take a variety of forms with respect to the constraints of that product. Its important to discuss how and what the feature should accomplish in a product before a designer comes up with mockups or prototypes.

A/B Testing

Jobscan is big on having data inform product decisions, and using A/B testing was a great way of quickly testing different versions of a design or text to determine how other variables like conversion rate are affected.

Long Term User Research Plan

I developed a comprehensive long term user research plan focused on understanding target users motivations. The plan focuses on the Why of the job search process, rather than the how or what they do. Understanding the "why" regarding target users leads to much more valuable insights, and when incorporated into a product correctly give the experience more value, and add a touch of empathy that users remember.

Style Guide

One of my first projects was to create a style guide for Jobscan. I wanted to completely redesign what they had, as it wasn't comprehensive and many questions emerged as to what colors, type, spacing, etc were used on different parts of the website. Using Adobe Illustrator, I created a style guide that covered:

  • Color
  • Typography
  • Buttons (hover states, spacing, primary, secondary)
  • Alert Colors
  • Modal Style
  • Color
  • Tab Style
  • Accordion Style
  • Input Fields (label style, state color, placeholder style, etc)
  • Other UI elements like bulleted list style, radio buttons, and checkboxes)

I also documented the reasoning behind the style guide change. In order to create a unified system, its crucial that the design language be thoroughly documented so everyone in the team has access to an updated style guide. This ensures faster development, consistency, and also makes it easy to update Jobscan's design language in the future.



Problem: when users view a report after scanning their resume or view the sample report, some variations of words arent recognized. For example "design" may be recognized as a skill, but "designers and "designed" isnt. This leads to people questioning the credibility of Jobscan. Based on the feedback we were receiving, it was in users mental models that variations of words should automatically be recognized. Implementing this feature would drastically improve the credibility of Jobscan, and also decrease the dropoff rate of new users.

Process: I started by meeting with team members who could give more context around the problem, people from engineering to management. We discussed user feedback, and framed the current problem users were facing, and discussed ideal solutions. It was important that we understand the problem from users perspectives, so we could understand what their idea of a solution would be. The issue is complex, because the number of variations of a word changes with every word. How do we display word variations logically to a user within the constraints of the current system?

I brainstormed and sketched low fidelity ideas of what different interactions could look like. From these ideas, I worked my way up to higher fidelity mockups made in Adobe Illustrator. I used the approach of having multiple alternatives. I like the ability to evaluate on prototype against the other, it creates a spectrum which makes it easier for a designer and the team to compare pros and cons of each design. It also surfaces more insights from brainstorming, and doesnt force a designer to put all their ideas into one solution.

After rounds of feedback, comparing against initial goals, and refining pixels, I presented the final design and we decided to move forward with engineering. Everything from the icon, to the way words appear under the icon was discussed at length with the team. The engineering team did an excellent job building the logic behind recognizing variations of different words. The current design is live on all reports.

Recruiter Findings

Problem: After the focus group with recruiters, we had many insights that we wanted to implement into the product. Specifically the match report. For jobseekers, having insight as to what recruiters are looking for in their candidates gives them a big edge on other jobseekers. As a team we decided that we wanted to build the scan report in the future to incorporate this data. What type of data would be most valuable to show jobseekers, and how should we contextualize it so they understand it?

Process: I started by talking with team members who had initialized the project, gathering context, research data, and the value proposition as to why we decided to add this section on the scan report. We wanted to launch recruiter findings at a basic level, so we decided to display education, and industry breadth to users.

By gathering this information from a jobseekers resume and analyzing it, jobseekers get the sense that Jobscan is a smart platform. We also wanted to differentiate this section from the normal categorization of data on a resume either being good or bad. By highlighting good experience jobseekers have on their resume, it gives users a pat on the back. The "Industry Breadth" section highlights potentially applicable skills a jobseeker has on their resume. A jobseeker may not consider them relevant for that job, but they actually may give recruiters a better sense of a persons skills. Recruiter Findings adds a personal touch to the job application process, where its easy to feel like your resume is lost in a black hole.

Live Score

Problem: Users are shown a report where they see a list of words and phrases Jobscan recommends they change on their resume. How can we improve the resume editing process, and make it easier for users to change their Match Rate Score?

Process: I worked with engineering, taking an iterative approach to designing What a live resume editing interface would look like. It was important that users understand they can edit their resume here, and be able to see the skills they need to add to their resume. The Match Rate Score is updated in the header section of the modal, and as users make changes it will increase or decrease accordingly. Users can save their progress in Live Score, so they can come back at a later point. There are also formatting options available. I created high fidelity mockups at different stages of the process after receiving feedback.


Problem: Users didn't have a means of seeing updates to the state of their profile or changes to the system in general. Jobscan also doesn't have a way of notifying users of new updates, features, and changes to the system. Its also a way of promoting engagement with users, and it can make platforms feel less static. Perviously, users were getting updates through email, or just seeing changes on the site without any warning. How should we design a useful notifications system within the constraints of the current platform?

Process: Notifications was a highly desired feature for most members on the team. It was also a large feature that would require many rounds of feedback and design changes. The project started with a mockup I created and showed to the team as a proof of concept. From there, I began working with engineering working on where the notification icon would be, what it would look like, how notifications would be displayed, and the other questions that arose with the system.

It was important that we design notifications within the style of Jobscans current design, and it was easy working with engineering on this. The top header of the notifications bar would be dark blue, like the nav bar. The notifications displayed below would be white, separated by a thin grey line. The rest of the style followed the style guide I created.

Considering all states of the notifications bar (and other features in general) is crucial.

  • What does the UI look like when there are no notifications and a user clicks the bell icon?
  • What should the notifications icon be?
  • What should the maximum number of unread notifications be?
  • What should the maximum height of a notification be?
  • What should the color heiarchy be?
  • How many notifications should we display?
  • What constitutes a "read" notification?
  • Should notifications have their own page?

These are a few of the questions we asked when ensuring we were designing for all possible use cases.

Front End Development

A portion of my work was making changes to the front end code of Jobscan's platform. I'm skilled with HTML, JavaScript, and CSS so this was a fairly easy task for me. I would often file work items that needed to be taken care of, and if engineering was busy on larger work items, I could help take some work off of their plate.

UX Audit

One of my other first tasks was to do a UX audit on Jobscan's website. My process was to first create a page in a document for each page on the website. I then measured each page against Nielsen's 10 usability heuristics, making a note when a convention or interaction violated one of the principles. I also documented my findings studying recordings in Hotjar or users performing specific tasks like scanning a resume or connecting their LinkedIn profile to their account. All this data I documented, and once the audit was complete, I referenced the audit findings with user research findings from interviews, survey data, and previous research done to surface the most important issues.

Findings in the UX Audit involved visual design changes, as well as UX issues. I attempted to focus less on visual design issues, because this can be an area thats more subjective. Usability issues were the primary focus. In the process I also looked at successful websites like AirBnB, Sketch (sketch.app), and Dropbox for examples of what good design looks like.


One major learning of mine was how to approach creating new features, or additions to preexisting features. Its natural for a designer to want to create something new, dynamic, and better than the existing version of a product. But usually, a company doesn't have the time or resources to do so. I believe Desiging within the complexity of an existing feature is an important skill for a good designer to have.

This means understanding the structure, heirarchy, and functionality of a feature or interface. For example, if a designer was hired at Google and they were tasked with redesiging Gmail, it would be tempting to start from scratch and build entirely new components. However, the best thing to do would be to use the existing design structure as a template. The layout of individual emails as rows. The ability to select all emails on a page. Using labels. All the buttons a user currently has. These are all tools that have been tried and tested, so if features are added, they should fit into the taxonomy of the product like gears in a machine.

Furthermore, its important to avoid adding additional layers of complexity to existing features if possible. This is a good practice to follow to ensure users arent jarred by updates, and is a good design practice in general to keep products simple. Designing within current interactions, if they're usable, is the ideal route.