Improve usability and accessibility of college website for students.
Georgia Tech's College of Computing(COC) is one of the best CS colleges national wide. However, they do not have enough research and design to support their website. The website lacks a clear and usable information structure. COC reached out to us and asked for a solution to their website. We are a team of 5, and my key role in this project is as a lead designer.
After 4 months of working, we presented the brand new CoC website with a reconstructed global navigation bar and information architecture. Particularly, we proposed a module-based design strategy that can be applied to all pages. Now the new website is under construction and will be ready to serve over 24,000 students and applicants soon.
We start our work with a survey to understand who is our user in the whole picture. Our survey identified the areas of interest and usability challenges that students face when they use the website. Specifically, we were curious about the following themes: navigation, design, content, accessibility.
From our survey, we can know that the biggest issue they face is finding topics of interest. Students self-report low mobile use across topics, but access some topics more than others. While in 195 responses we could not find valuable information about accessibility, so it needs further targeted research.
We wrote down the survey results and our insights on post-It notes and added them to an affinity map.
“Website lacks clear navigation to useful resources for students.”
Director of Academic Programs
“Truly accessible websites require usability and programmatic testing.”
QA Accessibility Analyst @Center for Inclusive Design & Innovation
Meeting multiple stakeholders and accessibility practitioners shed some light on what success looks like for them, and how they would benefit from a redesign of the CoC website.
We wanted to gain a deeper understanding of session data - including bounce rates, click rates, and session times - so we may gauge how users were navigating through the website, where they were coming from, and how long they were staying.
From data analytics we can know that the hottest topic in CoC is master degree and its admission process.
Content Drilldown for new and returning visitors.
Data is gathered from 1st January 2018 to 1st January 2019.
With our hybrid card sorting task, we wanted to identify how visitors group the website content and to verify the global navigation bar of the current website.
From the result of cardsorting, we discarded irrelevant information from website for example 'About Atlanta', and we renamed some of the inappropriate named tabs for causing misunderstandings.
One of the card sorting results.
We extrapolated evaluative criteria from a larger set of widely used common principles and heuristics, and applied them to a competitive analysis as well as a heuristic evaluation in order to diagnose the current status of College of Computing website.
Competitive Analysis
From the compilation of all our research findings, we gleaned insights into a large amount of information. We also were able to categorize information into four main themes of navigation and website organization, content, responsiveness/mobile use, and accessibility.
We formulated user needs, business requirements, and thoughts through other influences affecting our work within the higher themes.
We ended up identifying and creating two persona types that we thought best represented the majority of students in the College of Computing.
We want to make sure that we will focus on both the breadth and depth when designing. But due to the time constraint of this project, we can only look at the information architecture of global navigation and content structure especially in 'Perspective Students' and 'Current Students' tab.
New global navigation bar based on previous researches.
Instead of designing page by page, we are designing a system that can be used in multiple instances cross-site. These components are made up of UI element groupings and contextually link to the content.
We audited content in the current ‘Academic’ section to provide insight into what we plan to migrate to the new website’s Content Management System, as well as how that content could potentially be organized in an intuitive and useful way in-page.
Our high-level goals for the first feedback session will be to know how users would intuitively organize website content, and as for the second feedback session we want to understand how users navigate from the homepage to the pages.
In our first feedback session, we printed out the components we designed and ask participants to organize distinct in-page components based on their understanding. However, we found that the shape of the block was misleading. The participant attempted to order the content in a manner such that all the blocks fit together in a grid. So instead of printed blocks, we used post-it notes eventually.
Printed components we used in the first trial.
We used post-it notes to eliminate possible factors influence the usertesting.
During the second test, users were asked to click through the website tab and find a specific piece of information.
We asked users to start from the homepage and find "Master Program" and "Program Advising" on the website.
Click through
After two feedback sessions, we gathered additional qualitative data on the content of each page we had designed, as well as participants’ expectations. We also assessed whether they were able to find the pages we were asking for, and what issues they had along the way.
Based on our former research and feedback sessions, we came up with responsive prototypes for final user testing and feedback gathering.
Desktop prototype
Mobile prototytpe
Based on the issues that occurred in user testing, we iterated our prototype before delivering it to stakeholders. For the over-long mobile pages, we designed some components that can greatly shorten the page and condensed similar content.
Before user tersting.
After user testing. Iterated.
To validate our design, we did A/B testing for both legacy website and redesigned prototypes on usertesting.com. By asking participants to execute the specific tasks and fill out Likert scales to evaluate the prototypes' efficiency, effectiveness, and satisfaction. We found that the new prototypes received a significantly higher review from participants.
From the result of user testing, we can see that users spent less time on redesign prototypes than on the legacy website. Significant progress shows in finding information about tuition and international students. But still, mobile prototype requires more effort.
Before presenting the prototype to users, we invited 4 experts in HCI and design-related fields to test the prototypes and find out potential UX problems ahead. In the meantime, they evaluate the performance of the legacy website and new prototypes with scores on a scale of 1 to 5.