DO GOOD, BE GOOD
Do Good Be Good, founded by Sharon Tewksbury-Bloom, provides engaging and relatable training and facilitation specifically designed for AmeriCorps programs and State Service Commissions in Arizona.
Role: UX Researcher and UI Developer
Duration: Three weeks
Tools: Figma, Trello, Canva, Google Docs
User Journey Map
The client approached the team with a proposal for a mobile app that would add a new revenue stream by targeting Conservation Corps members, a sector of AmeriCorps that has been unreachable with Do Good Be Good’s current content delivery methods due to the remote nature of their work and unreliable access to wifi. The client proposed a mobile concept that would allow these members to access training content offline.
Do Good Be Good sees an opportunity to create a new revenue stream by developing a mobile product for an untapped segment of active AmeriCorps members who are working in the field and need of a way to access informational content that will support their post-service plans to use their Education Award.
We agreed on the 4 priority elements of functionality that our design solutions would need to incorporate in order to solve for both the identified end user and business pain points and goals:
Access to Content Offline
Engaging Content Experience
Customized Learning Paths
Highlighting Common Mistakes
WHAT IS OUR SCOPE?
Our scope for this sprint was dictated by the constraints of the timeline as well as Do Good Be Good’s business-specific priorities and goals.
The client proposed an offline mobile concept that would allow these members to access training content despite the remote nature of their work and unreliable access to wifi.
In our early meetings with the client, we agreed on the following definitions of scope given the less than three-week timeframe of the design sprint:
Narrow content focus to one of the client’s cornerstone areas concerning the education award.
Determine how gamification might serve as a means of delivering content and achieving outcomes.
Selected end users (aka Conservation Corps members) as the primary user group for this sprint in hopes of providing proof of concept that would entice future investment.
WHAT ARE WE WORKING WITH?
Do Good Be Good has a library of information-rich content and educational media that is currently stored in an online collaboration platform.
Our brief indicated the mobile app would be a restructuring of Do Good Be Good’s existing materials, so we first set off to determine what that content might be. I lead the creation of a content inventory that catalogs all of the content that is currently managed and distributed to AmeriCorps leaders and members through BaseCamp, an online platform.
A content inventory helped us to understand the breadth of DGBG’s content and pull out relevant content for concept testing.
From this exercise, we took note that the client’s content currently represents a diversity of mediums including videos, podcasts, articles, and other interactive activities. One significant portion of the content took the form of scenario-based activities for job hunting and using the education award, which inspired our ideations. We were also able to pull out education award-specific content for later in the design process.
The client also gave us access to current branding guidelines which included logo iterations, four typefaces, and four primary brand colors.
WHO ARE WE DESIGNING FOR?
Conservation Corps members have several unique habits and behaviors, but they also share similar motivations and goals with AmeriCorps members in other programs.
We benefited from the client’s assistance and extensive network of contacts in screening for candidates that met the criteria of spending a significant amount of time “in the field.” Then we proceeded to conduct 5 in-depth interviews with active AmeriCorps or recent alumni. We designed a set of questions that would prompt members to speak about their habits, motivations, and their current relationship to technology and training content while working in the field.
Excerpts from interviews with AmeriCorps members.
Noting our small sample size, we decided to turn to other secondary sources of information for more potential data insights and/or validation.
YouTube videos: I and one other team member watched, transcribed, and analyzed seven YouTube videos that spoke to ‘a day in the life’ of a Conservation Corps member, making observations and noting relevant quotes. (Some videos were published by Conservation Corps programs and were weighed accordingly for potential biases).
Discord: The client allowed us access to an AmeriCorps Discord server, where we took note of the kinds of questions members were asking each other, especially in regards to post-service life.
Scholarly work: I sourced several scholarly articles and studies including “The Voice of Corps Participants” (2010) and an executive summary of Corps member experience prepared by Jayne E. Smith in 2013.
Corps members have little free time during work hours and operate "low-tech" in the field.
4/5 members had questions or sought additional help after training on the award.
Members found provided training to be too broad and didn’t personally identify with the content.
24 | Arizona | Conservation Corps
Brooke already has her Bachelor’s in environmental studies. She's joined the Arizona Conservation Corps to help her determine whether or not she’d like to return to school to get her Master’s and pursue a professional career in conservation. Working in the field on trail restoration for days at a time, her cell service is sporadic and the education she’s received so far from her program about life after AmeriCorps has not been helpful because her crew leaders are trying to include information that applies to the situations of all members in her crew.
For Brooke and other corps members, the greatest pain point in terms of content delivery happens after exiting the program when they are confronted with the complicated reality of using their education awards.
The issue Brooke is forced to confront is confusion surrounding the technicalities of the award, which causes frustration and forces her to seek additional assistance from AmeriCorps. From our research, we also know that Brooke's crewmates are likely to contend with other pain points like a lack of awareness as to the award's diverse uses, misapplication, or time issues like missed application deadlines or simply expiration of the award itself.
WHERE ARE MEMBERS LIKE BROOKE CURRENTLY GETTING EDUCATION AWARD INFO?
One of the factors contributing to the pain points experienced by Brooke and her peers is a lack of consistency in terms of methods and emphasis across different AmeriCorps programs in providing education and training to members on how to use their education award.
In those that we talked to in the course of this project, members’ means of accessing information about the education award varied from attending optional Zoom trainings offered by their programs, doing cumbersome personal research, using the project management systems they are already using for other parts of their work, or relying on peer knowledge. And still, some others reported receiving no education around the award.
WHAT RELATED SOLUTIONS ARE OUT THERE?
A few features in other training and development mobile apps aligned with our identified user and business goals and were incorporated into our sketches.
We began exploring existing methods of delivering mobile learning content within the broader training and development landscape. Our goal was to determine if any features of existing apps would suit the lifestyle and habits of Brooke and other active corps members.
We looked at analogous mobile app platforms for learning management systems and workshop facilitation apps for relevant educational features. For gamification and personalization inspiration we also looked at choice/consequence games and games that facilitate content sharing.
A few features, listed here, aligned with the user and business goals we had identified and were subsequently incorporated into our ideas for design solutions.
The apps compared (including Gnowbe, Kahoot, EdCast, Life is Strange, and the Nonprofit Career Coach) have several features that promote learning and continued use.
HOW MIGHT WE VISUALIZE A SOLUTION?
Keeping our four core functionalities in mind, we designed a task flow and held a collaborative design studio.
We designed a task flow based around Brooke’s education goals that maps how she would go through and explore her own learning path, as well as prepare to access the content while offline.
From there, a design studio focused on ideating and visualizing layouts that would integrate the features and task flow we had agreed upon. Following this ideation session, we evaluated the pros and cons of each design and arrived at key design decisions through a weighted voting process.
A sample of sketches from the collaborative design studio shows some of the team's explorations in determining the app’s main screens.
As we translated the design concepts into mid-fidelity grayscale designs. We focused on our four main functionalities:
Customized User Profiles
Facilitating user engagement by embedding a gamification element to the on-boarding process through the creation of different avatar trackers. The addition of avatars also leaves room for potential development of tracking user data and processes as a metric of learning and achievement.
Opportunities for Offline Access
We provided options for accessing offline content both before and during the app experience through a toggle switch and a download icon on specific media content.
Exploratory Content Experience
We created multiple, exploratory paths using an action-consequence sequence so users can tailor the content and information they explore to their specific situation, goals and needs.
Payment & Data Structuring
An input field for “Start of Service” ensures access to the app as long as a member’s Education Award is valid. Below, an activation code complements the B2B2C revenue model, allowing end users to access the app for free via a code provided by their program director.
To validate these design concepts and find out the extent to which they were effective, we conducted 6 usability tests with current or former AmeriCorps members.
Participants were asked to complete a series of tasks under the scenario of wanting to continue their education at a Title IV school: create an account; find out more about Title IV programs; read more about tax implications of the award; and prep content for offline access.
All six participants completed all tasks with a success rate of 100%, which suggests our task flow was properly structured and the layout of our design effectively guides users in their navigation through the app.
The average easiness score for task completion across each task was also rated 5 out of 5 across all 3 tasks. From follow-up questioning around this result we found that all users found the navigation experience through the app straightforward and intuitive.
Create Account Discoverability
3/6 users hesitated around or struggled to find the ‘Create Account’ button before figuring out how to proceed through the app. This could be for several reasons, but to test visual design as the cause, we decided to lift the color of the create account button and bold the copy of ‘New here?’ to increase discoverability.
Incentive to Return
In follow up questioning, users questioned the purpose of the home screen. Instead of showing ‘downloads’, we added a feature allowing the user to change their learning path over time, therefore acting as an incentive to return to the app beyond its first-time use.
HIGH- FIDELITY ITERATION
After testing the mid-fi prototype, we decided to implement changes and bring the app concept to high-fidelity for the client to use as a tool for generating interest and future investment. Three features were validated during testing and transferred into the high-fidelity screens.
All users enjoyed personalizing and tracking their learning paths by picking an avatar of their choice. Once they saw they could proceed using the ‘bee’ avatar, we noted positive behavioral cues like laughs and smiles.
6/6 users expressed appreciation and satisfaction in experiencing customized learning paths.
Users found access to offline content and its functionality highly intuitive.
However, our testing results led us to make 2 key iterations:
VISUAL DESIGN DECISIONS
To flesh out the hi-fidelity screens, we decided to expand the existing color palette as well as use imagery that draws on the metaphor of a “path.”
As a government-adjacent program, the client expressed a desire for the app to be accessible. We expanded on the client’s four main brand colors for a more comprehensive and versatile palette. I created a style guide for the app and completed accessibility color contrast checks to ensure our design interface met WCAG guidelines.
For the purposes of this project, we used Droid Sans as the sole typeface as it was the closest typeface we could access that was related to the DGBG’s typeface, Droid Serif. For future development, a font within the brand kit would be recommended.
We recommended several other next steps in our presentation of the project for our client to consider:
Test the effectiveness of iterations and the high-fidelity prototype with another round of usability testing with AmeriCorps members.
Create an onboarding experience that walks users through the capabilities of the app and leads to the “Create Account” page.
Explore the potential for a business use case that leverages the avatar elements as a means of tracking user progress as a KPI.
Iterate ideas for a user prompt or tip prior to loss of cell and wifi service in order to guide a user to access all content offline.