MLevel

mLevel Runs Agile With a 5-Day Design Sprint

“We’re gonna build and test a realistic prototype in five days.” This is the mantra of the design sprint, a fast and collaborative design process first brought to life by Google Ventures. They claim you can solve big problems and test new ideas in just five days… sounds great, right? The product team at mLevel agreed, and decided to give this week-long endeavor a shot to see if it could really work. Here’s the detailed, day-by-day report of how it all panned out…

Donkey Kong Conference Room
Five days worth of work in the “Donkey Kong” conference room

Our first task was to create a problem statement that would drive the events for the week. We decided on some new ideas around the administrator’s experience of our analytics site, which led us to the following:

  • We need a better starting point for our admins to find the data that they need.
  • Our current analytics site provides a mostly course-centric view of our data; we’d like to provide admins with the ability to see this data in a more user-centric view.

Once the general problem was identified, we completed two big tasks before the sprint started:

  • With the help of our sales and customer teams, we secured five mLevel customers willing to talk to us about their analytics needs.
  • We assembled our A (for analytics) team, which consisted of a product manager (PM), a data engineer and two user experience pros. In hindsight, we agreed that adding one to two team members would have been ideal, but we needed to start somewhere.

Prior to the Monday meeting, we set the ground rules and determined our roles. Our PM would be the “facilitator” who was in charge of the schedule.  I was deemed the “decider,” charged with making all of the tough decisions in order to keep us moving forward.

Our first challenge was the schedule. With so much product development work already in progress, how could we possibly fit in five full days of a design sprint? We decided that we could drastically reduce the amount of time lost from our development team by only scheduling team-focused sprint events for two to four hours per day. Our initial schedule looked something like this:

Design Sprint Schedule
Our version of the five day design sprint schedule

Day 1 – “Nobody Knows Everything, so We’ll Share Info”

To help guide the process, we agreed that the product team would gather all necessary data from recent customer and persona interviews along with sales and market feedback. We spent the first of the two team-based hours presenting and understanding data points as well as individually capturing important high-level thoughts on sticky notes with a “How Might We” (HMW) approach. For example, “How might we help users find learning gaps?” or “How might we make filtering data easy?”.

Journey Map
Journey map with the best “How Might We” notes

Once all of the data was presented to the team, we spent the remaining hour collaboratively creating a journey map that outlined various personas and their unique paths for getting to the data that they needed. The amount of time originally allocated for this activity was insufficient, so we all agreed to pick it back up the following morning. In addition, we went away with a small homework assignment for the next day that required each of us to find an inspiring analytics tool that we would like to use.

Day 2 – “Sketch Alone, Together”

On Tuesday, we started the day with each of us briefly presenting a demo (aka “lightning demos”) of an analytics tool that we have used and liked in the past. This included a site that tracks & analyzes running exercises and/or bike rides (strava.com), a baseball score-keeping app that tracks all player data (gamechanger.io), and an account statement feature on the American Express site that tracks banking activity.

Lightning Demos
Lightning demos of other analytics tools

We continued to capture ideas that we liked, using the HMW approach again, and then we organized all of our notes into similar themes. From there, we conducted a process called dot voting. We also finished our journey map and integrated the winning HMW ideas into the map to help us decide on our target moment. This ended up being a bit broader than we had initially expected, but ultimately we agreed to tackle an analytics homepage dashboard along with two levels of user data (an aggregate view and an individual profile view).

We were now officially ready to start the “sketching” phase. We began with a 15-minute session of individual note taking, followed by another 15-minute session of rough ideating to prepare for the most fast-paced session we would face in the entire sprint: “The Crazy Eights.”

“The Crazy Eights” involved us participants to think at a super simplified level while drawing eight rapid variations in only eight minutes – essentially eight intense one minute sessions that restricted each of us to draw within a small 5×4” space with a sharpie pen. Truth be told, 60 seconds did not feel like enough time to form a rational wireframe or idea. After the eight minutes of mini-chaos had passed, we all individually announced to each other that our “design sketches” were not very good. There was even some embarrassment in having to share them with the group, but then something amazing happened….

Crazy Eights
“The Crazy Eights” and solution sketches

After reviewing the sketches as a team, we began to see some high-level ideas emerge and realized the immediate value of the ridiculous time constraint – it forced everyone to think quickly and not worry about the details. We proceeded to spend the next 15-minutes working on a solution sketch, based on the best ideas, followed by a review and discussion of all the sketches from that day’s session.

Day 3 – “Make Fast Decisions Without Sales Pitches”

We started the session with a silent review of the information posted on the walls followed by a ‘less than five minute’ speed critique of each idea. Afterwards, we all participated in silent dot voting to create a general heat map of the ideas we liked best. This worked perfectly, bringing the key ideas to the forefront and allowing us to agree on some key concepts that we wanted to present and test by the end of the week.

At this point, the product team spent the remainder of the day storyboarding what our mock-ups would look like using white-boarding, wall wire-framing, sticky notes and scraps of paper taped to the wall. Once we felt comfortable with each screen concept, we finalized our general approach to the testing by determining which screens would be tested as well as how the test would be conducted. This drove our final decision that our “prototype” would not be clickable by the tester, but rather a static page that would be used as a discussion point for open-ended questions. The goal of the test was to get as much information about the desired experience as possible. This allowed us to focus on the big picture and not get bogged down in the interaction details until we knew we were headed in the right direction.

Whiteboard Wireframing
Whiteboard storyboarding and wireframing

Day 4 – “A Simple Prototype Is All You Need to Learn From Customers”

On Thursday, we knew that the UX team was going to be doing the heavy lifting from here on out. We kicked off the morning with a quick 30-minute review of some interchangeable widgets that we had worked on the previous night in addition to a recap of the work needed for that day, which included the points listed below.

  • Finalize the mock-ups
  • Create a presentation deck
  • Draft and finalize an interview script

The UX team spent most of the morning and early afternoon working toward these goals. In the afternoon, we had a team check-in to make sure we were hitting our milestones and we began to outline our testing script to make sure we were hitting all of the major questions that we wanted answered.

My UX colleague cleverly suggested that we implement a smiley face scale to help gauge the user’s overall feeling of the experience that we were discussing for each page, but her next idea was even better! After the customer rated each page, we’d then ask him or her what it would take for us to get that page to the highest rating (a five in this case). This strategy ended up providing some great insight during testing, and we intend to use this technique for future tests as well.

Happiness
1-5 “happiness” rating scale

By late afternoon, we realized that we would have some additional homework in order to make sure we were fully prepared for testing the next day. We split up the remaining work and agreed to finish up before meeting for our dry run early the next morning.

Day 5 – “Five Customer Interviews Are Enough to Reveal Big Patterns”

After guzzling down our early morning cups of joe, we spent about 40 minutes ironing out the kinks with some dry runs. It was at that point when it dawned on us that the date was… Friday the 13th!!

Friday the 13th
Minor scheduling glitch… testing on Friday the 13th.

As it turns out, Friday the 13th pressed on without a hitch. We had a full day of awesome interviews conducted virtually using GoToMeeting web conferencing and a PowerPoint deck.  Each interview ranged from about 35 to 50 minutes in length. We used the Voice Memo app to record each session (with permission, of course) and we made sure to have at least one observer in each meeting.  We even recruited a developer that was not on the design sprint team to participate, and he was very enthusiastic about the outcome!

As the sessions concluded, we compiled the audio recordings and observation notes in our online wiki (we use Confluence).  From there, we began identifying trends from comments and feedback as well as averaging out the happiness ratings given for each screen. Lastly, we emailed thank you notes to the five customers we interviewed along with a hardcopy of the deck so that they could share it with other members of their team for additional feedback.

The following week we reviewed the results with our internal stakeholders who were thrilled with the amount of valuable data we were able to acquire in such a short period of time. Most importantly, however, it provided some keen insight into what we need to test next.

Conclusion

Overall, our first design sprint was a runaway success. We learned a lot about our initial assumptions in a very short amount of time, and our customers appreciated being apart of the process. Our customers were also impressed with how prompt progress was being made (especially considering our initial interviews with them only occurred a few weeks prior to the design sprint).

During our retro, our team happiness rating was 4.75 (out of 5) and the general feeling from the team was that it was a great collaborative experience.  This proved that it is possible to solve big problems and test new ideas in just five days. In fact, we’re already planning another design sprint in the coming weeks!

For more information on the process described above, visit http://www.gv.com/sprint/.

Share this post

Share on facebook
Share on google
Share on twitter
Share on linkedin
Share on pinterest
Share on print
Share on email

Leave a Comment