©2019 by Julia

Boston.gov: CMS Improvements

Background

Shopping at Market Basket is an experience. 

 

Low prices, good food, long-term employees, and few frills have defined Market Basket since 1917. Shoppers see neighbors, find deals, and chat with workers.

However.

Market Basket aisles get tight,  filled as they are with deal-seeking-neighbors and thousands of favored foods. And the friendly checkout lines can get long, often snaking down aisles.

 

The Market Basket Experience is not always what the customer wants. But is online shopping on their list?

 

Tasked with introducing an e-commerce component to a store's website, I decided to find out.

My ideal user for Market Basket's (MB) online shop:

  • is the household's primary grocery shopper;

  • has a shopping routine;

  • is a frequent MB shopper--the company should work to retain their business;

  • also shops at other grocery chains--their business could shift to another company.

I identified and interviewed 10 fitting users through my online screener, asking about their grocery shopping habits, preferences, and issues. While affinity-mapping the results using Miro, trends emerged:

  • Users had negative feelings about MB due to crowds, lines, and unreliable inventory;

  • While only half of my users had purchased groceries online, all said that they would;

  • Most expected sites to have online ordering;

  • None had satisfying experiences on MB's.

To better understand why and how MB could integrate online shopping, I conducted a feature audit of their competitors' main pages and shopping processes.

 

MB actually has many of the features my users want on a main page, performing better than some more established online brands. However, MB's lack of online shopping sets them back from competitors. While I couldn't compare shopping processes across all sites, the analysis helped me identify design patterns and pain points to remember.

MB also scored lowest in Jakob Nielsen's 10 Heuristics for User Interface Design. 

Seven of the users completed open card sorts of 102 MB item photos to inform my category taxonomy. I learned the most from moderating in-person sorts with OptimalSort:

  • Some sorted down list, others jumped around;

  • Almost all pictured a grocer while sorting;

  • Some categorized by location, others by meals;

  • Some used MB terms, others made their own.

 

The open sorts created 192 categories, which I standardized to 32 for a hybrid sort with four of my users. While my second round of sorters did use the provided titles, they created many more, resulting in 102 categories. 

 

After standardizing and using some discretion, I moved forward with 20 main departments and six sub-categories for my user flows.

Problem Statement
Market Basket shoppers need an intuitive online shopping option as an alternative to visiting the store to have a more positive grocery experience.

Hypothesis

We believe that a Market Basket e-commerce feature, will improve shopper experience and increase loyalty.

We will know this to be true if the service is used consistently.

I drafted user flows to complete three tasks, which integrated my shopper and competitor research:

  • Sign in as established customer

  • Find items

  • Check out

The flows informed my information architecture and design sketches, which I informally tested with two users. I incorporated feedback for a paper prototype.

Market Basket--a grocery chain of 80 stores in Massachusetts, New Hampshire, and Maine--is known for low prices, good food, and helpful associates. 

However, the stores are also known to get crowded and busy, leading many grocery-seekers to shop elsewhere, especially with the rise in online ordering and delivery services. Market Basket risks losing customers to competitors offering easier, less stressful shopping in-store or online. 

My ideal user for my conceptual e-commerce feature:

  • Is the primary grocery shopper for their household with common processes for the chore;

  • Is a frequent Market Basket shopper, indicating that the company would especially like to retain their business;

  • Also shops at other grocery chains, indicating their business was at risk of shifting to a competitor.

I interviewed my identified users to learn more about their shopping habits, likes, dislikes, and preferences. Trends included:

  • Most use a list--paper or digital--to plan their shopping

  • Lines and busy aisles dissuade some shoppers

  • Some shoppers go to another store for quick lists or at busy times to avoid the amount of time MB takes

  • Several shoppers had ordered groceries online from other retailers and all were willing to buy at least some Market Basket items online

  • Market Basket shoppers love the brand and are willing to manage some inconveniences to buy from a company they support

Market Basket responded to customer requests by launching a website in 2017. Currently, the site focuses on the famous flyer, store locations, and department information. They do have a shopping list feature, which you can type items into or add from the weekly flyer.

I wanted to see how the other stores my users reported shopping at used their website. I conducted a heuristic review of several, looking for features my users said they expected to see on a grocer's front page. I also looked at any online list or shopping feature. The Market Basket site actually met most of the users' needs other than online shopping. This analysis helped me identify components to incorporate as well as layouts and interactions to consider.

Additionally, I conducted a brief evaluation of my sites using Jakob Neilsen's 10 Heuristics. While the Market Basket site performed poorly, Hannaford, Trader Joe's, and Wal-Mart exceeded my expectations. Again, this evaluation helped me determine which sites to reference for different decision points in my design.

A pain point many users mentioned was difficulty finding items in stores with different aisle categories and layouts. If done well, my shopping feature could address this issue.

I documented 100 items from several aisles in the Nashua, NH Market Basket and used an open sort approach to gather grocery categories from my participants. 

The sorts were done through Optimal Workshop and conducted in-person and remotely, with and without my presence.

The open sort results were not always consistent, and the users' mental models likely impacted the tests more than I anticipated. Several users commented that they were picturing a specific grocery store while categorizing items, meaning that they were influenced by that chain. Others also noted that they were specifically trying to think of the words and phrases Market Basket uses on their aisle signs--the taxonomy I was trying to test. Additionally, some suggested categories were popular with users but would not gain corporate approval.

However, I had a starting point to begin sketching, drafting user flows, and prototyping.

Team Roles

Shared: User interviews, contextual inquiries, comparative research, usability tests

Julia: Client liason, accessibility lead, Drupal research lead, writing, partial prototype

Luca: Visual design lead, high fidelity digital designs, clickable prototype, design cohesion

Morgan: Project manager, task analysis, research lead, findings report and presentation lead

Tools

  • Sketch

  • Miro

  • Invision

  • Slack

  • WebAIM Contrast Checker

Constraints 

  • Three week project

  • Migrated to newer version of Drupal on Day 3

  • Potential limitations of CMS

Process

User Research (All participants were current users of the system who volunteered through DoIT)

  • Focus Group (5)

  • Interviews (13)

  • Contextual Inquiry (11)​

Analysis

  • Comparative Heuristic Analysis (4)

  • User Workflows (4)

  • Affinity Mapping (327 data points)

Prototype 

  • Sketch

  • InVision

​Accessibility Testing & Updates

  • Manual review

  • WebAIM Contrast Checker

Usability Testing & Feedback

  • Usability Testing (4)

  • Demo to DoIT Project Team (4)

  • Demo to DoIT staff including developers & designers as well as research participants (15+)​

Planning and Research

Comparative Research

While scheduling user interviews, we conducted heuristic analyses of the Drupal 8 Workbench and three similar content creation sites for usability, successes, and areas for improvement:

  • Wix - visually-oriented basic site creator

  • WordPress - most popular site creator platform in the US

  • Google Suite - where City of Boston employees create most of their content

We identified the busy, text-heavy design and lack of intuitive wording as issues on Drupal. This research prepared us to conduct informed interviews with users and identify potential improvements to customize from other sites.

Synthesized User Interview and Contextual Inquiry  Findings

While interviewing 13 employees, we heard similar statements and problems:

Navigation

  • I want to edit this page, but I can’t figure out how to get to the editor from here.

  • I am confused about how things are named and described.

  • I struggle to find what I am looking for. 

 

Moderation & Preview

  • The new moderation tools seem helpful, but I’m still confused.

  • I want to be able to see and compare revisions, but this way is confusing me.

  • I don’t trust Preview. 

 

Errors & Help

  • Help me avoid errors by telling me the requirements.

  • I want easier access to the DoIT user guide.

  • I rely on DoIT staff when I can’t help myself.

  • I need more support to ensure my content is truly accessible to all of our users.

Workbench User Problem Statement

City of Boston employees producing and editing content for Boston.gov are struggling to use the Drupal workbench and often have to rely directly on DoIT staff because the workbench is inefficient and unintuitive.

Hypothesis

 

We believe that by improving menus, descriptive text, and error-avoidance support, we will make the workbench more intuitive and efficient for content producers, streamline content workloads, and improve their workbench experiences, as well as  free up DoIT resources to focus on strategic innovation.

Ideation and Testing

Solutions to Test

​While we had many ideas for helping the Boston employees, we focused on the following for broadest impact:

  1. Incorporate visual examples, best practices, and more illustrative descriptions

  2. Improve access to the Drupal Workbench user guide created by DoIT

  3. Improve taxonomy and navigation by using more familiar terms

  4. Clarify the moderation process 

  5. Be clear and consistent in error avoidance and requirements

  6. Guide staff to support all  Boston.gov users and meet accessibility requirements

Example of Current Workbench View

Example of Proposed Workbench View

Menu: Reorganize Options and Reintroduce Familiar Names

We believed that by making page-editing actions more visible and intuitive, all users would have an easier time performing core functions of their workflow. Therefore, Luca redesigned the menu items and layout to be more easily accessible and familiar to users. We also added a "Help" section to the top menu, which opens the content creation handbook document that all users asked us to make more accessible from within the Workbench.

Existing Initial Screen and Menu

Proposed Initial Screen and Menu (v1)

Accessibility: Update Color and Font Choices to Meet WCAG AA Standards

While our scope did not include a complete accessibility audit, I manually reviewed several key color choices and checked them with the WebAIM contrast checker. I found that many color combinations for buttons, links, and descriptive text did not have adequate contrast to meet AA standards.

 

I proposed increasing many font sizes and found similar color combinations that would meet contrast standards, many of which were integrated into the final prototype.

Previous Colors

Proposed, Accessible Colors

Content Creation: Incorporate Descriptive Text from Guide and Examples 

I researched, designed, and prototyped a new content layout selection page with updated terminology and visual cues to support user decision-making.

Existing Content Layout Detail

Proposed Content Layout Detail (v1)

Feedback on Version 1: Almost!

We tested a combination of wireframes and mid-fidelity prototypes on several of our initial users and heard:

Positive Feedback

  • This is easier to navigate than what I am used to.

  • I would use “View Example” of the content layouts.

  • Moderation seems simpler and easier to get through.

Conflicting Feedback

  • I like this dashboard vs. I want more/different information

  • I want edit options in a pop-up vs. I want edit in a sidebar

Helpful Ideas

  • I want to see published examples  as well as templates.

What We Observed

  • Users very quickly and easily navigating to where they needed to go.

  • Users being excited  and more confident in their process

Iteration

Version 2: More Customizable and Visual

We incorporated user feedback, focusing on making the process more similar to dashboards the employees enjoy and incorporating more visual examples to support decision-making. The accessibility options in the Workbench were unclear, so more work will have to be done to make the examples meet standards.

Initial Screen with View Choices (v2)

Content Creation Detail (v2)

Version 2 Prototype

Additional Recommendations

  • Collaborate with very high volume departments to develop custom templates

  • More deeply embed help tips and access to user guide throughout the Editor

  • Provide users with access to the Brand Guidelines from within the Editor

  • Embed either Grammarly or Hemingway in text editor fields to encourage more accessible writing 

  • Provide users with translation capability

  • Implement accessible design recommendations from City’s April 2019 blog post to front and back-end