A more usable Drupal

  • 4 minute read

At the beginning of the D7UX project, the very first screencast by Mark and Leisa revealed confusion over Drupal’s installer. Overnight, patches were created to remove the confusion. This situation reveals how fantastic the community is at rising to fix issues that are backed by data.

Imagine if all issues or design suggestions resolved themselves this quickly. Would the toolbar redesign have taken fewer than 300 comments, and less than seven months to resolve if instead we were provided with data demonstrating that each tested participant found the redesigned toolbar to be more effective? Since I don’t see this community being argumentative for the sake of being argumentative, I’m betting such data would have shrunk the timeline for the toolbar design considerably. In fact, Acquia is banking on it.

Recently, we hired Dharmesh Mistry as a usability specialist to test Drupal and give the data back to the community. Together, he and I—with validation from the community—will build a matrix of repeatable tests that cover the gamut of Drupal. The goal of the matrix is to track what has/has not been tested by:

  1. Listing all tasks within Drupal core (a very large list)
  2. Defining tests comprised of tasks from the list of tasks
  3. Documenting the last tested date of a defined test

Every three weeks, we’ll decide what needs to be tested. The need will be defined by:

  1. Reviewing tests/tasks that have NOT been tested
  2. Of this list of untested tests/tasks, which are more frequently used
  3. Reviewing tests/tasks that have been tested - but have changed since the last test

Armed with the test plan, for each test we’ll reach out to a recruiting firm to help us find 6 to 7 site builders with varying levels of Drupal experience (depending on the test). We have two personas for site builders, Jermey (defined by Liesa Reichlt and the drupal community), and Ben, defined by us at Acquia. Both have the same characteristics.

After each test, we’ll summarize:

  1. The goal of the test, e.g., uncover usability issues with managing content and creating menus
  2. The demographic, e.g., who is to be tested
  3. The tasks included in the test, e.g., create a content type, pages with menus, etc.
  4. The repeatable issues found during the test supported by video snippets, participant quotes and logged. D.O issues. When applicable, we’ll also provide possible solutions to address the problem

These tests will be run over time., and as such, we’ll be able to measure the user experience of Drupal. For example, recently we ran a test we tilted the “content management” test. We chose this test because we wanted to understand how people perform using the most basic, frequent tasks in Drupal like creating, publishing and managing content and creating pages and menus. Like former tests, we’ve found some major issues (see chart below). Because these issue are so severe, we’ll look to fix them, apply the fix in Drupal Gardens, retest and measure the outcome.

This "content management" test was just completed. Soon there will be D.O issue for each issue and a complete write up of the test. Note the task "invite user" was specific to Drupal Gardens. Such issues will not be posted to D.O. Stay tuned for the rest of the details of this test.