Fresh eyes, a clean slate, and better software
Oct 17, 2014 by Jessica BaumgartI joined PatientsLikeMe in July to contribute to the Open Research Exchange's quality assurance efforts. As I write, what comes to mind is a friend's thoughts about how quality assurance is a bit of a misnomer based on common word usage. I'm not here to inform anyone in a certain, positive fashion that the Open Research Exchange is of high quality. (I assure you: it's great!) Rather, I am trying to ensure that it lacks many technical defects and is reasonable for people to use.
Quality assurance has many different approaches and methodologies. Some people believe only highly technical people very familiar with a product can make useful suggestions about what needs to change. That is not always the case. Having fresh eyes and a clean slate while a newbie can be beneficial. Sometimes, the best tools I can use when exploring software are my own inexperience, naiveté, curiosity, and creativity. One motivation for bringing extra help onboard was to get this kind of outside perspective on the product mixed with a QA approach.
"Well, can't we get that from customers?" some of you may be thinking. Part of the purpose of quality assurance is to catch problems before they go to customers. Clients aren't necessarily going to go through a product with a fine-toothed comb in a deliberate fashion to make sure everything works, walk through expected behaviors and common workflows, investigate edge and use cases, test regressions, make lots of notes, and share findings with the developers. While some people are happy to coordinate with a company to improve software, other people have no interest in that. Also, paying patrons may be biased toward how they'd like to use the software or their favorite features whereas quality assurance folks often have to consider and advocate for many different users and potential uses and be familiar with the entire widget and its nooks and crannies. Having someone on staff charged with examining the product means a person is dedicated to taking certain measures and collaborating with the engineers to make the product the best it can be.
When this person starts with fresh eyes, the team can get a lot of information about what the new client experience is like. Mixed in with functionality faultiness, many of my jottings from my first sprint here are usability and user experience comments. "This menu is confusing because ..." "I struggled to figure out how to ... because ..." "This feature doesn't make sense on this tab because similar features are elsewhere." Once someone knows where to go and how to accomplish a task, it is easy to stop considering whether the paths or actions make sense. These observations are helpful.
Some teams are so eager to get a new QA person up to speed that they want to train the person immediately, missing a valuable period when she could be furthering the group's knowledge of what new customers go through. I've been left alone to learn software on my own many times and find the process very informative. A practical learner, it's sometimes easier for me to recall things I discovered through my own hands. Stopping to think "Would someone new know what to do here?" is easier when I've been a newbie. And as I progress, I encounter issues people becoming more familiar with the program are also going to encounter.
That's not to say no one was around to aid me. I asked lots of questions, some of which got others thinking, too. My untainted viewpoint led to many modification requests for the betterment of the technology. In some cases, I heard, "Yes, we know that's an issue. Maybe we need to raise the priority on that ticket." Or "What would you change and how?" And, naturally, there were a few "That's only a problem because you are unfamiliar with the tool." Determining if that's an excuse or the truth can be subjective. Sometimes a balance exists between making something easy for new folks and having it make sense for advanced users.
As I become more familiar with the Open Research Exchange and its corners, the kinds of issues I flag and how I do that have changed. I've explored some areas quite thoroughly I was advised to avoid at first. I can get more in-depth now that I know more about what's behind some of the design decisions, what clients want to do when, and the technical guts beneath what I see on my screen all while being mindful of our clients and their different skill levels.
Oh, yeah, and because you're probably some kind of engineer, too, I should give you a number: 131. Our ticketing system says I've filed 131 issues since July. I don't think I'm going to try to dig around to figure out if it will tell me how many issues have drifted through my fingers. It's likely to be at least twice that. (10 tickets/week ... ? 20 tickets/week ... ?) In another few months, it'll be amusing to see if I've kept that pace.