Domain knowledge and software testing

Todays weekend testing mission was to test a mobile app which functioned as a sort of exercise documentation tool and determine how our domain knowledge (or lack there of) effected our ability to test the product. The idea of domain knowledge is interesting to me because of how vague it is. I’ve reading reading Rethinking Expertise so I thought I could explore some of the relevant parts I’ve read thus far.

According to Collins & Evans there are two main categories of expertise. There is ubiquitous tacit knowledge and specialist tacit knowledge. Ubiquitous tacit knowledge can be split in to beer-mat knowledge, popular understanding, and primary source knowledge. Specialist tacit knowledge can be split into interactional expertise and contributory expertise. These different categories create a sort of spectrum in wich the farther to the right (contributory expertise) you are, the deeper your understanding and ability to perform a task is.

Beer mat knowledge is they type of thing you read in passing about some topic. Popular understanding is the type of expertise you gain from following the common media or popular books on a topic. primary source knowledge is gained from reading primary sources of information.

On the specialist tacit knowledge side, interactional expertise is the ability to comunicate in the language of a field. Contributory expertise is the ability to actually contribute to a field. It should be noted that without actually seeing a person do work, these two types of expertise are indistinguishable.

So, back to domain knowledge. In terms of software testing and what is commonly referred to as domain knowledge, what Collins and Evans present is a pretty good representation. The type of expertise needed will be dependent on the role the tester is filling and the type of software.

BUT we discovered today that domain knowledge greater than the first level, beer mat knowledge, is not necessary to do meaningful testing. Domain knowledge can be beneficial for understanding the value specific people will get from a product but lack of should not prevent meaningful work. Development of skill as a tester in my opinion is more important than domain knowledge because most of the time these skills will be transferable to many different contexts and most of the domain knowledge needed can be learned in a short period of time.

Weekend Testing Americas with WIkipedia

Join Weekend Testing Americas on Saturday April 6th at 12pm CST for a special session with Chris McMahon from the Wikipedia test team. This session will focus on testing new features in the wikipedia product designed to enhance the user experience for new users. A secondary goal of this session will be to spend time having a public discussion about how wikipedia pages on software testin can be improved.

Here are some of the pages to consider for this discussion:
http://en.wikipedia.org/wiki/Software_testing
http://en.wikipedia.org/wiki/Black-box_testing
http://en.wikipedia.org/wiki/Exploratory_testing
http://en.wikipedia.org/wiki/Session-based_test

Test charters for the weekend testing sessions can be found here:
Test charter for session

Wikipedia uses radically open software and a community of testers to test a product used across the globe. Not only are the tools open source, but the implementation is completely open as well, from the source code to the Jenkins configuration to the real-time test results. Anyone may contribute by performing exploratory testing, analyzing test failures, adding scenarios to be tested, to writing code.

Here are some links of interest for participants:
Wikimedia feature testing
Wikimedia weekly testing goals
Wikipedia software deployments
Wikipedia mail list

As usual, to join the session send a message to weekendtestingamericas on skype just prior to the session.

WTA 37: experience report

I ventured into the latest edition of weekend testing americas in a different role that than I normally fill. This time it was as a facilitator, and a first time facilitator at that. Since this was my first time I learned a few important lessons and owe some gratitude to JeanAnn and Michael.

JeanAnn spent time with me refining my session idea into a practical skill development session. Over the span of a few days and some (many) emails, JeanAnn helped me expand a basic idea into the outline that we based the entire session on. She was a great help.

Michael gave me the opportunity to facilitate a community event that he normally facilitates. I have a feeling he used some boy scout-ish leadership methods.

Anyway, on to the experience report!

I broke the session into a few segments for a couple reasons: this is how I have experienced weekend testing in the past, and thinking about a few attributes of usability created a fairly tidy way to divide the session up. I had some preconceived notions of how I could break up the time for each segment. My plan was to prompt for a topic, learnability for example, give attendees 10 minutes or so to work in the software and then round everyone up to discuss their testing and what was learned. I more or less did this and might do it differently next time. My concern was getting through what I had planned for the session in a reasonable amount of time but this didn’t really allow for much think time.

In hindsight I would have followed the energy of the group a little more. Though, this is difficult. It can be hard to tell when a lull in the conversation means people are thinking or people are ready to move on. I think this may be a little easier to manage with a slightly larger group that we had.

Some of the main ways I found myself adding to the session were in asking clarifying questions. I think it is important to be able to clearly express ideas so asking questions around ambiguous words and phrases as well as asking people to discuss some word or phrase to come to a shared meaning was a big aspect of this session. You’ll see examples of this in transcript when we got to the topic of efficiency. I wish I had done this exercise purposefully with each topic before beginning the hands on part, but alas. I was able to use lessons from my coaching session with Ann Marie around the socratic method as a way to encourage critical thinking.

Facilitating, like anything else, is a skill that must be practiced to be developed. I’m looking forward to future possibilities to do that.

A full transcript of the session can be found here.

Join me for weekend testing americas #37!

So you want experience testing mobile applications? One of the biggest problems in testing mobile applications is to understand how to apply usability testing techniques. Usability testing on a mobile device is very different from desktop applications. Mobile testers must come up with a strategy in their approach to have complete coverage.

The next Weekend Testing Americas session is going to put focus on Usability Testing Techniques for Mobile Applications on various mobile devices. This session will take place 3/2/2013 at 1PM EST/10AM PST and facilitated by me, Justin Rohrman.

Use as many different mobile devices as you have for this session.

To join this session, please do the following:
1. Add “weekendtestersamericas” to your Skype contacts if you haven’t already.
2. Fifteen minutes prior to the start of the session, please message “weekendtestingamericas”
and ask to be added to the chat session. Once we see you, we will add you to the session.

We want you all to join in on the fun, as we have a full agenda planned. Now is the time to gain experience you can put on your resume.

For details check the Weekend Testers website for not only details on this session but also upcoming sessions as well as experience reports of past weekend Testing sessions.

www.weekendtesting.com

Here is a session outline

Suggested Mission and Charters: Explore the Facebook Mobile interface and specifically test for usability

Assumption: Attendees must have Facebook running in a mobile environment. Native or mobile browser versions are fine.

Topic: Learnability
Pick an area of the app you are not familiar with (pair up however makes sense. Spend some time getting familiar. Discuss your learning experience. How did you learn to use the software? What made the experience difficult? What made the experience easy? What would make the software easier to learn?

Topic: Memorability
For your group, describe the part of the software you just learned. Was the functionality easy to recall? why? why not? What would make the software more memorable?

Topic: Error Rate
Describe the errors you made while learning. What lead to you making these errors? What helped you to not make errors? What improvements would help reduce user error rate?

Topic: Efficiency
Were you able to use the software in an efficient manner? describe what made the product efficient or inefficient for you. How would you improve the efficiency?

Retrospective/Debrief:
What did we learn in this session?
Would anyone describe usability testing in a different way after the hands on exercise?
What kinds of tests can you apply when you go back to work?
Was this a useful session?