The UX team has a very nice spreadsheet it uses when conducting usability studies that I had to spend quite a bit of time adjusting for my needs, filling it out with my scenario and tasks. It has a macro that allows the facilitator to easily record time on task, which was helpful. It was also easier to take notes in a spreadsheet while also administering a test than I thought it would be. To that end, I conducting my first two tests this week and they both went well and took the right amount of time. There was only one failure of one task and I think it was due more to issues with the prototype than of the system design; the user just didn't notice that the squares I was using to indicate checkboxes were checkboxes. Once I pointed this out, she had no problem completing the task. So I'm not sure if that really counts as a task failure or just a prototype failure.
These are supposed to be checkboxes, but they are a little too large! |
It was nice to start my testing with iSchool students and to be able to do in-person testing to work out more kinks, practice running through the tasks, and get better at reading everything aloud then asking questions. (It made me wonder after the fact if was supposed to get IRB approval but since these tests are for a company and not really UT, I think I'll be okay.) The rest of my tests will be administered remotely, which has all kinds of potential for problems. I ran though a technology test to try out Webex and see what kind of lag I was getting. I think it worked out okay. I've included a short video of how the test will appear to me while the remote user has control of my screen. The audio quality is pretty terrible so I will avoid using my cellphone for these tests.
Next week will be more testing!
No comments:
Post a Comment