Today technically marks milestone two of my project but there was no set deliverable. I had originally planned to be done with my user testing but decided instead to build in a week buffer between testing with students and employees and testing customers so that I would have a chance to tweak the prototype for any major issues discovered. I met with Robin and Tanya on Monday to debrief them about the user tests I have completed so far and to get clarification on what the final deliverable will be. I will be creating a usability study report using a standard template and then also creating a PowerPoint deck that I will present to the User Experience team and also the WEM product managers showing them my findings and suggestions. While the presentation is not strictly part of the project requirements and I might end up doing it after the actual project end date, I think it will be good experience to talk through my project before the Capstone poster session.
I also asked Tanya about these strange, complicated formulas I noticed at the bottom of each spreadsheet tab in the usability test template and she told me they were calculations of the confidence interval for each task. She said she doesn't bother to calculate it every time and I told her I would look into it. I spent some time researching this and quickly got frustrated as statistics and I do not get along (I was really worried about taking the Intro to Research Methods course). What I could glean was that the confidence interval is the same as the margin of error for a sample and also that it doesn't matter as much for small sample sizes. I just couldn't figure out how to calculate it even if I wanted to. At least Excel has a formula for standard deviation!
I learned a valuable lesson this week about scheduling usability test participants: always schedule more than you think you need or have alternates on standby. Tuesday, I had one of my customer participants cancel on me for next week. I have since emailed two more customers but haven't received responses. The good news is that it is not necessary that I use customers for my tests, but it would have been nice. If I don't get responses by Monday, I will just ask someone else to participate. I know several former employees who have had experience with older versions of the product. Also, the two current employees I already used for testing are actually considered customers, just internal customers, since they use OT products on a daily basis.
I spent a lot of this week adding some increased functionality to the prototype in anticipation of testing involving customers. The first thing I did was to make the "add folder" and "add channel" buttons appear active all the time, not just when a user has a parent channel highlighted. While I think this is the way these actions should work, I think it might complicate the test since the prototype only allows the user to actually add a new folder (task 2) when the correct parent folder is highlighted—there are just too many possible variations of this task to code it correctly. The other change I made was to allow users the ability to "uncheck" the boxes for channel selections they had made during task 3 when I ask them to clear all the channel associations in task 5. Before, they had to find and use the "remove all" button but I noticed some participants really wanted to undo the check boxes instead.
This week, I leave you with another video, this time of a pseudo-usability test of Windows 8 that was linked in the latest version of the user experience newsletter I've been following for years, Good Experience. I haven't actually seen Windows 8 for myself, but this video doesn't give me a lot of confidence.
I laughed immediately upon reading your first line. I'm reading six blog posts right now, some on Blogger, some on Wordpress, some where my password works, some where for some unknown reason it does not, none where the damn thing actually remembers who I am, so I keep deciphering those #!@# two words in a box to prove I am the idiot they think I am. On a brighter note, great that you are getting so much work done and still learning new things (e.g., confidence intervals). Great stuff.
ReplyDeleteUgh, CAPTCHAs are the worst! They are real accessibility concerns plus they annoy everyone. I think checking data integrity should be the problem/responsibility of the site owner, not an issue for users to deal with.
ReplyDeleteI am still learning a lot, things I could not have even anticipated :)