Monday, February 14, 2011

Watson: Man Vs. Machine on 'Jeopardy!'

(Title linked.) Also, read this because it's cool and it deals with metadata/controlled vocabulary.

Learn to write, computer geeks!

I read a lot of posts from computer geeks. Heck, I interact with a lot of computer geeks.

I could begin posting here some discussion of RAID arrays, *nix commands and scripts, or differences between postgresql and mysql, that might illustrate a point that I want to get across. (I really could; it's not that I don't know how.) It's that my reader (I hesitate to put an "s" there because I think Carol is the only person who reads me!) would be bored to tears and tell me to post something interesting.

INSTEAD, I will post about communication. (It seems to be what most interests me, so I'll use my soap box for that.)

I was stalking some computer geeks'/writers' blogs today, or what they called blogs, and it hit me, "Day-um, this is crappy writing." I would link examples here, but I don't want to hurt anyone's feelings because these people are blogging about programming for open source projects--I consider their endeavors valuable, useful, and important to humanity's collective knowledge. I'm a big believer in open source, so don't assume that because I'm critiquing the writing style of open source programmers, it is because I dislike open source or these people's programming.

I began this blog a few years ago with the lofty goal in mind of improving open source application documentation (specifically to improve DSpace documentation, though I've since suffered a loss of faith in DSpace), on the premise that I'd be a good candidate to do so because I understand technology and I know how to string words together. Sadly, real life intervened, and I realized I had little time to commit to an endeavor like that. Instead, I continue to follow the near-incomprehensible musings of a bunch of appie-smiths who don't know the difference between a comma and a semi-colon, and I grit my teeth every time I see a comma splice.

But seriously, I came across one blog this evening that I expected to be slightly more comprehensible, and all I could think was, "What drugs was this guy doing when he wrote this?" While his code seemed solid (and apparently, he's a rock star of a programmer--this is not someone I know personally--I state that here because I don't want any of my computer geek friends seeing this and getting insulted by mistaken assumptions), his explanations of the projects for which he'd created the code was poor, to say the least.

So, computer geeks, learn to write! I'm a geek--granted, not hard-core, but I can follow code enough to troubleshoot it, can dig through logs to figure out where something went wrong--but I _can_ communicate. I know a lot of geeks aren't big into the whole human-contact thing, but I have to admit, when a geek knows how to communicate, the job possibilities are endless, and potential for job satisfaction is higher.

Geeks, go learn to write. Then in your cover letters say, "I'm a geek who knows how to write. I also use deodorant." (Okay, you don't really need that part.) You'll get a job for the writing part alone.

Minor rant. Conclude. End of file.

Friday, January 28, 2011

Usability Testing: What gets tested, what gets usabilified?

Should usability research be performed through grounded theory--where results of a usability test determine how the product gets changed--or through traditional empirical theory (whether that is experimental, quasi-experimental, etc.), where the researcher begins with a specific question or hypothesis and tries to solve this through usability testing? What is tested? When testing in usability, is the item being tested? Yes. Is the user's ability to test the software being tested? Absolutely not.

How, then, does the usability researcher proceed to determine A) the product of the usability test (end result) and B) where the user fits into the usability test?

The product of a usability test should always be the tested item's improved capability to fulfill its intended function; how often should the intended function be adapted when users--who are very smart human beings, generally--use the item in an unintended-by-the-creator fashion?

Usability testing relies upon de-centering the user, giving the user no anxiety about how s/he performs, no leeway to consider whether or not s/he has a place in the product testing. Let me offer a sentence: "The user working with an item provides feedback on the effectiveness of that item's design toward fulfilling an intended purpose." "Working with an item" is the subject of this sentence--NOT the user; similarly, working with the item is the subject of a usability test, and combining multiple workings with the item provides results from which conclusions may be drawn (grounded) or that will respond to the questions asked.

I am fairly new to being on the testing side of usability testing; I've performed multiple usability tests as a user for various friends, and I've often discussed iterative design and usability in my work, simply because these are incredibly important issues in digital curation. But when I go to the testing side of things, I have to de-center myself, to recognize that I am not performing usability testing for any purpose beyond what the software is doing; the user and I (as the tester) are irrelevant and replaceable. What is important is the results of working with the item, and how I as a usability researcher structure those results back toward item redesign. There is not a third product that emerges from this researcher, for instance as a friend and I were discussing, improved software training; there is no room for a third product because usability testing is about the action performed with the item, and how combined actions contribute to an improved item.

Sunday, January 23, 2011

Learning through observation

I am enrolled in a usability research course this semester, and my first assignment for class was to conduct a site visit, to observe someone doing his/her job. This goes directly to usability research because, to perform usability testing, one must learn to observe users, must learn to pay attention to what users do--only to take in, not to assign value, criticize, or make judgment. Just to observe.

I greatly enjoyed the site visit, and I will post my write-up on here, if I get permission from the person I visited. I visited one of my former colleagues, to observe her as she created catalog records for materials; when we worked together, I never had time to do this--I was always simply a beneficiary of her labor, and I had no need to pay attention to how metadata in my system originated with this person. Now, I have learned the process that goes into creating a record--both creating a record for our own university's search engine, and a record with OCLC, that can be localized and adapted to other libraries' holdings. (I have lots of librarian friends who are welcome to post on the actual process that I watched; I'm just stating what I observed, in my own words. Please feel free to say, "You were watching this person do XYZ.")

What hit home even more to me was the goal of user observation. In understanding what people are doing, in recognizing how they make things happen--in whatever job they do, it matters not--could be making smoothies or making metadata--developing a habit of shutting one's critical brain off, that part of the brain that says, "This user did this badly, and should have done this instead, and that's how I would fix it because I know better,"--and being able to get into the habit of watching what people and respecting their work as people who use tools, teaches us how to perform usability research. Usability research is not at all about, "Here's what would fix this product." No. It's about, "Here is how this user uses the product. Here is how the product was designed," and then letting whomever hired the usability researcher know about user habits and design intents; it's not about making judgments and criticisms.

Iterative design--the constant re-evaluation of how users use a product to redesign that product--requires constant observation, constant taking in, constant understanding what people are doing with a product. Iterative design in software and websites is becoming increasingly important, particularly in terms of library/archive websites, which are constantly gaining new information, new data, and which aren't necessarily being redesigned to accommodate this new data. For instance, at ALA in San Diego this year--which I did not attend, merely heard about it--OCLC announced a new tool that harvests data from digital archives and allows their materials to be available on OCLC, which basically means that OCLC now recognizes digital materials (that have proper metadata and OAI-PMH harvest capability) as physical holdings at local libraries. This is huge--and now it is contingent upon local libraries to design their sites to be even more usable, and to constantly revisit their designs--iteratively design based on how users use their information--so that they can support the high traffic that will begin to appear.

It's an exciting time. Questions of quality metadata come heavily into play, and quality metadata will rely upon strong observation of how people search. Not observation with the intent of, "I can fix this whole situation because it's not currently usable," but observation with the intent of, "Here is how users are using this right now. This is what I'm learning and taking in."