Website Usability Workshop

As of the 23rd May 2022 this website is archived and will receive no further updates.

understandinguncertainty.org was produced by the Winton programme for the public understanding of risk based in the Statistical Laboratory in the University of Cambridge. The aim was to help improve the way that uncertainty and risk are discussed in society, and show how probability and statistics can be both useful and entertaining.

Many of the animations were produced using Flash and will no longer work.

Notes on “Finding out whether your web site is usable: four techniques”

Persona development was the initial tool used to test a website’s usability. The different working groups came up with several personas that they thought would be using a particular website. The interesting point in this exercise, I found, was the assumption made about the personas. Granted it was a generalized perspective on the personas but some assumptions could have been completely wrong.

Eg. Margaret was an experienced university professor and the group made the assumption that people would be head-hunting her, or be contacting her directly in some manner, thus it was unlikely that she would be visiting the university website job section. If this was true then tailoring the website for her was incidental to making changes so that students, another identified persona for example, could get the most out of it.

Further discussion may have revealed that the university will get more out of attracting one Margaret than by having 100 students viewing the website. The groups decision to discard Margaret as a user is then note-worthy as she may have not been a constant user but the few times she is on the site would produce very important results for the university. Further tailoring of the website might engage even more Margaret-types, who the university does want to attract badly. We rarely discussed during the workshop the varying demands of a website, such as attracting many visitors versus getting many visitors of the right kind or VIP viewing.

I learned three important bits:

• Do not design a website according to whom you think will likely visit it more, at any stage of the development. Instead first decide upon, or revisit, who you want to attract.
• It is necessary to establish who our most important users will be of the personas we develop – who is it that we want to spend the most time on, whose needs do we want to serve most through the website, according to project goals, demands, etc.
• Back up the team's ideas of personas with independent research once it is established that they are who we think will, and who we need to, visit the website.

We then explored the university website with the personas we had identified as likely to be visiting it. By viewing the job section through their eyes, we saw several changes that could be made:

• Titles and links were discovered to be confusing.
• Insider’s advantaged because of terminology. Viewers outside of Cambridge, and not all those within, would recognize the terminology used as it is very “Cambridge-life” specific. Eg. "Academic-related" link may seem like an all-encompassing term to many visitors on the website but those within Cambridge University, or administrators/managers at least, know it can exclude “research” and “academic” staff. Viewers would likely click on the “All” link which defeats the work that went into developing the rest of the website, nor is it then serving well those outside of Cambridge University, supposedly a large component of the website’s audience.
• Search box capacity had high potential to be underutilized. Instructions or guidance would have been helpful.

Workshop participants noted that the graphics on the webpage were engaging.

We moved on to discuss the expert review, which is essentially the same as the above process, with additional questions about the user's relationship with the website. This method is used when developers want fast results and answers from their testing. Heuristics were introduced in this section, of which we were provided several sets, including Nielsen’s. These guidelines do answer questions we have asked ourselves at UU:

• Provide consistency and standards. Actions and words should mean the same thing and the viewer shouldn’t have to guess at what to do next.
• User control and freedom is important in website design. Consider having easy “emergency exits” in case the viewer is on a page they didn’t mean to be on (support undo and redo). Don’t make the viewer recall information when options and actions can be made visible.
• Every unit of additional information competes with the relevant information and its visibility.
• A system can cater to both inexperienced and experienced users of the website. Accelerators unseen by the new viewer may speed up the interaction between the expert user and the website.

http://www.usit.com/papers/heuristic/heuristic_list.html

We learned about “Platform Consistency” and mandated/accepted standards, which are valuable in forming websites as “on the web, most people spend their time on web sites other than yours”. To conform to other popular website standards, such as Google or Amazon or a site closer to UU academic aspirations, may be useful in developing high viewer effectiveness and compatibility on our site. It’s good to have found out that the question “how have others done it?” we have been consistently asking ourselves at UU has a strong basis in website development practise, and here we were encouraged to look at the “competition” thoroughly with specific questions:

• What other site or products do our users see?
• Who else asks similar questions?
• Are there any conventions that you should be exploiting?
• How are those sites organized?

Next, card sorting was explained. This methodology is designed to help website developers in organizing information on a website at the same time as it is tested on users. This exercise lets the users group information (the content of the website) according to their own identified headings, and then has them organize the grouped information on a flipchart with index cards. This sorting allows the developers to see what information the users grouped together and what headings were defined, indicating to developers what topics should appear together on the screen/webpage, and which subjects need to be linked into from separate pages (the user may have put a topic index card under two headings).

There is software, Syntagm “SynCaps” that allows for this analysis to be computerized. Results of the card sorting are inputted through a matrix and close associations get high numbers while outlying items showing up outside of the clear groupings need more attention.

The fourth method of usability testing was introduced as having the person in front of you and tracking their responses - watching how a person “uses” the site, not what the user is learning from the site’s content. Here, defining the scope of the test is key in analyzing the results of the user’s experience. Testers may ask a few questions and give general direction (“tell us what you think” type of questioning) or give the user tasks on the website to gauge their interpretation/understanding of the site and thus the site’s effectiveness.

The user’s responses are noted during the user’s perusal of the website, through the facilitator of the session and/or a note-taker, or using eye-tracking hardware and software. In this session, it is important to ask no leading questions and to give options when asking the user to consider their next move.

The workshop participants discussed having the eye-tracking software made available to the group’s respective working groups, and the workshop participants seem willing to act as test subjects in each other’s future website developments. This may be something that UU would find useful and we may consider initiating an event to have them help us.

Lastly, we discussed reporting after having accomplished testing. Elements to include in the report are:

• Observations about your users eg. recorded notes, were the users different than the ones you wanted?
• Report recommendations and areas for further research
• Details of methods in optional appendices

Based on recommendations and observations, the scope for change of a website is best done:

• When investigating feasibility
• During requirements gathering
• When there are any ideas/prototype
• When early design/early version is available

Good to see we are on track!