Profile image for William H.

William Hudson

User Experience Strategist / Educator at Syntagm Ltd / The Open University

Abingdon United Kingdom

Distinctions

Publications

Publication period start: 2000
Number of co-authors: 0

Publications

Hudson, William (2005): A tale of two tutorials: a cognitive approach to interactive system design and interaction. In Interactions, 12 (1) pp. 49-51.

Hudson, William (2005): The cost of more: psychology of choice in interaction design. In Interactions, 12 (2) pp. 71.

Hudson, William (2005): Fitts at 50: for link design, size does matter. In Interactions, 12 (3) pp. 57.

Hudson, William (2005): Playing your cards right: getting the most from card sorting for navigation design. In Interactions, 12 (5) pp. 56-58.

Hudson, William (2004): Foraging a la carte: an appetite for popup menus?. In Interactions, 11 (1) pp. 63-64. https://dl.acm.org/doi/10.1145/962342.962360

Hudson, William (2004): Applying research to design: bridging a widening gap. In Interactions, 11 (2) pp. 85-86. https://dl.acm.org/doi/10.1145/971258.971290

Hudson, William (2004): My place or yours: use and abuse of research facilities. In Interactions, 11 (3) pp. 45-46. https://dl.acm.org/doi/10.1145/986253.986270

Hudson, William (2004): Inclusive design: accessibility guidelines only part of the picture. In Interactions, 11 (4) pp. 55-56. https://dl.acm.org/doi/10.1145/1005261.1005278

Hudson, William (2004): Breadcrumb navigation: there\'s more to hansel and gretel than meets the eye. In Interactions, 11 (5) pp. 79-80. https://dl.acm.org/doi/10.1145/1015530.1015573

Hudson, William (2004): Attentional gambling: getting better odds from your web pages. In Interactions, 11 (6) pp. 55-56. https://dl.acm.org/doi/10.1145/1029036.1029054

Hudson, William (2003): Don't make me read: use and abuse of text in Web page design. In Interactions, 10 (4) pp. 55-56.

Hudson, William (2003): Books and mortar: the science of Web shopping. In Interactions, 10 (5) pp. 47-48.

Hudson, William (2003): Enterprise information architecture: strategies for the real world. In Interactions, 10 (6) pp. 53-55.

Hudson, William (2002): Syntagm. In Interactions, 9 (2) pp. 95-98.

Hudson, William (2000): The whiteboard: metaphor: a double-edged sword. In Interactions, 7 (3) pp. 11-15. https://www.acm.org/pubs/articles/journals/interactions/2000-7-3/p11-hudson/p11-hudson.pdf

Hudson, William (2009): Reduced empathizing skills increase challenges for user-centered design. In: Proceedings of ACM CHI 2009 Conference on Human Factors in Computing Systems , 2009, . pp. 1327-1330. https://doi.acm.org/10.1145/1518701.1518901

Hudson, William (2014): Card Sorting. In: Lowgren, Jonas, Carroll, John M., Hassenzahl, Marc, Erickson, Thomas, Blackwell, Alan, Overbeeke, Kees, Hummels, Caroline, Spence, Robert, Apperley, Mark, Holtzblatt, Karen, Beyer, Hugh R., Kjeldskov, Jesper, Burnett, Margaret M., Scaffidi, Christopher, Svanaes, Dag, Hook, Kristina, Sutcliffe, Alistair G., Schmidt, Albrecht, Cockton, Gilbert, Kaptelinin, Victor, Christensen, Clayton M., Hippel, Eric von, Tractinsky, Noam, Challis, Ben, Shusterman, Richard, Hudson, William, Mann, Steve, Whitworth, Brian, Ahmad, Adnan, de Souza, Clarisse Sieckenius, Fishwick, Paul A., Grudin, Jonathan, Poltrock, Steven, Gallagher, Shaun, Dix, Alan J., Nielsen, Lene, Randall, Dave, Rouncefield, Mark, Bowman, Doug A., Kock, Ned, Cairns, Paul, Few, Stephen, Dautenhahn, Kerstin, Paterno, Fabio, Cyr, Dianne, Mortier, Richard, Haddadi, Hamed, Henderson, Tristan, McAuley, Derek, Crowcroft, Jon, Crabtree, Andy, Stephanidis, Constantine, Stappers, Pieter, Giaccardi, Elisa, Blandford, Ann, Zimmerman, John, Forlizzi, Jodi (eds). "The Encyclopedia of Human-Computer Interaction, 2nd Ed." The Interaction Design Foundation .

How to Screen Research Participants. Screening participants is crucial to ensure you get suitably qualified participants in your user research studies. Learn how to craft effective screeners to maximize the benefits of UX research.

Correlation in User Experience. When collecting data for user research, it can be tricky to establish a correlation between different datasets. Learn more about correlation and how it applies to UX.

Why and When to Use Surveys. Surveys are a relatively inexpensive tool for user research, as long as you use them wisely. Learn when surveys are appropriate in UX research, and when to use alternatives.

Writing Good Questions for Surveys. To get the most from your surveys, ensure your questions are clear and easy to understand. Here are the best practices on what good questions look like and how they should be presented to respondents.

Ensuring Quality. Questions that appear obvious to you can easily be misunderstood by your survey respondents and ruin your research efforts. Find out how you can ensure quality in your survey results in this video.

Early-Design Testing. First-click testing and tree testing are great methods to test your designs early and minimize extra work later in the design process.

Getting Started with Early-Design Tests. As with every research method you need to decide what you’re trying to find out and who to conduct early design tests. Here's more on getting started, participant recruitment and screening.

Tree Testing. Tree testing provides goal-oriented verification of a navigation hierarchy. Learn how to get started with tree testing in this video.

First-Click Testing. In first-click testing, users are given a task and are asked to click on a design to indicate where they’d start. Here is some practical advice on how to get started with first-click testing.

Analytics Data Types. As a UX Designer, you will encounter several types of data such as bounce rates, conversion rates, search behavior. Let's look at these and more in this video.

When and Why to Use Analytics. In this video, we look at getting started with analytics and how best to apply them.

How to Fit Quantitative Research into the Project Lifecycle. Quantitative research methods can fit into the project lifecycle at different stages. Learn how, where and which methods to apply for your UX project.

Why Care about Statistical Significance?. There is an element of error involved in measuring anything. So, when we want to compare measurements, how do we decide whether any difference is due to the things being measured or due to error?