CIS Human-Computer Interaction Exam

The following theories draw from ICS 664, ICS 667, ICS 668 and the CIS HCI Exam Reading List.

Jump to:    Secondary Exam Reading List    Theories Leading to Distributed Cognition / CSCW    Topics Defining The User    Design Techniques / Frameworks    Methods for Description and Analysis    Methods for Evaluation

Links:    To-Do List / Meeting Schedule    Exam Rules


HCI Secondary Exam Reading List:

Norman, D. A. (1988). The Design of Everyday Things.
A popular book that will motivate the importance of human factors in the design of everything we use. This reading is also included as an introduction to concepts such as "affordances" and "knowledge in the world" versus "knowledge in the head" (but see Norman's later [DS]

About the Author / Reason for Writing the Book
Chapter 1: The Psychopathology of Everyday Things
Chapter 2: The Psychology of Everyday Actions
Chapter 3: Knowledge in the Head and in the World
Chapter 4: Knowing What to Do
Chapter 5: To Err is Human
Chapter 6: The Design Challenge
Chapter 7: User-Centered Design

 

Preece, J., Rogers, Y., & Sharp, H.. (2002). Interaction Design: Beyond Human-Computer Interaction.
A typical undergraduate level textbook to introduce you to the field, including both scientific background and usability design methods. One of the few that adequately addresses affective measures. [DS & DN]

Ch. 1: Intro to Interaction Design
Ch. 2: Conceptual Models and Interface Metaphors
Ch. 3: Cognition and Mental Models
Ch. 4: Conversational, Coordination, Awareness Mechanisms and Collaboration
Ch. 5: Affective Aspects of Interfaces and User Frustration
Ch. 6: The Process of Interaction Design and Life-Cycle Models
Ch. 7: Needs and Requirements (Scenarios, Use Cases, Essential Use Cases, HTA)
Ch. 8: Prototyping, Conceptual Design and Physical Design
Ch. 9: Ethnography and Participatory Design
Ch. 10: Introduction to Evaluation
Ch. 11: Evaluation Paradigms, DECIDE Framework
Ch. 12: Observing Users
Ch. 13: Interviews, Questionnaires, Inspections and Walkthroughs
Ch. 14: User Testing, GOMS, KLM, Fitt's Law
Ch. 15: Example Applications of Interaction Design
 

Carroll, J. M. (Ed.) (2003). HCI Models, Theories and Frameworks: Toward a Multidisciplinary Science.
This collection of tutorial articles is an appropriate survey for the graduate level student. [DS]

Ch. 1: Introduction
Ch. 2: Design as Applied Perception (visual senses)
Ch. 3: Predictive vs. Descriptive Models, Fitt's Law
Ch. 4: GOMS, KLM, Advanced / Modified GOMS
Ch. 5: Cognitive Dimensions of Notations Framework
Ch. 6: Users' Mental Models
Ch. 7: Information Foraging Theory, Optimal Foraging Theory, Scatter/Gather (evolutionary)
Ch. 8: Collaborative Technologies / Distributed Cognition
Ch. 9: Cognitive Work Analysis (CWA)
Ch. 10: Clark's Common Ground Theory (as used in CMC)
Ch. 11: Activity Theory
Ch. 12: CSCW Research (Psychological foundations)
Ch. 13: Ethnography, Situated Action, Ethnomethodology
Ch. 14: Computational Formalisms
Ch. 15: Design Rationale as a Theory
 

What is HCI?  http://sigchi.org/cdg/cdg2.html

HCI Resources:  http://www.id-book.com/starters.htm

 

Functional Goals:  main aims of the system to be designed:  those things that it needs to be able to do

Usability Goals (come first):  Easy to learn, easy to remember how to use, safe, efficient to use, effective to use, good utility

User-Experience Goals (come second):  fun, enjoyable, motivating, helpful, aesthetically pleasing, rewarding, satisfying, support creativity, helpful, entertaining, ...

(top)


Theories leading to Distributed Cognition and CSCW:

Cognitive Science / Cognitive Psychology / Social Psychology
- some of the most contributing fields to HCI.  HCI draws from the social sciences as well as engineering disciplines

Activity Design / Activity Theory (see Carroll Ch. 11)
- Nardi's articles:  Five main points of Activity Theory (see end of Carroll Ch. 11)
- Scandinavian history:  rose from Participatory Design

- focuses on role between actors, task, and community by analyzing artifacts, rules, and division of labor (the interconnected triangle diagram)
- the unit of analysis is the activity itself

- Nardi on Activity Theory:  outlines five principles of Activity Theory:

Mediation is folded in with each of the other four principles, resulting in four categories of concerns:

Suchman's "Situated Action" and Activity Theory (Scandinavians / Nardi) helped provide some of the foundation for the emerging theory of distributed cognition, which is the main support of CSCW (Computer Supported Collaborative Work).  Situated Action differs from Activity theory in that it is concerned with human ability to react to sudden situations and improvise.  The unit of analysis moves from the activity to the 'situation in context' and the actor is only equipped with their thoughts as they wander into the new situation.

Lucy Suchman:  helped precipitate the adoption of CSCW

Situated Action (Suchman)

This provides reason for ethnomethodology- which aims to survey the social environment in which we work.  It is a work methodology that examines the in situ production of social order.

Ethnography is a method to gain a 'structural' orientation to this environment (not concerned with the actual content of the work)
 

Clark's common ground theory (see Carroll Ch. 10)
-
Production + Comprehension = Communication
- a proposition is common ground if:  all the people conversing know the proposition, and they all know that everyone else knows the proposition.

- Grounding is the process of making sure that another person sufficiently understands you.  If not- use grounding.  We pick up on whether we are understood by monitoring nonverbal behavior or being questioned.
 

Distributed Cognition (see Preece Ch. 4; Carroll Ch. 8)
- draws from Cognitive Psychology and Social Psychology
- CSCW (see Carroll Ch. 12, draw from ICS 668 readings)
 

Ethnography / Ethnomethodology / Ecological Design (see Carroll Ch. 13)
- very important to, and used by Distributed Cognition / Group Collaboration / CSCW

(top)


Topics Defining the User:

Conceptual Models / Users' Mental Models (see Preece Ch. 2 & 3; Carroll Ch. 6)
-
Norman: 
There are three aspects to mental models: 

- ideally a system designed well is one where the users' mental model is very similar to the designers' mental model
 

Affective Learning (see Preece Ch. 5)

Information Foraging Theory (see Carroll Ch. 7)

Errors (mistakes vs. slips) (see Norman Ch. 5)

Ten Usability Heuristics by Jakob Nielsen:

(top)


Design Techniques / Frameworks:

User-Centered Design (see Norman, Ch. 7)
-
users should
be able to (1) figure out what to do (2) tell what is going on - at any given time
 

Interaction Design (Preece)
 

Participatory Design
- a form of design where users join the design team.  Can be for a short time or the length of the project; part-time or full-time.
- users provide subjective insight into the problem domain because they are familiar with it- although if part of design team full time, can become separated from the work situation
 

Constantine & Lockwood:  USAGE-Centered Design (UCD) (vs. Data-Centered)
- scenarios on user categories (housewife, retired worker, etc.  NOT Mary, John, etc.)
- Allen's five components
- Stakeholders (methods of Interviewing, etc)
- Usage Centered Design is a model driven process:  user-role models (get aspects of relationship between users and the system being designed), task models (represents 'typically through use-cases' the things that users try to accomplish using the system), content models (represents the organization of the user interface, besides the appearance and behavior)
 

Rosson & Carroll:  Scenario-Based Design (SBD)
- Analyze:  Problem Scenarios:  Describe the problem of current technology
- Design:  Activity Scenarios:  Describe the problem being solved by a new design- often corresponds to problem scenario
        Information Scenarios:  adds to activity scenario aspects of cognitive processes and affective information
        Interaction Scenarios:  further develops information scenario adding task / goal achieving information
- Claims Analysis can be done on all of the scenarios
- Prototype and Evaluate
 

Task-Artifact Framework (see Carroll Ch. 15)
 

Cognitive Dimensions of Notations Framework (see Carroll Ch. 5)
- http://homepage.ntlworld.com/greenery/workStuff/Papers/introCogDims/index.html

(top)


Methods for Description and Analysis:
- these methods were from notes and work in ICS 668.  For further description and example usage of some of these methods, please see my final project for the course (A Telementoring Workspace, a project I worked on with Viil Lid)

These methods are mainly a combination of SBD (Scenario-Based Design by Rosson & Carroll) and UCD (Usage-Centered Design by Constantine & Lockwood)

Requirements Analysis:  Gathering Information / Requirements so that we understand the work (so we can offer useful functionality) and the people (so we can make it suitable for them)

Root Concept: a shared understanding of the project's high level goals used to guide the field study and initial design.  Vision:  what are we trying to achieve?  Rationale:  why will technology help?  Stakeholder groups:  who has a vested interest?  Assumptions and Constraints:  what decisions have already been made, and what requirements have already been imposed?

Field Study / Ethnography:  'become one with the users' - get out into the problem domain and see what's going on

Workplace Themes:  we can write observations down on sticky notes and group them.  These are our workplace themes.

Artifact Analysis:  what artifacts are being used, and in what way? (not always as intended)  What does the artifact tell you about the task it supports?

Stakeholder Analysis (Roles):  who has a vested interest in the project, and what are their properties (this could be transformed into a stakeholder map)

User Role Model / User Roles Map:  a way to profile our users (User Role Model Form) and then show their interrelationships (User Roles Map)

Interviews:  direct questioning of stakeholders- can be structured, semi-structured or unstructured with open or closed questions.  Can also group interview or use questionnaires / surveys (such as online ones to profile a large group of stakeholders)

Problem Scenarios:  a written narrative, that describes a hypothetical stakeholder carrying out actions to achieve a goal.  These highlight some problem with the current system (to be solved through design process later)

Claims Analysis:  a follow-up method to the problem scenarios- where a key feature is highlighted, and then analyzed in terms of pros and cons (+/-), and the implications of that feature (this measures the associated tradeoffs)

Structured Representations:

Functional Requirements:  describe what the system should do as a whole (structure).  Can be developed iteratively.  Can be represented as a Data Flow Diagram

Data Requirements:  describe structure of system, typically through an Entity-Relationship Diagram (E-R Diagram) - describes artifacts and their relationships (1:1, 1:M, 0:M, M:M, etc)

Hierarchical Task Analysis:  describes users goals by breaking down activities (to achieve the goal) into a hierarchy of tasks with many subtasks.  Good at showing task structure but does not explain much of the cognitive aspects behind them.

Activity Design:  Generate lots of ideas and start comparing them

Activity scenarios:  transform current activities to use a new design through a narrative
- use problem scenarios / requirements analysis as the foundation- they set the scene- to envision new activities that achieve those objectives
- can use metaphors, tools to support new designs
[Metaphors (click for more info)]
- judge on effectiveness, comprehension, satisfaction

claims analysis: (+/-) of the features being described in the scenarios- do this hand in hand with activity scenarios
 

Information scenarios:  add to the activity scenarios presentation details and stakeholder reactions (cognitive processes - reactions, interpretations -  of those in the scenario are added in) - also elaborate the claims analysis to include these

Interaction scenarios:  focus on mapping task to system goals, further elaborate
 

Use Cases / Use Cases Map:  an interaction between a user and a system, that captures some user-visible function and achieves a goal for the user

Essential Use Cases:  a type of use case that describes a meaningful / well defined task, comprised of user intentions and system responsibilities, that is described in an abstract, implementation-free form using external users in role
- these are 'essential' to the system in that they are needed for the system and are still abstract with no implementation details yet

Contexts:  a visual representation of the essential use cases, often done using sticky notes or other easily modified medium.  It pulls content areas and actions out of the essential use cases and turns them into something visual, while still being abstract and free of implementation details.  Often the 'nouns' become contents and the 'verbs' become tools.  It is one step closer to implementation, but still doesn't step into it.

Context Navigation Map:  map / diagram that shows how the contexts inter-relate with each other - how you move between contexts

(Content Model is comprised of Contexts, Navigation Map)

Abstract Layout:  taking the contexts and arranging them logically, making more of a mock-up out of them

At this point, we are ready to translate this into a prototype!

Prototypes:  see below 'evaluation' section for more information.  Tradeoffs with these are whether we will waste work (throw them away) or premature commitment to something if too high fidelity.  See below for more benefits of using prototypes.

(top)


Evaluation Methods and Techniques:
- done by users or experts?

The following two methods can be used for Requirements gathering as well as for Evaluation:

Interviews / Questionnaires
- Interviews:  Can be structured, semi-structured, unstructured, or group interviews, with open or closed questions
- Questionnaires:  a good way to reach large amounts of stakeholders


- testing done by users or experts?  Empirical vs. Analytic Evaluation

Empirical Testing:  Users test the system

User Testing (see Preece Ch. 14) - done on / by users
- can be "quick and dirty" involving 1-2 users for quick feedback
- often involves 5-10 users, where performance times and usability are measured
- use the DECIDE framework to help structure this
- can be costly and time consuming

More on Empirical Testing / User Testing
- testing in the lab of users. 
- opposed to analytic testing (like GOMS, heuristic evaluation, other predictive models)
- equipment used depends on what is being analyzed:  can use audio, video, note taking, eye tracking, or any combination of the above.  Can differ on the role of the researcher and how much they interplay with the users.
- important to deal with ethics:  user consent, anonymity, description / why doing study, provide results of study, etc.
- often followed by user-satisfaction surveys
 

Analytic Evaluation:  Testing is done by expert analysis

Inspections / Cognitive Walkthroughs (see Preece Ch. 13)

- Inspections:  commonly inspect a portion of the code to test it's quality, done by experts
- Walkthroughs:  can be Pluralistic Walkthrough or Cognitive Walkthrough.  Cognitive walkthrough involves designer and expert walking through a series of tasks that the interface is supposed to support (predetermined), and examines whether at each step the user will know what to do, understand it, and note problems in the interface.  Good for identifying problems with the interface metaphor or interaction paradigm used, at relatively low cost.
 

Prototyping (low vs. high fidelity)
- low-fidelity are good for exploring alternative designs, often used early on in system design to explore alternative design implications.  Are low-fidelity because they are cheap and quick- don't have much detail.  Can be done by drawing storyboards on note cards, or by doing a very simple mock-up.  Typically these are not done in a software solution, because the software solution CONSTRAINS the design itself- the look and feel, etc of the software package can add constraints, thus limiting the overall design potential.
- high-fidelity prototypes are usually done later in a project and often look much more like the end product.  They are good at giving feedback to the designers after many of the design decisions have been made. 
 

Usability Evaluation / Heuristic Evaluation (heuristics:  Like Norman's or Nielsen's):  done by experts

Norman's 6 Design Principles:

  • Visibility - functions can be seen

  • Feedback - necessary part of interaction

  • Constraints - ways of restricting what kinds of interaction can take place

  • Mapping - relationship between controls and what happens

  • Consistency - similar operations / use similar elements for achieving similar goals

  • Affordance - attribute of an object that allows people to know how to use it

Nielsen's 10 Usability Principles:

  • Visibility of System Status

  • Match between system and real world

  • User control and freedom

  • Consistency and standards

  • Help users recognize, diagnose and recover from errors

  • Error prevention

  • Recognition rather than recall

  • Flexibility and efficiency of use

  • Aesthetic and minimalist design

  • Help and documentation

 

GOMS / KLM / Fitt's Law (see Preece Ch. 14; Carroll Ch. 4) - done by HCI experts
- variations of GOMS include CPM-GOMS (based on MHP), CMN-GOMS (based on serial stage model), and KLM (time to execute = sum of keypresses, mousing, clicking, dragging, system response time...)
 

Cognitive Dimensions of Notations Framework (see Carroll Ch. 5)
- http://homepage.ntlworld.com/greenery/workStuff/Papers/introCogDims/index.html
- a set of DISCUSSION TOOLS that aid designers in discussing DESIGN DECISIONS and TRADEOFFS through a SHARED VOCABULARY
- Notational Dimensions that are discussed:  Viscosity, Visibility, Hidden Dependencies, Abstraction Level, Premature Commitment (to name a few... see Carroll Ch. 5 for a complete list)
 

(top)


Jump to:    Secondary Exam Reading List    Theories Leading to Distributed Cognition / CSCW    Topics Defining The User    Design Techniques / Frameworks    Methods for Description and Analysis    Methods for Evaluation
Links:    To-Do List / Meeting Schedule    Exam Rules