Final Report University of Waterloo Library Gateway Evaluation
A Report Submitted in Partial Fulfillment
Of the requirements for SYDE 348
Prepared by Team 3:
Minh Cao |
96081101 |
Ryan Fnnie |
98180408 |
Robert Snow |
97018134 |
Greg Stefan |
98117308 |
Justine Yau |
98062608 |
Faculty of Engineering
Department of Systems Design Engineering
March 27, 2001
Course Instructor: Professor C. MacGregor
TABLE OF CONTENTS:
- Introduction to the Problem
1.1 Interactive Systems Problem Statement 1.1.1 Problem Statement - Project Boundaries
2.1 Project Constraints
2.2 Project Requirements2.2.1 Functional Requirements
2.2.2 Data Requirements
2.2.3 Usability Requirement - Process Overview
- Phase 1 UCD Methods
4.1 Summary of Phase 1 Methods
4.2 Impact of Phase 1 Methods on Prototype Development - Phase 2 UCD Methods
5.1 Summary of Phase 2 Methods
5.2 Impact of Phase 2 Methods on Prototype Development - Phase 3 UCD Method
6.1 Introduction Criteria
6.2 Methodology6.2.1 Participants 6.3 Results
6.2.2 Procedures
6.2.3 Measures
6.3.1 Lab Usability testing Results—Quantitative Data
6.3.2 Usability Questionnaire Results6.4.1 Criticism of Phase 3 Methodology 6.5 Recommended Improvements to Current Prototype - Final Specifications
7.1 Target Audience
7.2 CNAG’s Original Concerns
7.3 Critique of Current UW Library Gateway
7.4 Technical Features For New Design
7.5 General Features
7.6 Navigation
7.7 Logos or Banners (Graphics)
7.8 Content7.8.1 Main Menu 7.9 Prototype and Original CNAG Concerns
7.8.2 Left Navigation Bar
7.8.3 Top Navigation Menu
7.8.4 Bottom Navigation Links
- Future Work
- References
Appendix A: Summary of Client Meetings
Appendix B: Team Project Task Allocations
Phase One Appendix C: Team Learning
Phase Two
Phase ThreeAppendix D: Questions Asked for the Discount Usability Method (Iteration One) –Phase 2
Appendix E: The Test Monitor Scripts
Appendix F: Calculations for Keystroke Level Analysis for 10 Questions used in Usability testing.
Appendix H: Testing the usability of our gateway compared to the current gateway
Executive Summary
The purpose of this project was to study the usability of the University of Waterloo’s Library Web site in general and its main Gateway in particular. The problem was reviewed and solved using the spiral design methodology, a user-centered design process. There were three phases of this project. The project goal was: To design a University of Waterloo Library Gateway web page that is easy to understand, easy to navigate, and useful for study and research, for effective searching and retrieving of information to be used by novice to expert UW students, faculty/staff and non-UW persons.The first phase focussed on evaluating the usability of the current UW Library Gateway. The evaluation methods performed were heuristics, walkthroughs, task analysis and comparative analysis to understand the users’ abilities and limitations in using the current Gateway. The main outcomes found in this phase were:
Positives:
- Clean looking—no clutter [Heuristic evaluation, Competitive analysis]
- Help clearly available on main page. Help links give descriptive idea of what type of assistance can be could with that particular link. [Heuristic evaluation, Competitive analysis]
- Statement on the page that this is the University's Library Gateway (provides context) [Competitive analysis]
Negatives:
- Ambiguous titles and terms (Find It, Get It, E-data, E-text…) not very descriptive or suggestive [Heuristic evaluation, Design Walkthroughs, Hierarchical Task analysis, Competitive analysis]
- Too much library jargon (TRELLIS, ILL, TUG, …) [Heuristic evaluation, Design Walkthrough, Competitive analysis]
- Inconsistent page and heading styles and content on pages after the gateway [Heuristic evaluation]
- No good navigation tools at top or side of pages [Heuristic evaluation, Competitive analysis]
- Groupings of terms not always logical, terms are not simple understandable wording [Design Walkthroughs, Hierarchical Task analysis]
- Information buried and no obvious path to get there (ex. Renewing books on-line) [Hierarchical Task analysis]
- Inflexible search capabilities (search not available on main gateway) [Heuristic evaluation, Hierarchical Task analysis, Competitive analysis]
- Links from mouse-overs may not suggest they are clickable options [Heuristic evaluation, Hierarchical Task analysis]
- No feedback for user selection (colour change, etc) [Heuristic evaluation]
- Important information not salient (visible) (Text version, etc) [Heuristic evaluation, Hierarchical Task analysis, Competitive analysis]
The usability issues found in the first phase facilitated the drafting of design requirements, in particular usability requirements. A low fidelity prototype was created to illustrate these usability specifications.
In the second phase of the project the focus was to assess the usability of two low fidelity prototypes. The two prototypes differed in the manner in which the concepts where delivered, in that, one version explicitly displayed the terms on screen (www.ftplinc.com/sd348/prototype_dc.html) and the other was displayed in a pop-up fashion (www.ftplinc.com/sd348/prototype_alt.html). The evaluation methods used included two iterations of both the card sorting method and discount usability methods. Some of the main findings in this phase were:
First/Second Card Sort Resulting Categories:
Find Books, Journals, Data |
E-Services |
Your Patron info |
Special Services |
About the Library |
Help (for site) |
First Discount Usability evaluation (walkthroughs) :
- Users preferred the pop-up prototype. However had greater efficiency and accuracy with the explicit prototype!
- E-services, and Special Services: ambiguous. There was some confusion between the differentiations of tasks that could be accomplished between these two groups.
- The quick links were used often when applicable
- In the pop-up prototype, users would try to click on the options in the Crest.
Third Card Sort Resulting Categories:
Find Books, Journals, Data… |
Reference Shelf |
Services For |
About the Library |
Help |
Second Discount Usability:
- Improved performance on task evaluation
- Users still tried to use description as links
The final phase assessed the last prototype using keystroke level analysis and formal lab-based usability testing. The results of the user testing were statistically analyzed using chi square analysis and t-tests. The t-tests showed that the users were unable to meet the KLA requirements. However, when average times were compared to the current library’s KLA scores for inexperienced users, 6 of the 10 questions did in fact exceed the requirements. Chi square analysis revealed that users found this prototype significantly more user-friendly in terms of keywords, graphics, navigation bar and overall layout than the current gateway.
The report concludes with some final recommendations based on the third phase of design including the overall sight layout, labels, jargon, navigation issues and layout consistency. The report also discusses some future recommendations for the prototype including improved visibility of the navigation bar, improved mouse sensitivity of the mouse-overs, providing tags to provide additional information on the actions of links and extensive testing with a more representative sample of users.
- Introduction to the Problem Information resources are key assets of universities. The amount of potential knowledge undergraduate students, graduate students and faculty can gain from books, journals, dissertations and countless reference tools is not greater anywhere else than university libraries. Furthermore, with the Internet, university library users have access to these resources from anywhere and anytime.
- Overall site layout
- Graphics on the Gateway
- Mouse-over vs text layout
- Dead spaces (white spaces)
- Labels- Language and jargon
- Navigation – issues
- Layout consistency (navigation)
- Clarity of top navigation bar
- Students1
- Faculty2
- Visually-impaired 3
- Undergraduate/Graduate students4
- Staff5
- Librarians6
- Non-UW students/researchers7
- Novice and experts8
- Persons with disabilities9
- Members of the community10
- Searching11
- Retrieving12
- Navigating13
- Contacting14
- Reading15
- Learning16
- Mouse movements (including mouse-overs)17
- Scanning18
- Mouse clicking19
- Typing20
- Useful for study21
- Useful for Research22
- Main window to info23
- Easy to understand24
- Easy to navigate25
- Special services for persons with disabilities26
- Quick for experts27
- Provide quick links28
- High visibility of key information29
- Easy access (from remote computer)30
- Availability of on-line tutorials31
- Forgiving mouse-overs32
- Variety of languages33
- Web page (Gateway)34
- Graphics35
- Text version36
- Mouse-over version37
- Minimal dead space38
- Navigation bars39
- Fast loading40
- Useable anywhere41
- Browser compatible42
- Students: the project directives clearly state that the Library Gateway will be for students.
- Faculty: the project directives clearly state that for our purposes, the Library Gateway will be evaluated for students only.
- Visually impaired: Question 13 of the Web Usability Questions indicates that a user may be visually impaired.
Assumptions
- Undergraduate/Graduate students: we assumed that by students CNAG refers to both undergraduate and graduate students.
- Staff: we assumed that UW staff may also be interested in using the Libraries’ services.
- Librarians: we assumed that Librarians will be using the Gateway to help others in finding resources.
- Non-UW students/researchers: we assumed that students and researchers from other universities/colleges, especially those in our area (Wilfred Laurier, UGuelph, Conestoga College), would like to seek additional resources from our Library. As well as students from other countries doing academic exchanges with UW.
- Novice and experts: we assumed that the users may range from novice to experts in both using UW’s library Gateway and in using an on-line library web site.
- Persons with Disabilities: although the Web Usability Questions only addressed visually impaired persons, we assumed that persons with other disabilities may want to use the library services as well.
- Members of the Community: we assumed that members of the community may also want to use library services or find out about news/exhibits/events happening at the library.
USER ACTIVITY
Facts
- Searching: almost all Web Usability Questions involved searching for information, be it books, journals, maps, or library hours.
- Retrieving: Questions 10, 17 and 18 from the Usability Questions indicated users would like to retrieve books or journal articles from other Tri-University Group libraries (WLU, UGuelph).
- Navigating: the project directives indicate that users would need to navigate through the site.
- Contacting: Question 3 of the Usability Questions indicates that users may want to contact librarians.
- Reading: Question 9 of the Usability Questions indicate that users would like to read for example, a journal article, without coming to the library.
- Learning: Question 20 of the Usability Questions indicates that users may want to learn about the library services and how to use them through orientation sessions or workshops.
- Mouse-movements: the project directives indicate that the user may need to perform mouse-movements for mouse-over interaction styles.
Assumptions
- Scanning: we assumed that users will be scanning the web page (Gateway) for relevant information.
- Mouse-clicking: we assumed that with mouse-overs users will have to perform mouse clicks to navigate through different links.
- Typing: we assumed that to search for specific books or journals, etc, users will have to type in the title, author, publication, etc.
LEVEL OF SUPPORT
Facts
- Useful for Study: the project directives clearly state that the web site should be useful for study.
- Useful for Research: the project directives clearly state that the web site should be useful for research.
- Main window to information: the project directives clearly state that the web site should be the main window to information for users.
- Easy to understand: the project directives indicate that there may be language (jargon) issues, thus the site has to be easy to understand.
- Easy to navigate: the project directives indicate that there may be navigation issues, thus the site has to be easy to navigate.
- Special services for persons with disabilities: Question 13 of the Usability Questions indicate that special services should be offered to persons with disabilities.
Assumptions
- Quick for experts: for the website to be useful to all users, we assumed that it should be flexible to allow for expert users to quickly access the information they need.
- Provide Quick Links: this leads directly from the last idea. To allow for users to quickly access information, Quick links should be provided.
- High visibility of key information: to make navigation easy, salience of key information is important.
- Easy Access: for the website to be useful to all users, including those at their home computers, library information should be easily accessible from anywhere.
- Availability of on-line tutorials: for the website to be useful and easy to understand and navigate, extensive help functions such as on-line tutorials should be available.
- Forgiving mouse-overs: for the website to be easy to navigate, mouse-overs should be easy to use if a mouse-over layout is used.
- Variety of languages: to help a wide variety of students on academic exchanges and members of the community access information.
FORM OF SOLUTION
Facts
- Web page: the project directive clearly states that this project will evaluate and re-design a web page gateway.
- Graphics: the project directive indicates that graphics are an area of concern. Appropriate graphics should be included in the site.
- Text version: the project directives indicate that a text version of the site is a possible form of solution.
- Mouse-over version: the project directives indicate that a mouse-over version of the site is a possible form of solution.
- Minimal dead space: the project directives indicate that dead space may be an area of concern. Dead space should be minimized, however maintaining aesthetics.
- Navigation bars: the project directives indicate that navigation bars should be part of the solution.
Assumptions
- Fast loading: for accessibility to home users, we assumed that the web site pages should be fast loading for those with modem access.
- Useable anywhere: for accessibility to all users, we assumed that the web site pages should be accessible from anywhere with Internet access (that is, not limited to school labs).
- Browser compatible: for accessibility to all users, we assumed that the web pages will be functional for all browsers (Internet Explorer, Netscape) and for older versions (within reason).
- Project Boundaries
- Navigation tools
- The new gateway shall have a search tools on the main/front page of the gateway
- The new gateway shall have a request for the type of search required up front
- Patron Info and Renewals shall be included as quick links on the Gateway
- "How to find" help function shall be visible on the main/front page of the gateway
- Clear and common language used
- FIND IT and GET IT terminology shall be changed to action words used for describe tasks, followed by a noun of object task is completed on/for.
- The terms, Trellis, TUGdoc, ILL shall to be expanded for clarification (i.e. Trellis: Our book Catalogue)
- Visibility
- The new gateway shall provide feedback when making a mouse selection by means of colour change or highlighted box
- The system shall make the action of selecting an option to text box more obvious and visible to user by providing better mapping (eliminate hourglass appearing)
- Text version link shall be more visible with a higher contrast with the background, and the JavaScript link shall be located at the same spot at text version link
- Site map link shall be more visible because of it’s importance in use
- Consistency
- All menu bar options shall behave in the same manner (i.e. all top menu bar options will provide and pull down menu, while all center menu options will appear in the crest)
- All slashes (/) shall be replaced with "&" for consistency when appropriate
- Continuous colour, fonts and wording scheme shall be used consistently throughout gateway
- Help
- Help links shall give a descriptive idea of what type of assistance can be done with that particular link
- The new gateway shall gives a complete list of possible links for help, through a search option
- If help needed is not available, a method of contacting library is available through "Ask Us/Tell Us"
- Process Overview A spiral design methodology is adopted as the user-centered approach (refer to Figure 1). The process begins with an analysis of the current library gateway to determine the design requirements in terms of functionality, data and usability. The analysis tools include a heuristic evaluation, interview, design walkthrough, hierarchical task analysis and competitive analysis. As a result, a low-fidelity design is created using Paint Shop drawing program to show the basic layout design ideas and implemented for testing. The testing and evaluation process consisted of two iterations containing the use of a card sorting method and discount usability methods. The information gathered from an initial card sort was analyzed and headings were formed for the library gateway page that reflected the preference of the users. These predefined headings are used in a second attempt of the card sorting method. The organized concepts and terms are incorporated into our low fidelity prototypes that underwent further user testing in a discount usability consisting of design walkthroughs with users and heuristic evaluation by the design team. The result was the design of a medium fidelity prototype created using an actual programming language, which increased the realism of the prototype. The last iteration ends the spiral design process and involved lab-based usability testing of a range of users. The implementation of the medium fidelity prototype is evaluated by use of strict protocols that our outlined to ensure consistency and efficiency.
- Phase 1 UCD Methods
- Phase 2 UCD Methods
- Shuffle index cards so that all concepts and terms are randomized
- Give index cards to user(s) and ask them to group concepts into two groups. Allow users to give headings for groups. Record results after each phase.
- Shuffle up index cards, and then ask user(s) to group concepts into 5 groups
- Shuffle up index cards, and ask user(s) to group cards under the following headings; 1)Find Books, journals, data…2)Search for journal articles 3)About the library 4)Services for Faculty, Students,….5)Internet Resources
- Shuffle up index cars, and ask user(s) to freely group the concepts in as many piles as they feel appropriate
- Mouse-over versus text layout: the prototype combined the two interaction styles to get the advantages of reduced clutter by hiding some information (mouse-over) and direct information enabling concurrent processing of available options (text)
- White spaces (dead spaces): the prototype maintains some white space (to maintain little visual clutter) but less white space than the current gateway to take advantage of the display space available to communicate information
- Labels-language and jargon: the prototype attempts to clarify the labels with short descriptors of examples of what the category label may contain.
- Navigation issues: a consistent side navigation bar has been proposed for the prototype. As well, bottom links throughout the Library website pages.
- Graphics: the UW crest was chosen over the Davis Centre arch as it was decided that the crest better represented all of UW library users.
- Overall site layout: the site layout was addressed by the card sorting method and the resulting categories. As well, the Quick Links and side navigation bar allow for quick access to frequently used pages of the site.
- Layout inconsistencies: currently the prototype only addresses the Gateway page. However as a recommendation, all library website pages should have the side navigation bar and bottom links to maintain a consistent look and navigation throughout the site.
- Phase 3 UCD Method Instructor’s Note: This section had significant problems. Students were instructed to address these problems before submitting the final electronic version. While some of it was cleaned up, many problems remain, particularly with interpretation of the statistical analysis. Treat this section (Section 6) with caution.
- 6 males, 3 females
- The average age was 22.2 years (standard deviation of 1.4 years) Two Participants withheld this information
- 5 participants were of technical background, 4 of which were of science based background
- 8 of 9 participants were in the co-op program
- 5 background questions are summarised in Table 7.1
- The participant is asked to review and sign a consent form.
- The participant was then asked to answer a background questionnaire about their computer knowledge and their knowledge of the UW Library
- The participant was then asked to carry out the 10 timed tasks, in either Version A or Version B. The only difference in the two versions was the order of the questions.
- The participant is then asked to reflect upon the experience and evaluate the team's designed gateway on 5 different usability criteria. And asked to give additional commentary upon the designed web page.
- Final Specifications The design specifications outline what exactly the web site gateway should have. The specifications have been produced as a result of the design requirements (needs) gatherings. Needs of the site were gathered by usability evaluation methods such as those performed in Phase One and Two and the final Usability testing. This outline along with a visual sample should enable another team to reproduce the main design elements of the gateway page recommended.
- Target Audience
- CNAG’s Original Concerns
- Critique of Current UW Library Gateway
- Technical Features for New Gateway
- General Features for New Gateway
- Navigation for New Gateway
- Logos or Banners (Graphics) for New Gateway
- Content for New Gateway
- Prototype and original CNAG concerns
- Undergraduate students: varied level of experience with Library offerings
- Research: books, journals, web resources
- Library workshops
- Graduate students
- Research: books, journals, web resources, dissertations, government publications, special collections (rare books, archives), geo-spatial data
- TA responsibilities
- Library workshops
- Library contacts
- Faculty/Staff
- Research: books, journals, web resources, dissertations, government publications, special collections (rare books, archives), geo-spatial data
- Course preparation
- Library workshops
- Library contacts
- Persons with Disabilities
- Accessibility (Adaptive Technology Learning Centre)
- Perkins brailler
- print-enlarging service
- TTY/TDD
- Hours and locations of UW’s libraries
- How to obtain information from other libraries (University of Guelph, Wilfred Laurier University and more)
- Department phone/email directory, policies, reports, manuals, development
- News/Events/Exhibits announcements and other news items.
- Links to other relevant sites
- Distance Education
- Using the UW Library, resources available
- Using other libraries
- Alumni
- What resources are available to alumni
- Other university students (graduate and undergraduate)
- Using the UW Library, resources available
- Other university faculty
- Using the UW Library, resources available
- Business and Community
- What resources are available to community members
- Directory of Library Department for outside researchers.
- Library announcements (news) and links (service to community)
- Information for prospective donors (library development)
- Overall site layout of Library Gateway
- Graphics on the Gateway
- Mouse-over versus text layout
- Dead spaces (white spaces)
- Labels-language and jargon
- Navigation issues
- Layout consistency (navigation)
- Clarity of top navigation bar
- Clean looking—no clutter [Heuristic evaluation, Competitive analysis]
- Help clearly available on main page. Help links give descriptive idea of what type of assistance can be could with that particular link. [Heuristic evaluation, Competitive analysis]
- Good eye focus, main information attracts the eye, is centrally located [Competitive analysis]
- Statement on the page that this is the University's Library Gateway [Competitive analysis]
- Viewable with ANY browser out there (Netscape, IE, ...) [Competitive analysis]
- Banner provides some form of consistent look and feel (for some pages) [Heuristic evaluation]
- Loads quickly [Competitive analysis]
- Ambiguous titles and terms (Find It, Get It, E-data, E-text…) not very descriptive or suggestive [Heuristic evaluation, Design Walkthroughs, Hierarchical Task analysis, Competitive analysis]
- Too much library jargon (TRELLIS, ILL, TUG, …) [Heuristic evaluation, Design Walkthrough, Competitive analysis]
- Inconsistent page and heading styles and content on pages after the gateway [Heuristic evaluation]
- No good navigation tools at top or side of pages [Heuristic evaluation, Competitive analysis]
- Groupings of terms not always logical, terms are not simple understandable wording [Design Walkthroughs, Hierarchical Task analysis]
- Information buried and no obvious path to get there (ex. Renewing books on-line) [Hierarchical Task analysis]
- Inflexible search capabilities (search not available on main gateway) [Heuristic evaluation, Hierarchical Task analysis, Competitive analysis]
- Links from mouse-overs may not suggest they are clickable options [Heuristic evaluation, Hierarchical Task analysis]
- No feedback for user selection (colour change, etc) [Heuristic evaluation]
- Important information not salient (visible) (Text version, etc) [Heuristic evaluation, Hierarchical Task analysis, Competitive analysis]
- Page width: Should fit in lower common denominator browser (640x480 total)
- Must be usable with older browsers versions (available to a wide range of users)
- Quickly loading. This doesn't mean instantly over a modem, but it shouldn't take more than a few seconds until the page is at least readable (but a picture may take longer to finish). Most University students currently access the Internet from home using a modem.
- Javascript mouseovers (Upon mouse-over of a category title, category options appear in the crest (Main Menu), or a menu pops-up (Top Navigation bar menu)). However:
- must be usable with non-javascript enabled browsers
- must not be essential to navigation (provide a text-version)
- blinking (too distracting)
- animated gifs (too distracting)
- frames (possible navigation issues)
- java applets (too unstable and inconsistent)
- The focus of the screen is the Main Menu with four (4) category links. To its right is the University of Waterloo crest that would contain the pop-up menu of clickable options for the moused-over category
- Mouse-over of the Main Menu titles provides many options on one screen without the frustration of clicking back and forth between the gateway and wrong screen selections. Discount usability testing showed that the mouse-over version was aesthetically preferred over the explicit category descriptions (too much clutter).
- Menu options provide mouse-over feedback through text colour change and background colour change
- For users who do not want to use the mouse-over Crest options then clicking on the Category title would lead to an intermediary page with the same options
- Short descriptors for Main Menu category titles, to help users find keywords while scanning the screen for information. Thus facilitates user in selecting the correct category to view for options, avoids "guessing" which category the term belongs to. Discount usability testing found the descriptors reduced the information "search" time compared to category titles without descriptors.
- Side Navigation bar containing Quick Search and Quick Links (of frequently accessed pages to the Library resources)
- Top Navigation bar with titles that when moused-over will drop menus for selection
- Bottom of screen contains key links to non-Library resource related pages (Site Index/Search, News/Events/Exhibits, AskUs/Tell Us, UW Home page)
- Upon mouse-over of any link, the link will turn a different colour and is also underlined to provide feedback as to the user’s selection (which is not available in current site)
- Flexibility for expert users: Quick Links to frequently accessed pages
- Consistent look and feel throughout the site. So you know you are still in the UW Library site.
- Easy to navigate web site, including a search engine from the gateway.
- Uncluttered gateway (but not so sparse that it is difficult to guess where to find info)
- Left hand navigation bar consistent throughout web site. Thus make available frequently accessed information: Quick Search, Quick Links. (Recall that the hierarchical task analysis and competitive analysis found that the Quick Links and Quick Search be important usability requirements.)
- Library banner across the top of each page for consistent feel (visual momentum)
- As well, top menu bar available on all pages to access secondary links
- Bottom links available on all pages to access frequently used but not directly-related to library links
- Use of UW colours: black, gold, red
- UW crest
- Banner contains "The Library" (consistent with Library naming)
- Should have a basic (smaller?) version that can be incorporated on pages controlled by individuals or groups in the Department
- Find Books, Journals, Data,….
- Books in our Library (TRELLIS)
- Books in other Libraries
- Journal Indexes
- On-line Journals
- Electronic Full-Text Resources
- Electronic Data Services
- Course Reserves
- Reference Shelf
- Reference Tools
- Internet Search Tools
- Services For
- Faculty and Staff
- Graduate Students
- Undergraduate Students
- Persons with Disabilities
- Distance Education
- Alumni
- Business and Community
- Help
- Trellis Help
- On-line Instructions
- Site Index/Search
- How to Connect from Home
- Research Guides
- Quick Search
- Search by: title, subject, author, journal, etc
- Quick Links:
- Hours and Locations
- Patron Info
- Renewals
- Course Reserves
- TRELLIS: our catalogue
- Journal Indexes
- On-line Journals
- About the Library
- Hours/Locations
- Guide to the Libraries
- Staff/Administration
- Library Development
- Library Classes
- Adaptive Learning
- How Do I….?
- Find:
- Books
- Journal articles
- Research in my department
- Electronic maps
- On-line Journals
- On-line resources
- Obtain resources from:
- UWaterloo
- UGuelph/WLU/Annex
- Other Libraries
- Connect from Home
- Find:
- Library Forms
- TUG doc
- InterLibrary Loan (ILL)
- CISTISource
- Self Registration
- Reserves
- Buy It
- Local Library Sites
- Laurier Library
- Guelph Library
- TUG Homepage
- Kitchener Public Library
- Conestoga Library
- UW Homepage
- News/Events/Exhibits
- Ask Us/Tell Us
- Site Index and Search
- Overall site layout: the site layout was addressed by the card sorting method and the resulting categories. As well, the Quick Links and side navigation bar allow for quick access to frequently used pages of the site.
- Graphics: the UW crest was chosen over the Davis Centre arch as it was decided that the crest better represented all of UW library users.
- Mouse-over versus text layout: the prototype combined the two interaction styles to get the advantages of reduced clutter by hiding some information (mouse-over) and direct information enabling concurrent processing of available options (text)
- White spaces (dead spaces): the prototype maintains some white space (to maintain little visual clutter) but less white space than the current gateway to take advantage of the display space available to communicate information
- Labels-language and jargon: the prototype attempts to clarify the labels with short descriptors of examples of what the category label may contain.
- Navigation issues: consistent side and top navigation bars have been proposed for the prototype. As well, bottom links throughout the Library website pages.
- Layout inconsistencies: currently the prototype only addresses the Gateway page. However as a recommendation, all library website pages should have the side navigation bar and bottom links to maintain a consistent look and navigation throughout the site.
- Clarity of top navigation bar: the prototype took advantage of the prime location of the top navigation bar by using pop-up menus to allow for greater information storage without the clutter.
- Future Work Many of issues found through this project could not be completely resolved due to time constraints or limited technically ability of the group members. Future recommendations for the prototype include: improved visibility of the navigation bar, improved mouse sensitivity of the mouse-overs, providing tags to provide additional information on the actions of links and extensive testing with a more representative sample of users.
- References
- Funato, K., Matsuo.A, Fukinaga, T. (2000), Measurement of specific movement power application: evaluation of weight lifters, Ergonomics, 42(1), pp 40-54.
- Heinrichs, K., Perrin, D., Weltman, A., Gieck, J., Ball, D. (1995) Effect of protocol and assessment device on isokinetic peak torque of the quadriceps muscle group, Isokinetics and Exercise Science, 5, pp7-13.
- Nielson, J. (1994). Usability Inspection Methods. New York: John Wiley and Sons, Inc.
- Norman, D. (1988). The Design of Everyday Things. Doubleday/Currency, New York, USA.
- Stanton, N. (1998). Human Factors in Consumer Products. Taylor & Francis Ltd. Bristol, PA.
- Wichansky, A. (2000), Usability testing in 2000 and beyond, Ergonomics, 42(7), pp 998-1006.
Students invest a lot of money into gaining information from attending university and thus any one of the channels to gain this information should not be blocked. That is, library resources should be accessible. Accessibility of information from the World Wide Web (WWW) is dependent on the usability of the library web page.
The University of Waterloo Library’s Community Needs Assessment Group (CNAG) expressed concern about possible usability issues with the University of Waterloo’s (UW) Library Gateway. This problem has been reviewed using a user-centered design process. By assessing the users’ abilities and limitations, recommendations for design changes have been made. The goal of this project is to design a library gateway page that will provide students and staff with effective access to essentially limitless information!
To understand the problem, a multi-disciplinary team consisting of Kinesiology and System Engineering Undergraduate students was formed to examine the users of the interface and the tasks that the users will be performing. A spiral design methodology was used to tackle the problem, which contained four iterations. The first phase examined UW’s current web page using the usability inspection method of heuristics and usability evaluation methods, walkthroughs and task analyses. A comparative analysis of the Gateway against other library gateways was performed to study the usability of these library sites. Usability issues found through these four evaluations helped determine the design requirements for phase two, and facilitated the creation of a low-fidelity prototype. Therefore the scope of phase one covered the first iteration of the analysis, design (planning) and implementation phases of the spiral design method.
The second phase involved an evaluation of the low fidelity prototype to complete the full cycle of the spiral design methodology. Evaluation is an important stage that allows for user input that can be used for the design of a medium fidelity prototype. The usability assessment tools used were card sorting method and discount usability methods, which involved a heuristic evaluation and design walkthrough. As a result, further design changes were made and revised specifications proposed.
The final phased involved a lab-based usability test, which received ethics approval, whose purpose was to evaluate the medium fidelity prototype. The design team followed a strict protocol for the usability testing to ensure consistency and accuracy between the testing of each user. Resolution of the presented problem are made in the form of recommendations to the CNAG for a new revised and improved UW Gateway Page design based on all usability testing.
1.1 Interactive Systems Problem Statement
The initial tests should concentrate on the Library top gateway. Additional tests will concentrate on site navigation issues and usability of individual pages within the site.
Important aspects to test:
(From the Team Project Handout (SD348))
As well, twenty (20) Web Usability Questions were provided highlighting some of the key services that the Library provides, and information that users may need.
The key terms and ideas have been underlined in the above instructions and will be used to develop the interactive systems problem statement (ISPS). Key terms have also been pulled from the Usability Questions for the same purpose.
1.1.1 Problem Statement
To design a University of Waterloo Library Gateway web page that is easy to understand, easy to navigate, and useful for study and research, for effective searching and retrieving of information to be used by novice to expert UW students, faculty/staff and non-UW persons.
Table 1 is a summary of the key terms grouped into components of the ISPS. Following the table is a discussion of each of the components and terms. The last statement of this section will thus be the Problem Statement itself.
Table 1: Components of Interactive Systems Problem Statement
Facts |
Assumptions |
|
Users |
||
User Activity |
||
Level of Support |
||
Form of Solution |
Components and Terms
USERS
Facts
2.1 Project Constraints
The standards (rules and regulations) set by the clients is an organizational factor that may restrict the potential of a design solution, since these standards must be met. The design team is also constrained by their ability to perform heuristic evaluations due to the lack of experience and exposure to usability testing. As well, the programming capability of the multi-disciplinary team is challenged due to the length of time needed and the fact that some members were untrained in programming. The multi-disciplinary team has attempted to overcome all the project constraints that were presented, and have successfully completed and resolved the problem to the best of their ability.
2.2 Project Requirements
2.2.1 Functional Requirements
2.2.2 Data Requirements
2.2.3 Usability Requirement

Figure 1: Spiral Design Methodology applied to Evaluation of Library
4.1 Summary of Phase 1 Methods
The first usability evaluation method to be used on the library gateway was a heuristic evaluation based on Norman and Neilsen’s heuristics. Heuristic evaluation is a usability inspection method that sets guidelines to serve as a reminder of important aspects of design for the inspection of consumer products or software systems. The expert analyst makes judgements to ensure compliance with recognized usability principles or heuristics. Heuristic evaluations are advantageous because they bring awareness to user-centred design issues and suggest solutions in a cost-effective way. The intent of this method is to provide a solid starting point and a quick evaluation of the problem.
Contributions to the heuristic evaluation performed on the Library gateway was completed all five team members by means of individual evaluations followed by a group brainstorming session. Afterwards, one team member completed a more thorough heuristic evaluation using two specific sets of usability heuristics, one, adapted from Nielson's ten usability guidelines and the second, Norman’s principles of design for understandability and usability (Norman, 1988).
The heuristic evaluation provided insight into the abilities and capabilities of the users and the usability requirements of the new design. Several usability problems were identified with the current Gateway. There were problems with feedback with mouse overs and when selecting items and some of the terminology was ambiguous. The site map and search options were not visible and there was a lack of consistency in the way items were laid out for example there was inconsistent use of slashes and "&" signs.
The second method used in the evaluation of the current library gateway was a Design Walkthrough. Design walkthroughs are a data collection method where the user is given a task with a specific goal and a tester or evaluator watches how the user attempts to complete the task. The evaluator watches for and records user errors and discusses with the user particular problems they may have with the system. The intent of this method was to see how users were interacting with the current gateway and to assess how the users go about completing tasks. The most obvious way of collecting data about a person’s interaction with a device is watching and recording the interaction (Stanton 1998). This method was to give insight to specific areas that were problematic to users.
Two group members were assigned to the task of performing walkthroughs. Each group member tested four participants, two experienced and two inexperienced. The walkthroughs took place one on one with the participant in a private setting. The evaluator recorded errors, whether the task was completed and the time it took to complete the task. The tasks of the walkthroughs were based on the twenty gateway questions provided by the library staff. The participants were read the question and upon completion of the question the evaluator began timing. Observations of errors were recorded as well as a verbal protocol to give an indication of the user’s thought process while attempting to complete the task.
From the walkthrough method some specific problems encountered included the fact that users did not correlate the task of finding books to the Trellis option. In addition to this, there seems to be confusion surrounding the options of Find It and Get it. The terms are too general and are too similar. Participants also found it difficult to find the electronic resources such as dictionaries and participants also had difficulty linking patron information and renewals with the term Get It. Finally the walkthroughs demonstrated that participants were unsure of how to find information on connecting from home.
The third method for assessing the usability of the gateway was Task Analysis. Task analysis is an evaluation method that is used to gather design requirements, in particular usability requirements. Usability requirements outline what the Library Gateway needs to be like for the users. A task analysis is a technique to assess the tasks users carry out to achieve a goal. The hierarchical task analysis (HTA) is a top-down decomposition of the goal. That is, the goal is divided into sub-tasks, with each level representing a lower level of abstraction.
The group chose to evaluate 3 goals. 1) A general task analysis for Find It; and two more task-specific goals: 2) Obtain a book and 3) Discover when books are due, corresponding to UW Library CNAG’s Web Usability Questions #1 and #12, respectively. Question #1 (Obtain a book) was chosen to be evaluated by a task analysis because the group felt that since this is one of the main reasons users go to the Library web site, it should be analyzed comprehensively. Question #12 (Discover when books are due) was chosen because it was not specifically covered by another method, and to experience performing a task analysis with another genre of task (Get It as opposed to Find It for obtaining a book). The general Find It category was chosen for evaluation because it contains the most used tasks for the Library gateway.
After performing the task analysis it was discovered that the users might not know that there is an interaction with the pop up menu; it appears that the user can click on the heading and this will take them to a new page. The task analysis also indicated that there was too much library jargon on the gateway terms such as trellis, catalogue and barcode have little meaning to an inexperienced user. Another problem was to get to patron info the user must click through three pages instead of there being a direct link to this option. Finally, this method helped to illustrate the need to change the terms Find It and Get It.
The final usability analysis tool used to assess the gateway was a competitive comparison between the University of Waterloo and five other library gateway pages. A competitive comparison or benchmarking is when multiple products are ranked against one another to identify strengths in their designs. Six gateway pages were compared under a heuristic evaluation and ranked by these guidelines by four of the members in the group. For the benchmarking evaluation each of the six library gateway pages were ranked for each of the ten given heuristics, with the highest ranking gateway page receiving a rank of 1 and the lowest ranking gateway a rank of 6. The overall rank per each category was determined by the average rank of all the separate evaluations.
The competitive analysis revealed that the Dalhousie University Library gateway was the top ranked library gateway. There were some key characteristics of the gateway pages that caused them to be ranked high. It was found that top ranked sites tended to be those with clean, clear pages with no cluttering of information as well as those sites that had information grouped in to logical categories with simple descriptive headings. Top ranked sites also had search tools visibly located on the main page. Another asset to the top ranked sites was the help page which was clearly located on the main page of top ranked sites and contained a help search option.
4.2 Impact of Phase 1 Methods on Prototype Development
All the methods used indicated that there should be some sort of search tool on the main page to allow quick, direct searches. All methods also indicated a need for a clarification between the terms Find It and Get It. These terms were replaced by the term "Find books, journals, data…" and the information contained in the "Get It" category was reorganized in other headings. In addition it was revealed by the design walkthroughs and the task analysis that the term Trellis need to be expanded and clarified. Another design change influenced by all the design methods was the need for increased feedback when making a mouse selection. This included mouse over pop-ups and changing the colour of link text when the mouse passes over it.
Three out of the four methods (Task analysis, walkthroughs, and competitive analysis) revealed a need for a quick link option for tasks such as patron info, reference tools, renewals and a "How to Find" help function. The competitive analysis influenced the decision to change the location of the button bar from the top of the screen to the bottom. The competitive analysis also influenced the decision to give a complete list of possible links for help through a search option.
The heuristic evaluation influenced a number of design changes. First of all the inconsistencies in format needed to be addressed. The text version link was in the upper left corner when using the Java script version and the Java script link was in the upper right in the text version. These links should be in the same location and were both placed in the upper right corner of the first phase design. In addition to this, other consistency issues identified by the heuristic evaluations include the replacing the slashes with "&" signs and making all the links on the button bar take the user to a new page. For an illustration of how the design changes came together to create the first phase design see a copy of the first phase, low fidelity prototype in Figure 2.

5.1 Summary of Phase 2 Methods
The card sort method is a categorization method where users sort cards depicting various concepts into several categories. The concepts may be functions or terms, and the categories are groupings that make sense to the user (UCD Course notes, 2001). This enables designers to capture the users’ mental models of how the system should work (the links between the functions/tasks). This technique is best used in the early stages of the design process to gain an understanding of how the user sees the system working, and apply this to the design.
To facilitate the design of category headings and the items they should contain, two card sorting methods were performed. The design of the category headings and groupings is an integral component of the Library website’s usability. It was consistently found in previous evaluations of the present library web page that the current category terms were ambiguous and were not suggestive of what the categories contained. The first phase of the card sort method was used to determine headings for the Gateway that the users found more descriptive and intuitive. The headings discovered from the first round of card sorting were used to perform a second card sort. The purpose for performing a second card sort using different users was to find out the location that each subheading should belong to within the predefined heading. The second phase was also performed to evaluate initial changes, therefore rendering evidence to maintain changes or suggestions to further modify the existing ones. The card sort method can provide a fresh perspective of how the terms can be associated with each other, and provide insight into more intuitive groupings of similar concepts. The combination of the two phases of card sorting was used to produce a prototype to be tested using discount usability.
Discount usability enables designers and developers to utilize the benefits of user testing with minimal resources. Nielsen (1994) emphasizes that developers often want to use the "best" methods for usability testing to achieve the "best" results, but it is often the case that when the goals are set too high, no method is used at all. He states that it is better to use some form of usability testing rather than none. Usability testing methods can be highly expensive due to cost of equipment and the salary of usability experts, and its complexity can be intimidating to those who have not been exposed to it. Therefore, discount usability methods are employed to reduce cost and intimidation.
Discount Usability is a rapid spiral method of design evaluation. This method uses heuristic evaluation and user walkthroughs to quickly identify areas of concern within a design. After areas of concern are identified, quick design changes are made to the designed interface.
The evaluation technique of discount usability is a method that greatly assists our design of the Library gateway. By allowing the key users to interact with our design and allowing for the criticism of such users to be heard, as well as observing any difficulties that the users experienced enabled our team to have further insight into our design. This insight into the design allowed us to iteratively improve our design, therefore giving us a superior design to what we had when we entered this stage, as well as the confidence needed within our design to proceed the final prototyping stage.
Application of the card sort method required each team member to perform a card sorting method on one experienced and one inexperienced user (may be done in pairs of people with same level of experience and resulting in a collaboration of end data) using the following standard procedure:
The second card sort was consistent with the Initial one except that users were asked only to group the concepts into the following headings: 1) Find Books, Journals, Data… 2) E-Services 3) Your Patron Info. 4)Special Services 5)About the Library 6)Help
The discount usability method was a two-iteration procedure, for the first iteration, three users were tested on two separate designs. These two designs were referred to as the Pop-up design, and the explicit design.
Before the start of each evaluation with a user, the user was told in brief that the designs are for the Waterloo Library Gateway and that this work is being done for academic credit in SY DE 348. This was not scripted, but this information was passed on in a colloquial format. Each user completed six questions, stated in Appendix D. These six questions were asked with respect to both designs, and each user evaluated both designs, sequentially. Two users started with the pop-up design, and the other with the explicit design.
The team member administrating the evaluation was not given any extra information regarding the designs during the evaluation. The prototype designs used in the evaluation were paper-based. The Explicit Design was on one sheet of paper, while the Pop-up design was on multiple sheets, with the team member having to interchange the sheets of paper for the user to view depending on the actions taken by the users. At the end of the evaluation, the users were asked about their previous experience with the Waterloo Library gateway, and are given a rating of high or low experience. Subsequently, Neilsons Heuristics were applied and information was recorded.
The procedural approach in performing discount usability testing for the second iteration was very similar to the initial testing. Deviations from primary testing consisted of having users evaluate one online prototype in comparison to paper mock-ups used in the initial iteration. The updated prototype had combined the explicit descriptors with pop-up descriptions. Categories now had short descriptions to aid the user in mousing-over the appropriate category. A full list of what the category contains, pops-up to validate and solidify the users choice before s/he commits to the link and leaves the Gateway.
There was some difficulty in analyzing the card sorting method. However, the most useful data came from the free form task because it allowed the users to group the data as they wished under the headings they wished. There were also some common trends in the freeform task such as the creation of headings such as help, services, Internet services, and search tools. The results of the second–stage card sort were used to reorganize the gateway information. Redundancy was found in that participants put some terms in two or more categories. For example, Course reserves was placed in ‘Find Books, Journals, Data…’ and ‘Your Patron Info’.
From this Discount Usability Iteration, we identified several areas of concern, which were addressed in a new prototype. It was found that the users preferred the pop-up prototype, however had greater efficiency and accuracy with the explicit prototype! Thus a compromise between function and aesthetics was reached by combining the two interaction styles. Another issue was that some category names were found to be ambiguous; for example the group headings E-services, and Special Services in the pop-up prototype. There was some confusion between the differentiations of tasks that could be accomplished between these two groups. Heuristic findings suggested that specific terms were also misunderstood. Specifically for E-text and E-data, we recommend that these terms be clarified with Electronic Full-Text Resources and Electronic Data Services, respectively. Walkthroughs also found that all three users tried to use description text in both prototypes as linking text, which was not the team's intention. From this we realized that we will either have to ensure that the description text in no way looks like an active hyperlink, or make all descriptions active hyperlinks. At this point, we decided to peruse the first option, with an understanding that this may change in the future. A positive point: the quick links were highly used when applicable and are strongly recommended at this stage.
Within the second iteration of the discount usability, the outcome of the evaluation was very positive. Areas that could possibly be addressed from this stage are a clearer refinement of what is a link within a page and what is not. Again, users attempted to use the descriptions of each group as detailed hyperlinks to the desired pages. This is a possible area of modification for future design changes. The other point of interest lies with the Graphical crest with descriptive text. It was noticed that the inexperienced users were unaware of the fact that this text changes when mousing over each separate menu grouping. This is also another area that could be addressed in terms of visibility with possible colour changes of the text per menu choice, or possibly with the layout, or font of the text within the crest.
5.2 Impact of Phase 2 Methods on Prototype Development

Figure 3: Proposed prototype design for Phase Three
www.ftplinc.com/sd348/prototype_PhaseTwo.html.
The category links were titled according to the analysis of the card sort method. The categories remain similar to the current gateway groupings because it was found that it was not the categorization of terms that hindered users from finding information but the fact that category titles were vague. For those category titles that were not self-explanatory (e.g. Reference Shelf, Services For, About the Library), short examples were displayed to help the user in the decision-making process as to which category s/he should select to complete the desired task. These examples provide hints about the type of information contained in the category. Scanning for keywords on one screen proved to be faster than scanning for keywords on multiple (pop-up) screens, as found in the first iteration of the discount usability testing.
Several users expressed their preference to a minimalist design with less clutter (the pop-up design), although they performed the tasks much faster with explicit descriptions on the screen. Thus a compromise was reached between function and aesthetics. The pop-up design was maintained to reduce text on the screen and thus increase white space (reduce clutter/information overload). Short examples give a hint as to what the category may contain and reduce time required for mental preparation by the user in deciding/choosing which of the categories may contain the desired task/term.
In the current library gateway, selection of a link off of the main category title (Find It, Get It, etc) does not provide any feedback to the user. This was found to be a usability problem through heuristic and task analysis performed in the first phase of the project. The pop-up description should clearly indicate that the options are not links. Currently it only provides validation to the user as to whether s/he selected the right category before s/he commits to that link and leaves the library gateway (Figure 9-2: Proposed prototype: cursor moused-over Reference Shelf category title).

Figure -4: Proposed prototype: cursor moused-over Reference Shelf category title
An added "Help" capability was included to the navigation bar: a menu listing of frequently asked questions as to How do I….? (eg. How do I find a book? How do I get a book from University of Guelph? How do I connect from Home?) It was found through card sorting and discount usibility testing that this was needed. Help was an important component of usability. Thus further to the Help provided in the main frame, the user has access to answers to Frequently Asked Questions that may not be directly or explicitly contained in the current Help options. Note: all the possible How Do I…? questions have not yet been listed, the menu is presented to get the design idea across but the Help capabilities are not fully functional.
At that time, the Quick Links were placed at the bottom because the menus had to be above, or else they could not be seen because the screen would cut off the menu options. This was not a design intention but a technical limitation. The Quick Links should have been closer to the top since they would be used more and people tend to read from top down. This has been since rectified in later prototypes. As well, the drop-down menus were not of good design/quality and better ones were designed. We hope to add a counter onto each link on the gateway page to keep track of the most used links, which will ultimately determine the order the quick links are listed on the side navigation bar.
The bottom screen contains other links that are important to keep visible but are not necessarily frequently accessed. It is recommended that these links should also be accessible through each library site page.
All links change colour, as well as become underlined upon mouse-over to provide feedback as to user selection.
Thus the prototype has attempted to address the client (CNAG)’s original concerns:
6.1 Introduction
Usability testing is a widely used technique to evaluate user performance and acceptance of products and systems. Although usability testing may not be the most efficient technique for discovery of usability problems, it is a relatively quick and reliable method to estimate quantitatively user's performance and subjective and subjecting satisfaction with products. One of the major trends in usability evaluation is that of website testing (Witchansky 2000). When testing websites for usability, one of the main aspects that is evaluated is the Graphical User Interface.
Test Protocols are the methodology or formatted structure upon which a set of experiments/evaluations are based. This structure for evaluation gives the researcher an advantage of ensuring that variance is not entered into the data collected through information delivery to the user. A script is a predetermined set of instructions, and actions, which the evaluator follows while administering an experiment/evaluation. With this script the researcher can guarantee to a higher level of certainty that all participants within an experiment were given equal instruction and prompting while completing the task.
6.2 Methodology:
6.2.1 Participants
For the study, there were 9 participants. Basic demographic information follows:
Question |
Average |
Standard Deviation |
Familiar with Computers |
4.5 |
0.7 |
Familiar with Internet |
4.4 |
0.8 |
Design of WebPages |
2.9 |
1.4 |
UW Library WebPages |
3.1 |
1.3 |
UW Library Services |
2. 8 |
1.1 |
6.2.2 Procedures
A summary of the procedures is as follows.
6.2.3 Measures
A Keystroke-Level Analysis(KLA) was completed for each of the tasks that was asked of the user. The KLA for each task can be found in Table 7.2, with detailed information of how each of the times was calculated in Appendix F. Each KLA was calculated as if the users were unfamiliar with the website (which happened to be the case). The KLA adjusted for the reading time needed to find the correct link. The assumptions are that each link has approximately 2 words, and each word takes 0.5 s to read, therefore, one second is added to the keystroke level analysis for every link that must be scanned. Scanning is ceased as the user finds the link.
Table -2: Summary of Times for Keystroke Level Analysis
Question |
Exp. (sec) |
Adv. (sec) |
1 |
16.1 |
- |
2 |
13.55 |
- |
3 |
18.1 |
- |
4 |
7.1 |
2.65 |
5 |
14.1 |
- |
6 |
12.1 |
- |
7 |
11.1 |
2.65 |
8 |
10.1 |
- |
9 |
2.65 |
- |
10 |
9.1 |
- |
6.3 Results
6.3.1 Lab Usability testing Results—Quantitative Data
Table -3: Results of times (in seconds) to complete each question
Participant # |
1 |
2 |
3 |
4 |
5 |
6 |
7 |
8 |
9 |
AVG. |
Version |
A |
A |
A |
A |
A |
B |
B |
B |
B |
|
Q.1. Department site |
39 |
2.4 |
132 |
5 |
15 |
105 |
12 |
5.9 |
20 |
37.37 |
Q.2. Journal from WLU |
18.4 |
44.4 |
31.5 |
35 |
40 |
21 |
109 |
32 |
12 |
38.14 |
Q.3. Alumni |
3.1 |
20 |
4.4 |
10 |
5 |
9.2 |
8.2 |
3.9 |
3 |
7.42 |
Q.4. Hours, Archives |
13.6 |
9.7 |
3.1 |
4.5 |
45 |
5.2 |
2.2 |
1.5 |
3 |
9.76 |
Q.5. Kitchener Library |
43.1 |
18.6 |
16.3 |
25 |
45 |
52.8 |
9.5 |
8.5 |
185 |
44.87 |
Q.6. E- Dictionary |
25.1 |
10.1 |
5.6 |
6.8 |
7 |
10.2 |
11 |
3.2 |
7 |
9.56 |
Q.7. Andrew Pyper |
7.2 |
7.2 |
6.2 |
3.6 |
18 |
7.1 |
5.7 |
1.8 |
33 |
9.98 |
Q.8.Connecting fr. home |
5.5 |
2.7 |
5.6 |
10.1 |
5 |
8.6 |
10.8 |
6.8 |
18 |
8.12 |
Q.9. Renew |
3.7 |
2.1 |
3.1 |
1.7 |
3 |
40.5 |
6 |
1 |
2 |
7.01 |
Q.10. Liaison Librarian |
45.1 |
44.6 |
44 |
35.7 |
35 |
9.3 |
36.1 |
64 |
88 |
44.64 |
As a result of the lab-based usability testing that involved nine users, a t-test was used to analyse the significance between the averaged times and the keystroke level analysis results (refer to Table 7.4). There was high variability between the times to complete each question across participants, which are reflected in the large standard deviation.
Table -4: Calculated T-test values for each question (version A)
Question |
AVERAGE |
KLA |
StdDev |
StdError |
T-test |
Version A |
|||||
1. |
37.37 |
16.1 |
47.75 |
2.65 |
8.02 |
2. |
38.14 |
13.6 |
28.55 |
1.59 |
15.50 |
3. |
7.42 |
13.1 |
5.41 |
0.30 |
-18.89 |
4. |
9.76 |
7.1 |
13.79 |
0.77 |
3.47 |
5. |
44.87 |
14.1 |
54.97 |
3.05 |
10.07 |
6. |
9.56 |
12.1 |
6.34 |
0.35 |
-7.23 |
7. |
9.98 |
11.1 |
9.74 |
0.54 |
-2.07 |
8. |
8.12 |
10.1 |
4.51 |
0.25 |
-7.89 |
9. |
7.01 |
2.65 |
12.64 |
0.70 |
6.21 |
10. |
44.64 |
9.1 |
21.65 |
1.20 |
29.55 |
T critical (df = 8,a = 0.05) = 2.31 |
|||||
On comparison of the average times and KLA times each question, it is found that all t-test are significant, expect for one question (Does UW library have a copy of Andrew Pyper’s Lost Girls?). The decision is to reject the null hypothesis for all t-test except for the one question previously mentioned. There is a 5% chance that the results could have occurred by chance, also indicting that the confidence level is 95%. The significant difference between the average times and the KLA values indicate that the participants were unable to come significantly close to the benchmark times. Of the nine questions, three questions were a negative t-test value meaning that the average times actually fell below the expected KLA times. Question 3, 6, and 8 of version A, (Q#3: Can alumni borrow books? Q#6: Does the library have electronic dictionaries? Q#8: Where can you find instructions for connecting to home) were, on average, performed faster than the calculated benchmark times for an inexperienced user.
Repeated t-tests are then performed on the three questions that exceeded the benchmarks for the inexperienced users, but using benchmarks for experienced users. The results were t (two-tailed, df=8)=7.72, t (two-tailed, df=8)=12.06, t (two-tailed, df=8) = 12.05, p < 0.05, respectively. With a t critical (df = 8,a = 0.05) = 2.31. The results still remain significant, but the values are positive in this case indicating that on average the users did not meet the requirements as outlined by the KLA for experienced users.
In order to determine whether the performance on the "average" user meets the requirements, the confidence interval is calculated. With a level of probability of 0.05, we can see that only one question (#7, version) meets the requirements as calculated by KLA. From Table 6-5, it is evident that the benchmark values for all questions but #7 are not embraced by the minimum and maximum values, therefore we can conclude that the users were unable obtain the benchmark values as we would have hoped.
Table -5: Confidence Intervals for each question (version A)
Question & Description |
Xmin |
Xmax |
Benchmarks |
Meet the req'ments? |
||
Q.1. Department site |
31.25 |
43.48 |
16.1 |
NO |
||
Q.2. Journal from WLU |
34.49 |
41.80 |
13.6 |
NO |
||
Q.3. Alumni |
6.73 |
8.12 |
13.1 |
Exceeds |
||
Q.4. Hours, Archives |
7.99 |
11.52 |
7.1 |
NO |
||
Q.5. Kitchener Public Library |
37.82 |
51.91 |
14.1 |
NO |
||
Q.6. Electronic Dictionary |
8.74 |
10.37 |
12.1 |
Exceeds |
||
Q.7. Andrew Pyper |
8.73 |
11.22 |
11.1 |
YES |
||
Q.8. Connecting from home |
7.54 |
8.70 |
10.1 |
Exceeds |
||
Q.9. Renew |
5.39 |
8.63 |
2.7 |
NO |
||
Q.10. Liaison Librarian |
41.87 |
47.42 |
9.1 |
NO |
||
To examine how the averaged times for the nine users compare to the benchmark times of the current gateway, t-tests were performed to see whether there were any significant differences (refer to Appendix G). All t-tests values were found to be significantly different from the required score using a t-critical (df = 8, two-tailed, p=0.05) = 2.31. The confidence intervals are also calculated for each of the 10 questions, and it was found that 6 of the 10 questions exceeded the requirements. This means that the average times from the lab-based testing are significantly faster than the benchmarks, in other words, the performance on the prototype version is quicker than the performance on the current gateway for those specific six questions.
6.3.2 Usability Questionnaire Results
Table -6: Chi-Square test Results (refer to Appendix H for raw data):
Chi-square critical (df=2, alpha=0.05) = 5.99
The critical value of 5.99 was obtained from a look up table. The results indicate that all criteria show a significant difference. The conclusion is that there is a significant difference between the observed preference of the users in all categories X2 (df=2, p<0.05) =12.67, 18, 8.33, 6.67, 18. We can conclude that users prefer the usability of the new gateway when considering key words, major links, graphics, navigation bar and the overall layout.
6.4 Discussion:
The lab-based usability test showed that 6 of the 10 questions were performed faster in the prototype when compared against the benchmarks of the current library gateway. It is possible to conclude that the 6 questions are more intuitive and that the usability issues around the questions are better addressed in the prototype version. But it is possible that there may have been an overestimation in the calculated KLA benchmark values caused by the over-emphasis of reading time to scan the page. The more time the participant is exposed to the gateway, the less reading and scanning time is required to accomplish the task. Therefore, as each question is being read to the participant, they are building a visual picture or mental model of the web page, so less time is required to read over the page. If less time is needed to scan, this will result in faster and faster times to complete the task as exposure time to the web page is increased. A learning effect of the questions may need to be analyzed further.
The background of the participants indicates that they are familiar with computers (average of 90% familiarity) but lack familiarity with the current UW’s library web page (average of 60% familiarity) and services (average of 56% familiarity). These facts indicate that the participants may be unfamiliar with library terms and "models", so the participants are learning about the library along with trying to figure out the usability of the gateway, which may have caused an increase in the time to complete a task.
Also, it should be mentioned that a keystroke analysis was not originally designed for graphic user interfaces, and so may not reflect benchmark times very well.
The analysis performed on the prototype version of the gateway compared to the current gateway was tremendously positive. The analyses indicate that our participants found the prototype version to be user-friendly in terms of keyword, major links, graphics, navigation bar, and the overall layout. An open-end question was asked to obtain comments or feedback, which will help in the improvement of the final prototype.
6.4.1 Criticism of Phase 3 Methodology
Looking at each of the packages the following can be seen:
Within the Information and Consent form: There is a question on whether or not the participant would like to have a summary of the results of the study. This information however does not seem applicable for the user to understand as it is in a technically intense report. It assumes that the administration of this course will be responsible for the summary of this information, and that this information will be a summary of all six-project groups.
Within the Background Questionnaire: This is a subjective questionnaire, in that people are requested to judge their experience level related to different aspects of the study. As such it relied upon the participant to give an accurate representation of their skills, with respect to the norm. This task is not practical and will definitely rely upon the confidence of the participant. A more accurate understanding of the participants experience level could be obtained through a standardised test. It must be understood that all the pertinent study information gained is of a subjective nature.
Within the Test Monitor Scripts: For the test monitor scripts there is no allowance for the answering of questions during the evaluation. If an inexperienced Library user is being tested, they may not understand the concept of the question being asked. Therefore a level of library knowledge on the part of the participant is assumed, that could be inaccurate. In addition, at no point in the task paper is the participant requested to actually find the information requested from the designed Library gateway page. Participants were free to obtain the desired end results any way possible.
Within the Data Collection Form: There was no standard method for entering of the comments by the evaluator for each of the completed tasks. As such the comments and description of how each task was completed varies greatly from each evaluator. Methods to counter this area of concern would be to have only one evaluator, or to have talked about the commenting methods before the testing so as to come to a group consensus regarding format and content of comments.
Within the Usability Questionnaire: This section directly reflects the success of the team to develop a user-friendly product. This in itself gives no concern. Yet when the participants are chosen from our peers there is a strong possibility that their ranking of the design will be skewed by friendship. The understanding that the evaluator was also the designer, could also raise concern within the participant of offending someone and as such the scoring once again could be skewed.
6.5 Recommended Improvements to Current Prototype
There are three main types of design requirements: functional (what the system needs to do), data (what information needs to be available) and usability (what the system needs to be like for the user) requirements.
This section summarizes design features, both functional and technical, that we recommend the University of Waterloo Library Gateway web pages to have. The section may be used as a stand-alone document for the web site specifications.
Overview
7.1 Target Audience
Internal to UW:
7.2 CNAG’s Original Concerns
7.3 Critique of Current UW Library Gateway
Positives:
7.4 Technical Features for New Design
Recommended
7.5 General Features
7.6 Navigation
It is not in the scope of this project to make design changes to pages off the main gateway, however it is recommended that a collaborative project with all the owners of the Library site pages should design all site pages to have consistent navigation tools and content layout.
7.7 Logos or Banners (Graphics)
7.8 Content
Various usability evaluation methods have led to a model of how the site information should be laid out, as well the naming of the terms. The Card Sort method facilitated the grouping of terms, while design walkthroughs and hierarchical task analysis highlighted which terms should be re-named or clarified.
The indented lists are those terms in the sub-menu that pops-up upon mouse-over of the category title.
7.8.1 Main Menu
7.8.2 Left Navigation Bar
7.8.3 Top Navigation Menu
7.8.4 Bottom Navigation Links
7.9 Prototype and Original CNAG Concerns
Appendix A: Summary of Client Meetings
Topics of Discussion
Client Meeting #1 – (informal)
By: Design Team #3
Date: Thursday, January 25, 2001 Time: 10:30am
Place: Dana Porter Cafe
Present: Annette Dandyk, Minh Cao, Ryan Finnie, Rob Snow, Greg Stefan, Justine Yau
- Questions directed to Annette:
- What is your experience with the Library Gateway?
- What is the best way to get a hold of you?
- Of the other Library Pages, which do you prefer?
- Why did you like the chosen page?
- What are your expectations of the project? Of us?
- Our approach to the redesigning the current Library Gateway
- Explained usability testing and how it will be conducted
- Addressed time frame of when components will begin and complete
- Issues that need to be explored further:
- Orientation sessions of library; how many people attend? Are they effective?
- Which sites are most visited?
- Technical aspects of Gateway set-up
- Requests from Annette
- Minutes from meetings sent to Annette
- Would like visibility on testing done
- Will review reports before submittal
Appendix B: Team Project Task Allocations
Phase One
Task |
Contributors |
Summary |
Minh |
Introduction |
Minh |
Interactive Systems Problem Statement |
All team members |
Heuristic Evaluation |
Primary: Justine Secondary: All team members |
Interview with Expert User |
Primary: Justine |
Design Walkthroughs |
Primary: Ryan, Greg |
Task Analysis |
Primary: Minh |
Competitive Analysis |
Primary: Rob Secondary: All team members |
Proposal for Changes and Draft Specifications |
Primary: Justine Secondary: All team members |
Future Team Goals |
Greg |
Phase Two
Task |
Contributors |
Introduction |
Justine |
Process Overview |
Justine |
Description of low fidelity prototype |
Minh |
Prototypes were created by |
Minh |
Medium Fidelity prototype Justification |
Minh |
Card Sorting method |
Justine, Ryan, Greg; testing carried out by all members |
Introduction (What, why) |
Minh |
Procedure (how) |
Justine |
Results |
Justine, Ryan, Greg |
Analysis/conclusion |
Ryan |
Discount Usability |
Minh/Rob; testing carried out by all members in 2nd iteration |
Introduction |
Rob |
Procedure |
Rob/Greg |
Results |
Team effort |
Analysis/conclusion |
Minh/Rob |
Overall Conclusion |
Ryan |
References |
Rob |
Task Allocation |
Greg |
Phase Three
Component |
Contributors |
A. Executive Summary |
Ryan |
B1. Introduction to the Problem |
Justine |
B2. Revised Interactive Systems Problem Statement (and details) |
Minh |
C1. Project Constraints |
Justine |
C2. Project Requirements |
Justine |
D. Phase 1 UCD Methods |
Ryan |
Summary of Phase 1Methods (Intent and Method Specifics) |
Ryan |
Impact of Phase 1 Methods on Prototype Development |
Ryan |
E. Phase 2 UCD Methods |
Greg |
Summary of Phase 2 Methods (Intent and Method Specifics) |
Greg |
Impact of Phase 2 Methods on Prototype Development |
Greg |
F. Phase 3 UCD Method (Lab-Based Usability Testing) |
Rob |
Intent of Usability Testing |
Rob |
Methodology |
Rob |
Results & Discussion |
Justine/Rob |
G. Final Specifications (Recommendations & Justification) |
Minh |
H. Summary of Client Meetings |
Greg |
I. Team Project Task Allocations |
Greg |
J. Team Learning |
Greg |
K. Process Overview/Editing/Formatting |
Justine/Minh |
- The tasks were delegated in the above manner to maximize individual group member expertise and experience.
Appendix C: Team Learning
The design process to improve the usability of the University of Waterloo’s library gateway page has rendered significant learnings to all group members. This project promoted learning in the context of doing. It allowed our team, hands-on opportunities to learn about user-centered design (UCD) and evaluation. The application of heuristic evaluations, walkthroughs, task analysis and competitive analysis granted great experience into assessing human performance issues and their relation to design. Administering UCD principles and techniques to human-computer interaction (HCI) fostered learning empirical research methods and procedures to design that can effectively evaluate systems and their interfaces. Relevant methods used were a card sorting method and discount usability method. These gave way to the collection of information, practicing of problem solving skills and ultimately the production of a fully functional prototype. Multiple iterations of the latter methods, emphasized the importance in utilizing a spiral method approach and the need for it to be implemented early on in the design process. Additionally as group members, we developed and fine-tuned our group dynamics. At the beginning of the project, we were a diverse group of individuals with separate areas of experience and competence. During the process of this project, we molded our diversity into an effective multi-disciplinary team, which could conquer a variety of situations and problems. We focused on a common goal, prompted interaction, shared leadership roles and practiced individual and group accountability. In hindsight, Our initial dedication to working as a high-performance cooperative learning group was very high, but as midterms and other projects arose our intensity of achieving the previous mentioned declined. This alternatively could have been a response to the weighting of the marks not equaling the effort being put forth. Overall as students, this project provided the opportunity to develop practical and procedural skills that prepare us for becoming human factor practitioners.
Appendix D: Questions Asked for the Discount Usability Method (Iteration One) –Phase 2
1. How would you find if the library has Margaret Atwood's Alias Grace? |
2. I need to contact the librarian for my department - how do I find his/her phone or e-mail? |
3. Does the library have any electronic dictionaries? |
4. Where can I find a database in which to locate articles on Anthropology subjects? |
5. How can I discover when my UW library books are due? |
6. Where can I find instructions about connecting from home? |
Appendix E: The Test Monitor Scripts
TEST MONITOR SCRIPTS
Note to Test Monitors:
Please read the text in Times-Roman (this font) out loud to each participant. The text in capitals (Arial) are instructions to you and should not be read out loud.
LIST OF USER-TESTING TASKS:
| Instructions to Participant, Information/Consent & Background Questionnaire | (5 minutes) |
| Library Tasks using current design or redeisgn | (10-20 minutes) |
| Usability Questionnaire | (5 minutes) |
EQUIPMENT YOU NEED TO BRING TO TESTING SESSIONS:
- testing script
- data collection forms
- a watch to time events
- clipboard (or something to write on)
- extra paper for making notes
Step 1: INSTRUCTIONS TO PARTICIPANTS.
NOTE: MAKE SURE THAT WINDOW FOR WEB PAGE IS MINIMIZE SO THAT PARTICIPANT CANNOT SEE IT UNTIL THE TRIALS ARE READY TO BEGIN
Thank you for agreeing to participate in our study entitled AUsability Testing of the UW Library Gateway Web Page@.
This study is being carried out by the class members of SD 348 (User-Centred Design) as part of our course requirements. In order to make sure that all participants receive the same information, I am going to read to you from this script.
We will start by going over the information letter for this study. It will explain the objectives of this study and the tasks that we will ask you to perform.
GO OVER INFORMATION LETTER, AND ASK IF PARTICIPANT HAS ANY QUESTIONS?
ASK PARTICIPANT TO READ OVER AND SIGN CONSENT FORM.
Step 2: BACKGROUND QUESTIONNAIRE
Before we have you use the UW Library Gateway Web Page we would like to ask you some general questions about yourself and your computing experience.
ASK STUDENT TO FILL IN THE BACKGROUND QUESTIONNAIRE.
Now for the main part of the study.
Step 3: LIBRARY TASKS
Before we start, I need to remind you that we are testing the usability of the web page design and not your ability to use the system. I will be taking notes as you work through the tasks and recording how long it takes someone to find the appropriate links and where people may have difficulties using the web page. This will help us to improve upon our design before our presentations next week.
I will be asking you to work through a series of tasks that people normally associate with libraries. We will be going through each task one right after the other and then I will give you an opportunity to comment about the web page and tasks once they are all done.
For each task, you will start with your hand on the mouse and the cursor at the bottom centre of the screen. I will then read the task out loud and once I am finished you are to try to find the appropriate link on the web page as quickly as possible. I will tell you when you have found the appropriate link which will end that task. I will then remind you to move the cursor to the bottom centre of the screen before I read the next task.
- BRING UP WINDOW AND ASK PARTICIPANT TO PLACE HAND ON MOUSE AND MOVE CURSOR TO BOTTOM CENTRE OF SCREEN.
- READ FIRST TASK FROM SHEET.
- START TIMER.
- STOP TIMER WHEN PARTICIPANT HAS CLICKED ON APPROPRIATE LINK THAT MOVES USER OFF OF THE GATEWAY.
- NOTE ANY PROBLEMS ON OBSERVATION SHEET BEFORE MOVING TO NEXT TASK.
- REMIND PARTICIPANT TO MOVE CURSOR TO BOTTOM CENTRE AND KEEP HAND ON MOUSE WHILE YOU READ NEXT TASK OBJECTIVE.
ONCE ALL TASKS ARE COMPLETED, ASK PARTICIPANT TO FILL OUT USABILITY QUESTIONNAIRE.
Step 4: USABILITY QUESTIONNAIRE
As a final task we would like you to answer some questions about the usability of the UW Library Gateway Web-Page design that you just used.
Appendix F: Calculations for Keystroke Level Analysis for 10 Questions used in Usability testing.
Q1. Where would you go to find general help with a research topic in your department?
KLA- Experienced user with current phase three prototype
Step |
Operator |
Time |
1. |
Mentally prepare (after hearing sentence) |
1.35 s |
2. |
Scan Top menu (4 word chunks) |
4.0 s |
3. |
Scan middle menu (3 word chunks) |
3.0 s |
4. |
Point to link (Services for…) |
1.10 s |
5. |
Mentally prepare (as reads menu) Assume hand remains on mouse |
1.35 s |
6. |
Scan options (3 word chunks) |
3.0 s |
7. |
Point to link (Undergraduate) |
1.10 s |
8. |
Click on link (Undergraduate) |
0.20s |
Total |
16.1 s |
Q2. How do you get an article from a journal that is available at WLU?
KLA- Experienced user with Phase Three prototype
Step |
Operator |
Time |
1. |
Mentally prepare (after hearing sentence) |
1.35 s |
2. |
Scan Top menu (2 word chunks) |
2.0 s |
3. |
Point to menu option (How do I …) |
1.10 s |
4. |
Mentally prepare (as reads new menu) Assume hand remains on mouse |
1.35 s |
5. |
Scan top menu (2 word chunks) |
2.0 s |
6. |
Point to option (Obtain Resources) |
1.10 s |
7. |
Mentally prepare (as reads new menu) Assume hand remains on mouse |
1.35 s |
8. |
Scan menu (2 word chunks) |
2.0 |
9. |
Point to link (Guelph/WLU/Annex) |
1.10 s |
10. |
Click on link (Guelph/WLU/Annex) |
0.20s |
Total |
13.55 s |
Q3: Can alumni borrow books from UW?
Step |
Operator |
Time |
1. |
Mentally prepare (after hearing sentence) |
1.35 s |
2. |
Scan top menu (4 word chunks) |
4.0 s |
3. |
Scan middle menu (3 word chunks) |
3.0 s |
4. |
Point to link (Services for…) |
1.10 s |
5. |
Mentally prepare (as reads new menu) Assume hand remains on mouse |
1.35 s |
6. |
Scan menu (1 word chunks) |
1.0 s |
7. |
Point to link (Alumni) |
1.10 s |
8. |
Click on link (Alumni) |
0.20s |
Total |
13.1 s |
Q4: What are the hours for the University Archives?
KLA- Experienced user with Phase Three prototype
Step |
Operator |
Time |
1. |
Mentally prepare (after hearing sentence) |
1.35 s |
2. |
Scan top menu (1 word chunks) |
1.0 s |
3. |
Point to link (About the Library) |
1.10 s |
4. |
Mentally prepare (as reads menu) Assume hand remains on mouse |
1.35 s |
5. |
Scan menu (1word chunk) |
1.0 s |
6. |
Point to link (Hours/Locations) |
1.10 s |
7. |
Click on link (Hours/Locations) |
0.20s |
Total |
7.1 s |
KLA- Advanced user with Phase Three prototype
Step |
Operator |
Time |
1. |
Mentally prepare (after hearing sentence) |
1.35 s |
2. |
Point to link (Hours/Location QuickLinks) |
1.10 s |
3. |
Click on link (Hours/Location QuickLinks) |
0.20 s |
Total |
2.65 s |
Q5: What books are available at the Kitchener Public Library?
KLA- Experienced user with Phase Three prototype
Step |
Operator |
Time |
1. |
Mentally prepare (after hearing sentence) |
1.35 s |
2. |
Scan top menu (4 chunks) |
4.0 s |
3. |
Point to link (Local Sites) |
1.10 s |
4. |
Mentally prepare (as reads menu) Assume hand remains on mouse |
1.35 s |
5. |
Scan menu (5 chunks) |
5.0 s |
6. |
Point to link (Kitchener Public Library) |
1.10 s |
7. |
Click on link (Kitchener Public Library) |
0.20s |
Total |
14.1 s |
Q6: Does the Library have electronic dictionaries?
KLA- Experienced user with Phase Three prototype
Step |
Operator |
Time |
1. |
Mentally prepare (after hearing sentence) |
1.35 s |
2. |
Scan top menu (4 word chunks) |
4.0 s |
3. |
Scan middle menu (2 word chunks) |
2.0 s |
4. |
Point to link (Reference Shelf) |
1.10 s |
5. |
Mentally prepare (as reads menu) Assume hand remains on mouse |
1.35 s |
6. |
Scan menu (1 word chunk) |
1.0 s |
7. |
Point to link (Reference Tools) |
1.10 s |
8. |
Click on link (Reference Tools) |
0.20s |
Total |
12.1 s |
Q7: Does UW library Andrew Pyper’s Lost Girls?
KLA- Experienced user with Phase Three prototype
Step |
Operator |
Time |
1. |
Mentally prepare (after hearing sentence) |
1.35 s |
2. |
Scan top menu (4 word chunks) |
4.0 s |
3. |
Scan middle menu (1 word chunks) |
1.0 s |
4. |
Point to link (Find Books, Journals, Data) |
1.10 s |
5. |
Mentally prepare (as reads new menu) Assume hand remains on mouse |
1.35 s |
6. |
Scan menu (1 word chunk) |
1.0 s |
7. |
Point to link (Books in our Library: Trellis) |
1.10 s |
8. |
Click on link (Books in our Library: Trellis) |
0.20s |
Total |
11.1 s |
KLA- Advanced user with Phase Three prototype
Step |
Operator |
Time |
1. |
Mentally prepare (after hearing sentence) |
1.35 s |
2. |
Point to link (Trellis: Our Catalogue QuickLinks) |
1.10 s |
3. |
Click on link (Trellis: Our Catalogue QuickLinks) |
0.20 s |
Total |
2.65 s |
Q8: Where can I find Instructions for connecting from home?
KLA- Experienced user with Phase Three prototype
Step |
Operator |
Time |
1. |
Mentally prepare (after hearing sentence) |
1.35 s |
2. |
Scan top menu (2 word chunks) |
2.0 s |
3. |
Point to arrow of menu (How do I …) |
1.10 s |
4. |
Mentally prepare (as reads new menu) Assume hand remains on mouse |
1.35 s |
5. |
Scan menu (3 word chunks) |
3.0 s |
6. |
Point to link (Connect from Home) |
1.10 s |
7. |
Click on link (Connect from Home) |
0.20s |
Total |
10.1 s |
Q9: Where would you go to renew books online?
KLA- Experienced user with Phase Three prototype
Step |
Operator |
Time |
1. |
Mentally prepare (after hearing sentence) |
1.35 s |
2. |
Point to link (Patron Info QuickLinks) |
1.10 s |
3. |
Click on link (Patron Info QuickLinks) |
0.20 s |
Total |
2.65 s |
Q10: What is the contact information for the Liason Librarian assigned to department?
KLA- Experienced user with Phase Three prototype
Step |
Operator |
Time |
1. |
Mentally prepare (after hearing sentence) |
1.35 s |
2. |
Scan top menu (1 word chunk) |
1.0 S |
3. |
Point to arrow of menu (About the Library) |
1.10 s |
4. |
Mentally prepare (as reads new menu) Assume hand remains on mouse |
1.35 s |
5. |
Scan menu (3 word chunks) |
3.0 s |
6. |
Point to link (Staff/Administration) |
1.10 s |
7. |
Click on link (Staff/Administration) |
0.20s |
Total |
9.1 s |
Appendix G: Lab-based Usability testing Results Comparing Averages to Current Library Gateway benchmarks
AVERAGE |
KLA |
StdDev |
StdError |
t-test value |
|||
37.37 |
18.50 |
47.75 |
2.65 |
7.11 |
|||
38.14 |
15.50 |
28.55 |
1.59 |
14.28 |
|||
7.42 |
21.50 |
5.41 |
0.30 |
-46.83 |
|||
9.76 |
16.50 |
13.79 |
0.77 |
-8.81 |
|||
44.87 |
22.50 |
54.97 |
3.05 |
7.32 |
|||
9.56 |
21.50 |
6.34 |
0.35 |
-33.94 |
|||
9.98 |
13.50 |
9.74 |
0.54 |
-6.51 |
|||
8.12 |
22.50 |
4.51 |
0.25 |
-57.33 |
|||
7.01 |
17.50 |
12.64 |
0.70 |
-14.94 |
|||
44.64 |
18.50 |
21.65 |
1.20 |
21.74 |
|||
T critical (df = 8,a = 0.05) = 2.31 |
|||||||
Confidence Intervals: |
|||||||
Xmin |
Xmax |
Benchmarks |
|||||
Q.1. Department site |
31.25 |
43.48 |
18.50 |
NO |
|||
Q.2. Journal from WLU |
34.49 |
41.80 |
15.50 |
NO |
|||
Q.3. Alumni |
6.73 |
8.12 |
21.50 |
Exceeds |
|||
Q.4. Hours, Archives |
7.99 |
11.52 |
16.50 |
Exceeds |
|||
Q.5. Kichener's Public Library |
37.82 |
51.91 |
22.50 |
NO |
|||
Q.6. Electronic Dictionary |
8.74 |
10.37 |
21.50 |
Exceeds |
|||
Q.7. Andrew Pyper |
8.73 |
11.22 |
13.50 |
Exceeds |
|||
Q.8. Connecting from home |
7.54 |
8.70 |
22.50 |
Exceeds |
|||
Q.9. Renew |
5.39 |
8.63 |
17.50 |
Exceeds |
|||
Q.10. Liaison Librarian |
41.87 |
47.42 |
18.50 |
NO |
|||
Appendix H: Testing the usability of our gateway compared to the current gateway
A: Raw data: results of Usability Questionnaire when divided into three categoriesParticipant # |
1 |
2 |
3 |
4 |
5 |
6 |
7 |
8 |
9 |
Key Words |
2 |
2 |
3 |
2 |
2 |
2 |
2 |
2 |
2 |
Major Links |
2 |
2 |
2 |
2 |
2 |
2 |
2 |
2 |
2 |
Graphics |
2 |
2 |
3 |
2 |
3 |
2 |
2 |
2 |
2 |
Navigation Bars |
2 |
1 |
2 |
2 |
3 |
2 |
2 |
2 |
2 |
Overall Layout |
2 |
2 |
2 |
2 |
2 |
2 |
2 |
2 |
2 |
Legend:
1 = Rated not at all user-friendly (rating of 1-2)
2 = Rated very user-friendly (rating of 4-5)
3 = Rated Neutral (rating of 3)
B: Chi-Square results for evaluating usability of prototype compared to gateway
|
Not User-friendly |
User-friendly |
Neutral |
|||||||||
Criteria |
Observed |
Expected |
c 2 |
Observed |
Expected |
c 2 |
Observed |
Expected |
c 2 |
||
Key Words |
0 |
3 |
3.00 |
8 |
3 |
8.33 |
1 |
3 |
1.33 |
||
Major Links |
0 |
3 |
3.00 |
9 |
3 |
12.00 |
0 |
3 |
3.00 |
||
graphics |
0 |
3 |
3.00 |
7 |
3 |
5.33 |
3 |
3 |
0.00 |
||
Navigation Bars |
1 |
3 |
1.33 |
7 |
3 |
5.33 |
3 |
3 |
0.00 |
||
Overall Layout |
0 |
3 |
3.00 |
9 |
3 |
12.00 |
0 |
3 |
3.00 |
||
C: Sum of c 2 for all three categories
c 2 Summary |
|
Key Words |
12.67 |
Major Links |
18.00 |
Graphics |
8.33 |
Navigation Bars |
6.67 |
Overall Layout |
18.00 |
c 2(df = 2, a = 0.05) = 5.99 |
|
Appendix I: Forms completed for the Office of Research Ethics, University of Waterloo
UNIVERSITY OF WATERLOO
OFFICE OF RESEARCH ETHICS
APPLICATION FOR ETHICS REVIEW OF RESEARCH INVOLVING HUMAN PARTICIPANTS
IN UNDERGRADUATE COURSE PROJECTS
-
Title of Project: Usability Testing of UW Library Gateway Web Page
-
Faculty Supervisor(s) Department Ext: e-mail:
Carolyn MacGregor Systems Design Eng 2897 cgmacgre@engmail
-
Student Investigator(s) Department e-mail: Local Telephone Number:
SD 348 Winter 2001 students - see attached list
-
Level of Project: Undergraduate Course Specify course and number: ___SD 348_____
-
Indicate the anticipated commencement date for this project: __March 22, 2001___
Indicate the anticipated completion date for this project: __March 30, 2001____
- Purpose and Rationale for Proposed Research
Briefly describe the purpose (objectives) and rationale of the proposed project and include any hypothesis(es)/research questions to be investigated.
We have been working on the redesign of the UW Library Gateway page in conjunction with the UW Library Community Needs Assessment Group (CNAG). The members of CNAG serve as clients for the 6 teams of students (5-6 students per team). All teams are working on the same assignment - the redesign of the UW Library Gateway page through the use of user-centred design methods. The methods and redesign activities to date have involved the students and the CNAG members. As the final phase of the design projects, the students will be carrying out formal "lab-based" usability testing of their final designs and the current UW Gateway page.
The project does not involve a research question per se. The overall objective of the usability testing is to have the students experience a more formal method of evaluating their designs. The intention of the project is to generate recommendations for alternative designs for the UW Gateway page that can then be more rigorously tested by CNAG.
-
Methodology/Procedures
- .Which of the following procedures will be used? Provide a copy of all materials to be used in this study.
[ ] Survey(s) or questionnaire(s) (mail-back) Are they standardized? All [ ] Some [ ] None [ ]
[ ] Survey(s) or questionnaire(s) (in person) Are they standardized? All [ ] Some [ ] None [ ]
[X ] Computer-administered task(s) or survey(s) Are they standardized? All [ ] Some [ ] None [ X ]
[ X ] Interview(s) (in person)
[ ] Interview(s) (by telephone)
[ ] Focus group(s)
[ ] Audiotaping
[ ] Videotaping
[ ] Invasive physiological measurement (e.g. venipuncture, muscle biopsies, catheter insertions, etc.)
[ ] Non-invasive physiological measurement (e.g. exercise, heart rate, blood pressure, electromyography, muscle stimulation, balance/movement, force exertion, CO2 or altered O2 breathing, lower body negative pressure, etc.)
[ ] Unobtrusive observations
[ ] Analysis of secondary data set (no involvement with human participants)
[ ] Analysis of human tissue, body fluids, etc. only
Other (specify)
-
Provide a brief, sequential description of the procedures to be used in this study.
All users will be asked to read and sign the Information/Consent Letter. See Appendix A.
All users will be asked to fill out a brief Background Questionnaire concerning their familiarity with the main UW libraries and the UW Library website (e.g. frequency of visits) and web pages in general. See Appendix B.
Test monitors will read from a set script and each user will be asked to perform a set of tasks related to library activities using either the current UW Library Gateway Page or the redesigned Gateway page. Users will be reminded that the objective of the study is to test the usability of the design (and not to test the skills of the user). See Apprendix C.
Test monitors will time how long it takes the users to complete the tasks, and will record when users have encounter difficulties in completing tasks (e.g. select a link that does not lead to the appropriate resource).
See Appendix D.
Participants will be asked to fill out a Usability Questionnaire once the tasks have been completed.
See Appendix E.
All users will be given a feedback letter. See Appendix F.
- .Which of the following procedures will be used? Provide a copy of all materials to be used in this study.
-
Participants Involved in the Study
- Indicate who will be recruited as potential participants in this study.
UW Participants:
[ X ] Undergraduate students
[ X ] Graduate students
[ ] Faculty and/or staff
Non-UW Participants:
[ ] Children
[ ] Adolescents
[ ] Adults
[ ] Seniors
[ ] Persons in Institutional Settings (eg. Nursing Homes, Correctional Facilities, etc.)
Other (specify) _________________________________________
-
Describe the potential participants in this study including group affiliation, gender, age range and any other special characteristics. If only one gender is to be recruited, provide a justification for this.
For the purpose of this design project, participants can be anyone who is a current UW student. In order to test the robustness of their designs, teams will be encouraged to recruit a range of participants (e.g. mix of males and females, undergrads and grads, different disciplines). The only requirement is that the participant has used web pages before.
-
How many participants are expected to be involved in this study? 6 design groups X 6-10 participants
- Indicate who will be recruited as potential participants in this study.
- Recruitment Process and Study Location
- From what source(s) will the potential participants be recruited?
[ X ] UW undergraduate and/or graduate classes
[ ] UW Psychology Research Experiences Group
[ ] Other UW sources (specify) _______________________
[ ] School Boards (not including local school boards)
[ ] Kitchener-Waterloo Community
[ ] Agencies
[ ] Businesses, Industries
[ ] Health care settings, nursing homes etc.
Other (specify) _Students may recruit through friends, roommates, and classmates
- Describe how and by whom the potential participants will be recruited.
Provide a copy of any materials to be used for recruitment (e.g. posters(s), flyers, advertisement(s), letter(s), telephone and other verbal scripts).
Since the majority of the recruitment will be done directly by the SD 348 students, a short "email" message has been prepared that they can send to personal contacts. See Appendix G.
- Where will the study take place?
[ X ] On campus Location __EL 108 (computer lab)
[ ] Off campus Location __________________________
- From what source(s) will the potential participants be recruited?
- Compensation of Participants
Will participants receive compensation (financial or otherwise) for participation? Yes [ ] No [ X ]If Yes, provide details:
-
Feedback to Participants
Briefly describe the plans for provision of feedback. Where feasible, a letter of appreciation should be provided to participants. This also should include details about the purpose and predictions of the study, and if possible, an executive summary of the study outcomes. Provide a copy of the feedback letter to be used.See Appendix F
- POTENTIAL BENEFITS FROM THE STUDY
-
Identify and describe any known or anticipated direct benefits to the participants from their involvement in the project.
The participants will experience usability testing and their feedback and efforts will contribute to the evaluation and redesign of the UW Library Gateway Page.
-
Identify and describe any known or anticipated benefits to society from this study.
Ultimately usability improvements to the UW Library Gateway Page will allow the larger community greater ease in accessing UW Library information and resources.
-
-
POTENTIAL RISKS FROM THE STUDY
-
For each procedure used in this study, provide a description of any known or anticipated risks/stressors to the participants. Consider physiological, psychological, emotional, social, economic, legal, etc. risks/stressors. A study-specific medical screening form must be included when physiological assessments are used and associated risk(s) to participants are minimal or greater.
[ X ] No known or anticipated risks
Explain why no risks are anticipated: Participants will be asked to carry out brief tasks using a webpage. All participants will be familiar with using webpages. Participants will be reminded that their performance is being observed to test the effectiveness of the designs - not to test the skills of the users.
[ ] Minimal risk
Description of risks:
[ ] Greater than minimal risk
Description of risks:
- Describe the procedures or safeguards in place to protect the physical and psychological health of the
participants in light of the risks/stresses identified in D1.
All test monitors will follow the test monitor script to ensure that all participants are instructed in the same manner.
-
- INFORMED CONSENT PROCESS
- What process will be used to inform the potential participants about the study details and to obtain their consent for participation?
[ X ] Information letter with written consent form; provide a copy
[ ] Information letter with verbal consent; provide a copy
[ ] Information/cover letter; provide a copy
Other (specify) ________________________________________________________________
-
If written consent cannot be obtained from the potential participants, provide a justification.
- What process will be used to inform the potential participants about the study details and to obtain their consent for participation?
- ANONYMITY OF PARTICIPANTS AND CONFIDENTIALITY OF DATA
- Explain the procedures to be used to ensure anonymity of participants and confidentiality of data both during the research and in the release of the findings.
All participants will be assigned a participant number which will be used on data collection forms.
Data will be aggregated such that no participant will be identified in any of the reports.
- Describe the procedures for securing written records, questionnaires, video/audio tapes and electronic data, etc.
Teams will be responsible for keeping data collection forms until the data collection and analysis is complete.
Information/consent forms with participant names will be kept in a file separate from the data collection forms to protect anonymity. Once data collection is finished and aggregated raw data (i.e. consent forms, data collection sheets and questionnaires) will be turned over to Prof MacGregor.
-
Indicate how long the data will be securely stored and the method to be used for final disposition of the data.
[ ] Paper Records
[ ] Confidential shredding after ______ years
[ ] Data will be retained indefinitely in a secure location
[ X ] Data will be retained until completion of specific course.
[ ] Audio/Video Recordings
[ ] Erasing of audio/video tapes after ______ years
[ ] Data will be retained indefinitely in a secure location
[ ] Data will be retained until completion of specific course.
[ ] Electronic Data
[ ] Erasing of electronic data after ______ years [ ] Data will be retained indefinitely in a secure location
[ X ] Data will be retained until completion of specific course.
[ ] Other (Provide details on type, retention period and final disposition, if applicable)
Researchers must ensure that all supporting materials/documentation for their applications are submitted with the signed, hard copies of the ORE form 101/101A. Note that materials shown below in bold are required as part of the ORE application package. The inclusion of other materials depends on the specific type of projects.
Please check below all appendices that are attached as part of your application package:
[ X ] Recruitment Materials: A copy of any poster(s), flyer(s), advertisement(s), letter(s), telephone or other verbal script(s) used to recruit/gain access to participants.
[ X ] Information Letter and Consent Form(s)*. Used in studies involving interaction with participants (e.g. interviews, testing, etc.)
[ ] Information/Cover Letter(s)*. Used in studies involving surveys or questionnaires.
[ ] Parent Information Letter and Permission Form*. For studies involving minors.
[ ] Medical Screening Form: Must be included for all physiological measurements and tailored for each study.
[ X ] Data Collection Materials: A copy of all survey(s), questionnaire(s), interview questions, interview themes/sample questions for open-ended interviews, focus group questions, or any standardized tests.
[ X ] Feedback letter *
[ ] ORE Form 102: To be submitted by applicants who wish access to students and/or teachers from the local school boards.
[ ] Other: _____________________________________________________________________________
INVESTIGATORS’ AGREEMENT
I have read the Office of Research Ethics Guidelines for Research with Human Participants and agree to comply with the conditions outlined in the Guidelines. In the case of student research, as a Course Instructor, my signature indicates that I have read and approved the application and proposal, deem the project to be valid and worthwhile, and agree to provide the necessary supervision of the student(s).
_____________________________ _March 15, 2001________Signature of Course Instructor Date
____________________________________ _________________________Signature of Student Investigator(s) Date
____________________________________ _________________________Signature of Student Investigator(s) Date
____________________________________ _________________________Signature of Student Investigator(s) Date
FOR OFFICE OF RESEARCH ETHICS USE ONLY:
_____________________________ _________________________
Susan E. Sykes, Ph.D., C. Psych. Date
Director
Office of Research Ethics - Explain the procedures to be used to ensure anonymity of participants and confidentiality of data both during the research and in the release of the findings.