Skip to the content of the web site.

Evaluation of Medium Fidelity Prototype

A Final Report Submitted in Partial Fulfillment of the Team Project Requirements for SYDE 348

Cecilia Chung, 96138814
Tim Fillier, 98054316
Danny Ho, 97140232
Scott Nisbet, 98128538
Anna Tran, 98132372

Faculty of Engineering
Department of Systems Design Engineering
March 27, 2001

Course Instructor:  Professor C. G. MacGregor

Executive Summary

This report outlines the process and results of the usability study performed by the team on the current University of Waterloo (UW) Library Gateway.  The overall goal of the project was to determine the usability of the UW Library Gateway and to generate design recommendations to make the site more usable by students and faculties.

The design team performed three design iterations.  In the first iteration, the team analyzed the usability of the current gateway.  The gateway was also compared to gateways of other university libraries.  Three usability evaluation techniques were used – heuristics evaluation, design walkthrough, and hierarchical task analysis.  Problems with the current gateway were identified and design recommendations were made.  Design recommendations mainly fall under these categories – use of icons, help tool, clarity of language, and colour combination.  With these recommendations in mind, the team generated a low-fidelity prototype.  The team then further refined this low-fidelity prototype and created two slightly different prototypes.

The two prototypes created in the first iteration were evaluated in the second iteration for their usability.  The techniques used to evaluate these prototypes were discount usability, which is a combination of heuristics evaluation and design walkthrough, hierarchical task analysis, and card sorting.  Heuristics evaluation and hierarchical task analysis were helpful in determining problems with process flow while design walkthrough and card sorting produced valuable results for the organization and phrasing of the headings and options on the design prototype. The area with major problems was the organization and wording of headings and menu options available on the prototype.  The problems identified by using the evaluation techniques were translated into design recommendations. With these recommendations, the team chose one of the two prototypes and made design changes and refinements.

This revised prototype was then subjected to usability testing designed by Professor Carolyn MacGregor in the last iteration of this project.  The team used keystroke-level analysis to create benchmark numbers and compared the benchmark numbers with actual performance of users.  General comments made by users were also recorded.  With the benchmark numbers and general comments, the team made another set of design recommendations to the prototype.  This prototype is the final prototype to be presented to the clients.

Table of Contents:

Executive Summary
Table Of Contents
List Of Tables
List Of Figures

  1. Introduction
  2. Interactive Systems Problem Statement
  3. Project Scope, Constraints, Criteria And Design Requirements 3.1 Project Scope
    3.2 Project Constraints
    3.3 Evaluation Criteria
  4. Methods, Findings, And Resulting Prototype From Iteration 1 4.1 Summary Of Overall Findings And Recommendations
    4.2 4.3 Proposal Of Design Alternative
  5. Methods, Findings, And Resulting Prototype From Iteration 2 5.1 Design Alternatives 5.1.1 Design Alternative #1
    5.1.2 Design Alternative #2
    5.2 Summary Of Methods
    5.3 Summary Of Findings And Recommendations 5.3.1 Heuristics Evaluation And Hta
    5.3.2 Design Walkthrough 5.3.3 User Testing With Card Sorting Findings
    5.4 Proposal For Medium-Fidelity Prototype
  6. Method, Findings And Results From Iteration 3 6.1 Summary Of Method
    6.2 Summary Of Findings And Results
  7. Design Recommendations 7.1 Aesthetics
    7.2 Label Changes
    7.3 Organization
    7.4 Order
    7.5 General Recommendations
References

Appendix A – Detailed Findings From Iteration 1

Nielson’s And Norman’s Heuristics
Design Walkthrough
Hierarchical Task Analysis
Appendix B – Detailed Findings From Iteration 2 Detailed Findings From Hierarchical Task Analysis
Detailed Findings From Card Sorting
Appendix C – Detailed Findings From Iteration 3 Keystroke-Level Analysis BenchmarksAppendix D - Summary Of Task Allocation And Learning Summary Of Task Allocation
Team Learning
Appendix E - Summary Of Client Meetings Meeting #1
Meeting #2
Meeting #3
Appendix F – ORE Form [link no longer exists]

List of Tables
Table 1: Details of ISPS
Table 2: Summarized Grouping of Data from Card Sorting
Table 3: Average Level of Familiarity with Computing
Table 4: Results of Chi-Square Test for "User-Friendliness"
Table 5: Potential Problems for Task 1
Table 6: Potential Problems for Task 2
Table 7: Potential Problems for Task 3
Table 8: Potential Problems for Task 4
Table 9: Potential Problems for Task 5
Table 10: Potential Problems in Task 1 from Design Alternative #1
Table 11: Potential Problems in Task 2 from Design Alternative #1
Table 12: Potential Problems for Task 3 from Design Alternative #1
Table 13: Potential Problems for Task 4 from Design Alternative #1
Table 14: Potential Problems in Task 1 from Design Alternative #2
Table 15: Potential Problems in Task 2 from Design Alternative #2
Table 16: Potential Problems for Task 3 from Design Alternative #2
Table 17: Potential Problems for Task 4 from Design Alternative #2
Table 18: Task Allocation Strategy for Iteration 2

List of Figures
Figure 1: Current Library Gateway
Figure 2: Design Alternative from Iteration 1
Figure 3: Design Alternative #1
Figure 4: Design Alternative #2
Figure 5: Recommended Design from Iteration #2
Figure 6: Medium Fidelity Prototype from Iteration #2
Figure 7: HTA for Task 1
Figure 8: HTA for Task 2
Figure 9: HTA for Task 3
Figure 10: HTA for Task 4
Figure 11: HTA for Task 5
Figure 12: HTA for Task 1 from Design Alternative 1
Figure 13: HTA for Task 2 from Design Alternative #1
Figure 14: HTA for Task 3 from Design Alternative #1
Figure 15: HTA for Task 4 from Design Alternative #1
Figure 16: HTA for Task 1 from Design Alternative #2
Figure 17: HTA for Task 2 from Design Alternative #2
Figure 18: HTA for Task 3 from Design Alternative #2
Figure 19: HTA for Task 4 from Design Alternative #2

  1. Introduction

    "Ask and it will be given to you; seek and you will find; knock and the door will be opened to you."                                                                                                     - Matthew 7:7

    For those willing to knock, libraries have always been a primary door to information. However, since the introduction of the world-wide-web, information has literally become available at one's fingertips. Gone are the days of searching through paper library catalogues for Dewey numbers and flipping through stacks of books on a general topic in hopes of finding something relevant. Now, one needs only to type in a keyword and with the click of a mouse volumes of information retrieved from databases across the globe are instantly displayed. Rather than admitting defeat to the Goliath of information that the internet has now become, libraries have adopted the wisdom of David by not only facing the giant, but manipulating it to work for them, thus victoriously stand firm in its position.

    The University of Waterloo’s (UW) Library is an example of how a library is employing the powerful faculties of the world-wide-web to enhance its services.  The particular project which we, as members of the Winter 2001 SY DE 348 class have taken on attempts to aid in the achievement of the UW Library's goal to make its website the main door to information for both students and faculty.  The UW Library's Community Needs Assessment Group (CNAG) has recognized the importance of usability on making this door easier to access and thus more desirable to pass through.  Through this project, we, as a team, therefore aim to increase the usefulness of the site for students and faculty conducting study and research.

    This final report of our SYDE 348 team project focuses on the evaluation of our third iteration prototype of our proposed design for the UW Library gateway. This evaluation is based on the formal, lab-based, usability testing that was conducted according to the procedures as specified by Professor MacGregor. Our arrival at the medium fidelity prototype was not without justification – a battery of previous assessments of several other prototypes that have since been improved upon provide the necessary justification for the usability of our design. Hence, a summary of the methods, findings, and resulting prototypes from the first and second iterations is also provided. Based on the assessments from all three iterations, specifications for a more user-centred design of the UW Library Gateway are outlined and a final prototype is proposed.

  2. Interactive Systems Problem Statement

    Stated more succinctly, in an interactive systems problem statement (ISPS), our task in this project was as follows:

    To design a Library Gateway that is walk-up-and-use (self-sufficient) for searching library resources and general library information to be used by undergraduate and graduate students, faculty, library staff, and researchers from other schools, institutions, or businesses.

    The details of the ISPS are outlined in Table 1:

    Table 1: Details of ISPS

      Human Activity Users Level of Support Form of Solution
    Facts
    • Searching for books, journals, info
    • Researching a particular topic
    • library information
    • Finding information on library services
    • Undergraduate & graduate students
    • Library & other staff
    • Faculty
    • Online help
    • Self-explanatory headings
    • Ease of walk-up-and use
    • Redesign of gateway page only
    • Web page (available online)
    • Remote access
    Assumptions
    • Want a printout of the listings
    • Users from other schools, institutions or businesses
    • Novices & experts
    • Know how to use a mouse
    • Possess basic computer skills
    • Technical support staff
    • Ease of error recovery despite having slow connection
    • Consistency between menu options and the titles of pages they actually link to
    • Shortcuts for experts
    • Users will continue to use the system if they find it easy to use
    • All other pages underneath will remain the way they are
    • No limitations in terms of the format of the design proposal
    • Pages that don't need to be reloaded following each error in menu choice
    Additional Comments/

    Resources

    • The assumptions provided above come from the structure of the existing library gateway
    • There is an extensive number of activities that can be performed from the library gateway.  Since an exhaustive list of these activities would not be feasible to present here, only the most common are listed.
    • The users as specified in the Team Project Handout are listed as facts, while other potential users are listed as assumptions, since these were not specifically stated.
    • Our client stated that the Library Forms shortcut and Trellis shortcut were used frequently by library staff.
    • The no limitations assumption was as stated by our client, Susan Routliffe (see Summary of Client Meetings)
    Requirements
    • Users must be able to find library information from the first layer of the page
    • Layout consistent with other pages in the library’s site
    • Print option
    • Access from remote terminals
    • Major headings and subheadings to support the specific activities
    • The design must support both novice and expert users.
    • Shortcuts for experts
    • Use terminology that is understood and correctly interpreted by all users
    • Follow Norman’s Principles of Design for Usability (i.e. visibility, mapping, conceptual model, feedback)
    • Include shortcut to Trellis and Library Forms
    • Follow components of usability, and tested on each component for usability with users
    • Help section for novice users that is readily available from the gateway
    • Self-explanatory headings
    • Menu options must link directly to corresponding pages
    • Web-based design methods
    • Simple and straight-forward access from remote terminals

  3. Project Scope, Constraints, Criteria and Design Requirements

    3.1 Project Scope

    This project is limited to the redesign of the library gateway, meaning that only the presentation pages of the library gateway can be changed. With this in mind, the team had assumed that all secondary pages would remain the same in terms of layout and content. Also, the team would only have until the end of the academic term to finish this project and the design. The timeline and required deliverables for this project were specified by Professor MacGregor. There would be three iterations. Each iteration would consist of usability evaluation, proposal and recommendations of design changes, and the implementation of those design recommendations.

    3.2 Project Constraints

    Constraints are determined to be conditions that must be satisfied for a design to be considered feasible and acceptable.  For the design of the library gateway, the team has defined the following constraints.

    • The new design must be compatible with different types of web browsers.
      Users of the library gateway vary greatly from students to staff, from faculty to members of the community. Just as the types of users vary, the types of web browsers used to view the gateway page will vary as well. Therefore, the new design must be viewable on different kinds of web browsers such as Netscape Communicator, Internet Explorer, etc. However, the team will exclude web browsers on cellular phones and other types of wireless devices at this time. The team believes that the number of users using wireless devices will be extremely small.
    • The new design must contain all currently available links.
      Although the client never explicitly specified that all currently available links must remain available, the team felt uncomfortable with removing any links that are currently available from the library gateway. The team felt that without any investigation into the frequency of use of those links, it is unreasonable to remove any links that may be key to the user population.
    • There will be no technical and budget constraints.
      Upon discussion with the client, the team realized that there would be no technical and budget constraints. The design can potentially use any programming language and techniques. The complexity of the design is also not limited by budget and resources including human resources and finance, but limited only by our personal design talent.

    3.3 Evaluation Criteria

    Several criteria would be used to evaluate the current library gateway and subsequent designs generated by the team. A summary of the criteria is included as follows:

    • Usability of the page
      Usability is the key factor in evaluating the gateways. It will be measured by different usability evaluation methods. The details of methods used for each iteration of this project are discussed later in Sections 4 , 5 , and 6
    • Aesthetics
      Aesthetics is regarded as secondary factor after usability. Aesthetics will be measured by subjective comments of subjects of user testing. It can be argued that aesthetics and usability is intertwined. However, if these two factors conflict with each other, the group will value usability over aesthetics.

  4. Methods, Findings, and Resulting Prototype from Iteration 1

    4.1 Summary of Methods

    During Iteration 1, the team used heuristics evaluation, design walkthrough, and Hierarchical Task Analysis (HTA) to evaluate the existing UW Library Gateway. The UW Library Gateway is also compared against other library gateways chosen by the client. A picture of the existing Library Gateway is found in Figure 1.

    Current Library Gateway
    Figure 1: Current Library Gateway

    Heuristics are general “rules of thumb” used by experts to help discover or evaluate specific aspects in a design.  Heuristic evaluations performed by the team in iteration #1 consisted of using Donald Norman’s Principles of Good Design (Norman, 1989) and Jakob Nielson’s Ten Usability Heuristics (Nielson, 1985).  Basically the team looked at each principle to see if the website followed or violated the principle.  By performing heuristics evaluations, the team was able to perform an overall assessment of the current UW library gateway.  The team was also able to compare the UW library gateway with those of other universities.

    Design walkthroughs are accomplished by having the user go through a specified set of tasks using concurrent verbal protocols, where users voice out loud everything they’re thinking while performing the task.  The design walkthroughs for iteration #1 were conducted on UW’s site using the usability questions provided by the client.  The team members gathered users to perform the tasks and recorded the steps taken and any comments made.

    Hierarchical task analyses (HTAs) specify a particular goal and the tasks and subtasks required to accomplish the goal are broken down into various levels.  The HTA for iteration #1 was conducted on five tasks taken from the usability questions provided by the client.  Each team member was given one task.  The team member then tried to break down the tasks into the smallest components possible.

    4.2 Summary of Overall Findings and Recommendations

    Upon completing all three evaluations, a number of immediate concerns and problems were identified.  After careful revision, the team grouped the problems into four main categories.  Further analysis translated the four problem areas into four groups of recommendations and specifications for a new design.  The four categories identified by the team are the use of icons, the clarity of language, the colour combination, and the help tool.  Detailed findings for the individual methods can be found in Appendix A. 

    Use of Icons
    The use of icons would be a welcomed addition to the UW Library Gateway.  Using descriptive, yet simple icons for each heading would illustrate the meaning of the heading very well.  The icons should all be fairly similar to each other in size, colour, and style to show that they have similar functions as links to another page or menu.  This would provide conceptual mapping for the user.  The addition of these icons will reduce the need to use words to explain each link.  Using icons would also increase the visibility of decisions to users, following one of Norman’s principles.  Having six carefully chosen icons should not violate the principle of a minimalist design and it may increase the understanding of many novice users.

    Clarity of Language
    In future designs of the Gateway, the clarity of language must be emphasized.  In order to help users find the right starting point; the language used must be easily understood and correctly interpreted.  Unambiguous and self-explanatory wording would be very helpful to users. The user population for this site is very diverse. There is a wide range in terms of the ability to comprehend English. The wording of the headings and menu items should reflect this situation and be accessible to most users. The headings should easily explain what functions and options are available in the menu.  Numerous problems were discovered with phrases like “Find it” and Get it”.  In addition to the headings, the menu items should also be self-explanatory.  Phrases like “E-Journal” may not convey the meaning as well as “Electronic Journal”.  Moreover, terms specific for the University should be explained, e.g. TRELLIS and TUG.  In order to provide better feedback and mapping, an attempt should be made to match headings of webpages to the text of the link that links to the page.

    Colour Combination
    The use of colours was found to be confusing during the evaluation.  The red border that turns blue when activated does not seem to give users appropriate feedback.  Since red is a brighter colour than blue, it catches the user’s attention more effectively. The team suggests that the colours can be reversed, meaning the borders would start in blue and turn red when the mouse-over is activated.  Colour changes upon activation provide valuable feedback to users and should be kept.  The mouse-over menu options are currently black and there is no colour change with the mouse-over.  This may falsely suggests that all the options have the same function.  To illustrate that each option leads to a different page, the options should be clearly distinct from the others in the mouse over.  A colour change may be used to accomplish this.  These recommendations in colour combinations may seem minor but they may produce some significant benefits to the gateway’s ease of use.

    Help Tool
    During the team’s evaluation of the UW Library Gateway, the team members had several problems with the help tool, or the lack of one.  Although a “Help” heading is available, the menu options are not illustrative and novice users may encounter difficulties with selecting the appropriate users.  The task of categorizing their question into one of the menu options was found to be very difficult for less experienced users.    The site index and search can only be accessed through the “Help” heading but that assumes that users can master the mouse-over menus.  Therefore, the team is recommending a “Help Search” tool on the Gateway.  Or, a link to the site map should be provided on the Gateway. The “Help Search” tool will allow novice users to find the basic instructions they need. Expert users will also be able to find the more advanced or less publicized contents of the website.

    Other Recommendations
    Lastly, the team is also proposing a few other recommendations that would help make the website more user-friendly. The background graphic on the current Gateway is overpowering and it covers up the “Text Version” link. If a background picture is used, the picture should be carefully chosen. It should be very faint and it must not interfere with text or links. Also, more quick links should be provided to the experienced users. This would help these users with faster access and more flexibility.

    4.3 Proposal of Design Alternative

    After generating the design recommendations and specifications, the team proposed one design alternative. The design alternative is presented in this section along with descriptions of the features.

    Design Alternative from Iteration 1
    Figure 2: Design Alternative from Iteration 1

    The low fidelity design alternative is a simplified version of what is to come with the later stages of prototype development. The main menu buttons located vertically along the right side of the page are going to become icons in the later stages of development. The team decided it would be too hasty to assign icons to menus this early in the design, as the team acknowledges that different users will assign different meanings to different icons. Further testing would be necessary before icons can be chosen.

    The team proposed to provide a "Quick Search" field at the top left corner of the Gateway page. The option to search the site should be available throughout the various layers of the web site, in order to be consistent, and to provide the expert user with a detour around the numerous explanatory steps that a novice user would take.

    The phrasing of the main menu options has been designed to appeal to any user, expert or novice. The phrase, "How Do I?" was used instead of the standard, "Help". This change was the subject of must debate. On the one hand, "How Do I?" is a much more natural statement, and illustrates good visibility. When the user has a problem, he is most likely thinking just that, and thus the system provides a good mapping between the user's intentions and the system's functions. On the other hand, "Help" is a standard term that is used in many computer applications. The question of "How Do I?" versus "Help" will be solved by user testing during the next phase of development.

    This design alternative provides the user with fewer options than that of the current UW Gateway page. This cuts down on the decision time required. The number of choices from the main menu was reduced from six to five. The pop-up menus that occur during mouse-over on the current UW Gateway page were replaced by a "Description of Function" section. This also reduces the number of choices the user is required to make. The user has the choice of reading the description of the menu functions, or just choosing an item. This accommodates both expert and novice users by eliminating a step for the expert users and providing more information to facilitate the user's compilation of a correct conceptual model.

  5. Methods, Findings, and Resulting Prototype from Iteration 2

    5.1 Design Alternatives

    For the second iteration, the low-fidelity prototype generated from Iteration 1 was expanded to two separate versions. The two new prototypes are described below.

    5.1.1 Design Alternative #1

    Design Alternative #1
    Figure 3: Design Alternative #1

    Design Alternative #1 is basically an elaboration of the low fidelity prototype described in the previous report. The mouse-over functions have been eliminated from the Gateway. Icons along with a short phrase are used to represent possible links. These icons and phrases are found along the left-hand side of the page. Besides the icons, there will be a more detailed description of the links. After clicking on the icons, the user will be led to a new page. For instance, if the user clicked on "Search Library Holdings", he/she will be led to a page where he/she can select whether to search for books, journal index, e-journals, etc. This new layer eliminates the need for mouse-over and may make the page much easier for users to navigate.

    Along the top of Design Alternative #1, there will be several quick links. These links would link to UW homepage, TRELLIS, and back to the Gateway page. This top section should stay the same in all library webpages. The quick links provides a more consistent look to the site. Users can also navigate back to the listed pages easily. Moreover, instead of embedding the links into a graphic, the links are all visually separated. Lastly, the "Site Search" function moved from the second layer to the top of the Gateway. This "Site Search" will perform functions identical to the existing site search.

    5.1.2 Design Alternative #2

    The screenshot of Design Alternative #2 is shown in Figure 4. In this design alternative, the site is divided into two sections. The top section contains the quick links to the University of Waterloo homepage, TRELLIS, and the Gateway page. The top section also contains the "Site Search" function described in design alternative #1.

    Design Alternative #2
    Figure 4: Design Alternative #2

    In the bottom section, a list of headings is provided along the left-hand side.  These headings are active links.  Instead of using a mouse-over like the existing UW library gateway, the links must be clicked for the menu to be displayed.  When users click the heading on the left, the menu options corresponding to the link will be presented on the right hand side.  The background colour of the heading and the menu options will change.  This will highlight and connect the heading and the menu items together.

    5.2 Summary of Methods

    During the second iteration of the project, the team used discount usability (which includes heuristics and design walkthrough), HTA, and card sorting to determine the usability of the two designs illustrated in Section 5.1

    Discount usability evaluation is a usability evaluation method developed by Jakob Nielsen (1989).  This method combines heuristic evaluations and scenarios testing (design walkthrough with verbal protocol).   This method provides a quick and easy evaluation of the user interface.  Heuristics evaluation was performed on both design alternatives.  The team decided that design walkthrough would be used in this iteration to test the contextual groupings of the headings and menu options.  For example, electronic dictionaries, web searches, and other web-related reference may be more suitable in an “Electronic Resources” section.  Since both design alternatives used the same set of headings and menu options, design walkthrough was only performed on one of the design alternatives.

    HTA is used again in iteration #2 to evaluate the usability of the design alternatives.  The tasks chosen were almost identical to the tasks chosen from iteration #1.  This enabled a comparison with the results obtained from the first iteration to see if improvements were made. 

    The final technique used in iteration #2 is card sorting.  Card sorting is a technique used to delineate the user’s mental model of a system.  The user groups a list of functions/items according to the user’s perceived relationship that exists between these functions/items (Nielsen and Sano, 1995).  One particular problem found during iteration #1 with the existing UW Library Gateway was with the groupings of the headings.  Therefore, in this iteration, the team attempted to find out what the appropriate groupings and headings for the menu options are.

    5.3 Summary of Findings and Recommendations

    5.3.1 Heuristics Evaluation and HTA

    Design Alternative #1

    The major improvements incorporated in this design are the addition of icons and the clarified language used for the headings.  Some improvements were in regards to the mapping as well, by using a top navigation bar that would be common for all of the pages in the website.  The mouse-over function was removed to avoid any problems or confusion for novice users.

    Although eliminating the mouse-over reduces problems and confusion, it may cause some other visibility problems at the same time.  In order to see what each heading will allow you to do exactly, the user must go to a second page.  If the headings clearly inform the user what the links are for, this change should not be a problem.  However, if the user is unsure about where to look, they may be required to navigate back and forth to find the proper heading.  Nonetheless, if an incorrect choice were made, the top navigation bar would make it very easy for the users to return to the gateway.

    Design Alternative #2

    As with Design Alternative #1, most of the problems that were identified in the original gateway were fixed.  The language used to describe the links has been clarified.  Some improvements were made to the mapping as well, by adding a top navigation bar that would be common for all of the pages in the website.  Instead of using a mouse-over, the menu options under each heading would only be activated after the user clicks the heading.  This design also increases mapping by providing a colour change indicating the connection between the heading and the menu options.  The section on the left listing the headings will be common to all of the second level pages.  If the incorrect heading is selected first, the user would not need to navigate backwards to choose another heading.  This design may also benefit from using icons to increase mapping.

    The negative aspects associated with the Design Alternative #2 seem to be minor when compared with those associated with the Design Alternative #1.  Users are bound to make mistakes and the second prototype eliminates extra actions needed to correct the mistake.  For this reason, heuristic evaluation would point to the second prototype as the better design.

    5.3.2 Design Walkthrough

    The detailed steps taken by users when going through the twenty tasks are included Appendix B.  From the design walkthrough, the following observations and comments were obtained from the users.

    • The library catalogue name TRELLIS is still unfamiliar to many users.
    • Users are unfamiliar with the forms used for inter-library loans.
    • The added site search tool was used frequently and appropriately.
    • The desired functions were easier to locate.
    • The viewing of the different category headings and choice selection was clearer to the user.
    • Context-based errors (the link does what it says) reduce dramatically compared to the walkthrough done on the current library gateway design.

    From these observations, it was concluded that additional information must be displayed to inform the user of terms specific to the Waterloo library site and its operations. These notices should be placed in a location where the user can read them before making a menu selection.

    Terminology

    Terms such as TRELLIS, TUG, ILL, etc. cannot be used haphazardly.  The user may be misled if they are not informed of the terminology while reading.  In one case in the walkthrough, the user was required to simply search for a book title, but was hesitant to choose the “TRELLIS catalogue” link.  Although the link was listed amongst E-Journals, E-data, and other non-book related terms, the term TRELLIS did not imply the search they desired.  It is crucial that appropriate explanation of terms be available to the user.

    Menu Instructions

    If the name of the menu selection highly abstracts the purpose of the task, eg. Getting library materials from Guelph or other libraries, clearer explanation of the task or its elements need to be presented before the user actually chooses the task.  For example, since obtaining material from Guelph requires TUGdoc services and obtaining from other libraries may use CISTI or Inter-Library Loan Service, these should be sub headings within the same menu, accompanied by a brief explanation of the service’s purpose. This way, the user has a clearer understanding of the steps involved in getting material, and can make a more appropriate choice to suit the desired task.

    5.3.3 User Testing with Card Sorting Findings

    The results from the card sorting tasks were used to arrange the objects into groups.  The detailed groupings identified by test subjects are listed in Appendix B.  Table 2 shows the summarized grouping of data that resulted from the card sorting exercises.  Labels to the menu groups were assigned by the team to ensure consistent phrasing throughout the Gateway page.

    Table 2: Summarized Grouping of Data from Card Sorting
    Group 1

    Search UW Library Holdings

    TRELLIS: Our Catalogue
    Journal Indexes
    By Subject
    Course Reserves
    Group 2

    Services For . . .

    Faculty & Staff
    Graduate Students
    Undergraduates
    Persons with Disabilities
    Distance Education
    Alumni
    Business & Community
    Staff /Administration

    Group 3

    Search Other Library Holdings

    From Waterloo
    From Guelph/Laurier/Annex
    From other Libraries
    Laurier Library
    Guelph Library
    Kitchener Public Library
    Conestoga College Library
    Group 4

    Library Information

    Hours /Locations
    Library Development
    Accessibility
    Guide to the Libraries
    News/Events/Exhibits
    Tours & Workshops
    Renewals
    View Your Record
    Group 5

    Search Tools

    Internet Search Tools
    Reference Tools
    Online Instruction
    Site Index & Search
    UW Home Page
    TUG Home Page
    Group 6

    Electronic References

    E-Journals
    E-Texts
    E-Data

    5.4 Proposal for Medium-Fidelity Prototype

    After performing discount usability evaluation and hierarchical task analysis, the team gathered and examined all the problems encountered.  A discussion was held to determine the probable cause of the problems and the possible solutions to address the problems.  These possible solutions were translated into design changes to the design alternatives.  The team, as a whole, felt that Design Alternative #2 is superior over Design Alternative #1 because it kept all the existing information onto one page while it addressed problems found with the original UW library gateway.  Design Alternative #1 required an additional webpage and can be potential source of usability problems.  After the team discussed their findings, a new design was generated.  A medium fidelity prototype was also created.

    This new design is shown in Figure 5.  This design is very similar to Alternative #2.  There is a top frame with the “University of Waterloo Library” heading.  There are also quick links to pages such as TRELLIS, U of W homepage, and the library gateway.  The team neglected to include the “Text Version” link and the “Library Forms” dropdown in Design Alternative #2.  Since this design uses frame, the “Text Version” link would be necessary as users may be using browsers that do not support frames.  The “Library Form” dropdown is included because they provide flexibility for experienced users to get to the forms directly. The “What’s New” link has been added back to the top frame as well.  The library can potentially use this link to introduce new features and highlight existing functions.  As with Design Alternative #2, there is a “Site Search” mechanism included in the top frame which allows users to search for keywords in the website. 

    Recommended Design from Iteration #2

    Figure 5: Recommended Design from Iteration #2

    The selection process of the new design is identical to that of Design Alternative #2.  When users click on one of the headings on the left hand side, a menu would come up on the right hand side.  The background colour of the heading and the menu would change to convey to users that they are connected.  An example of the menu selection screen is shown in Figure 6.

    Medium Fidelity Prototype from Iteration #2

    Figure 6: Medium Fidelity Prototype from Iteration #2

    The new prototype used new groupings of menu options and headings.  The groupings were derived from the results from card sorting performed by the team.  During card sorting, the test subjects were presented with words and phrases currently used in the UW library gateway.  Some of the phrases do not convey the meaning of the pages they link to.  As a result, the team felt that some of the groupings were not logical.  Thus, the menu selections were modified to reflect better contextual grouping.

  6. Method, Findings and Results from Iteration 3

    6.1 Summary of Method

    User testing for the third iteration consisted of a lab-based testing protocol that was designed to suggest final changes to our design.  This process was very structured in nature.  A keystroke-level analysis (KLA) was performed for 10 tasks that were representative of common tasks performed from the UW Library Gateway web page.  This KLA was performed on the prototype generated at the end of iteration #2.  The details of the KLA are included in Appendix C.  Due to the horizontal structure of the page, each of the tasks required approximately the same amount of time to complete.  After performing the KLA to obtain benchmark numbers, a pilot test was then performed, using the team’s client as the user.  Pilot testing helped the team to rehearse the actual testing protocol that would be used with the pool of subjects during the lab-based testing. 

    The team followed the protocol outlined in Appendix F to perform the lab-based testing.  Prior to the performing the tasks, the participants read and signed an information/consent letter. The purpose of the information/consent letter was to delineate the nature of the study being performed, as well as to define the study objectives, and the tasks that would be performed during the study.  Inclusion of the user in the study was dependent upon the participants’ consent, thus this was a necessary step in the lab-based testing.

    A background questionnaire was then administered to determine the extent of the participants’ familiarity with the UW libraries, as well as with the current UW Library Gateway web page.  The purpose of the background questionnaire was to establish the participants’ level of familiarity with the system and system components.  The participants were asked to rate their aptitude and/or familiarity with everyday computing activities, webpages, webpage design, UW’s Library webpage, and UW’s Library services.  A copy of the “Background Questionnaire” can be seen in Appendix F.  The questions were closed – ended in order to facilitate responses from the participants, and the verbal anchors ranged from “not at all familiar” to “very familiar”. The questions were structured so participants could choose one of five points on a familiarity scale, ranging from 1 to 5.

    After finishing the background questionnaire, participants were asked to complete a set of tasks requiring the use of our UW Gateway web page prototype. The elapsed time taken for each participant to complete each task was recorded on a data collection sheet, along with any comments regarding the steps taken to achieve each task. These times were to be compared to the times established during the KLA.  A usability questionnaire was then given to each participant to gather the opinions of each participant.

    KLA is a submodel of the GOMS (goals, operators, methods, and selection rules) model of analysis.  The GOMS model assumes that “...users formulate goals and subgoals that they achieve by way of methods and selection rules” (Wickens et al., 1998).  The keystroke model of analysis allows the design team to determine “...how much time a user would take to accomplish a given task with a given computer system without an explicit analysis of goals and selection rules” (Chi et al., 1996). The goal of an analysis at the keystroke level is to predict how fast expert users can perform the operations available in the system. The time it takes experts to perform a task is thus determined by the time it takes to perform the keystrokes. Standard execution times for operators such as typing, pointing to a location with a pointing device, and mental preparation are clearly defined in the literature. Using these operator execution times, the design team can design the system to operate within these limits by reducing the depth of the menu structure, optimizing visual layout, and improving overall usability. This was done during the analysis of our prototype at the keystroke level. The tasks that we established execution times for were very similar in nature, and an analysis of each produced similar execution times. Results obtained form the KLA are included in Appendix C.

    Pilot testing the new UW Gateway prototype was necessary to formalize the protocol to be followed during the lab-based phase. This protocol is supplementary to the protocol established by Professor MacGregor, which can be seen in Appendix F.  The team agreed on the following supplementary protocol. The test monitor should read the “Test Monitor Scripts” in a monotone voices, and placed no emphasis on any phrase that would lead the user to misinterpret the nature or purpose of the task. The user would search the UW Gateway until he/she clicked on a link that could result in successful completion of the task. At this point, the test monitor would say, “stop”, and would then reset the mouse to its original position. It was also decided that there would be no communication, beyond what was required by the experimental protocol, between the user and the test monitor during the test.

    Three males and seven females were tested. Each participant was a student currently studying at the University of Waterloo. The average age of participants was 20.6 years. The participants had a wide range of experience with computing, web-based activities, and the UW Library.  Average levels of familiarity were tested via a background questionnaire.  Table 3summarizes the average level of familiarity computed from results of the background questionnaire.

    Table 3: Average Level of Familiarity with Computing
    Objects or Tasks

    Average Level of Familiarity 1 = not at all, 5 = very

    Computing Activities
    4.7
    Using Webpages
    4.2
    Designing Webpages
    2.6
    Current UW Library Gateway
    2.4
    UW Library Services
    2.3

    6.2 Summary of Findings and Results

    The lab-based testing indicated some unanticipated problems with the team’s design.  Many of the concerns voiced by the user dealt with the aesthetics of the design, while some of the other concerns illuminated more serious usability problems, such as the wording and location of menu options.  Each participant’s comments were taken into consideration when modifying the final functional iteration of the design.

    Statistical Results

    The team tested the design alternative for overall “user-friendliness” using the results from the “Usability Questionnaire”.  The design alternative proved to be significantly “user-friendly”. The results of the test for significance of these findings can be seen in Table 4. A Chi-Square test was performed on our results to test for significance. Since the observed X2 value exceeded the critical X2 value, the team can safely conclude that the design alternative is “user-friendly”.

    Table 4: Results of Chi-Square Test for "User-Friendliness"
    Category
    Observed Frequency (O)
    Expected Frequency (E)
    O – E
    (O – E)2
    (O – E)2]/E
    Not user-friendly
    8
    20
    -12
    144
    7.2
    User-friendly
    42
    30
    8
    64
    2.1
           
    X2 =
    9.3

    X2crit =  3.84, with df = 1, and power = 0.95 (from Table)

    General Observations

    The participants indicated that the prototype was not aesthetically pleasing. Participants indicated that the design “...(needed) some graphics”, and that “...maybe a little more colour would be great”. Although this is not the primary concern of our design team, we concluded that the aesthetics of the design was important to usability, because if users were not pleased with the look and feel of the design, they would be reluctant to use it. The mean graphics score for our design was 2.9 on a scale of 1 to 5 (1 to 3 being user-friendly, 4 and 5 being not user-friendly), proving the need for improvement in this aspect of usability. Another concern with the aesthetics of the design was the “...big white screen...” on the Gateway. One participant made a suggestion to change the colours of the web page to match those of the University of Waterloo. Participants did, however, like the change in colour with the mouseover of menu objects. One participant stated, “...(it’s) cool how the buttons go red when you move over them”. The comments from the usability questionnaire suggested a need for some serious improvements of the aesthetic component of our design.

    Participants had trouble finding electronic dictionaries. Five of our ten participants did not locate the electronic dictionaries after an exhaustive search of the menu options. Those who did find the electronic dictionaries did so with a mean time of 46.6 seconds, well above the minimum execution time determined by the KLA. Many of these participants expected to find electronic dictionaries under the “Electronic References” menu option. However, when “electronic dictionaries” was not found under this heading, users chose one of the menu options under this heading, such as “E-Texts”. The question was phrased to lead the user to our menu option, and we suggested that if the question was worded differently, we would have seen less of a problem in finding electronic dictionaries. For example, a search for “online dictionaries” would have encouraged the user to put more thought into their navigation since the work “electronic” is not there to bias towards a particular choice.  Electronic dictionaries were actually found in “Reference Tools” under the “Search Tools” menu. Participants suggested that this option be moved under the “Electronic References” menu option.

    The team believed that there were problems with the testing protocol that interfered with the results from the lab-based testing.  Participants had problems with some of test questions. One of the questions participants had problems with was, “(what) are the hours for the University Archives?”. Participants were misled by the word “archives”, because not everyone was sure of the meaning of the word. If the question were phrased, “What are the hours for the University Library?”, the team believes that participants would have taken less time than they did for the existing question. There was also a problem with participants using trial and error to answer the test questions, rather than using a logical progression of thought to answer the questions. This may explain why some of the task completion times were less than the times determined through the keystroke-level analyses. Participants should have been informed prior to testing that they were to use a logical thought process, rather than “hit and miss” to determine the answers to the questions. Participants should have been informed that they were expected to try to answer each question correctly on the first attempt. Another limitation experienced during the lab-based testing was that the participants were not representative of the general population of users, limiting the testing’s external validity. Most of the participants tested were friends of ours, which introduced a bias into the results of the testing. Friends were most likely less critical than the general population would have been during testing

    There was a contradiction between the participants’ reported familiarity with the current UW Library Gateway page and their willingness to compare the two web pages in terms of usability. Many participants indicated moderate to high levels of familiarity with the current UW Library Gateway page, yet indicated that comparing the two pages in terms of usability was not applicable, due to a lack of familiarity with the current UW Library Gateway page.  This observation illustrated the potential weaknesses of background questionnaires.  Since many users did not sufficiently answer the question, the team was reluctant to generalize the responses of a few individuals.

  7. Design Recommendations

    After performing user testing, the team reviewed the task performance and other results obtained.  A number of problems were discovered.  These problems can be translated into design changes to the team’s design alternative. 

    7.1 Aesthetics

    A number of users commented that the web page lacks graphics.  With this in mind, the team decided to include a picture of the Dana Porter Library and the Davis Centre Library in the top two corners of the page.  This will add some colour and visual interest to the page.  The pictures will be kept small, maintaining the simple and uncluttered appearance.

    Comments were also made about the lack of “visual appeal” on the gateway.  Users do not like the large amount of white space on the web page when it is first opened up.  Therefore, the team decided to add a watermark of the University of Waterloo coat of arms to cover the white space.  This watermark would stay on all of the pages with menu options.  The team realized that adding the watermark may create visibility problems for the instructions.  To counteract the possible effect, a larger and more noticeable font will be used to.

    Some of the users did not like the overwhelming blue background that appeared when they clicked on a link.  They believe that blue is not one of the school colours.  This is inconsistent with any other University of Waterloo web pages.  Since the connection of the headings to the menu options is essential to the conceptual model of the user, the team decided that the colour should be changed to a lighter, less overpowering colour.  The team agreed that yellow would be perfect since it is a school colour as well as not being as dark as the blue currently being used.

    7.2 Label Changes

    Some participants had problems associating the headings with the menu options they contain.  As a result, it was decided that simplification of the headings may elevate this problem.  First the team removed the word “holdings” in the heading “Search UW Library holdings”, which left it as “Search UW Library”.  The heading “Search Other Libraries” was also simplified to just “Other Libraries”.  The heading “Electronic References” was changed to “Online Resources”.  This is done to reduce the confusion and ambiguity caused by both words of the phrases “electronic”. 

    Confusion was also observed when the users were trying to understand the meaning of the subheadings.  For this reason the subheading “Research Guide” was changed to “Search by Discipline”.  Although these are small changes, they should make a significant difference in the associations that users make with what is contained in these headings and subheadings.

    7.3 Organization

    A few problematic areas were encountered while carrying out the usability testing with the participants.  It was observed that a lot of users went to the “UW Home” menu option to try to find certain areas.  The users did not realize that the link would take them to the UW homepage.  It was decided that there is no reason for having “UW Home” as a menu option when it is already available in the top navigation bar.  Therefore, it was removed to avoid the confusion.

    It was also noted that “Online Instruction” does not seem to fit under any of the headings.  “Online Instruction “ is a help function for the web page.  Therefore, the team decided that it should be highly visible and easily accessible.  The best place for this link would be the navigation bar located at the top of the web page.

    The newly named menu option “Search by Discipline” is another label that does not seem to fit logically under just one of the headings.  Therefore, the team decided that redundancy would be the best way to store this link.  This link will be added to two headings “Search Tools” and “Search UW Library”.  These two headings were frequently visited by users during testing and by putting the “Search by Discipline” menu option under both headings, the user stands a better chance of finding this menu options more efficiently.

    Finding electronic dictionaries also proved to be a problem for all of the participants.  Everyone seemed to look at the heading “Electronic References”.  This suggests that the “Reference Tools” subheading should be located under the “Electronic References” heading which is now called “Online Resources”.

    7.4 Order

    As the team proceeded through the testing procedure, the team members realized that the headings are not in optimal order.   After discussing the alternatives, the team decided to change the order to be as follows:

    1. Library Info
    2. Search UW Library
    3. Services for…
    4. Other Search tools
    5. Online Resources
    6. Other Libraries

    This puts the headings in the order of the most frequent as well as the most important from the top to the bottom.  This would be optimal since people read from left to right and from top to bottom.

    The heading “Library Info” contains a large number of menu options.  As a result, it took a long time for users to read all of the options during testing.  The users often ended up scanning the list without reading the menu options carefully.  They would miss the correct link and proceed to click on another heading.  A small change in the order of the menu options according to the frequency of use should improve the usability.  The team, with the feedback from the client, decided that the first two subheadings will stay the same but “Connecting from home” will be moved up to the third position. “Staff and Administration” will be moved up to the fourth position. 

    7.5 General Recommendations

    In addition to the recommendations specific to the design alternative tested in during Iteration #3.  The team is also proposing several other design recommendations.  First of all, the helpfulness of icons must be investigated.  Due to time and resource constraints, the team was unable to perform any user testing of icons.  Icons may be very useful in representing the headings on the gateway page.  Therefore, the option of using icons should be kept open for further usability investigations.

    Secondly, the team recommends that all secondary pages should have consistent layout and phrasing.  Having consistent layout and phrasing makes it much easier for user to navigate through the pages.  Thirdly, the team believes that thorough study should be done to determine if users of the current library gateway frequently use all the available links on the gateway.  One of the major constraints for this project is the fact that all links currently available must be kept in the new design.  This made grouping and organization more difficult.  CNAG should evaluate the value of putting the “extra” links on the gateway page.

    Lastly, the team proposes that CNAG should continue user testing.  Due to time and resources constraints, the team was only able to perform user testing with small groups of users.  CNAG should conduct user testing with a larger user population.  The results obtained would likely be much more representative than those obtained during the course of this project.

     

References

Chi, C., Chung, K. (1996). Task analysis for computer-aided design (CAD) at a keystroke level.  Applied Ergonomics.  Vol. 27. No. 4. Pp. 255 – 265.

Norman, D. (1989).  The Design of Everyday Things.  New York:  Doubleday.

Nielsen, J. & Sano, D.  (1995).  SunWeb: user interface design for Sun Microsystems’s internal Web.  Computer Networks and ISDN Systems.  Volume 28.  Pp.  179 - 188.

Smith. K.  (2000).  Project Management and Teamwork.  United States of America:  McGraw Hill.

Wickens, C.D., Gordon, S.E., and Liu, Y. (1998).  An Introduction to Human Factors Engineering. New York: Longman.

 

Appendix A – Detailed Findings from Iteration 1

Nielson’s and Norman’s Heuristics

Using Nielson’s and Norman’s heuristics, many strengths and weaknesses with the current UW Gateway were identified. The characteristics can be summarized into three main aspects: the mouse-over, the language, and the general look-and-feel.

The Mouse-Over

The mouse-over offered advantages and disadvantages to the design.  For instance, the mouse-over is one way to avoid cluttering the Gateway with phrases and links.  The mouse-over menu would be shown at any given time and users would be able to clearly see what are the options provided.  However, the team identified many problems associated with the mouse-over.  Firstly, users may not be able to associate the mouse-over menu with the heading because the heading and the mouse-over menu are far apart.  Secondly, users unfamiliar with the options in the mouse-over menus are forced to remember all the options available in each menu before they can make a decision on where to click.  There is also a lack of feedback when the user clicks on the headings.  If users were unaware of the mouse-over, they would be confused by the lack of feedback.  The situation can be illustrated as follows: a new user wants to use online help features to learn about the site.  However, the user could not interpret the mouse-over functions.  The user would likely to be very confused and frustrated.

The Language

A clear attempt was made to make the language simple and easy to understand.  However, during the heuristics evaluation, the language used in the Gateway is found to be confusing, vague, and misleading.  Headings like “Find it” and “Get it” can be interpreted in many different ways.  There is a lack of explanations and descriptions for the headings and the options in the mouse-over menus.  Confusion arises as a result.  For instance, “Find it…by subject” is actually refers to a research guide instead of a search by topic.  In addition, the options in the Library Form dropdown are almost cryptic.  Although the dropdown is targeted to experienced users, it may be unreasonable to expect the experienced users to recall the meaning of all the options.

The General Look-and-Feel

The Gateway has minimal amount of words and this avoids cluttering the screen.  The headings are nicely spaced, suggesting that they are different options.  However, the colour scheme used in the Gateway may be a source of problem.  The mouse-over menus use blue as the colour for highlights and red as the normal colour.  This contradicts most people’s experiences with web page links.  Moreover, the picture used as background graphics is masking some of the links, e.g. the “Text Version” link.  The links contained in the top navigation bar of the Gateway are not obvious to users at all.  Users will likely treat the top navigation bar as just a graphics component.  Lastly, there is a lack of “Quick Links” aside from Library Forms dropdown menu.

Design Walkthrough

From the design walkthrough, the following common observations and comments were obtained from the users.

  • There were a lot of scroll down and contemplation.  In most cases, the user had no idea about where to go.  Most users tried to understand the features of the Gateway by trial and error. 
  • Users typically do not read the words carefully under after a mistake was made.
  • Most users looked at the headings to see if the tasks would fit under the heading.  Many problems were encountered because the users could not extrapolate from the headings.
  • Users did not notice the Trellis shortcut on the top navigation bar in cases where they were explicitly told to find a book or a journal article. 
  • On many occasions, the site index had the link the user was looking for.
From these observations, it was concluded that there were major problems with several design aspects.  These are design challenges that must be overcome. The site designers should consider the following walkthrough aspects carefully:

Visibility

The correct action must be sufficiently evident to the user. When the user initially navigates the gateway, he/she must be immediately aware of how to activate the mouse-over menus by placing the cursor over the headings. Also, the four links in the top navigation bar must be made more obvious.  In many cases, the user did not know these were usable links.

Action Association

One challenge for the user is to associate the desired task with the functional headings of the gateway menu.  Will the user connect the correct action description (label, dialog box) with what the user is trying to do?  For example, when the user attempts to find Margaret Atwood’s Alias Grace, he links to Find It - By Subject. By intuition, that traversal is interpreted directly as finding resources by subject. However, that section actually lists “Research Guides by Subject”.

Feedback

When the user follows a link, the connected site should provide appropriate feedback to indicate the user’s destination.  Due to inconsistency in the gateway, some destinations do not accurately reflect the user’s intended path. For example, in Find It - By Subject, the destination link reads “Research Guides by Subject” and listed are research guides by subject headings.  The “By Subject” link perhaps should read “Research Guides” instead, to reflect the set of material that is being searched.

Hierarchical Task Analysis

Task 1: How can I find electronic maps?

HTA for Task 1

Figure 7:  HTA for Task 1

Table 5: Potential Problems for Task 1

Box # Potential Problem
2.1.2 If the user does not move the mouse around they may never know that there are options under each one of the headings.
2.1.3 The user may not know that electronic maps would be located under “Find It”.
2.2.2 If the user accidentally moved the cursor to the other headings, the options in the mouse-over menu would change.
2.3 There is no reason for someone to know that electronic maps would be under the heading of “Reference tools”.

 

Task 2: Guelph has a book I need.  How do I have it brought to Waterloo?

HTA for Task 2

Figure 8:  HTA for Task 2

Table 6: Potential Problems for Task 2

Box #
Potential Problem
2.1.1
User may not know that “Get It” is the correct choice from the list of headings.  The user may think that “Find It” is the right heading.
2.2.1
User will not necessarily know that “TRELLIS” is the name for the library catalogue
2.2.3
User may not be sure as to what search method they wish to use once they have entered TRELLIS
3.2.1
Request option is quite vague

 

Task 3: Where do I find directions for connecting from home?

HTA for Task 3

Figure 9:  HTA for Task 3

Table 7: Potential Problems for Task 3

Box #
Potential Problem
2.1.2
Users may have a tendency to click on the “Help” heading.  If user clicks on “Help”, they may be confused because there is a lack of feedback.  
2.1.3
If cursor is moved outside of the allowable area of "Help", menu options change to those for the heading above or below.
2.2.2
It is not obvious that menu options need to be clicked on because the border hugs the list of options too closely.
2.2.2
The options almost appear like one paragraph instead of separate and distinct items to be clicked on.

 

Task 4: As a new undergraduate student, how can I familiarize with the library’s services and procedures?

HTA for Task 4

Figure 10: HTA for Task 4

Table 8: Potential Problems for Task 4

Box #
Potential Problem
2.1
New students may need really obvious links to lead them to the help or tutorial page
2.2
New students may have problems with mouse-over and other navigation issues
2.2
Users may have problem remembering all the options available under each heading.
2.3
New students may not be able to recognize which menu to choose.
2.4
Users may lose mouse-over menu if they moved the cursor to other headings.

 

Task 5: I need to contact the librarian for my department – how do I find his/her phone number or email?

HTA for Task 5

Figure 11: HTA for Task 5

Table 9: Potential Problems for Task 5

Box #
Potential Problem
2.1
User may not know where to find the information.
3.1
User may not know how to copy and paste to an existing window.

 

Appendix B – Detailed Findings from Iteration 2

Detailed Findings from Hierarchical Task Analysis

Design Alternative #1

Task 1: Where can I find electronic maps?

HTA for Task 1 from Design Alternative 1

Figure 12:  HTA for Task 1 from Design Alternative 1

Table 10: Potential Problems in Task 1 from Design Alternative #1

Box #
Potential Problems
2.2
The user may not know that electronic maps would be located under "Reference Tools" in “How do I?”
2.2
If user makes a mistake in choosing the correct heading or menu option, requires user to load and reload pages---time consuming
2.2
Menu options are not immediately available to user---requires user to load a new page first.  This requires user to store information on menu options for each heading in the head when users are unsure as to which option to choose.

 

Task 2: How can I discover when my UW library books are due?

HTA for Task 2 from Design Alternative #1

Figure 13: HTA for Task 2 from Design Alternative #1

Table 11: Potential Problems in Task 2 from Design Alternative #1

Box #
Potential Problems
2.1
User may not know that Patron Info is under Trellis only
2.2
Patron Info tab may be difficult to locate because it is so small and users may not realize it is a button

 

Task 3: Guelph Library has the book I need.  How do I have it brought to UW?

HTA for Task 3 from Design Alternative #1

Figure 14: HTA for Task 3 from Design Alternative #1

Table 12: Potential Problems for Task 3 from Design Alternative #1

Box #
Potential Problems
2.1
Loading and reloading of pages is too time-consuming and doesn't promote good error-recovery
2.0
Requires users to retain instructions instead of simply taking users to the correct page
2.1
If user already knows which book they want from Guelph, may not make intrinsic sense to "Search" under the "Search Inter-Library Holdings" heading

 

Task 4: I need to contact the librarian for my department - how do I find his/her phone number?

HTA for Task 4 from Design Alternative #1

Figure 15: HTA for Task 4 from Design Alternative #1

Table 13: Potential Problems for Task 4 from Design Alternative #1

Box #
Potential Problems
2.1
Loading and reloading of pages is an issue---too time-consuming

 

Design Alternative #2

Task 1: Where can I find electronic maps?

HTA for Task 1 from Design Alternative #2

Figure 16:  HTA for Task 1 from Design Alternative #2

Table 14: Potential Problems in Task 1 from Design Alternative #2

Box #
Potential Problems
2.2
User may not intrinsically know that electronic maps would be under the heading of “Reference tools” in "How do I?"

 

Task 2: How can I discover when my UW library books are due?

HTA for Task 2 from Design Alternative #2

Figure 17: HTA for Task 2 from Design Alternative #2

Table 15: Potential Problems in Task 2 from Design Alternative #2

Box #
Potential Problems
2.1
User may not know that Patron Info is under Trellis only
2.2
Patron Info tab may be difficult to locate because it is so small and users may not realize it is a button

 

Task 3: Guelph Library has the book I need.  How do I have it brought to UW?

HTA for Task 3 from Design Alternative #2

Figure 18:  HTA for Task 3 from Design Alternative #2

Table 16: Potential Problems for Task 3 from Design Alternative #2

Box #
Potential Problems
2.0
Requires users to remember a set of instructions for a task rather than simply taking users to the correct page to perform the task
2.1
If user already knows which book they want from Guelph, may not make intrinsic sense to "Search" under the "Search Inter-Library Holdings" heading

 

Task 4: I need to contact the librarian for my department - how do I find his/her phone number?

HTA for Task 4 from Design Alternative #2

Figure 19:  HTA for Task 4 from Design Alternative #2

Table 17: Potential Problems for Task 4 from Design Alternative #2

Box #
Potential Problems
2.1
Users may not know that "Find out About the Library" includes information on library staff

 

Detailed Findings from Design Walkthrough

The following are the results from design walkthrough sessions conducted by the team members with novice users.

  1. How would you find if the library has Margaret Atwood’s Alias Grace?
    • Clicked on Library Holdings
    • Became confused by TRELLIS name
    • Required TRELLIS explanation, and realized that was the correct choice

  2. Is the University Map and Design Library open on weekends?
    • Clicked on Find Out About…
    • Clicked on Hours/Locations

  3. I need to contact the librarian for my department – how do I find his/her phone number or email?
    • Clicked on Find Out About…
    • Click on Staff/Administration

  4. Does the Library have any electronic dictionaries?
    • Clicked on Related Sites
    • Thought an e-dictionary was library “related site”
    • Backtracked to main
    • Used Site Search to find results

  5. Where can I find electronic maps?
    • Used Site Search to find results

  6. Where do I find the URL for Yahoo?
    • Went to UW Homepage to use their internet search tools.
    • User felt an internet engine like Yahoo belonged in related sites..
    • NOTE: Further research of Related Sites is necessary

  7. Where can I find a database in which to locate articles on Anthropology subjects?
    • Clicked on Search Library Holdings
    • Clicked on By Subject

  8. Where can I find information on how to cite web sites?
    • Did not understand that Style Manuals was a method to cite web sites
    • Did not find the solution to this problem
    • Site Search of style manual would have yielded result

  9. Can I read an article of a journal without coming to the library?
    • Clicked on Search Library Holdings
    • Clicked on E-Journals

  10. Guelph library has books I need.  How do I have them brought to UW?
    • Clicked on Inter-Library Holdings
    • Clicked on From Guelph

  11. How can I discover if any materials have been placed on Reserve for my courses?
    • Clicked on How Do I…
    • Clicked on Course Reserves

  12. How can I discover when my UW library books are due?
    • Clicked on How Do I…
    • Clicked on View Record

  13. I am visually impaired – does the library offer any special services for me?
    • Clicked on Find Out About…
    • Clicked on Accessibility

  14. Where can I find the exam timetables?
    • Clicked on Related Sites
    • Clicked on UW Homepage link

  15. How can I find out if my course textbook is in stock at the UW Bookstore?
    • Same as step 14

  16. Where can I find instructions about connecting from home?
    • Clicked on How Do I…
    • Clicked on Connect From Home

  17. Getting a copy of a book from a remote library
    • Clicked on Inter-Library Holdings
    • Clicked on Other Library Catalogues

  18. Getting an article from a journal at University of Guelph
    • Clicked on Inter-Library Holdings
    • Clicked on From Guelph

  19. Where to start to do research on a subject
    • Clicked on How Do I…
    • Clicked on TRELLIS Help

  20. What library orientation sessions are offered for grad students this term?
    • Clicked on Find Out About…
    • Clicked on Events
    • Clicked on Library Workshops

Detailed Findings from Card Sorting

The following are the groups identified by the subjects during card sorting.

Subject #1

  1. By Subject, E-Texts, Site Index & Search, From Guelph/Laurier/Annex, E-Data, E-Journals
  2. Alumni, Distance Education, Staff/Administration, Faculty & Staff, Undergraduates, Graduate Students, Persons with Disabilities
  3. Laurier Library, Guelph Library, Kitchener Public Library, Conestoga College Library, From Other Libraries, Guide to the Libraries
  4. Library Development, Hours/Locations, Accessibility, Business & Community, Other Library Catalogues, News/Events/Exhibits, Guide to the Libraries
  5. Library Development, By Subject, Accessibility, Business & Community, Other Library Catalogues, News/Events/Exhibits, Tours & Workshops
  6. From Waterloo, Course Reserves, TRELLIS Help, Research Guides, View Your Record, Reference Tools, Renewals, TRELLIS: Our Catalogue
  7. UW Home Page, TUG Home Page, Online Instruction, Connect from Home, Internet Search Tools, Site Index & Search
Subject #2
  1. Journal Indexes, E-Journals, E-Texts, E-Data
  2. Hours/Locations, Renewals, Other Library Catalogues, View Your Record
  3. Reference Tools, Course Reserves, Research Guides, Internet Search Tools, By Subject, Site Index & Search
  4. Guide to the Libraries, Business & Community, Distance Education, News/Events/Exhibits, Tours & Workshops, Library Development
  5. From Other Libraries, From Guelph/Laurier/Annex, From Waterloo, Guelph Library, Conestoga College Library, Other Library Catalogues, Laurier Library, Kitchener Public Library
  6. Online Instruction, Connect from Home
  7. TUG Home Page, TRELLIS: Our Catalogue, TRELLIS Help, UW Home Page
  8. Undergraduates, Graduate Students, Alumni, Persons with Disabilities, Staff/Administration, Faculty & Staff
Subject #3
  1. Laurier Library, Other Library Catalogues, Conestoga College Library, Guelph Library, TRELLIS: Our Catalogue, Guide to the Libraries, Kitchener Public Library, Course Reserves
  2. Hours/Locations, Accessibility, Tours & Workshops
  3. From Waterloo, By Subject, Renewals, View Your Record, From Other Libraries, From Guelph/Laurier/Annex
  4. TUG Home Page, Connect from Home, UW Home Page
  5. Graduate Students, Alumni, Undergraduates, Persons with Disabilities, Faculty & Staff, Distance Education, Staff/Administration
  6. Online Instruction, TRELLIS Help, Internet Search Tools, Research Guides, Reference Tools, Site Index & Search
  7. Library Development, News/Events/Exhibits, Business & Community
  8. Journal Indexes, E-Texts, E-Data, E-Journals
Subject #4
  1. Site Index & Search, Internet Search Tools, News/Events/Exhibits, UW Home Page
  2. E-Texts, By Subject, Reference Tools, Journal Indexes, E-Data, E-Journals
  3. Graduate Students, Alumni, Distance Education, Undergraduates, Persons with Disabilities, Business & Community, Faculty & Staff, From Waterloo
  4. Accessibility, Research Guides, Library Development, Connect from Home, Course Reserves, Tours & Workshops, Hours/Locations, View Your Record, Renewals, Online Instruction
  5. From Other Libraries, Guelph Library, Guide to the Libraries, Other Library Catalogues, Laurier Library, Conestoga College Library, From Guelph/Laurier/Annex, Kitchener Public Library
  6. TRELLIS Help, TRELLIS: Our Catalogue, TUG Home Page
Subject #5
  1. Faculty & Staff, Graduate Students, Undergraduates, Persons with Disabilities, Distance Education, Alumni, Business & Community
  2. From Waterloo, From Guelph/Laurier/Annex, From Other Libraries, Laurier Library, Guelph Library, Kitchener Public Library, Conestoga College Library
  3. E-Journals, E-Texts, E-Data
  4. TRELLIS: Our Catalogue, Other Library Catalogues, Journal Indexes, By Subject, Course Reserves
  5. Hours/Locations, Library Development, Accessibility, Guide to the Libraries, News/Events/Exhibits, Tours & Workshops
  6. Research Guides, TRELLIS Help, View Your Record, Renewals, Connect from Home
  7. Online Instruction, Internet Search Tools, UW Home Page, TUG Home Page, Site Index & Search, Reference Tools

 

Appendix C – Detailed Findings from Iteration 3

Keystroke-Level Analysis Benchmarks

The following tables summarizes the benchmarks determined from keystroke-level analysis performed on the ten questions provided by Professor MacGregor as user-testing tasks.

Assumptions

  1. The test monitor reads the question to the user, and that the trial starts once the question has been read.
  2. The user has his/her hand on the mouse as the question is being read (i.e. don’t need to calculate in initial “home to device).
  3. The cursor is at the centre bottom of the screen at the start of the trial.
  4. The user is an average typist (i.e. Average skilled typist = 0.20 sec per keystroke).
  5. The user is experienced (i.e. does not make errors).
Question #1
Step Operator Time
1.
Mentally prepare (after hearing sentence) 1.35 s
2.
Point to link (Search UW Library Holdings) 1.10 s
3. 
Click on link (Search UW Library Holdings) 0.20 s 
4.
Mentally prepare (as reads menu)

Assume hand remains on mouse

1.35 s
5.
Point to link (Research guides by discipline) 1.10 s
6.
Click on link (Research guides by discipline) 0.20s
Total
 

5.3 s

or

Step Operator Time
1.
Mentally prepare (after hearing sentence) 1.35 s
2.
Point to link (Search Tools) 1.10 s

3. 

Click on link (Search Tools) 0.20 s 
4.
Mentally prepare (as reads menu)

Assume hand remains on mouse

1.35 s
5.
Point to link (Online Instructions) 1.10 s
6.
Click on link (Online Instructions) 0.20s
Total
  5.3 s

Question #2

Step
Operator Time
1.
Mentally prepare (after hearing sentence) 1.35 s
2.
Point to link (Search Other Library Holdings) 1.10 s

3. 

Click on link (Search Other Library Holdings)

0.20 s 

4.
Mentally prepare (as reads menu)

Assume hand remains on mouse

1.35 s
5.
Point to link (Obtaining Materials from U of G, Laurier.etc) 1.10 s
6.
Click on link (Obtaining Materials from U of G, Laurier.etc) 0.20s
Total
  5.3 s

Question #3

Step
Operator Time
1.
Mentally prepare (after hearing sentence) 1.35 s
2.
Point to link (Services for…)

1.10 s

3. 

Click on link (Services for…)

0.20 s 

4.
Mentally prepare (as reads menu)

Assume hand remains on mouse

1.35 s
5.
Point to link (Alumni)

1.10 s

6.
Click on link (Alumni) 0.20s
Total
  5.3 s

Question #4

Step Operator Time
1.
Mentally prepare (after hearing sentence) 1.35 s
2.
Point to link (Library info) 1.10 s

3. 

Click on link (Library info)

0.20 s 

4.
Mentally prepare (as reads menu)

Assume hand remains on mouse

1.35 s
5.
Point to link (Hours, Locations) 1.10 s
6.
Click on link (Hours, Locations) 0.20s
Total
  5.3 s

Question #5

Step Operator Time
1.
Mentally prepare (after hearing sentence) 1.35 s
2.
Point to link (Search Other Library Holdings) 1.10 s

3. 

Click on link (Search Other Library Holdings) 0.20 s 
4.
Mentally prepare (as reads menu)

Assume hand remains on mouse

1.35 s
5.
Point to link (Kitchener Public Library) 1.10 s
6.
Click on link (Kitchener Public Library) 0.20s
Total
  5.3 s

Question #6

Step
Operator Time
1.
Mentally prepare (after hearing sentence) 1.35 s
2.
Point to link (Search Tools) 1.10 s

3. 

Click on link (Search Tools)

0.20 s 

4.
Mentally prepare (as reads menu)

Assume hand remains on mouse

1.35 s
5.
Point to link (Reference Tools) 1.10 s
6.
Click on link (Reference Tools) 0.20s
Total   5.3 s

Question #7

Step
Operator Time
1.
Mentally prepare (after hearing sentence) 1.35 s
2.
Point to link (Search UW Library Holdings) 1.10 s
3. 
Click on link (Search UW Library Holdings) 0.20 s 
4.
Mentally prepare (as reads menu)

Assume hand remains on mouse

1.35 s
5.
Point to link (Trellis: Our Catalogue) 1.10 s
6.
Click on link (Trellis: Our Catalogue) 0.20s
Total
 

5.3 s

or

Step Operator Time
1.
Mentally prepare (after hearing sentence) 1.35 s
2.
Point to link (Trellis) navigation bar at top. 1.10 s

3. 

Click on link (Trellis) navigation bar at top

0.20 s 

Total
  2.65 s

Question #8

Step Operator Time
1.
Mentally prepare (after hearing sentence) 1.35 s
2.
Point to link (Library info) 1.10 s

3. 

Click on link (Library info) 0.20 s 
4.
Mentally prepare (as reads menu)

Assume hand remains on mouse

1.35 s
5.
Point to link (Connecting from home) 1.10 s
6.
Click on link (Connecting from home) 0.20s
Total
  5.3 s

Question #9

Step
Operator Time
1.
Mentally prepare (after hearing sentence) 1.35 s
2.
Point to link (Library Information) 1.10 s
3. 
Click on link (Library Information) 0.20 s 
4.
Mentally prepare (as reads menu)

Assume hand remains on mouse

1.35 s
5.
Point to link (Renewals) 1.10 s
6.
Click on link (Renewals) 0.20s
Total
  5.3 s

Question #10

Step
Operator Time
1.
Mentally prepare (after hearing sentence) 1.35 s
2.
Point to link (Library info) 1.10 s

3. 

Click on link (Library info) 0.20 s 
4.
Mentally prepare (as reads menu)

Assume hand remains on mouse

1.35 s
5.
Point to link (Staff and Administrators) 1.10 s
6.
Click on link (Staff and Administrators) 0.20s
Total
  5.3 s

 

Appendix D - Summary of Task Allocation and Learning

Summary of Task Allocation

Iteration 1

The team had three main goals when allocating the tasks.  First of all, the work must be divided evenly between the team members.  Secondly, each team member should be exposed to each of the user-centered design evaluation methods.  Lastly, the task allocation would take advantage of each team member’s strengths.  Achieving these goals turned out to be very challenging.   Each team member had to compromise on the rest of the team’s behalves.

The first goal of dividing the work evenly was made out of the respect for all team members.  All team members were busy with other commitments.  Unfair distribution would cause conflicts among team members.  This was a very easy goal to agree upon, as the idea was mutual within the group.

The second goal was agreed upon because the team members believed that exposure to each evaluation method would be beneficial in the long run.  Each team member participated in heuristics evaluation, design walkthrough, and hierarchical task analysis (HTA).

For heuristics, the team members used either Nielsen’s Heuristics or Norman’s Principles to perform a competitive analysis between UW’s Gateway and the other Gateway pages identified by the client.  For design walkthrough, each member was responsible for four of the twenty questions provided by the clients.  The team members observed novice users perform the designated tasks.  Finally, for HTA, each team member completed one HTA with one of the twenty questions provided.

The team members’ strengths were taken into consideration for the write-up of the assignment.  Some members have more experience with computer applications while others have better writing skills.  This allowed for highly efficient work as well as high quality results.  Overall, all of the goals were accomplished and the team members worked well together.

Iteration 2

The team had a very limited timeframe for this iteration of the project.  Some team members made plans for reading week.  Some team members were also going through midterms.  These circumstances had made task allocation more difficult.  For this iteration, the team had three main goals when allocating the tasks.  First of all, the team must be considerate about the team member’s commitment to other courses and their schedules.  Secondly, team members should be able to choose what they are interested in doing.  Lastly, the task allocation would take advantage of each team member’s strengths.

Although the team was only required to perform one form of usability evaluation in addition to the discount usability evaluation, the team felt that the card sorting technique would be highly beneficial in the long run.  Therefore, the team members decided collectively that the team would take on discount usability evaluation, hierarchical task analysis (HTA), and card sorting.  The team decided to only have one person working on each task because the team members have already been exposed to, and are familiar with the different usability methods (heuristics evaluation, design walkthrough, and HTA).  Therefore, there is no longer a need to make sure that each member gets hands-on experience with each method.

Each team member was then asked to express his or her preferences in terms of task allocation.  With these preferences in mind, the team decided to divide up the task as follows:

Table 18: Task Allocation Strategy for Iteration 2
Task Team Member
Discount Usability – Heuristics Evaluation Scott Nisbet
Discount Usability – Design Walkthrough Danny Ho
Hierarchical Task Analysis Anna Tran
Card Sorting Tim Fillier
General Report Write-up Cecilia Chung

For heuristics evaluation, the team members decided to use Norman’s Principles to evaluate the two prototypes created.  For design walkthrough, the twenty questions used previously in Iteration 1 were used again.  For HTA, the same tasks were used as in Iteration 1.  Finally, for card sorting, the existing phrases on the UW Gateway were used as an initial set of words.

Iteration 3

For the last iteration of the project, the team was provided with the “TEAM PROJECT UW LIBRARY GATEWAY WEB PAGE PHASE 3” handout.  On the handout, the testing procedure and sections required for the final report were provided.  With this handout, the team allocated the tasks in the following manner:

Each team member was responsible for finding two test subjects for user testing.  Each team member needed to familiarize him/herself with the test procedure and be ready to test his or her subjects.  Danny and Tim were chosen by the team to present in class.  As for the final report write-up, the team basically divided up the sections in the report according to the team members’ preferences and time constraints.  With the recommendations of Professor MacGregor and Lora Bruyn in mind, the team agreed that Cecilia and Anna would be the editors in charge of the report.  Lastly, Danny will be responsible for incorporating the design changes into the final report since he is most familiar with HTML programming. 

Cecilia Chung    ____________________                Anna Tran    ____________________

Danny Ho          ____________________                Tim Fillier       ____________________

Scott Nisbet      ____________________

Team Learning

Cecilia Chung

In five years of university, this school project is the first time that I am part of a team that has a real sense of structure and team spirit.  We designated a time and place for weekly meetings.  We set group norms at the beginning of the project.  We took turns writing meeting minutes.  And most importantly, we treated each other respectively and considerably, as equals.  From reading the book by Smith (2000), I started thinking about group dynamics differently.  I used to only think about team work in terms of whether it went well or not and whether everyone did their share or not.  I have always had a pretty good idea of what my role tends to be in a team.  From Smith’s book (2000), I learned about how to judge effectiveness, how to evaluate the roles we play, and why challenges arise.  I learned to be a better team member because I found out more about myself in terms of my role and behaviour as part of the team.  I also learned to appreciate the team’s success more than ever before because I now realize how difficult it can be to become an effective and cooperative team.

Tim Fillier

The UW Library Gateway project helped further my understanding of user – centered design, as well as team dynamics. By allowing us to apply the knowledge obtained through course material, this project allowed each of us to take ownership of the various methods of testing and design. From heuristic evaluations to keystroke – level analysis, this project fostered our understanding of the progression of testing throughout each design iteration. More importantly, the UW Library Gateway project facilitated the development of team skills, a necessary asset in the working world.

Many user – centered design methods were applied during the redesign of the UW Library we page. I was personally responsible for heuristic evaluations, design walkthroughs, prototype design, HTAs, card sorting, and KLAs. Through using each of these usability methods, the conclusion I arrived at is that none of these methods should be relied on exclusively. Each of these methods should be used to build on the previous method, and the results from one method should not be seen as the bottom line. This was especially apparent during the card sorting exercise. The results from the card sorting exercise suggested groupings of objects that were not similar in functionality, so our design team was forced to make some subjective changes in order to keep the options consistent. This is just one example of the modifications that had to be made during our design process. Results from each testing stage had to be examined critically. This is where the diverse viewpoints of the collective team helped the most. Through each step of the design stage, the team was the most important functional unit. Individual contributions were important, but in the end, the collective design team made each decision final.

Danny Ho

This project, having gone through three iterations in the spiral design model, is the first successfully iterated project I’ve work on in university so far. Having said that, I’m impressed that that was finally accomplished yet slightly disappointed that it took 4 years for it to finally happen. Learning usability in a team environment and through a group project is the most effective way of learning, I think, and I know I’ve gotten a lot out of working with my group on the various tasks and working towards the various deadlines throughout the term.

In terms of group dynamics, this was not something new since we’ve had a lot of experience working with groups in Systems Design. However, the interdisciplinary aspect of the group was very interesting to observe. Two out of the five group members were from engineering,  so we were reasonably well divided. I felt that there was pleasant cooperation and understanding between all group members. We helped to pull each other through, and made sure that everything was done right.

In the context of learning course material, I believe the course content was sufficiently learned through practice. I am pleased with the opportunity given to try out the different methods. However, it was also interesting to note how some methods were more tedious and less exciting than others (eg. Heuristics evaluation versus the more interactive KLA).

Overall, I’m pleased with the course.

Scott Nisbet

I have learned many practical skills as an individual throughout the development of this project.  I have had a chance to use each of the usability testing methods outlined in class to assess designs that already exist as well as new designs.  Proper use of the spiral method was included and the important advantages were seen in this real world example.  I was also able to identify my strengths and how they can be used to help my group as a whole. I think I am good at critical analysis of ideas and the identification of possible problems with them, which was useful in our team meetings.  As a team, we have learned many important skills.  We learned how to, not only produce quality work for ourselves, but also for the whole group.  This included experience in identifying everyone’s strengths and using them to the advantage of the group.  Compromise was a very important concept in this group atmosphere.  It was used in working around everyone’s busy schedules, which involved picking up some work when others are having problems with their workload.  Compromise also had to be made to divide tasks evenly through the group and in deciding which ideas are the best and will be used in the final product.

Anna Tran

The experience and learning I gained this term from working on the SYDE 348 team project has been an extremely valuable and rather unforgettable one.  First of all, not only was it my first time working in a design team, but it was my first time working as a designer, as well as my first time working on designing a webpage.  I must admit that I began with minimal knowledge about the design of webpages, let alone the usability of one.  My initial thoughts about webpages were that they were really complicated things to put together, but that they would be relatively simple to make user-friendly.  However, with all the assessing and testing behind me now, I've come to learn that putting together a webpage was the easy part.  It was the usability part that was complex.  The spiral method of design made intuitive sense, upon learning about it in class as a student.  The implementation of it was a different story.  As a designer I've learned that there are ideals that we all strive for, but that are rarely attained.  We would have liked to have been able to go through each of the iterations more thoroughly, complete with user-testing at each level and lengthy team meetings to discuss and implement all the findings in the best way possible.  Unfortunately, we do not live in an ideal world, so as students and not full-time designers, we are left with having to make the best of our resources given the limitations of time and energy, since all members of the team had at least two other major projects to juggle. 

The Team

As a team, we felt that we worked extremely well together.  We were able to make the transition from being a co-operative learning group to becoming a high-performance co-operative learning group by the time we began our third iteration.  We were really fortunate to have been placed in a group where everyone got along right from the start.  Not only were we all committed to the project, but our deeper sense of responsibility to one another drew us to become committed to each other as people as well.  Because we were considerate to one another as individuals, we felt that as a team we were able to draw the best out of one another as we were often willing to take on more than our share of the workload if one member was having trouble.  Hearing rumours about the conflicts that arose in other groups, we noted to each other more than once just how fortunate and possibly even strange it was that no conflicts arose among us, considering we did not even have the choice to work together, but were merely assigned to.  Our unique experience as a team has taught me that it is not always necessary to have a leader, especially when everyone is doing their share of the work and there is no social loafing going on.  Each person recognized his/her own strengths and fortunately, as individuals in a team, our strengths did not overlap but worked in a synergistic manner.  We were able to establish our norms based on each person's unique contributions and thus become effective as a team in reaching our goals and getting our work done.

 

Appendix E - Summary of Client Meetings

In this section, summaries of the client meetings that took place for the project undertaken are provided.  The team’s client is Susan Routliffe from the UW Library’s CNAG.  The date and time of the client meetings are as follows:

Meeting #1 – January 20, 2001 at 11:30 a.m.

Meeting #2 – February 27, 2001 at 11:30 a.m.

Meeting #3 – March 20, 2001 at 11:30 a.m.

Meeting #1

In our first meeting with Susan Routliffe, we discussed the preliminary aspects of the project for the term.  As suggested in the Team Project Handout, we agreed to meet three times over the course of the term, once on February 27 at 11:30 a.m. in the multi-media room on the 4th floor of Dana Porter, and once more near the end of the term.  The date, time, and place of the third meeting would be confirmed following the second meeting.  We discussed the background and current status of the project, with Susan emphasizing that the first goal for the team was to determine how usable the current gateway was.  Her operational definition of “usable” is that users are able to make the correct decision for a starting point in their navigation.  According to Susan, we were to assume no constraints and limitations in our design proposal, as the CNAG would be willing to consider all possible design suggestions.  Lastly, Susan suggested that we focus on the language, mouse-over, and the organization of the links and categories in our analysis of the gateway, as those were the problematic areas identified by the CNAG.

Meeting #2

In our second meeting with Susan, we basically gave her an update of our procedures and findings from the first iteration of the project.  We explained to her the objectives and procedural steps in heuristics evaluation, design walkthroughs, and task analyses, as well as describing our major findings based on those methods.  We also gave her a summary of the strengths and weaknesses we found from our competitive analysis of six other library gateways.  Susan expressed her interest and satisfaction with our progress to date, and was anticipating our proposed design alternative.  The meeting ended with us scheduling our final meeting for March 20 at 11:30 a.m. in the same location.

Meeting #3

Our last meeting with Susan consisted mainly of a pilot testing of our most recent prototype.  We went through the test package as provided by Professor MacGregor, taking Susan through the various tasks to familiarize her with our proposed gateway.  Following the testing session, Susan expressed her preference to refrain from filling out the feedback questionnaire as provided in the test package.  She explained that the brief exercise conducted was not adequate for her to become totally familiar with our proposed gateway, as she felt she would need to be in order to answer the questions.  She did, however, provide us with excellent feedback on both the positive and negative aspects of our design, which we took into consideration in our final iteration.

 

Appendix F

– ORE Form [link no longer exists]

Close this window