Saturday, December 14, 2019
Usability Evaluation of a Web Design Interface Free Essays
string(202) " that the U E process needs test users from the target population to evaluate the degree to which a product meets specific 13/3/2013 0:50 Usability Evaluation of a Web Design Interface /12 http://ojni\." Usability Evaluation of a Web Design Interface 1/12 http://ojni. org/602/usability. htm Usability Evaluation of a Web Design Interface by Karen D. We will write a custom essay sample on Usability Evaluation of a Web Design Interface or any similar topic only for you Order Now King, RDH, MHeD and Dr. Rosalee Seymour, Associate Professor, EdD, RN Abstract This report presents the results of a usability evaluation of the Web design interfac e for an instructional unit prototype on Herpes Simplex and Apthous Ulcers. Usability is defin ed as the measure of a productââ¬â¢s potential to accomplish the goals of its users (Dumas, 1999). The unit and the Web interface were designed to deliver instruction to undergraduate dental hygien e students. The three randomly selected users/subjects for this evaluation were from an undergrad uate class of dental hygiene students. This report describes the usability evaluation planning, im plementation, data analysis methods, and results. The results demonstrate that conducting usabilit y evaluations help to determine the organization and ease of navigation of an interactive, Web-base d, instructional unit. Usability Evaluation of a Web Design Interface Computers are used to educate, in many instances, with conventional interfaces that i nclude those used to create documents and manipulate data. A Web interface, which was tested in t his case, is very different from a conventional one. The Web is a domain that must be instantly u sable and support many communication modalities. Web designers must focus on the computer user whose goal is to gather information rather than to create documents or manipulate data (Raj ani Rosenberg, 1999). It is critical that the accomplishment of the usersââ¬â¢ goals be the primary objective o f a usability evaluation (UE) of Web site interface design. Users will not be able to access correc t pages unless the constructed site reflects their needs and contains a navigation scheme that allow s easy access to the desired information (Nielsen, 2000a). In Web interface designs the properties of color, sound, navigation, and placement must be considered from a different perspective than with c onventional interfaces. Usability evaluation purposes. The faculty of the Department of Dental Hygiene, where this evaluation was conducted developed an oral pathology course for undergraduate student s in dental hygiene and wanted to deliver it via a Web design interface. The instructional unit o n Herpes Simplex and Apthous Ulcers is the prototype for nine instructional units to follow. It was anticipated that conducting a UE, on the prototype instructional unit Web interface, would enable identification of any usability issues or problems relevant to this Web interface before the constru ction of subsequent instructional units. In keeping with Rajani and Rosenberg (1999), the primary purposes of this UE were agr eed upon as: 1) to determine if the Web-based Herpes Simplex and Apthous ulcer prototype is easy t o navigate 13/3/2013 0:50 Usability Evaluation of a Web Design Interface 2/12 http://ojni. org/602/usability. htm and meets the goals of undergraduate dental hygiene students, 2) to use any identifie d problems to revise this unit, 3) to make recommendations on the construction of additional units based on this prototype, 4) to save faculty time, and 5) to insure studentsââ¬â¢ goals will be met in t he Web interface format. The Literature Usability evaluations include a range of methods for identifying how users actually i nteract with a prototype or completed Web site. Planning of a UE egins with a statement of the ove rall purpose and objectives for the investigation and a clear identification of the problem (Hom, 1999; Instone, 1999). In a typical approach a UE is conducted while users perform tasks and a modera tor watches, listens, and records for later data analyzes and reporting of results (Fichter, 2000) . The next steps are the identification of the subject/users and the design of the study. Graha m (2000) describes many ways to get feedback about the usability of a Web site. Graham (2000) recommends that a moderator observe a user representing the siteââ¬â¢s target audi ence as they navigate the site. Graham (2000) cautions moderators against the interruption of the subject/user while conducting any observations. Nielsen (2000a) also recommends that the user/subj ects be representative of the target audience and not colleagues or others who may know too m uch about the site. Nielsen (2000a) recommends that user/subjects perform specific tasks durin g a UE as opposed to asking them to just play on the test site. These test tasks need to be re presentative of the types of tasks that users will actually perform on the Web site within the Web in terface being tested. Nielsen (2000a) suggests that the moderator solicit comments from users as they progr ess through to task completion to help determine their thought process. Hom (2000) refers to thi s encouragement of user comments during the evaluation as the ââ¬Ëthink aloud protocolââ¬â¢. H om (2000) describes this technique as one in which the user verbalizes any thoughts, feelings, and/or opinions while interacting with the test site. The inclusion of the ââ¬Ëthink aloud protocolââ¬â¢ all ows the moderator to qualitatively measure how the user approaches the Web interface and what consideratio ns they keep in mind when using it. For example, a user verbalizing that the sequence of steps, d ictated by a task, is different from what was expected, could demonstrate an interface problem (Ho m, 2000). Hom (2000) recommends using the qualitative ââ¬Ëthink aloud methodââ¬â¢ in conjunction with performance measures. The performance measures add to the data collected noting such things as: 1 ) the time it takes for a user to complete a task, 2) the number and type of errors per task, 3) the number of users completing a task successfully, and 4) the satisfaction of the user with the si te (Nielsen, 2000a). After determining the study design and identification of the users, Spool et al. (199 9) in agreement with Nielsen recommend development of specific tasks for users to perform during the UE. In addition to a task list, Hom (1999) advocates during the planning phase of UE that on e specify materials needed and the site evaluation environment. Rubin (1994) agrees that the U E process needs test users from the target population to evaluate the degree to which a product meets specific 13/3/2013 0:50 Usability Evaluation of a Web Design Interface /12 http://ojni. You read "Usability Evaluation of a Web Design Interface" in category "Papers" org/602/usability. htm criteria. Rubin (1994) describes six basic elements of a UE: 1) a clear statement of the problem and/or evaluation objectives, 2) a sample of users, which may/may not be randomly cho sen, 3) a setting representative of the actual work environment, 4) observation of users who either use or review a representation of the product, 5 ) a collection of quantitative performance and qualitative preferences measures, and 6) an analysis leading to recommendation for design of the product evaluated. When analyzing data from having conducted a UE, rather than supporting hypotheses one is looking for patterns to identify common problems, in the remarks or observations, between use rs (Dumas, 1999; Hom, 1999). Performance data is statistically analyzed while qualitative data, collected by observing the userââ¬â¢s actions and opinions, is analyzed for trends. The data analysis results should lead to identification of strengths and recommendations for improving the site or pro duct (Nielsen, 2000a; Spool, et al. 1999; Hom, 1999; Dumas, 1999). Usability Evaluation: The Case This UE was conducted because usability problems, within any prototype, are important to discover prior to the costly, time consuming, construction of a web interface for additional i nstructional units. The specific purpose of this UE was to determine if the Web interface presented the H erpes Simplex and Apthous Ulcer prototype interactive educational unit in a way that allowed underg raduate dental hygiene st udents to successfully achieve unit outcomes. Specific objectives for this usability evaluation were to determine: 1) navigational and/or organizational problems with the Web interface, 2) the presence of any confusing term inology in the site, 3) if the site meets the goals of the user, 4) if the users can complete the as signed tasks, and 5) userââ¬â¢s attitudes toward the Web site. Methods A description of the UE environment, user selection criteria and profiles, usability evaluation process, the task list, and evaluation measures for this study follow. Usability evaluation environment. The UE took place in the moderatorââ¬â¢s private campus office. This is a quiet, well-lit room with a comfortable temperature, equipped with a Dell computer workstation, which was used for the evaluation. A sign reading ââ¬Å"Usability Evaluation in Session. Please Do Not Disturbâ⬠was posted on the closed office door to prevent interruptions and distractio ns. The UEs were conducted on July 2, 2001, at 1:00 p. m. , 2:00 p. m. , and 3:00 p. m. Subject/users interacted with the Herpes Simplex and Apthous Ulcers Web interface one at a time. Each subject /user had 20 minutes to complete the usability evaluation. Subjectuser selection and profile. Three randomly selected undergraduate dental hygi ene students, from a target population of 24 (class of 2002), became subjectusers. All 24 students will be required to take the oral pathology courses including the instructional units reflect ing the results of this UE. Alphabetical order by userââ¬â¢s last name determined the order of subjectuser participation. In order to be selected the subject/users must have met the following criteria: 1) be an undergraduate 13/3/2013 0:50 Usability Evaluation of a Web Design Interface 4/12 http://ojni. org/602/usability. htm ental hygiene student, 2) have successfully completed one academic year of the Denta l Hygiene Program, 3) have previous experience with the Internet, and 4) have previous experien ce with Web browsers The demographic characteristics of the users for this evaluation were that: 1) they all were female, 2) ages 25, 22, and 43, 3) all had successfully completed on e academic year in the Dental Hygiene Program, 4) all had pr evious experience with the Internet, and 5) all had between 1 and 3 years experience with Web browsers. Administration protocol. Prior to the UE a training packet and session of 30 minutes were provided to each subjectuser. The training session included a brief description of the UE proce ss, purpose and objectives, and the UE protocol instructions. Each subject/user was given an opportun ity to review the packet and ask any questions before agreeing, by signing a consent form, to be a voluntary participant. The UE packet included: 1) a user profile questionnaire, 2) a task list , 3) a statement of the purposes of the evaluation, 4) evaluation instructions, and 5) a consent form. Prior to each actual UE every subjectuser was again given a 10-minute review of the UE instructions and opportunity to ask questions. Subjects/users were told it would take one hour to complete the entire UE process; 20 minutes to complete the task list. According to Nielsen (2000a) , a UE time of 30 minutes or less is adequate to conduct a UE. An additional 15 minutes allowed time for the user to verbalize about the Web interface and to complete a follow up questionnaire to de termine their attitude towards the Web interface. The remaining 15 minutes of the hour the moderat or used to review notes of comments and observations and to make corrections so that no misunder standing would occur later in interpreting results. Shneiderman (1998) suggests the moderator rewrite UE notes as soon as possible, reducing moderator errors in note interpretation later. The subjectusers were required to use theââ¬â¢ think aloud methodââ¬â¢ (Hom, 2000) to provid e subjective data in conjunction with the collection of various performance measures. The performa nce measures included: 1) the time it took the user to complete the task list; 2) the number of er rors per task, 3) the number of users completing the task list successfully in the allotted time, and 4 ) the attitude of the user toward the Web interface. In addition, the moderator collected qualitative data by observing each user during completion of each task and taking notes regarding their f acial expressions, opinions expressed, and verbalized thoughts while completing UE. The mo derator made notes on the opinions and thoughts of the user following UE. Finally, the subje ctusers completed a questionnaire to describe their attitudes about the Web interface. Implementation Piloting the UE administration protocol. A Department of Dental Hygiene professor, fa miliar with the Internet, Web browsers, and oral pathology pilot tested the UE administration protoco l one week prior to testing subjectusers. The moderator provided the pilot test user with the s ame pre UE instructions and task list that would be given to subject/users. The pilot test resul ted in no problems with the UE administration protocol. The moderator observed the pilot test subject/us er and collected the same quantitative and qualitative data that was to be collected from th e research 13/3/2013 0:50 Usability Evaluation of a Web Design Interface 5/12 http://ojni. rg/602/usability. htm subjectusers. The results of the pilot test showed that the UE protocol could be use d with subject/users without revision. Pre-training for UE. At 12:30 p. m. July 2, 2001, the subjectusers arrived for the pr e UE training session. The moderator distributed the UE packet and described the purpose and proce dures of the UE. The users were given an opportunity to review the UE packet and to ask questions . Each of the three subjectusers signed consent forms before leaving the pre UE training. Administration of UE. Each of the three subject/users arrived at the moderatorââ¬â¢s offi ce for the UE. The moderator reviewed the evaluation instructions and gave time for any additional q uestions to be answered. The following sequence of events occurred for the three users, each: a) beg an the UE , 2) completed the task list, 3) responded to questions about the evaluation experience, 4 ) added thoughts or opinions regarding interaction with the Web interface, and 5) left the mo deratorââ¬â¢s office in 45 minutes each. The administrator used the remaining 15 minutes of each of the t hree hours to rewrite portions of notes taken during observation in preparation for the UE report o f results. Task list and description. The tasks were identified using the purposes and objective s of the UE. The task list includes 10 primary tasks for subjectusers to perform in navigation of the Website interface for the Oral Herpes Simplex and Apthous Ulcers prototype. The task list beginning wi th accessing the Website via the interface and progressing through the instructional unit follows. Because many of the 10 primary tasks were repeated the actual count of performing tasks is 31. Task 1 ââ¬â with the browser open go to www. etsu. edu/cpah/dental/dcte760. This task was chosen to determine if users, indicating they had between 1 and 3 years experience with a Web b rowser, would have a problem accessing a Web site when given only a Web address without a dir ect link. Task 2 ââ¬â read the instructions on the first page of the Web site and click on the lin k that it directs you to go to first. This task was to determine the clarity of the Web interface in p roviding instructions for beginning the instructional unit. Task 3 ââ¬â click on Assignment 1 Task 4 ââ¬â access the discussion forum and enter your name and email address. This task helped determine the Web interface design, by allowing for observing if users had difficulty locating the discussion forum area and/or entering information into it. Task 5 ââ¬â When done in discussion forum, return to Assignment 1. This task will ident ify if users have difficulty returning to the designated page using the Web interface. Task 6 ââ¬â Click on Assignment 2 Task 7 ââ¬â Read the content on Apthous Ulcers. This task requires users to read content on a Web page on the site. Task 8 ââ¬â Click on the images on this page to enlarge them. This task determines the e ase of click navigation to enlarge thumbnail images. 3/3/2013 0:50 Usability Evaluation of a Web Design Interface 6/12 http://ojni. org/602/usability. htm Task 9 ââ¬â Return to Assignment 2. This task determined if users could navigate the We b interface via a link taking them back to a designated page in the Web site. Task 10 ââ¬â Answer the study questions in Assignment 2. The stu dy questions direct the user through a series of multiple-choice items in a linear fashion. Correct responses allow the u ser to continue to the next question while incorrect responses require the user to go back to the questi on and make another attempt to answer. Users cannot go to the next question until the previous q uestion is answered correctly. This task requires navigating through a series of questions with the potential for going back and forth if an answer is wrong. This task determined if users could succe ssfully navigate the Web interface to the study questions Task 11 ââ¬â When the study questions are all answered, return to Assignment 2. This tas k again measures their ability to use the Web interface to return to a designated page in the Web site. Task 12 ââ¬â Click Assignment 3. Task 13 ââ¬â Read the entire case 2 Herpes Simplex. Again, users are required to read c ontent on the Web site but they must use the Web interface design to do it successfully. Task 14 ââ¬â When you have finished reading Case 2, return to Assignment 3. This task de termined if users could navigate the Web interface to a case study contained within the instructi onal unit and return to a designated page in the Web site. Task 15 ââ¬â Click on Assignment 4. Task 16 ââ¬â Go to Case 1. Task 17 ââ¬â Fill in the diagnosis form. This task required students to locate a case, fill in case study information obtained from previous exercises. This task measures the Web interfaceââ¬â¢s ease of navigation using forms to complete information. Task 18 ââ¬â Submit the Form. This task demonstrates if the Web interface allows for ea sy form submission upon completion. Task 19 ââ¬â Return to Assignment 4. User must complete a form by diagnosing the case s tudy patient in this assignment. This task determined if users could easily navigate the case stud y, fill in the appropriate form fields, submit the form, and return to the designated page in the We b site. Task 20 ââ¬â Click on Assignment 5. Task 21 ââ¬â Go to the reflection form. This task demonstrates if the Web interface all ows users to navigate to the reflection form. Task 22 ââ¬â Write your reflections on the unit on the form. A form to reflect on the in structional unit is required for assignment 5. This task demonstrates if users will be able, through thi s Web interface, to make text entries in the appropriate form fields in the reflection form. 13/3/2013 0:50 Usability Evaluation of a Web Design Interface 7/12 http://ojni. org/602/usability. htm Task 23 ââ¬â Submit the form. The task determined if users could navigate the Web inter face to send the completed reflection form electronically. Task 24 ââ¬â Return to Assignment 5. Determines if users via the Web interface, can eas ily return to a designated page in the Web site. Task 25 ââ¬â Go to the course evaluation survey. An evaluation form is included in this instructional unit to determine student attitudes and satisfaction levels with the instructional un it. This task measures if the Web interface allows the user to easily locate a survey on the site. Task 26 ââ¬â Complete the course evaluation survey. This task determined if users using the Web interface, could easily navigate a form to reply to the questions. Task 27 ââ¬â Submit the survey. This task measures whether the Web interface allows use rs to easily submit form information electronically. Task 28 ââ¬â Return to Assignment 5. This task measures the Web interface as it allows u ses to return to designated pages in the Web site with ease. Task 29 ââ¬â Go to the discussion forum. This task determined if the users could open t he forum and is a test of the Web interface design and its ease of promoting discussion. Task 30 ââ¬â Make a forum entry indicating that you have finished the usability evaluati on. This task measures the Web interface designââ¬â¢s success with entering comments into a discussion forum. Task 31 ââ¬â Return to Assignment 5. This task measures the Web interface designs succ ess with returning users to designated pages in the Web site. (N=31 navigational tasks) Non-task performance measures. Following Nielsen, (2000a) subjectusers were asked t o use the ââ¬Ëthink aloud methodââ¬â¢ in conjunction with performance measures. The quantitative measu res to be evaluated included the: 1) amount of time to complete the task list, 2) number of err ors per task, 3) number of users completing the task list successfully in the allotted time, and 4) at titude of users toward the Web interface. In addition to the quantitative measures, the administrato r collected qualitative data during and after the usability evaluation by each user. This UE was designed to measure the ease of undergraduate dental hygiene student user s navigation through the Oral Herpes Simplex and Apthous Ulcers instructional unit prot otype Web interface. Although all task completion or non completion allowed for tests of the in terface, the following three questions focus more directly on navigation of the prototype Web inte rface: Do all the navigational links in this Web site work correctly? Is the organization of this Web site consistent? Is there any confusing terminology regarding navigation and organization on this Web site? Results The success or failure on each task performed as well as the qualitative data collect ed from the 13/3/2013 0:50 Usability Evaluation of a Web Design Interface 8/12 http://ojni. org/602/usability. htm post-test interview and the post-test questionnaire are reported. Because the tasks in the UE were short, the quantitative data collected was based on the entire task list and not on each task independently. Users had adequate time to complete the enti re task list. There were 31Website interface navigation tasks completed by three subject/users with a total of seven navigation errors. 1. User #1 took 20 minutes to successfully complete the task list with one Web i nterface navigation error. 2. User #2 took 18 minutes to successfully complete the task list with three Web interface navigation errors. 3. User #3 took 19 minutes to successfully complete the task list with three Web interface navigation errors Task 1 ââ¬âUsers #1 and #2 completed task #1 easily and were able to successfully open t he designated Web site without Web interface navigation error. User #3 entered the Web site address in the search line of the Web browser, an error message was returned by the browser, and then the user entered the Web site address in the address line of the browser and was able to successfully access the home page of the instructional unit via the Web site interface. In this c ase the navigational error relates to lack of knowledge about where to type in a Web address in a Web browser. Task 2 ââ¬â Read the instructions on the first page of the Web site and click on the lin k that your are directed to go to first. User #1 asked, ââ¬Å"Do I make the decision myself to go to assi gnment 1 or to the course syllabus? â⬠The administrator did not answer this question as the instruct ions on the Web page indicated the first link. This error, while not significant since both links ta ke the student to the appropriate Web page to begin the instructional unit as well as the usability evaluat ion, could add user frustration to the mix. Users #2 and #3 use the Web site interface on the first page of the Web site to readily access the needed location. Task 3, 4, and 5 ââ¬â Click on Assignment 1 and enter your name and email address in the discussion forum. When this task is complete, return to Assignment 1. User #1 was unable to re adily use the Web interface to access the discussion forum. This user consistently scrolled to the bottom of any page before making any choices about where to go next. This scrolling is not consider ed an error in the prototype but could indicate that the Web interface design needs revision to stop this behavior. Once the discussion forum was accessed, this user asked, ââ¬Å"Is this where I go to post my name? The administrator did answer in the affirmative and the user continued with the task. Upon completion of the discussion forum entry user #1 could not navigate back to the desig nated page. The administrator finally intervened and instructed the user to use the ââ¬Å"backâ⬠button on the browser. The user then looked for the ââ¬Å"back keyâ⬠on the keyboard. Further instructi on fr om the administrator got the user back on task. When user #2 realized that the task involved a discussion forum, the user indicated n o previous 13/3/2013 0:50 Usability Evaluation of a Web Design Interface 9/12 ttp://ojni. org/602/usability. htm experience with discussion forums of any type. Her response was ââ¬Å"Am I being timed, b ecause here is the first problem? â⬠The administrator reassured the user that there is as much ti me as needed to perform the task. Upon submission of the discussion forum entry, user #2 chose the ââ¬Å" backâ⬠button on the browser quickly. User #3 got to the discussion forum easily, but then asked, ââ¬Å"Am I the subject? â⬠The administrator informed the user that the responses in the form fields did not matter and that any i nformation could be entered in any field. Upon submission of the form entries, user #3 used the ââ¬Å"backâ⬠button on the browser but indicated that she thought only one click of the ââ¬Å"backâ⬠button was sufficient. All users successfully completed the task. The Web interface design was not the culprit in these task struggles. Tasks 6, 7, 8, and 9 ââ¬â Click on Assignment 2. Read the content on Apthous Ulcers. Cl ick on the images to enlarge the view. Return to Assignment 2. Users #1 and #3 did not click o n the images to view a larger version of the image. Both disregarded this portion of the task com pletely. Perhaps the images were large enough for them. User #2 opened the larger view of the images a nd returned to the designated page in the Web site indicating no problem with the Web interface d esign in the area of enlarging images. All users returned to the designated page in the Web site, but only one user completed the entire task successfully. Tasks 10 and 11 ââ¬â Answer the study questions in Assignment 2. When the study questio ns are all answered, return to Assignment 2. All users navigated through the study questions ea sily. User #1 expressed embarrassment, because the administrator of the UE is also a faculty member in the Department of Dental Hygiene, and the user did not want the administrator to know if the answers to the study questions were incorrect. The administrator reminded user #1 that the an swers to the questions were not the purpose of this evaluation. The Web site was being evaluated n ot the knowledge of the user. User #1 continued to navigate through the study questions, but indicated distress any time she chose an incorrect response to a study question. It is assumed this frustration related to having to go back and continue to answer until the answer was correct befo re going on. User #2 quickly realized that the links chosen by user #1 were a different color. Sin ce all users participated in the UE on the same computer, the visited hyperlinks were apparent. U ser #2 easily navigated the questions with much less distress about incorrect responses, because sh e realized that her peers had chosen incorrectly as well. User #3 also noticed the visited hyperlinks and navigated the questions without incident. However, user #3 had a problem choosing answers beca use the hyperlink was on only one letter, the user had trouble positioning the mouse pointer exactly over the single letter link. The user clicked several times before realizing that the link ar ea was very small. This indicates an area of the Web interface design that needs improvement. All users successfully completed these tasks. Tasks 12, 13, and 14 ââ¬â Click on Assignment 3. Read Case 2. When you have finished re turn to Assignment 3. Users #1 and #3 quickly read the case and returned to the designated We b page. User #2 appeared to have accidentally clicked the wrong link and could not locate Cas e 2. The administrator provided instruction because the user seemed frustrated. After the user located the correct page, there was no problem completing the task. Here it is hard to distingui sh if this is a 13/3/2013 0:50 Usability Evaluation of a Web Design Interface 10/12 http://ojni. org/602/usability. htm Web interface design error or not. Tasks 15, 16, 17, 18, and 19 ââ¬â Click on Assignment 4. Fill in the form. Submit the form. Return to Assignment 4. This was the first form in the Web site. User #1 began with ââ¬Å"OK, what is this? â⬠The user had never filled in a form and submitted it through a Web site. Users #2 and #3 both accessed and filled in the required information in the form fields and returned to the designa ted Web page easily. User #1 took more time, but successfully completed the task. Tasks 20, 21, 22, 23, and 24 ââ¬â Click on Assignment 5. Go to the reflection form. Fill in the form. Submit the form. Return to Assignment 5. This was the second experience with the Web interface using a form. All three users accessed, filled in the form, and submitted the form wi thout a problem. Task 25, 26, 27, and 28 ââ¬â Go to the course evaluation survey. Complete the course eva luation survey. Submit the survey. Return to Assignment 5. Users #1 and #2 had difficulty loc ating the survey link on the page. Once the survey evaluation link was located, no user had any difficulty completing the task. User #3 completed the task easily, but after submission of the form, the user clicked on the ââ¬Å"backâ⬠button to return to the designated Web page in the site. As use r #3 clicked on the ââ¬Å"backâ⬠button she said, ââ¬Å"Is it erasing the form information if I am going back wi th the back button? The moderator assured her the action of the ââ¬Å"backâ⬠button would not erase fo rm input after submission. Task 29, 30, and 31 ââ¬â Go to the Discussion Forum. Make a forum entry indicating that you have finished the UE. Return to Assignment 5. By task 31, all users were familiar with th e site and had no trouble navigating the discussion forum and ret urning to the designated page in the W eb site. Upon completion of the task list, each user had the opportunity to comment on the Web site and offer suggestions and opinions. The following were offered: User #1 indicated that she would be more comfortable if the administrator had not bee n watching her progress. She indicated being watched so closely made her very nervous and she th ought the site would have been much easier to navigate on her own. She indicated that she like d the set up of the Web interface and asked if there were going to be other sites like this for her u se in the dental hygiene curriculum. User #2 indicated that she liked the site and thought it was easy to use. User #3 lik ed the site and would like similar sites for other topics in the dental hygiene curriculum. She indi cated that she did not like using the ââ¬Ëback buttonââ¬â¢ after all the forms. All three users expressed nervo usness about being watched by the administrator. Discussion A sample of three users completed this UE. Nielsen (2000b) indicates that three to f ive participants in a UE are adequate. Usability problems were identified in some part of nine of the ten primary tasks on the task list. In addition, some of the problems as told by the users, rela ted to: 1) the administrator present during the UE was also a professor in the Department of Dental Hygiene in 3/3/2013 0:50 Usability Evaluation of a Web Design Interface 11/12 http://ojni. org/602/usability. htm which the user is a student, 2) the evaluation was conducted during the summer school session, and 3) all users were also students in the administratorââ¬â¢s class. Users reported being mo re nervous about the site content in the presence of this administrator. In future UE studies th e usability administrato r should be a neutral observer The questionnaire completed by the users following the usability evaluation demonstra ted user satisfaction with the site. Shneiderman (1998) suggests users should give their subje ctive impressions of the Web interface. All but one of the responses indicated that the us ers were satisfied with the siteââ¬â¢s navigation and organization. The users indicated that the t erminology used in the site was clear, they were able to complete the assigned tasks easily, the site me t their needs, and the users liked the appearance of the site. The only responses not scored as sati sfactory were related to using the ââ¬Å"back buttonâ⬠. Overall, all three users indicated the ease of na vigating the Web site interface was satisfactory. Recommendations It is evident from the results of this UE that Web-based interfaces for instructional delivery should be evaluated for usability problems. Corrections, suggested by the results, to the Oral Herpes Simplex and Apthous Ulcers instructional unit prototype and Web interface should be made and the site re-tested before continuing development of the remaining nine courses in the oral pat hology Web-based instructional unit series. The usability evaluation of the Oral Herpes Simplex and Apthous Ulcers Web-based inst ructional unit prototype resulted in the following recommendations for improvement to the Web site n avigation and organization. . This may be one time when the use of standard link colors should be violated. Students using the same computer to complete an instructional unit would be able to discern the answ ers chosen by the student previously using the computer. Changes in the Web interface design for te sts so that the link color does not change when a user chooses a particul ar response is recommended. 2. When assigning form submissions, provide a link to take the user back to the designated page in the Web site. The users in this UE did not like using the browserââ¬â¢s ââ¬Å"backâ⬠button after completing the forms nor following entries to the discussion forum. The Web site interface desig n will be changed so the confirmation pages following discussion forum postings and submission of forms will take the user back to the page accessed immediately prior to the form or discussion f orum. 3. The hyperlinks for the answers to study questions were not large enough. Cli cking on a one letter link made users have a hard time identifying the link. This Web site interfac e design will be corrected so that the entire cell in which the letter choices are located will be the hyperlink. Conclusions The UE conducted on the Oral Herpes Simplex and Apthous Ulcers instructional unit pro totype Web design interface proved to be a successful method for the determination of usability problems in a 13/3/2013 0:50 Usability Evaluation of a Web Design Interface 12/12 http://ojni. org/602/usability. htm Web-based instructional delivery method. The users identified usability problems with the Web interface as well as with their own skill or lack of skill with using any browser. Re commendations for revision have been identified by the researcher and will be implemented. Authors Note Should anyone wish to examine the Website and review the Herpes Simplex and Apthous U lcer instructional unit prototype it can be accessed at http://www. etsu. edu/cpah/dental/dcte760/. References Dumas, J. , Redish, J. (1999). A Practical Guide to Usability Testing. Portland: In tellect Books. Fichter, D. (2000). Usability Testing Up Front. Online, 24 (1), 79-84. Graham, J. (2000). Usability Testing Basics. INT Media Group. Retrieved June 30, 2 001, from the World Wide Web: http://clickz. com/print. jsp? article=2053. Hom, J. (1999). The Usability Testing Toolbox. Retrieved June 10, 2001, from the Wo rld Wide Web: http://www. best. com/~jthom/usability. Instone, I. (1999). User Test Your Web Site: An Introduction to Usability Testing. Retrieved July 1, 2001, from the World Wide Web: http://instone. org/keith/howtotest/introduction. html. Nielsen, J. (2000a). Designing Web Usability. Indianapolis: New Riders Publishing. Nielsen, J. (2000b). Why You Only Need to Test with 5 Users. Jakob Nielsenââ¬â¢s Alertb ox. Retrieved June 12, 2001, from the World Wide Web: http://www. useit. com/alertbox/20000319. html. Rajani, R. , Rosenberg, D. (1999). Usable? Or Not? Factors Affecting the Usability of Web Sites. CMC Magazine. Retrieved June 23, 2001, from the World Wide Web: http://www. december . com /cmc/mag/1999/jan/rakros. html. Rubin, J. (1994). Handbook of Usability Testing. New York: Wiley. Shneiderman, B. (1998). Designing the User Interface. Strategies for Effective Huma n-Computer Interaction. Third Edition. Reading: Addison-Wesley. Spool, J. , Scanlon, T. , Schroeder, W. , Snyder, C. , DeAngelo, T. (1999). Web Site U sability: A Designerââ¬â¢s Guide. San Francisco: Morgan Kaufmann Publishers, Inc. 13/3/2013 0:50 How to cite Usability Evaluation of a Web Design Interface, Papers
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.