Pauline Mosley Research Seminar
DCS891C Dr. Grossman
The article entitled “User Interface Directions For The Web”, by Dr. Nielsen discusses the inefficiencies and problems of Web User Interfaces (UIs). Dr. Nielsen states that unless a vast majority of Web sites are improved considerably, we will suffer a usability meltdown of the Web no later than the year 2000. He offers a plethora of solutions and states the need for further advancement in browsers, navigation, and information management as well as in content authoring.
In 1994, Dr. Nielsen conducted a Web Usability Study and several problems were noted in the findings. The study is of some historical interest because it includes screen captures of several famous early websites and it is one of the first formal usability studies of the Web. It is remarkable how the findings and conclusions of this study still hold up in the light of the current state of the Web. It appears that usability issues seem to change much more slowly since they stem from human capabilities and interests.
The inefficiencies and problems identified were: getting web sites to actually obey any usability rules, non-existent standards regarding design conventions and user interaction, poor optimization for users accessing online information, inability to project a single design onto a wide variety of platforms, poor web navigation, poor handling of huge amounts of information and inadequate support of applications over the Web.
This is definitely a challenge but usability rules could be enforced if the web embraces the notion of quality as a pervasive attribute of objects. The author suggests implementing a reputation manager for the Internet. The reputation manager would judge the quality of the pages (page(s) adherence to the usability’s rules) and hence the value of one’s web site would depend on that user’s own status in the reputation manager. He mentions the PHOAKS project at AT&T, which provides users with Web documents, and Web sites that have been deemed the most valuable for the various topics discussed on Usenet. However, note that quality is determined by human judgment.
One approach mentioned by Dr. Nielsen in regards to poor design would be the availability of web authoring tools, which provide templates for the most common types of pages. However, templates can never cover all design needs and there is a risk that templates from different vendors will be different and thus, result in proprietary UI standards. Therefore the Web community at large needs to make further efforts in establishing design conventions.
Optimization can be maximized if the content is rewritten. Studies have revealed that most users do not read online but rather they scan the text. Thus a new writing style should be implemented, thereby writing multiple short segments interlinked with hypertext, designed for skimming and structured according to the inverted pyramid style.
way of dealing with the cross-platform design environment would be to separate
presentation and content and encode the presentation – specific instructions in
style sheets that can be optimized for each platform. This concept may be difficult to facilitate since current authoring
tools are lacking at structure editing.
These tools need to be flexible in their design for a multiplicity of
display devices and bandwidths.
Inefficiency: Poor web navigation
This is ongoing challenge especially as the number of pages grows exponentially. It is estimated that before the end of the decade there will probably be 10 billion pages online that can be reached from any Internet-connect device. The pull-down menu is an exceptionally weak way of organizing a user’s bookmarks. How to design better bookmark support is an open research problem, which may provide a solution or solutions to the poor web navigation problem.
The technical background of these problems are based upon the 1994 Web Usability Study conducted by the author and another study of 1992 – Characteristics of Usability Problems Found by Heuristic Evaluation as well as other tests, interviews and questionnaires which the author has conducted since then as well.
One significant thing about his work is its impact on E-commerce. If customers can’t find a product, they can’t but it. It’s cheaper to increase the design budget than the ad budget, and attention to usability can increase the percentage of Web-site visitors who complete a purchase. Hence, usability is now becoming a means of survival on the Web rather than a luxury.
The research methodology employed in the 1994 Web Usability Study was the “discount usability engineering” approach. Statistical data analysis is not appropriate when this empirical methodology is implemented; hence the findings reported in this study are qualitative in nature.
This method is based on the use of three techniques: scenarios, simplified thinking aloud and heuristic evaluation. Scenarios are a way of obtaining quick and frequent feedback from users. It is a special type of prototyping by reducing the complexity of a system and eliminating parts of a full system. Simplified thinking aloud involves real users thinking out loud as they perform tasks. Dr. Nielsen recommends using between 3-5 test users per test. Heuristic evaluation is a method for finding both major and minor problems in a user interface. The heuristic evaluation relied on the 10 basic usability problems.
Three external participants were tested: an MIS director, a programmer and a systems administrator. All had extensive Unix experience, some technical company employed them and all were highly technically competent. The test investigated a best-case situation. Each participant was tested for 60-90 minutes. The test consisted of visiting 2-4 WWW sites and giving their initial impression of the home page and then they were told to explore the site. After the exploratory browsing, the participants were given a directed task that asked to find specific information on that site.
The three articles I choose were: “New Browser From Microsoft Gets Criticism”, “GVU’s 9th WWW User Survey”, and “Stuck With Old Browsers Until 2003”.
In the article entitled “New Browser From Microsoft Gets Criticism”, the Web Standards Project Leader, George Olsen, blasts Microsoft’s new version of its Internet Explorer browser for failing to support Web standards that were previously agreed upon by Microsoft and others in World Wide Web Consortium (W3C). Microsoft’s failure to support Web standards means that Web developers will be forced to continue extensive and expensive workarounds and debugging. The standards-related problems are: cascading style sheets, DOM 1.0 (Document Object Model), XML 1.0 (Extensible Markup Language), XSL (Extensible Stylesheet Language) and HTML 4.0 (Hypertext Markup Language). Personally, I believe that IE5 is the best browser yet and it does support almost everything in the basic Web standards. As for the weaknesses in IE5 they might not matter that much since websites are not going to be able to implement any advanced features for several years to come. This article confirms the findings of Dr. Nielsen’s 1994 study, which clearly state the need for further research of browsers and the problem of mandating software developers to adhere to standards.
In the article entitled “GVU’s 9th WWW User Survey”, the findings from this study confirmed the findings from Dr. Nielsen’s 1994 in the area of browsers, poor optimization and web navigation. This user survey was conducted form April 10, 1998 – May 15, 1998 and over 10,000 web users participated in the study. The study revealed:
31% of the respondents reported having problems with their browsers
53% of the respondents left a web site searching for a product because it was too slow
64.8% stated that the speed at which items were downloaded was too slow
60% of the respondents complained about broken links
The methodology used in this study was non-probalistic sampling. Note that the non-probabilistic sampling does not ensure that the elements are selected in a random manner. Hence, it is difficult to guarantee that certain portions of the population were not excluded from the sample since elements do not have an equal chance of being selected.
The last article, “Stuck With Old Browsers Until 2003” is another article by Dr. Nielsen, which reveals his accurate assessment of Netscape. He predicted that Netscape 5 is a “lost generation” and that Netscape 5 will never get more users than Netscape 4. Recognizing that we are stuck with the old technology for some time frees sites from being consumed by technology considerations and focuses them on content, customer service, and usability.
Three possible research problems arising from this study are:
· The implementation of standards among browsers
· The implementation of web standards on new devices (web, web TV, mobile phones..)
· Bookmark Support
(1) “User Interface Directions for the Web”, Jakob Nielsen, Communications of the ACM, January 1999
(2) “Report From a 1994 Web Usability Study”, Jakob Nielsen
(3) “New Browser From Microsoft Gets Criticism”, Wall Street Journal, April 11, 2000
(4) “Stuck With Old Browsers Until 2003”, Jakob Nielsen, April 18, 1999 http://www.useit.com/alertbox/990418.html
(5) Kehoe, C., and Pitkow, J.E. GVU’s WWW user surveys, (1998); www.cc.gatech.edu/gvu/user_surveys/papers/
(6) Terveen, L., Hil, W., Amento, B., McDonald, D., and Creter, J. PHOAKS: A
system for sharing recommendation. ACM, March 1997