Very few web-based applications are designed to match the
web metaphor.
As a result they are often irritating, counteproductive,
or simply unusable.
During the last two months I've been working on an
IEEE Software theme issue titled "developing with
open source software".
Most of my work is performed over the
IEEE Computer Society
Manuscript Central
web application.
The application is an almost perfect example of everything that
is often wrong with such interfaces.
- it requires a specific browser
- it requires the use of a mouse
- many elements are not readable without a bit-mapped graphics display
- it does not allow navigation using the browser's features
- individual screens are not associated with unique URLs
Yesterday I wanted to print-out the reviews we had received.
A great advantage of the web is that URLs allow you effortlessly
navigate to specific locations.
So it should in theory be relatively easy to write a simple script
to navigate to each review page and print the review.
Unforunatelly it wasn't in practice: all pages appear under the same URL
(review pages in fact appear on a pop-up page completely lacking a URL).
As I result I would have to navigate by mouse to
every paper and to every review and separately
issue a print command.
This operation would entail more than 600 mouse clicks.
It would be unfair to single out this application.
One e-banking site I am using suffers from similar problems, so
do many other sites I try to avoid.
So how should the ideal web application be?
Browser Agnostic
First and foremost the application should be able to work with any
browser.
By any I do not mean Internet Explorer and the latest version of Mozilla,
I mean
any.
This includes
text-based browsers like
lynx, old versions of
Netscape and
Mosaic, and
tools used for non-interactive applications and scripting like
wget.
A browser agnostic application guarantees interoperability with all
current browsers, and also future ones.
A browser agnostic application will also interoperate with search
engine crawl robots, Perl, Python, and Ruby scripts,
C, C++, and Java programs, and any other code that follows the
HTTP protocol.
Navigatable
The browser's navigation buttons (previous, next) should correctly
work to trace backward and forward the user's browsing path.
This is what users have come to expect from their web interactions and
this is what all web applications should deliver.
Some may argue that operations performed through the browser's navigation
buttons may interfere with transactions producing counterintuitive
results.
This is however a moot point.
Users understand that if they visit their grocer twice to buy a
bottle of milk they will end up with two bottles of milk.
Transparent and Bookmarkable URLs
Every screen of the application should be uniquely identified by its
URL.
Thus users will be able to bookmark specific pages they are interested
in using their browser's bookmarking mechanism,
revisit older pages by browsing their browser's history list,
and even email interesting URLs to friends and colleagues.
In addition, the URLs should be decypherable by humans, so that
more sophisticated users can tailor them to their needs and use
scripting languages to automate their work or perform batch operations.
If the application requires some sort of authentication and this
is delivered over an insecure channel (so called advisory authentication)
it makes sense to use the HTTP authentication headers for this purpose.
This allows users to include the authentication information in the URL.
Some may argue that these moves may render the application insecure and
vulnerable to a number of attacks.
This is untrue; applications with opaque URLs and a form based authentication
mechanism trade only provide a false sense of security through obscurity.
Device Independent
Finally, the application should be usable with a simple text-based
display and keyboard.
Many people may prefer the eye-candy of cute graphics and enjoy navigating
with a mouse,
but there are people with physical disabilities that restrict them from
viewing graphics or handling a mouse.
Keeping them out of your application is cruel.
In addition,
Getting There
In a sense, a well behaved web application should,
to the greatest extent possible,
provide the illusion that the pages it serves come from a permanent
static collection of files.
Note that many of the requirements I outlined above also apply to static
web sites.
The objectives I outlined are not difficult to achieve,
a number of high-traffic sophisticated web sites like
Google and
Slashdot play by many of the rules
I described above.
If the tools you are using for building web applications fail to support
this paradigm,
consider them broken and look for alternatives.
Your efforts will result in web sites that are easy to use, responsive,
accessible, and good citizens of the Internet web site society.
Read and post comments