one thing that always bugged me when designing web-based products was the (seemingly) irrevocable law of html-based user-interfaces: if you want to change the data in a page you need to reload it with its entire interface. a clear disadvantage towards PC-based applications. (or could you imagine MS word re-loading entirely every time you open a menu?)
- google suggest: a google interface giving you keyword suggestions in realtime as you type. read “google suggest dissected” for a technical insight.
- the innovative photo-sharing community flickr lets you edit meta-data for a picture without refreshing the page
- weboogle is a little interactive word-guessing-game that looks up words in a server-based dictionary as you type.
- map.search.ch is an incredibly nice address-search that lets you zoom in from a map of switzerland down to street level, using satellite images, without reloading the page (via simon wilson’s blog)
- the objectgraph dictonary works like google suggest, instead of search keywords it looks up results from an online dictionary.
A9.com, amazon’s search engine, phones home to display search results when you press one of their buttons on the right side.update: simon wilson has just pointed out to me that A9 uses iframes to achieve that effect.
the downside of web pages phoning home is that content dynamically requested normally won’t be indexed by search engines and cannot be directly linked to (permalinks), unless the developes take specifically care of that (as the makers of map.search.ch do). macromedia flash features these disadvantages too, as most designers are rather abusing flash than using it in a sensemaking way. let’s hope that doesn’t happen to XMLHttpRequest.
what will happen is that web-applications will finally be able to compete with PC-based applications, and they will resemble them optically. for example yahoo has announced to build a webmail-application based on oddpost-technology (screenshot of oddpost) that is meant to compete with outlook rather than with hotmail.
many web-based applications will soon resemble PC-based ones so much that users might be astonished they can’t use them the first time they try on a plane. and they will expect them to act just like they desktop programs which will pose new challenges to webdesign-usability, as usage conventions between desktop and web will merge together. just think double-click vs. single-click.
the benefits, on the other hand, are clear:
- user-contibuted content (in forms) can be validated or corrected in real time. google’s “did you mean …?” could be done instantly, for example.
- the latency of a page refreshing on each interaction vanishes, the usability-goal of instantly loading pages comes closer.
- load on servers will be reduced (just look at the “did you mean..?” example above), at least by losing the need to transfer huge interface-code-data on every interaction.
- auto-completion, a feature most PC-based application have, can also be implemented in web-based ones, such as webmail-clients, retrieving data from a user’s address book.
- online-configuration of a complex product or processes such as user-registration or information-search that usually involve multiple reloads can now be done near-instantly. we’ll surely see that on many e-commerce- and community-sites soon.
- service providers will appear that offer XML-data that you can dynamically call into your web pages. i’m sure amazon will start to provide such data soon, if they aren’t already. look at this example from apple for some sets of iTunes data.
- etc, i’m sure we’ll see more innovative uses in 2005.
i can’t expect to use XMLHttpRequest in a project for the first time!