In 1986 the International Organization for Standardization released ISO 8879 entitled ‘Information Processing – Text and Office Systems – Standard Generalized Markup Language (SGML). Although this defined a standard for the first time nothing much happened after that, mainly because no-one knew what to do with it. Then Tim Berners-Lee invented the World Wide Web and suddenly the demand was there.
Everybody wanted hypertext documents. More than that everybody wanted to produce software that would create hypertext documents, if only so they could control the standard.
In the software industry standards (the standard or industry accepted way of doing something that all other software manufacturers must conform to) is just about the ultimate prize. Should one company create a standard, known as a de facto standard in that every other company must follow it despite it never being ratified by a standards committee, then the financial rewards can be truly spectacular. Not only does it have the prestige of being the market leader which can be easily translated into sales, but, if the technology can be patented or copyrighted in some way, then every other company in the business will have to buy a license to use it before they can even begin to compete. A lucrative affair in its own right made even more so by the fact that whoever controls the standard sets the standard. Practically speaking that means they can change it whenever they like, forcing the consumer to buy new software and the rest of the industry to buy new licenses.
Standards, and their control, is therefore an important issue which helps to explain why the W3 Consortium was set up in 1994 to oversee the process, at least as far as the Internet was concerned. Theoretically any web page written to conform to the W3 standard can be read by any Browser no matter where it comes from.
If only life was that simple.
With such massive profits at stake the companies concerned are all pushing ahead with advancements to their own Browsers which, they hope, will become so popular that the W3 Consortium will have no choice but to include it in their next official standard release. A good business practice it might be, but along the way all it really produces is confusion the outcome of which is that some websites which include these new features are unreadable by other Browsers.
On a practical basis, then, before anyone is given the go-ahead to design a web page they should be made to state explicitly, and in writing, either that it will incorporate only those features officially approved by the W3 Consortium or if not they must provide a list of all Browsers that will support it. In this case they should also give the version number of each Browser and, just as importantly, the date it was released. That way the company who expects the website to work for them, and are paying for it, can make their own decisions about the size of the market being reached – or being excluded. For example if a feature is supported by the latest version of Microsoft Internet Explorer the question is how many people are likely to be still using an earlier version which can only be decided when the date of its release is known. Not everyone downloads the latest version immediately it becomes available. A frequently recommended practice is to wait until the early adopters find the bugs and the software company fixes them. Then, when the package is known to be stable and safe, the upgrade can be performed without undue hazard.
Also, remember how the software industry works. What is leading edge today is standard technology tomorrow and almost obsolete the day after (and practically on that time scale). For the web design agencies this creates a tremendous pressure to be constantly working at the leading edge so that when it becomes standard technology they already have the experience and expertise in its use which they can then sell to other clients.
Of course for those first few clients using that leading edge technology this has several implications:
• Delivery could be delayed. If a system has to be learnt before it can be implemented no-one can be sure how long it will take.
• The final product could be flawed. If a system is new no-one can be certain it will work flawlessly, not without extensive testing leading to further delays. Alternatively the website could go live with its faults still undiscovered and still in place making the whole exercise useless.
• The site could be invisible to some users. If a Browser cannot support the features of a particular website it will be unable to read some or all of that website.
While a reputable design agency will make their client aware of all this not every agency is reputable and so the questions should be asked. At the very least the agency should be able to explain the potential shortcomings of each and every feature included in a proposed website.
Better yet should there be even the slightest hint of a problem the onus should be on the design agency to justify that particular feature. Remember, the only people who score points for having fancy web pages are the web design agencies. Everyone else can make do with bog standard technology. And if the agency cannot create a good looking site with the tremendous variety of features which already exists as standard then find an agency that can.
Next, as if the entire business was not already over-complicated, there are yet more ways in which the waters can be muddied even further. In this case the culprit is a particular piece of software generally referred to as a plug-in.