MY GLAMOROUS LIFE: Tragicomic fodder from the life of Zeldman. A LIST APART: Design, code, content. For people who make websites. LES MISC: Articles, essays, and miscellanies. TAKING YOUR TALENT TO THE WEB: A Guide for the Transitioning Designer.
DAILY REPORT: Web design news for your pleasure.
STEAL THESE GRAPHICS: Free art for your desktop or personal site. FUN HOUSE: Entertainment for you. ASK DR WEB: Tips for web designers. Since 1995. 15 MINUTES: Interviews with movie stars and cyberstars, 1996-1999.
Leap to Happy Cog.

Current ALA: Build A PHP Switcher | This Web Business Part 4: Business Entity Options
Current Glamour: Worked For Me
Controversy: 99.9% of Websites are Obsolete (Digital Web)
Design: CA Interactive Winners (Communication Arts)
Happiness: Microsoft Redesigns!

16 October 2002
[noon]
Where have we been? We’ve been digging. Brand development tasks. Final fixes on a site that’s about to launch. Design and production on a new site. Proposals. Book chapters. Client meetings. Preparation for upcoming appearances. Last-minute debugging. We owe phone calls. We owe email replies. We owe our barber a visit.

Now @ Webmonkey: The Secret Life of Markup.

Dori Smith and Tom Negrino’s JavaScript for the World Wide Web Visual Quickstart Guide, 4th Edition, is a handy book to have around, particularly for non-JavaScript experts who want to add interactivity to their work. And the book’s companion site includes plenty of well-targeted JavaScript examples you can grab and use.

CSS layout—and hybrid design that combines CSS with tables—both require trickery to avoid potholes in browser compliance. Eric Meyer has compiled several strategies in one convenient location.

Not content with masterminding the CSS/XHTML Wired Redesign, Douglas Bowman took a deep breath and reformulated his personal site in those technologies. Looks pretty, works great.

GIF animation on Rollovers remains troublesome in Mozilla 1.1. The technique works as expected in Netscape 6.2, IE5, and other common household browsers old and new. But in Mozilla 1.1, when you mouse over an image whose “on” state consists of a multi-frame animation, you see only the last frame. Mozilla supports ECMAScript, CSS, XML, and the DOM. So why the difficulty with GIF animations? Dunno. :::

13 October 2002
[2 pm]
In this week’s double issue of A List Apart, for people who make websites:
        Build A PHP Switcher, by Chris Clark. ALA’s open source style sheet switchers are swell so long as your visitors use DOM-compliant browsers and have JavaScript turned on. But what if they don’t? New ALA author Clark tells how to build a cross-browser, backward-compatible, forward-compatible, standards-compliant style sheet switcher in just five lines of code. Plus:
        This Web Business Part 4: Business Entity Options, by Scott Kramer. You’ve mastered Photoshop, Flash, PHP, CSS, XHTML and JavaScript; studied usability, accessibility, and information architecture; and can fake your way through XML. But there’s more to running a web business than that. Part 4 of a continuing series. :::

12 October 2002
[noon]
The Real Challenge
In yesterday’s quite lengthy Report (prompted by the release of the Wired redesign and the many issues it raised) we touched on purity of validation and the reasons it is rarely achieved on large-scale, commercial sites, even when the initial templates validate and the client and builders are fully committed to supporting W3C specs. Admittedly, clients and builders who care are still too rare a breed. But even those who do often find themselves flummoxed by outdated middleware, compromised databases, and other such roadblocks.
        In a 10 October editorial, Joe Clark points out that independent designers and developers have no problem authoring to W3C specs (and many even enjoy fixing commercial sites with which they are unaffiliated, bringing such sites up to spec with a few hours’ work).
        Our point is not that XHTML 1, CSS, and Priority 1 accessibility are hard. They’re easy for any halfway sophisticated designer or developer to achieve. The problem lies in the large-scale systems and third-party content that must be integrated into most big sites. We need CMS systems that encourage compliant, accessible authoring practices instead of breaking them. (Translation: We often build our own if the budget permits.) We need backend folks to recognize that compromising the front end “to force a display issue” is no longer an acceptable option. We need clients who are willing to invest in fixing systems that “work” at the expense of forward compatibility. It will take persuasion and it will take money.
        There is no shortage of persuasion. Money is another story.
        In a healthy economy, companies invest in R&D, training, and long-range planning. In a sick economy, companies focus on cutting costs, eliminating personnel and processes, and keeping their doors open for the next 24 hours. Encouraging designers and developers to learn compliant authoring methods is easy: show them the benefits. Coaxing struggling companies to invest in the long-term health of their web presence is the real challenge.

The Vignette-generated validation errors that afflicted Wired.com have now been fixed and the site validates. Not every company has the technological chops to fix errors introduced by Vignette and its brethren. And as we’ve just said, few companies have the dough. The systems themselves must become standards-compliant for the web to progress. Site managers must tell CMS vendors compliance matters, just as designers and developers once told browser makers. If enough customers do this, the vendors may be willing to upgrade their tools.

This screenshot of zeldman.com in NCSA Mosaic, submitted by Usman Farman, shows how backward compatible transitional web layouts can be. Mosaic was the first graphical browser, and was last updated in January 1997, a month after the W3C introduced CSS1 (which Mosaic naturally did not support). Zeldman.com is built to current W3C specs yet looks acceptable and is legible and usable in the long outdated Mosaic browser. :::

11 October 2002
[7 pm | 6 pm | noon | 11 am]
At Netscape DevEdge: An interview with Douglas Bowman of Wired. The interviewer is Eric Meyer. As you’d expect from two such minds, the interview covers benefits in terms of bandwidth, design control, and overall usability, and gets into some nuts and bolts of the Wired redesign (see below).

The big news today is the Wired redesign, whose launch we’ve been anticipating for months. Team leader Douglas Bowman, Network Design Manager at Lycos, aimed for pure standards compliance: XHTML for data, CSS for presentation. Bowman and his team worked hard and achieved their goals. Not only that, the site looks good and is easy to use. But it does not validate.
        Third-party (mainly, advertising) content, over which Bowman and his team had no control, uses invalid methods and improper URL handling. When that content is added to Wired’s pages, the site stops validating. Add poop to your soup and it’s no longer a healthy meal.
        This often happens in the real world. It happens on projects far less ambitious (from a web standards perspective) than the Wired redesign. We can’t control the markup and code ad servers deliver. We can’t always control (or afford to rebuild) outdated backend and middleware systems. What happens to the rest of us happened to the Wired team. This in no way invalidates their efforts or lessens their achievement.
        As a highly visible site with a long (and well-earned) reputation for deploying web technology well, Wired serves as a beacon to all developers. Its XHTML/CSS redesign will inspire other commercial sites to take the plunge, and its example will eventually trickle down to the ad servers.
        That’s how progress works. Indies take risks. Large, commercial sites take risks. Sooner or later, the market follows these leaders. The bigger the leader, the greater the impact, and the sooner what were once risks become norms. When your client, boss, or manager says, “We can’t do this,” you can now reply, “Wired did it.”
        We congratulate the Wired team. Their efforts will make things easier for the rest of us. Expect more coverage of the Wired redesign here and elsewhere soon.

Four additional points about the Wired redesign are worth making:
        1. The site’s builders have written a rationale explaining why they did what they did and positioning the new standards-compliant Wired design vis-a-vis W3C specs and browser history.
        2. Behind the scenes, the builders are addressing many of the technological issues that stood in the way of validation when the site launched late last night.
        3. Following in the footsteps of A List Apart and many other indie sites last year, Wired’s CSS/XHTML redesign makes content accessible to all browsers and devices (including screen readers) but hides its layout from old browsers that weren’t built to support the CSS spec.
        4. Users of these old browsers are informed about newer, more compliant ones (screenshot) and encouraged to give them a trial download. Wired supports the WaSP’s Browser Upgrade Campaign.

A secondary point on the Wired redesign. Hardcore standards geeks sometimes judge the intentions of a site’s owners and creators based on one critieria only: does it validate? When a site fails to validate, some geeks infer that the people behind the site made no effort. As the Wired redesign makes clear, validation alone does not tell the full story.
        If you’re going to take the trouble to run someone else’s site by the W3C validators, take additional time to view source.
        If the source shows valid structural markup but internal URLs generate errors, the middleware may be outdated and there may be no budget to cover the cost of an upgrade. (Budgets this year are not what they once were, as anyone who’s been squeezed out of a job can tell you.) If most of the site is correctly authored but deprecated junk surrounds certain components, those components may include that junk in the database. The company that built the tool that grabs the data may have gone out of business. Money to rebuild may not be available.
        A site’s markup may be perfectly kosher except for two invalid attributes to the body element. Those attributes may be there for the sake of 4.0 browsers. The designer (and even the site’s owner) may have argued against including those attributes, but the company that owns the company that owns the company that owns the site may have insisted. An otherwise compliant and accessible site may include embedded media that requires certain tags and attributes the W3C never saw fit to add to any HTML or XHTML standard.
        In short, do not confuse fundamentalism with evangelism, or necessarily equate lack of compliance with lack of care. On the other hand, if the source is bad throughout, it’s probably safe to assume that the site’s creators are stuck in the 90s and in need of a nudge.

In yesterday’s Report we examined the automated reflective links used on an increasing number of personal sites, and pointed out an unintended side-effect, i.e., as handled on some sites, these links can become a popularity contest. Dive Into Mark, a site we mentioned in that Report, agrees, and has changed its reflective links feature to emphasize network-wide conversation instead of audience size. :::

10 October 2002
[noon]

The hits keep coming

More and more personal sites have become daily annotated linkfests AKA weblogs, and more and more weblogs have begun incorporating automated referrer links showing which third-party sites linked to them. At the bottom of its left-side column, Anita Rowland’s home page includes a list of recent referrers. Tanya Raourn’s Field Notes shows who linked in the past 24 hours and how many visitors each link produced. Dive Into Mark, a fine site in spite of the unpleasant visual its name conjures, includes “Further Reading on Today’s Posts,” showing whose sites responded to its full day of posts and how many visitors followed each link. Todd Dominey’s What Do I Know uses Movable Type’s Trackbacks feature to show who linked to any given entry (but not how many visitors followed those links).
        If the web is a hyperlinked information network, these reflective links are true to its spirit and in some cases may amplify comments on a given site by turning them into a network-wide conversation of sorts. Secondarily, reflective links also suggest that the site you’re reading is worth your time: after all, other sites have commented, so the text must possess value.
        On that level reflective links serve as this year’s version of the Hit Counter, which, by declaring somewhat accurately how many people have visited a site, implies merit or at least popularity.
        In 1996, Jeffrey Veen sagely observed that such counters add no value to user experience and only betray the producer’s vanity. Hit Counters tell approximately how many people have seen a page, but not who, or what they thought about it, or how long they stayed, or how much (if any) of it they read. Hit Counters are also at best semi-accurate. (A Hit Counter may record 500,000 AOL users as a single visitor.) Even their name is a bust. Hit Counters record page views, not hits. For these and other reasons, almost no modern site includes a Hit Counter.
        The Daily Report sports a Hit Counter mainly to annoy Mr Veen. It's been restarted three times since 1995 and is about as accurate as anything else on the web.
        Reflective links can add value but may also discourage the very practice they record. If your site is shown to have sent two or three visitors to someone else’s site, your vanity might prompt you not to link to that site again. After all, who wants to suggest that no more than two or three people are reading their site? For a personal site, the implication is embarrassing; for a commercial site, it could have financial repercussions.
        Proxy issues aside, the number of visitors listed in a reflective link may not accurately reflect the number of readers, since roughly five percent of readers will follow a given link. We’re not basing that percentage on hard research but rather on patterns we’ve observed when visiting sites whose reflective links point back to zeldman.com. If the numbers are accurate, about 95% of our visitors will not follow a given link.
        Our efficacy as a traffic generator is of no interest, but it’s worth noting the 5% click-through rate. By inference, your click-throughs may also be far lower than your readership. You can tell yourself as much the next time a site you’ve linked to declares that two readers followed your link. :::

9 October 2002
[8 pm | 10 am]
More conference. Less money. See us at Web Design World Boston and save $200 off the standard conference rate.

One down, one to go. Fox Searchlight Pictures has left the building and will launch in the next few days. :::

8 October 2002
[5 pm | 4 pm | 2 pm | 1 pm]
And here, in this obscure corner of a little-known mailing list, the W3C tells us it has rejected the ill-conceived and controversial RAND patent policy that would have turned open web technologies into a cash cow for a few large companies. Score one for the good guys.

We keep saying we’ll quit travelling, stay home, and churn our own butter, but our World Tour has a mind of its own. We’re now booked through May 2003 with gigs from Boston to Austin to Budapest—and a few more not yet listed that will soon be confirmed.

Indie web publishers ask when we’ll offer an RSS feed. We hand code The Daily Report, hence no middleware, hence no feeds. Fortunately a cottage industry has arisen to fill the void. Here, for instance, is a zeldman.com RSS feed by James Huston. Some may surmise that we’re opposed to content management systems. On the contrary, we often find them useful and one of our partners develops great ones for some of our clients. Nonetheless, we like authoring The Daily Report by hand and will continue to do so. We also churn our own butter.

Please note third-party RSS feeds are a labor of love. They may or may not offer features you desire. They might not even link to the sites whose data they pull.

Speaking of RSS, kindly read today’s Dive Into Mark on that subject.

Danger HipTop T-Mobile Sidekick stupidly breaks properly authored sites. Rant at Dashes.com, slightly more technical explanation on The Web Standards Project’s Buzz blog. :::

7 October 2002
[11 am | 10 am]
Meet the Makers San Francisco will include interface design legend Jeffrey Veen; a Web Standards panel featuring Microsoft’s Tantek Çelik and Netscape’s Arun K. Ranganathan; and key minds behind E*Trade, MapBlast, and quite probably a site sophisticated web users choose as their default start page. Free VIP tickets are available by request, first come, first served. The event takes place 21 October in the Grand Hyatt near Union Square. Meet the Makers is a series of one-day events for creative people in a technical world. Zeldman sits on its advisory board and finds it quite comfortable.

For your pleasure and convenience, our Resize widget now resides in the right-side Subnav.

A blind web user is suing Southwest Airlines because its website is incompatible with his screen reader. Details at Law.com. We keep saying that even though many sites are not (yet) legally obliged to comply with accessibility guidelines, compliance is right, smart, and may protect a site’s owners from costly litigation and negative P.R. Accessibility isn’t free, but the cost of basic accessibility (Priority 1, U.S. Section 508) is negligible if you incorporate the guidelines into your workflow. Compared to the cost of lawsuits and bad P.R., basic accessibility is a bargain. If you’re unfamiliar with accessibility or confused about what it all means for designers and site owners, try Dive into Accessibility (“30 days to a more accessible website”), Anitra Pavka’s Accessibility Weblog, and Joe Clark’s AccessiBlog.

ICANN (Internet Corporation for Assigned Names and Numbers), the group that assigns control of domain registries to fine companies like VeriSign, has evaluated proposals to reassign the .Org registry and recommended control be assigned to The Internet Society (ISOC) of Reston, Virginia. IMS/ISC, whose non-commercial bid ICANN rejected, has evaluated the evaluation. IMS/ISC concludes ICANN’s decision could cost consumers $268 million over five years. ICANN has shut down its public bulletin board, but if you disagree with its recommendation, you can send email to org-eval@icann.org. Any mail sent to that address will be publicly posted by ICANN. If you’re concerned that your comments may get lost in cyberspace, include a cc: to bot@invisible.net. And if you participated in the “spread the dot” campaign in support of the non-commercial IMS/ISC bid, please note that the campaign has been updated and your dot should now point to the ISC’s OpenReg open source software for registries. :::

The author and his opinions.
Over [counter] served!   Copyright © 1995–2002 Jeffrey Zeldman Presents.
XHTML, CSS, 508.  Reset bookmarks to www.zeldman.com. Ahead Warp Speed.