Categories
A List Apart Accessibility Authoring Best practices CSS Design development HTML interface IXD Layout Markup Real type on the web Responsive Web Design Site Optimization Standards State of the Web The Essentials type Typography Usability User Experience UX W3C Web Design Web Design History Web Standards webtype

Web typography: a refresher and history

Many designers still think in px first when creating baseline styles. But we know intellectually that various relative typography approaches are better suited to our medium in all its complexity. Better for accessibility. Better for avoiding bizarre typographic disasters linked to user preference settings, device limitations, and the unforeseen ways our overwrought styles can interact with one another.

As I contemplate a long-overdue redesign of my own site, it’s worth taking a refreshing dip into what we’ve learned about web typography over the past 20+ years. From the pages of (where else?) A List Apart:

Bojan Mihelac: “Power to the People: Relative Font Sizes” (2004)

An early and simple creative solution for text resizing that respects users’ choices and also gives them an additional option for resizing despite the limitations of some of the most popular browsers of the day. Presented for its historical importance, and not as a how-to for today. https://alistapart.com/article/relafont/

Lawrence Carvalho & Christian Heilmann: “Text-Resize Detection” (2006)

Detect your visitors’ initial font size setting, and find out when they increase or decrease the font size. With this knowledge, you can create a set of stylesheets that adapt your pages to the users’ chosen font sizes, preventing overlapping elements and other usability and design disasters. Presented for its historical importance as an insight into the complex dancing we’ve done in the past to ensure readability. https://alistapart.com/article/fontresizing/

Richard Rutter: “How to Size Text in CSS“ (2007)

Sizing text and line-height in ems, with a percentage specified on the body (and an optional caveat for Safari 2), provides accurate, resizable text across all browsers in common use today. An early move toward more responsive type and away from the accessibility problems created by setting text sizes in px in some browsers and devices. https://alistapart.com/article/howtosizetextincss/

Wilson Miner: Setting Type on the Web to a Baseline Grid

The main principle of the baseline grid is that the bottom of every line of text (the baseline) falls on a vertical grid set in even increments all the way down the page. The magical end result is that all the text on your page lines up across all the columns, creating a harmonious vertical rhythm. https://alistapart.com/article/settingtypeontheweb/

Tim Brown: “More Meaningful Typography” (2011)

Introduces modular scales, the golden ratio of readable typography. Delivers accessibility plus aesthetic beauty derived from the math underlying all of creation. https://alistapart.com/article/more-meaningful-typography/

Tim Brown: “What is Typesetting?” (2018)

“We must now practice a universal typography that strives to work for everyone. To start, we need to acknowledge that typography is multidimensionalrelative to each reader, and unequivocally optional.” https://alistapart.com/article/flexible-typesetting/

Keep going…

For more web design community wisdom and web typography history, see Typography & Web Fonts in A List Apart, for people who make websites.

And in the Comments below, please share your favorite resources for creating websites that look great and read beautifully, no matter what technical and human capabilities get thrown at them.

Categories
"Digital Curation" Applications Archiving Blogs and Blogging conferences content Design dreams Election engagement glamorous industry iphone Microblogging Micropublishing Mobile Networks Platforms social media social networking Standards State of the Web SXSW The Essentials The Old Man of the Mountain tweets Web Design Web Design History Websites writing

Valediction.

I started using Twitter before the dawn of the iPhone. Back then, in 2006, it was a fun, funky, fully functional (if barebones) beta messaging service used mainly by The People of the Web™—the kind of folks who attended the SXSW Interactive conference and probably spoke on the panels. 

You know. You were there. You were one of us: Designers. Developers. Pioneers. Writers of blog posts, trade books, and all the little guide texts that websites depended on to attract and serve their users. People who, in casual conversation, might use words like “digerati” unironically and without intending to be pretentious. 

We believed in the power of the web to highlight unheard voices and evolve a more just society. If we were naive, and we surely were, at least we were on the side of the angels. Turns out, not everybody was.

A new skill

Years before Slack, the early 140-character Twitter served as a kind of private pre-Slack for the digitally awake and aware.

Back in those days, if you’d asked me or my conference-going fellow bloggers and designers who that first, rudimentary Twitter was for, we’d have said it was for us. For people like us, who’d spent years mastering all manner of skills and technologies simply to communicate online. Who saw value in the act of putting words together, so long as there were people to read and react to those words.

(After expressing our feelings of pride and ownership in the Twitter community, of course, the more Ted-talk-y among us early users would have waxed rhapsodic about microblogging and its potential to improve the world. More about that in a moment.)

With the birth of Twitter, when we wanted to pin down something that was twitching about in our heads and transmit it to other heads, the skill we needed wasn’t CSS or HTML or art direction or back-end wrangling. It was the ability to edit our thoughts down to a glittering trophy built with 140 characters or less. A new skill to master!

How much do people like us love showing the world what we’ve learned! This much: Even after Twitter no longer relied on wireless carriers’ text messaging services, so that the permitted character count was consequently doubled, many of us would-be Oscar Wildes continued to whittle away at our tweets, limiting them to 140 characters or fewer on principle.

After all, if we could deliver fully functioning website in 10K or less, we could surely craft deathless sentences from a tightly constrained character count. Right? Of course right!

Only connect

Years later, with a huge international user base, the idea persisted that a globally connected free and open messaging network like Twitter could help humanity do less evil and more good.

If you wanted proof, you could look to the first Arab Spring, to Me Too, to Occupy Wall Street and Black Lives Matter—movements that were greatly abetted by the busy, worldwide network. 

Of course, while many cheered and participated in these activist-driven movements, others saw them as threatening. Some felt the world was changing too fast, and that their views on social issues, like their once-good jobs, had no champion among the ruling classes. We all know how that turned out. 

And now a brief digression about power and megaphones:

How I got over

Nearly two decades before Bluesky and its sweet starter packs, Twitter hired creatives to recommend selected  users to newcomers. Some of the coolest people I know did that work.

Web design was at its peak, so quite naturally the in-house team put together a list of influential designers, developers, and writers for new users to follow. And for a variety of reasons, I was among those early recommended follows. (I may still be listed there, if the current X still welcomes newcomers with follow recommendations.) Which is how, at my Twitter peak, I ended up with a blue checkmark and 355,000 followers.

Even now, on wretched “X,” where I no longer post, I still retain 305,000 followers. At least, that’s what the stats told me when I popped in just now to find out. But are there really that many folks following me there?

How many of my current Twitter/X “followers” used to participate but have since quit quietly, without bothering to close their accounts? Lots, I reckon.

Some may avoid the site but keep their accounts open for strategic reasons, such as preventing someone else from hijacking their name (not that the owner can’t take over your account whenever he feels like it—but I digress).

Mainly, I’m guessing a lot of folks lost interest in the site but forgot to close their accounts. In other words, the data says 305K, but it’s probably less than half that many active users at most, few of whom would even see my tweets if I still posted there, as the algorithm throttles texts from folks like me.

Who cares, besides me? Nobody. Nor should they. And, besides, except as a temptation to stay, my follower count is beside the point.

Come play with us, Danny

The point is that the former Twitter has become a hateful cesspool, not simply mirroring but amplifying its owner’s profound insecurities, god-awful beliefs, and self-serving lies, and forcing that insanity into the public consciousness, whether we avoid X or not.

Thus, millions of Americans who don’t use Twitter/X nevertheless believe conspiracies that the owner and his favorite acolytes use the site to broadcast.

And there’s no doubt that, in consequence of the above, X helped determine the results of the last US presidential election. (I use the phrase “last election” here to mean “most recent election,” although I fear it may come to mean more than that.)

So, in the interest of not supporting fascism, do I abandon these readers? Thanks for asking! Pretty much, yeah.

If you like my longer-form writing, you can find it here on zeldman.com, at A List Apart, and in my books.

If you like my chatty posts, news bytes, and occasional brief confessions, join me on Bluesky.

Good luck to us all in the coming year.

Categories
Accessibility Blue Beanie Day Code Design HTML Standards Web Design Web Design History Web Standards

How to Join Blue Beanie Day: Wear and Share!

Saturday, 30 November 2024, marks the 17th annual Blue Beanie Day celebration. It’s hard to believe, but web standards fan Douglas Vos conceived of this holiday way back in ’07:

The origin of the name of the holiday is the image of Jeffrey Zeldman on the cover of his book wearing a blue knit cap.[7][8][9] Over the years, the Blue Beanie Day also became an action day for web accessibility, for which the correct use of web standards is a basic requirement.[8]Wikipedia

How can you join this year’s fun? That’s easy! Snap a self-portrait wearing a blue beanie and post your Blue Beanie Day photo to Bluesky, Threads, Instagram, Tumblr, LinkedIn, Facebook, your blog (you’ve still got one, right?), and whatevs. Hashtag: .

No blue toque to call your own? Kevin Cornell’s venerable illustration to the rescue! Download the zipped Photoshop file here. If you like, you can ping this web page with a link to your post’s URL. See below for details.

Categories
architecture Best practices data development Standards Web Standards

Understanding MARTI: A New Metadata Framework for AI

At its core, MARTI is a bridge. It harmonizes with existing metadata standards like the Content Authenticity InitiativeAnthropic’s Responsible Scaling Policy, and the W3C’s PROV. It anticipates the needs of future standards, laws and practices, such as those proposed by the Coalition for Networked Information (CNI)The EU Artificial Intelligence Act, and Making Data FAIR.—Carrie Bickner

As I study Carrie Bickner’s initial posts on the MARTI Framework she’s developing to manage AI metadata across various disciplines, a familiar feeling steals over me.

It’s similar to how I felt during the early days of The Web Standards Project (WaSP), when a handful of us took on the quarreling browser makers in what seemed a Quixotic attempt to bring consistency, predictability, usability, and accessibility to an already Balkanized web.

Fortunately, at that time, we had two aces up our sleeves: 1., the standards already existed, thanks to the W3C, and 2., the EU and Clinton Administration were suing Microsoft, which meant that the tech press was interested in hearing what we had to say—even if evangelizing web standards had little to do with accusations that Microsoft was abusing its monopoly power.

Once more with feeling: standards from the community

Years after The WaSP declared victory, and browser stagnation had begun to set in, I felt that same thrill vicariously when Eric Meyer, Tantek Çelik, and Matt Mullenweg invented XFN (XHTML Friends Network), inverting the standards creation pyramid so that great ideas were empowered to bubble up from small groups to the wider community, Open Source style, rather than always coming from the top (W3C) down.

I’ve no doubt that microformats were the spark that lit the HTML5 fuse, and we all remember how Steve Jobs used the new markup language to power the first iPhone, initiating the mobile era we now live in.

More about microformats history is available, and you can read Jeremy Keith’s HTML5 For Web Designers online for free—or buy the 2nd Edition, coauthored with Rachel Andrew, directly from Jeremy.

And now I feel those same stirrings, that same excitement about possibilities, as I study Carrie’s first posts about MARTI, an emerging object-oriented metadata framework that can be used to articulate rights-permissions, preservation metadata, provenance, relationships between objects, levels of AI involvement, and contextual information such as usage history and ethical considerations. 

Here’s why I’m excited (and you may be, too).

What do you wanna do tonight, MARTI?

For better or worse, our ideas create our reality. For better or worse, we have atomic power, the web, and social media. There’s no putting these genies back into their bottles. And there’s certainly no shutting down AI, however you may feel about it. Nor need we, as long as we have smart guardrails in place. 

I believe that MARTI—particularly as it promotes responsibility, transparency, and integrity in documenting AI’s role in content creation and curation—has the potential to be one of those guardrails.

Drafted by a career digital librarian, this provisional  metadata framework for human/generative AI output won’t stop bad actors from scraping content without permission. But if it is extended by our community and embraced by the companies and organizations building AI businesses, MARTI has the potential to bring rigor, logic, and connectedness to the field. In Carrie’s words:

The emergence of generative AI marks a transformative moment in human creativity, problem-solving, and knowledge-sharing. MARTI (Metadata for AI Responsibility, Transparency, and Integrity) is a provisional metadata framework designed to navigate this new landscape, offering a standardized yet adaptable approach to understanding, describing, and guiding the outputs of human-AI collaboration—and even those generated autonomously by AI.

At the heart of MARTI lies a robust object model—a modular structure that organizes metadata into reusable, interoperable components. This model ensures transparency, traceability, and ethical integrity, making it the cornerstone of the MARTI framework.

MARTI is not just an architecture for describing AI output, but it offers a way of structuring policy and a possible foundation for a new literacy. This is not about teaching every individual to code or engineer prompts. It’s about empowering humanity to collectively understand, describe, and guide everything we make with AI, ensuring accountability, transparency, and ethical integrity at every step.

MARTI is a framework for creating structured, standardized documentation that is attached to or embedded in AI-generated content. This documentation, or metadata, can be created by people collaborating with AI tools to produce content. Additionally, AI processes themselves can generate and embed metadata into their outputs, ensuring transparency, traceability, and accountability at every stage of content creation.

MARTI also offers a variety of potentially transformative business applications.

Disclaimer: the author is a friend of mine. But then again, so is every other thought leader mentioned in this article (with the exception of the late Steve Jobs, although our lives did touch when he fired me from a project—but that’s another story).

For more MARTI magic, check these posts:

And if you’ve a mind to do so, please pitch in!

Categories
Accessibility Applications apps architecture Authoring Best practices Design Standards State of the Web

What happened to the Share button in Zoom?

Where did the button go? Jeffrey Zeldman can no longer find it.

Zoom has always included a clickable button/badge at the top left of its primary meeting interface window. Click the badge to copy the URL of that meeting. You can then, with just one more click in any messaging system, send that URL to the other meeting participants. Fast. Simple. Drop-dead easy. Elegant.

It comes in especially handy when people didn’t get (or didn’t see or for some reason can’t click on) the meeting link in their invite. Or when the meeting link is hidden behind a tab behind a tab behind a tab in their browser. Or for any of a dozen other reasons you might want to grab the URL of a meeting you’re in, and zap it to a colleague.

How wise are the designers of Zoom to have solved this problem!

And talk about usable! The button’s placement at the top left of the meeting window, with plenty of free open space around it, means that any user (regardless of software experience level) can quickly find the button when they needed it. It’s placed right where your eyes know to look for it.

Good design! Smartly focused on what’s most important to the user.

So, anyway, Zoom seems to have removed the button.

—As I discovered during a Zoom meeting with a colleague 30 minutes ago. (Or, more accurately, a Zoom meeting without that colleague.)

—Who texted me to request the Zoom URL. But I couldn’t send it to them. I couldn’t send it, because I couldn’t see it, because the interface was hiding it.

—Because Zoom has decided to remove that affordance, replacing it with… well, nothing, actually.

It is possible that the affordance still exists somewhere within the Zoom interface, in some gloomily cobwebbed, rarely visited subscreen or other. Possibly with a rewritten label, so that any Zoom customers lucky enough to find it will fail to recognize it, even if staring directly at it with the fixed gaze of an astronomer.

I don’t say Zoom has definitely removed one of the nicest (and possibly, in its humble way, most important) tools their product offered. I don’t say that because I can’t be sure. I merely say, if they haven’t removed this function, they might as well go ahead and do so, for all the good its hidden presence does for Zoom’s millions of users. If the tool is hidden somewhere in the deep background layers of Zoom, I sure couldn’t find it.

So, after wasting time hunting for and texting about the missing Zoom link affordance (here comes the punchline), my colleague and I ended up holding our Zoom call…

… in Google Meet.

If I were a Zoom executive or investor, this might worry me.


Offered with love, UX is hard, and not all decisions are in our hands.

Categories
Accessibility Adobe Advocacy AIGA art direction Authoring Bandwidth Best practices Browsers business Career client management Community creativity CSS Design Designers development Digital Preservation Fonts Future-Friendly HTML industry interface maturity Medium My Back Pages Off My Lawn! Performance Photoshop Rants Real type on the web Responsibility Responsive Web Design Site Optimization Standards State of the Web The Essentials The Profession Typography Usability User Experience UX Web Design Web Design History Web Standards Websites webtype work Working writing

This Web of Ours, Revisited

ONE MONTH and 24 years ago, in “Where Have All the Designers Gone?” (my HTMHell design column for Adobe of March 20, 2000), I discussed the deepening rift between aesthetically focused web designers and those primarily concerned with creating good experiences online:

More and more web designers seem less and less interested in web design.

Over the past 18 months or so, many of the best practitioners in the industry seem to have given up on the notion that a low-bandwidth, less than cutting-edge site is worth making. Much of the stuff they’ve been making instead has been beautiful and inspiring. But if top designers wash their hands of the rest of the Web, whose hands will build it, and whose minds will guide it? The possibilities are frightening.

An Imperfect Medium for Perfectionists

Why were many of the leading graphic designers and studios at the time uninterested in web design? For one thing, designers trained to strive for visual perfection found the web’s unpredictability depressing. The article provided clues to the frustrations of the time:

Good designers spend hours tweaking typography in Illustrator and Photoshop. Then visitors with slow connections turn off images.

Of course, where professionals trained in graphic design saw a distressing lack of control, others glimpsed in the infant technology a tremendous potential to help people, pixel-perfection be damned. To reduce the conflict to a cartoon, you might characterize it as David Carson versus Jakob Nielsen—though doing so would trivialize the concerns of both men. Designers already charged with creating websites found themselves somewhere in the middle—barking themselves hoarse reminding clients and managers that pixel-perfect rendering was not a thing on the web, while arguing with developers who told designers the exact same thing.

Visually inspiring websites like K10k showed that the web could, if approached carefully and joyfully, provide aesthetic delight. But many designers (along with organizations like AIGA) were unaware of those sites at the time.

Us and Them

Another source of tension in the medium in 2000 sprang from the discrepancy between the privileged access designers enjoyed—fast connections, up-to-date browsers and operating systems, high-res monitors (at least for the time) offering thousands of colors—versus the slow modems, aging and underpowered computers, outdated browsers, and limited-color monitors through which most people at the time experienced the web.

Which was the real design? The widescreen, multicolor, grid-based experience? Or the 216-color job with pixelated Windows type, a shallow “fold,” and pictures of headline text that took forever to be seen?

To view your masterpiece the way most users experienced it, and at the syrup-slow speed with which they experienced it, was to have an awakening or a nightmare—depending on your empathy quotient. Some designers began to take usability, accessibility, and performance seriously as part of their jobs; others fled for the predictability of more settled media (such as print).

A New (Old) Hope

My March, 2000 article ended on an upbeat note—and a gentle call to action:

For content sites to attain the credibility and usefulness of print magazines; for entertainment sites to truly entertain; for commerce sites and Web-based applications to function aesthetically as well as technically, the gifts of talented people are needed. We hope to see you among them.

That was my hope in 2000, and, all these years later, it remains my vision for this web of ours. For though the browsers, connections, and hardware have changed substantially over the past 24 years, and though the medium and its practitioners have, to a significant extent, grown the Hell up, beneath the surface, in 2024, many of these same attitudes and conflicts persist. We can do better.

Minus the framesets that formerly contained it, you may read the original text (complete with archaic instructions about 4.0 browsers and JavaScript that broke my heart, but which Adobe’s editors and producers insisted on posting) courtesy of the Wayback Machine.

☞  Hat tip to Andrey Taritsyn for digging up the article, which I had long forgotten.

Categories
Journalism at its Finest links News Politics Press Publications Reporting Responsibility Standards USA

Both Sides, No

There’s no situation so awful our news media can’t make it worse. In a cowardly, doomed, and deeply misguided effort to appear “balanced” during an emergency that requires plain speaking, our news editors tie headlines into fantastic pretzels of spurious equivalence. In today’s edition of her subscriber-only newsletter, Washington Post columnist Jennifer Rubin tears into an especially egregious atrocity by the copy wizards of The New York Times:

Journalism 101

People on social media and other critics justifiably mocked, derided and denounced the New York Times for the headline, “Two Imperfect Messengers Take On Abortion.” The sub-headline was nearly as bad: “Neither side of the abortion divide would probably design the exact candidate they have in 2024.” This could be the crown jewel of “both-sidesism,” accomplishing that feat in multiple ways.

For starters, it blurs the distinction between Biden’s clear and unwavering position (to write Roe v. Wade into a federal statute) with Trump’s well-documented inconsistenciesdeflections and contradictions. These two men simply are not equally deficient communicators. That imbalance in clarity and sincerity actually might determine the campaign’s outcome.

In addition to mischaracterizing the candidates’ relative abilities, this quintessential “process story” diminishes the issue’s moral gravity. You could not imagine a 1942 headline: “Two imperfect messengers take on world war.” Awarding style points, as the story does, trivializes the abortion issue.

Finally, the Times headline amounts to a self-parody of gamified political coverage: “Neither side of the abortion divide would probably design the exact candidate they have in 2024.” (Well, neither team in the World Series would design the exact lineup they have.) In essence, the Times tells us, “No one’s perfect!” — an empty platitude. Journalists owe readers an accurate depiction of the candidates’ vast differences in consistency, clarity and moral seriousness on abortion. Alas, such precision would demand truth-telling in lieu of feigned “balance.”

Washington Post subscribers can view the complete text of today’s newsletter on the paper’s website. You may also sign up to get it in your inbox free of charge.

Categories
Authoring Best practices Blogs and Blogging Code Compatibility CSS Design development Free Advice Future-Friendly HTML industry Layout links maturity Off My Lawn! Platforms Rants Responsibility Standards State of the Web The Profession Web Design Web Design History Web Standards wisdom

The More Things Change… (or: What’s in a Job Title?)

I’m not a “[full-stack] developer,” regardless of what my last job title says.

I’m not even a front-end developer, thanks to the JavaScript–industrial complex.

I’m a front-of-the-front-end developer, but that’s too long.

So, I’m a web designer. And I also specialise in accessibility, design systems, and design.

…Why do I think that this is the best title? Here’s why.

I’m designing for the web. The infinitely flexible web. The web that doesn’t have one screen size, one browser, one operating system, or one device. The web that can be used by anyone, anywhere, on any internet connection, on any device, on any operating system, on any browser, with any screen size. I’m designing with the web. Using the web platform (HTML, CSS, JS, ARIA, etc.), not a bloated harmful abstraction. I have a deep understanding of HTML and its semantics. I love CSS, I know how and when to utilise its many features, and I keep up-to-date as more are added. I have a strong understanding of modern JavaScript and most importantly I know when not to use it.

Front-end development’s identity crisis by Elly Loel

See also:

The Wax and the Wane of the Web (2024): Forget death and taxes. The only certainty on the web is change. Ste Grainer takes a brief look at the history of the web and how it has been constantly reinvented. Then he explores where we are now, and how we can shape the future of the web for the better. – A List Apart

The Cult of the Complex (2018): If we wish to get back to the business of quietly improving people’s lives, one thoughtful interaction at a time, we must rid ourselves of the cult of the complex. Admitting the problem is the first step in solving it. – A List Apart

Dear AIGA, where are the web designers? (2007): For all the brand directors, creative directors, Jungian analysts, and print designers, one rather significant specimen of the profession is missing. – zeldman.com

Standardization and the Open Web (2015): How do web standards become, well, standard? Although they’re often formalized through official standards-making organizations, they can also emerge through popular practice among the developer community. If both sides don’t work together, we risk delaying implementation, stifling creativity, and losing ground to politics and paralysis. Jory Burson sheds light on the historical underpinnings of web standardization processes—and what that means for the future of the open web. – A List Apart

The profession that dare not speak its name (2007): “No one has tried to measure web design because web design has been a hidden profession.” – zeldman.com

Categories
Accessibility Advocacy architecture Authoring Best practices development ethics industry Performance Platforms Responsibility Standards State of the Web Usability User Experience UX W3C Web Standards

CAPTCHA excludes disabled web users

What’s widely used, no longer particularly effective, and makes web content inaccessible to many people with disabilities? It’s our old friend CAPTCHA! In a group note dated 16 December 2021, the W3C explains how CAPTCHA excludes disabled users, and suggests alternatives which may be kinder and more reliable:

Various approaches have been employed over many years to distinguish human users of web sites from robots. The traditional CAPTCHA approach asking users to identify obscured text in an image remains common, but other approaches have emerged. All interactive approaches require users to perform a task believed to be relatively easy for humans but difficult for robots. Unfortunately the very nature of the interactive task inherently excludes many people with disabilities, resulting in a denial of service to these users. Research findings also indicate that many popular CAPTCHA techniques are no longer particularly effective or secure, further complicating the challenge of providing services secured from robotic intrusion yet accessible to people with disabilities. This document examines a number of approaches that allow systems to test for human users and the extent to which these approaches adequately accommodate people with disabilities, including recent non-interactive and tokenized approaches. We have grouped these approaches by two category classifications: Stand-Alone Approaches that can be deployed on a web host without engaging the services of unrelated third parties and Multi-Party Approaches that engage the services of an unrelated third party.

W3C: Inaccessibility of CAPTCHA: Alternatives to Visual Turing Tests on the Web

We can do better!

Tell your friends. Tell your boss. Tell your clients.

Tip o’ the blue beanie to Adrian Roselli.

Categories
A Book Apart A Feed Apart A List Apart An Event Apart Applications architecture Best practices Career client services Community conferences creativity CSS Design Designers Education eric meyer experience Formats glamorous Happy Cog™ Ideas industry Information architecture launches links maturity Mentoring Networks people social media social networking software Standards State of the Web Stories Teaching The Essentials The Profession twitter User Experience UX Web Design Web Design History wisdom

“Where the people are”

It’s nearly twenty years ago, now, children. Facebook had only recently burst the bounds of Harvard Yard. Twitter had just slipped the bonds of the digital underground. But web geeks like me still saw “social media” as a continuation of the older digital networks, protocols, listservs, and discussion forums we’d come up using, and not as the profound disruption that, partnered with smartphones and faster cellular networks, they would soon turn out to be. 

So when world-renowned CSS genius Eric Meyer and I, his plodding Dr Watson, envisioned adding a digital discussion component to our live front-end web design conference events, our first thought had been to create a bespoke one. We had already worked with a partner to adapt a framework he’d built for another client, and were considering whether to continue along that path or forge a new one.

And then, one day, I was talking to Louis Rosenfeld—the Prometheus of information architecture and founder of Rosenfeld Media. I told Lou about the quest Eric and I were on, to enhance An Event Apart with a private social network, and shared a roadblock we’d hit. And Lou said something brilliant that day. Something that would never have occurred to me. He said: “Why not use Facebook? It already exists, and that’s where the people are.”

The habit of building

Reader, in all my previous years as a web designer, I had always built from scratch or worked with partners who did so. Perhaps, because I ran a small design agency and my mental framework was client services, the habit of building was ingrained. 

After all, a chief reason clients came to us was because they needed something we could create and they could not. I had a preference for bespoke because it was designed to solve specific problems, which was (and is) the design business model as well as the justification for the profession. 

Our community web design conference had a brand that tied into the brand of our community web design magazine (and soon-to-emerge community web design book publishing house). All my assumptions and biases were primed for discovery, design, development, and endless ongoing experiments and improvements.

Use something that was already out there? And not just something, but a clunky walled garden with an embarrassing origin story as a hot-or-not variant cobbled together by an angry, virginal undergraduate? The very idea set off all my self-protective alarms.

A lesson in humility

Fortunately, on that day, I allowed a strong, simple idea to penetrate my big, beautiful wall of assumptions.

Fortunately, I listened to Lou. And brought the idea to Eric, who agreed.

The story is a bit more complicated than what I’ve just shared. More voices and inputs contributed to the thinking; some development work was done, and a prototype bespoke community was rolled out for our attendees’ pleasure. But ultimately, we followed Lou’s advice, creating a Facebook group because that’s where the people were. 

We also used Twitter, during its glory days (which coincided with our conference’s). And Flickr. Because those places are where the people were. 

And when you think about it, if people already know how to use one platform, and have demonstrated a preference for doing so, it can be wasteful of their time (not to mention arrogant) to expect them to learn another platform, simply because that one bears your logo.

Intersecting planes of simple yet powerful ideas

Of course, there are valid reasons not to use corporate social networks. Just as there are valid reasons to only use open source or free software. Or to not eat animals. But those real issues are not the drivers of this particular story. 

This particular story is about a smart friend slicing through a Gordian Knot (aka my convoluted mental model, constructed as a result of, and justification for, how I earned a living), and providing me with a life lesson whose wisdom I continue to hold close.

It’s a lesson that intersects with other moments of enlightenment, such as “Don’t tell people who they are or how they should feel; listen and believe when they tell you.” Meet people where they are. It’s a fundamental principle of good UX design. Like pave the cowpaths. Which is really the same thing. We take these ideas for granted, now.

But once, and not so long ago, there was a time. Not one brief shining moment that was known as Camelot. But a time when media was no longer one-to-many, and not yet many-to-many. A time when it was still possible for designers like me to think we knew best. 

I’m glad a friend knew better.

Afterword

I started telling this story to explain why I find myself posting, sometimes redundantly, to multiple social networks—including one that feels increasingly like Mordor. 

I go to them—even the one that breaks my heart—because, in this moment, they are where the people are. 

Of course, as often happens, when I begin to tell a story that I think is about one thing, I discover that it’s about something else entirely.

Categories
Advocacy Announcements arts Best practices bugs content Content First editorial Education engagement Ideas industry Microblogging Off My Lawn! Press Publications Publisher's Note Publishing reportage Reporting Responsibility Standards Wit and Wisdom Working writing

Get it right.

“Led” is the past tense of “lead.”

L.E.D. Not L.E.A.D.

Example: “Fran, who leads the group, led the meeting.”

When professional publications get the small stuff wrong, it makes us less trusting about the big stuff. Trust in media is already at an all-time low. Don’t alienate liberal arts majors and obsessive compulsives. We may be the last readers standing.

Categories
Best practices blogger Blogs and Blogging Community content Content First Content-First creativity editorial findability Free Advice Ideas Indieweb industry links Marketing Off My Lawn! Own your content Platforms Publications Publishing Search Standards State of the Web Teapot The Essentials The Profession Wah! writing

Algorithm & Blues

Examining last week’s Verge-vs-Sullivan “Google ruined the web” debate, author Elizabeth Tai writes:

I don’t know any class of user more abused by SEO and Google search than the writer. Whether they’re working for their bread [and] butter or are just writing for fun, writers have to write the way Google wants them to just to get seen.

I wrote extensively about this in Google’s Helpful Content Update isn’t kind to nicheless blogs and How I’m Healing from Algorithms where I said: “Algorithms are forcing us to create art that fits into a neat little box — their neat little box.”

So, despite Sullivan’s claims to the contrary, the Internet has sucked for me in the last 10 years. Not only because I was forced to create content in a way that pleases their many rules, but because I have to compete with SEO-optimized garbage fuelled by people with deep pockets and desires for deep pockets.

Is the Internet really broken?

For digital creators who prefer to contain multitudes, Tai finds hope in abandoning the algorithm game, and accepting a loss of clout, followers, and discoverability as the price of remaining true to your actual voice and interests:

However, this year, I regained more joy as a writer when I gave upon SEO and decided to become an imperfect gardener of my digital garden. So there’s hope for us yet.

As for folks who don’t spend their time macro-blogging—“ordinary people” who use rather than spend significant chunks of their day creating web content—Tai points out that this, statistically at least a more important issue than the fate and choices of the artists formerly known as digerati, remains unsolved, but with glimmers of partially solution-shaped indicators in the form of a re-emerging indieweb impulse:

Still, as much as I agree with The Verge’s conclusions, I feel that pointing fingers is useless. The bigger question is, How do we fix the Internet for the ordinary person?

The big wigs don’t seem to want to answer that question thoroughly, perhaps because there’s no big money in this, so people have been trying to find solutions on their own.

We have the Indieweb movement, the Fediverse like Mastodon and Substack rising to fill the gap. It’s a ragtag ecosystem humming beneath Google’s layer on the Internet. And I welcome its growth.

For more depth and fuller flavor, I encourage you to read the entirety of “Is the internet really broken?” on elizabethtai.com. (Then read her other writings, and follow her on our fractured social web.)


“The independent content creator refuses to die.” – this website, ca. 1996, and again in 2001, paraphrasing Frank Zappa paraphrasing Edgar Varese, obviously.

Hat tip: Simon Cox.

Categories
A List Apart art direction Design Standards State of the Web Web Design Web Design History Web Standards webfonts Websites webtype

The Next Generation of Web Layouts

Who will design the next generation of readable, writerly web layouts?

Layouts for sites that are mostly writing. Designed by people who love writing. Where text can be engaging even if it isn’t offset by art or photography. Where text is the point.

With well considered flexible typesettingmodular scaling, and readable measures across a full range of proportions and devices. With optional small details that make reading screens of text a pleasure instead of a chore. With type sizes that are easy to read without needing to zoom in. And with a range of interesting sans and serif fonts (including variable fonts) that support reading and encourage creative exploration where headlines are concerned.

Well? How did we get here?

The web has come along way since design meant crafting UIs in Photoshop and exporting them as sliced GIFs. Flash. SiFR. Table layout. Rebellion and rethinking. Liquid layout. Semantic HTML and CSS layout. Adaptive layout. Responsive layout. Intrinsic layout. Web fonts. Big type and super lightweight UX emphasizing readability was new (and controversial!) in 2012. We’ve long since accepted and improved upon it. Today’s news, magazine, and blog pages are more flexible, readable, and refined than ever before.

So what comes next? For writers, one hopes that what’s next is a fresh crop of small, innovative advancements. Improvements that are felt by readers, even when they aren’t always consciously noticed. Layouts that are not merely legible, but actually feel inevitable, at all sizes and in all contexts.

Beyond outside the box

Services like Typetura may point the way. A marriage of type and tech, Typetura is different from other typesetting methods. An intrinsic typography technology, it “enables you to design with more flexibility, while dramatically reducing code.” Disclaimer: I’m friends with, and have long admired the work of, Typetura founder Scott Kellum. Designing With Web Standards readers will recognize his name from the Kellum Image Replacement days of the early 2000s, but that ain’t the half of what he has done for web design, e.g. inventing dynamic typographic systemshigh impact ad formatsnew parallax techniques, and fluid typesetting technology. Scott was also the coder, along with Filipe Fortes, of Roger Black’s late, lamented Treesaver technology. But I digress.

The tech is not the pointexcept in so far as it improves our ability to think beyond our current understanding of what design and layout means. Just as Gutenberg’s printing press was not the point, but it was the point of departure. Initially, the invention of movable type reproduced the writing we already knew (i.e. the King James Bible). But ultimately, by freeing writing and reading from narrow elite circles and bringing it to more (and more diverse) minds, Gutenberg’s invention transformed what writing was and could be—from the invention of newspapers to the fiction of Virginia Woolf to multimedia experiences, and perhaps even to the web.

Let us all to play with Jen Simmons’s intrinsic web layout ideas and Scott Kellum and partners’s Typetura. While we also sketch in pencil and spend time looking at well designed books —printed, bound ones as well as digital publications in various devices. And specifically, not just fabulous coffee table books, but books that you’ve reread over and over, to understand what, beyond the text itself, encourages that reading response. So that, together, we may take the experiences of both reading and writing to the next level.

Appendix: Resources

If you’re new to the interplay between design and code on the open web, or if you just want a refresher, here are some evergreen links for your further learning and pleasure:


Categories
Advocacy Archiving Browsers Community Design glamorous HTML industry javascript launches links Off My Lawn! people Publications Responsive Web Design Standards State of the Web Stories The Profession W3C Web Design Web Design History Web Standards Websites

He Built This City: The Return of Glenn Davis

You may not know his name, but he played a huge part in creating the web you take for granted today. 

As the first person to realize, way back in 1994, that the emerging web could be a playground, he created Cool Site of the Day as a single-focused blog dedicated to surfacing interesting sites, thereby demonstrating the web’s potential while creating its first viral content. (As an example, traffic from his followers, or, as we called them back then, readers, brought NASA’s web server to its knees.)

He co-founded The Web Standards Project, which succeeded in bringing standards to our browsers at a time when browser makers saw the web as a software market to be dominated, and not a precious commons to be nurtured.

He anticipated responsive web design by more than 20 years with his formulation of Liquid, Ice, and Jello as the three possible ways a designer could negotiate the need for meaningful layout vis-a-vis the unknowns of the user’s browsing environment.

He taught the web DHTML through his educational Project Cool Site. 

And then, like a handful of other vital contributors to the early web (e.g. Todd Fahrner and Dean Allen), he vanished from the scene he’d played so large a role in creating.

He’s ba-ack

Glenn Davis wasn’t always missed. Like many other creators of culture, he is autistic and can be abrasive and socially unclueful without realizing it. Before he was diagnosed, some people said Glenn was an a**hole—and some no doubt still will say that. I think of him as too big for any room that would have him. And I’m talking about him here because he is talking about himself (and the history of the early web) on his new website, Verevolf.

If you go there, start with the introduction, and, if it speaks to you, read his stories and consider sharing your own. That’s how we did it in the early days, and it’s still a fine way to do it—maybe even the best way.

I knew Glenn, I worked with him and a lot of other talented people on The Web Standards Project (you’re welcome!), and it’s my opinion that—if you’re interested in how the web got to be the web, or if you were around at the time and are curious about a fellow survivor—you might enjoy yourself.

Categories
Advocacy automattic Blogs and Blogging Brands Community Design industry Own your content software Standards Web Design Web Standards wordpress

Enabling Folks to Express Themselves on the Web: State of the Word 2021

Screenshot of slide highlighting the four phases of WordPress Gutenberg.

Not only are we enabling folks to express themselves uniquely on the web, unlike the cookie cutter looks that all the social sites try to put you into. We’re doing it in a way which is standards-based, interoperable, based on open source, and increases the amount of freedom on the web.

—Matt Mullenweg, State of the Word 

In a live address, Automattic’s Matt Mullenweg

  • Introduces Openverse (an opt-in content commons);
  • Announces that WordPress’s beginner-friendly Learn.Wordpress.org is now available in 21 languages;
  • Philosophizes about Web3 and the “decentralized web” —which, despite big company colonization attempts, is really what the web has always been;
  • Extols the virtues of Open Source;
  • And more. 

Watch the 2021 #StateoftheWord annual keynote address on YouTube. It’s two hours long, so bring popcorn.

Selected Additional Reactions & Commentary

Hat tips to Chenda Ngak, Reyes Martínez, and Josepha Haden.