Aaron Marcus, Berkeley's Bard of User-Centered Design, Battles "High-Order Crap"

Aaron Marcus, watching from his perch on Euclid Street in the Berkeley hills above San Francisco Bay, has seen the business world’s infatuation with design rise and recede, rise and recede.

Ten or 20 years ago, if you’d traveled to San Francisco or Silicon Valley in search of help designing a consumer product or a software interface, you’d have been directed to one of two marquee firms: Ideo or Frog Design.

Both companies offered industrial design and user-interface design services; both had histories that intertwined with Apple Computer and the revolution in personal computing wrought by Steve Jobs back in the 1980s; both employed a panoply of hypertalented artists and creative types; and both had showcases full of the famous products they’d helped to create, from the Apple IIc computer (Frog) to the first smartphone, the Handspring Treo (Ideo).

Ideo and Frog are still around today, each employing hundreds of people at their offices in the Bay Area and around the world. But these days, Marcus notes, they have a host of smaller competitors, like Carbon Design Group, Essential, Lunar Design, Smart Design, and Whipsaw. On top of that, there’s a new vogue at many Web and mobile companies for shipping “minimum viable products” that are hardly designed at all. In response, Ideo and Frog have scaled back their design practices and moved up-market into organizational consulting and “innovation consulting,” competing with larger firms like McKinsey and Deloitte.

Marcus, meanwhile, keeps doing what he’s been doing for the last 34 years in his consultancy, Aaron Marcus and Associates: the hard work of helping clients visualize and manage information effectively, using an approach that, in his words, “combines reason and emotion, with design as a middle ground between art and science.”

Like bits flowing through a network, Marcus is a little hard to pin down. He’s part physicist, part graphic designer, part ethnographer, part programmer. You could label him a user experience designer, an interaction designer, an information designer, or a proponent of user-centered design, and all of those would be right. But whatever you call it, Marcus has been doing it, and making money at it, longer than just about anyone else on the planet.

“I am the first graphic designer ever to use computers, that I know of,” Marcus says. His fascination with code and interfaces dates back to 1967, when he was a summer intern at Bell Labs in New Jersey and worked on computer-generated art, alongside the programmers who were inventing Unix. Fifteen years later, using a big grant from the Defense Advanced Research Projects Agency, he started his firm and developed one of the first systems for making computer code more readable; thanks to Marcus, software developers’ comprehension of their own once-cryptic scribbles went way up.

That success set the stage for some 500 subsequent jobs for more than 300 clients, across government, industry, education, and consumer markets. The firm contributed to the design of the first version of AOL and the first versions of Travelocity and Orbitz. It helped BMW improve iDrive, the first successful in-car infotainment system. These days, Marcus travels extensively in Asia, and he’s become known for his studies of localization—that is, finding ways to adapt design so that they remain effective even when transported across cultural lines.

Through it all, Marcus, now 70, has remained the firm’s only full-time employee.

“Ideo has grown to be, what, 500 people? We are 10 to 15 people,” counting interns, Marcus says. “So, in a sense, we are a mosquito compared to them.” But while the big firms are good at marketing themselves through slogans and frameworks, Marcus thinks smaller firms like his are probably better at handling complex design problems. And behind the friendly (or not-so-friendly) rivalry, what preoccupies him is the conviction that there’s a common set of principles underlying good product design, good interaction design, and effective visual communication. Today’s high-tech entrepreneurs all too often overlook these principles, only to be forced into rediscovering them, he argues.

Even the biggest design-related technology story of the past year, the introduction of Apple’s stripped-down iOS 7 mobile operating system, was really about the company confronting the fact that its designers had built a lot of “high order crap” and deciding to get back to basics, Marcus says.

At the humblest startups and the world’s biggest technology companies, what’s old is new again. There’s a realization that interfaces need to be simple and consistent. In an era of touchscreen tablets and smartphones, when the form of a device says nothing about its function, designers must carefully shepherd users through interactions, providing them with obvious navigation cues and mental models that are internally consistent. (Tasks, by the way, that probably can’t be crowdsourced to users themselves, the way many “agile” startups try to do.)

“There is now a fad of emphasizing design thinking, and I find that a little amusing, because for us professional designers with some history, that’s what we learned in our first semester” of art school, Marcus says. “Now it’s being peddled as the secret sauce for the creation of successful products and services. But some of these techniques are no more or less than what user-centered design has been promoting for decades.”

A Jumble of Icons

Interaction design wasn’t even a recognized field until the mid-1980s. Many of its precepts are borrowed from the much older field of graphic design, which is where Marcus got his start. Problems crop up, he argues, when software builders overlook this legacy. In their rush to build systems of buttons, menus, and gestures for navigating graphical user interfaces, they have forgotten how people actually learn and comprehend.

A pair of examples underscore Marcus’s point. One is about Computervision, a Massachusetts company that, back in the 1980s, was an early pioneer in computer-aided design and computer-aided manufacturing, or CAD-CAM (Parametric acquired it in 1998). It was the golden era of icons—if you are old enough, you probably remember your delight the first time you dragged a file into the Apple Macintosh’s trash can and heard the little crunch sound. The issue was that early builders of CAD software thought they could create order simply by reducing every function to a pictograph. “You had all these engineers creating sign systems with no background at all in visual communications,” Marcus says. “I encountered one of these [Computervision] guys who said with great excitement, ‘We are up to 15,000 icons now!’”

There’s nothing wrong with order, Marcus emphasizes, or with the impulse to impose it through visual metaphors. The trick is to make sure there’s a system behind these signs. For a more modern example of how the process can go wrong, “You have only to open a Mac or Windows operating system, with their jumble of icons,” Marcus says. “It’s the result of market forces that have allowed every individual provider of a product to design their own icon, so that we stumble when we try to find them. It’s as if every letter of the alphabet was sponsored by a different company.”

What’s puzzling, Marcus says, is that many designers of digital interactions today don’t seem to have absorbed lessons learned long ago by their colleagues in the larger world of information design and graphic design.

“If you look at the international standards for mass transportation design, there are some very good systems that have been created to help guide people through a complex airport or subway system,” he says. “Having learned a little bit, you are able to understand the rest, because it’s a rational system approach.”

Humankind Inventing Its Future Self

Marcus, a native of Omaha, NE, began to develop his own design approach in the 1960s. As a physics undergraduate at Princeton, he wanted to study quantum mechanics and gravitational theory, but kept getting steered into more practical areas like laser research. After college he decided to break with science and apply to art schools; he got into the graphic design department at Yale’s School of Art and Architecture.

There, he says, “My brain snapped. I didn’t know what anyone was talking about when they said things like ‘That works!’ or ‘Let the progression of color be more systematic.’”

It took Marcus about six months to learn the language. “I began to understand that it was as if I were still in a physics lab, doing experiments and relating those to first principles and systems of thought and noting down paradigms,” he says. He also learned Fortran and, as a summer researcher at AT&T’s Bell Laboratories in 1967, began experimenting with ASCII art and other forms of computer graphics.

That summer brought Marcus his first experiences with a big computer—a GE 635 mainframe. “I would walk into the room with the raised floor, the eternal quiet hum of the CPU fans, and realize I could see nothing around me but the computer,” he says. “I knew this was humankind inventing its own future self.”

Armed with a new respect for computers, Marcus returned to Princeton, where he taught visual communication, information design, and computer graphics in the School of Architecture and Urban Planning from 1968 to 1977. He also took up consulting work. Between 1969 and 1971, he worked on a prototype page-layout system that would allow AT&T to display the Yellow Pages on its Picturephone. “I was already doing user-centered design, because I was studying myself as a graphic designer, figuring out how I worked and what people like me would need,” he says.

In 1979 Marcus moved west to become a lecturer in the College of Environmental Design at the University of California, Berkeley, and then a staff scientist at Lawrence Berkeley Laboratory. He founded Aaron Marcus and Associates in 1982 and soon won a three-year DARPA grant to come up with ways to improve the usability of the C programming language, first developed at Bell Labs. Simply by reorganizing the code and its presentation on paper and on screen, he was able to help C programmers increase their comprehension by 20 percent, as measured by an independent human-factors testing group.

Still, before Apple, Microsoft, and other companies ushered in an era of truly personal computing, Marcus’s early work was reaching only the specialized few who had access to expensive machines. “As a visual designer, I had to spend $100,000 just to draw a line,” he says. It’s no joke: the special 300-dot-per-inch black-and-white vertical display and associated hardware and software that Marcus needed for the DARPA project cost $76,000, and the matching, refrigerator-sized laser printer cost $24,000. The Mac and the LaserWriter made the equipment obsolete within a couple of years. As Marcus puts it, “When Apple came along in 1985 and we tried to get rid of the equipment, I couldn’t even give it away.”

Slowing Down to Think

It was a key moment: As computers started to show up on every desk, companies realized they needed to find better ways to communicate through software. Marcus had already been pondering the problem for a while, and was ready to help explain the interaction-design process to potential clients in the business and consumer worlds. To understand what Marcus means by user-centered design, it’s worth stepping through his approach.

“The standard levels with which technology deals are data, information, knowledge, and wisdom,” he says. “Data are organizations of significant perceptions—say, all the temperatures in the United States. Information is the organization of significant data.” Those same temperatures displayed on a map, for example.

“Knowledge is the organization of significant information together with action plans. I can have all sorts of weather reports, but do I need an umbrella or not?” he says. “Wisdom, the highest form, is significant patterns of knowledge, combined with either internalized knowledge or real-world experience.” There’s more rain than usual in Bangladesh, say, and less than usual in Nevada—maybe the global climate is changing.

When it comes time to build something real, Marcus says, designers have to start by figuring out which stakeholders will care about the data, the information, the knowledge, or the wisdom. They could be engineers, marketers, business managers, or regular old office workers or consumers. In a true user-centered design project, each step is built around understanding those stakeholders and their needs.

In Marcus’s view of good UX development, there are at least nine steps:

  • Plan
  • Research
  • Analyze
  • Design
  • Evaluate
  • Implement
  • Document
  • Train
  • Market

The content of these steps—the actual things designers are research, analyzing, and designing—break down into five components, more or less:

  • Metaphors: basic ideas communicated through images, words, sounds, textures, etc.
  • Mental models: an organization of functions and data, or content and actions, or users and tasks.
  • Navigation: how users move through the mental models.
  • Interactions: all of the input and output behaviors of the system, including text, speech, and graphics.
  • Appearance: color, typography, language, gestures, and the like.

If you hire Marcus to help design a new software product, that’s the way of thinking he’ll bring to bear. Of course, every design firm has its own toolbox of models. Marcus’s own marketing pitch boils down to this: more brains, less branding, more personal attention. “I believe our level of education and intensity and business focus is much deeper” than what Ideo and Frog offer, he says.

And if you hire him, know this: he doesn’t believe in shortcuts. There’s a whole school of product development these days that is basically anti-design (at least by Marcus’s definition), and unsurprisingly, he’s a skeptic. I’m talking about companies built around “agile” software engineering and the lean startup methodology, which calls for rapid cycles of iteration: developers build, ship, gather customer feedback, rebuild, ship again, and so forth. In that environment, Marcus says, there’s little time for planning, research, or analysis.

“A true UX designer would say we need to study the users, and sometimes we want to study them for one to six weeks,” Marcus says. But in an agile shop, “The other team members are saying, ‘What are you talking about? We’re in two-week sprints and by the time you deliver something we are three cycles beyond that.’”

Granted, many companies devoted to agile development and lean-startup principles would say they iterate precisely in order to gather and analyze feedback from users, the better to refine their ideas. But in Marcus’s view, you can’t sidestep the work of thinking about the metaphors, models, and interactions embodied in your software; otherwise, you’re just throwing features at the wall to see what sticks.

“It’s outsourcing the innovation process to everybody, so in a sense, everybody suffers along as you incrementally improve,” he says. “It’s certainly a cheap model for startups. They don’t have to pay for a lot of testing. But it does mean that a lot of crap is produced that we all have to live with.”

Good user-centered design, in other words, is more like a marathon than a sprint. Design thinking is hard, Marcus says, and only a handful of companies have a long heritage of doing it well; he points to names like Herman Miller and Nike. “Even with Apple, we live with crap. It’s just very high-order crap,” he says.

A Brontosaurus of Horrendously Inappropriate Form

Apple’s recent housecleaning with iOS 7 pointed to an interesting tension in the design world, between two camps that Marcus calls the “Apollonians” and the “Dionysians.” The Apollonian philosophy is restricted, limited, and controlled, he says. The Dionysians are expressive, exuberant, complex, zany.

In industrial and graphic design, the Apollonians are probably best represented by the Swiss-German school—people like Dieter Rams, the designer behind Braun, and Hartmut Esslinger, the founder of Frog and the originator of the spare, beige-box “Snow White” design that Apple applied to most of its products in the 1980s. The Dionsysians have people like Saul Bass, the iconic graphic designer, and Milton Glaser, creator of the sentimental “I Heart New York” logo and the famous rainbow-hair cover image for Bob Dylan’s Great Hits album in 1975, as their heroes.

Usually, you want people from both camps around. And Marcus confesses to having a zany side: In one recent side project, he designed a fake currency called Facebucks, with Mark Zuckerberg on the $100 bill. But a mobile touchscreen device can instantly morph into a phone, a camera, a compass, or a game board, which puts an enormous burden of clarity on software designers. Under those conditions, zaniness can get old pretty quickly, Marcus says. “It’s one thing to do funny, wild, and crazy things, and another to create large systems of signs that people need to learn and use.”

He cites Google’s ever-changing home page logo as one (relatively benign) example of excessively Dionysian design. He thinks the skeuomorphism that accumulated in Apple’s OS X and iOS, prior to the overhaul initiated with iOS 7, was another. The imitation dials, buttons, shadows, and textures that used to turn up throughout iOS may have helped to make early users of touchscreen interfaces feel comfortable. But as the wood veneer, green felt, and torn paper piled up over time, they came to feel unnecessary, even inconsistent. And they never had very much to do with the actual content being conveyed.

Apple’s old Address Book app, which looked like a real leatherbound address book, was one of the worst offenders, in Marcus’s eyes. He describes it as “a brontosaurus of horrendously inappropriate form that someone thought was going to be cute.” What made the app especially irritating, in his designer’s eyes, was that it used mental models and interactions that were inconsistent with those in other Apple programs like iTunes or Mail, even for simple actions like pasting text or moving between pages.

“At a certain point, people say ‘Wow, that’s cute, but how do I use this?’” Marcus says. “If I interpret this as meaning X, Y, or Z, are you going to pull out the rug and have it mean something else when I go to another screen? That’s a recipe for driving people insane.”

In 2012 and 2013, as it happened, both Apple and Microsoft went through Dionysian-to-Apollonian conversions, adopting slimmer typefaces, flatter designs, and simpler interactions for Windows 8 and iOS 7. Some hailed the changes as radical and shocking, but in fact they represented a resurrection of good old Swiss-German graphic design aesthetics from the 1950s and 1960s. Marcus says it was a smart way to bring some sanity back to the jumble.

A Universe with Many Centers

Like all big transitions, the latest one could come with its own minor backlash. Some users have had trouble adapting to the minimal new designs, and neither Microsoft nor Apple has gone out of its way to offer education or assistance (they’ve been lax about the “document” and “train” parts of Marcus’s UX development process). Also, not every culture adheres to the same set of meanings. Minimalism might imply luxury in some Western cultures, but in others it implies the opposite.

“There are studies that show that for consumers in India, if a phone interface doesn’t look rich and crowded enough, they will think it’s inferior,” Marcus says. “It should be bursting with icons, and by the way, a little gold wouldn’t hurt.” From an Indian point of view, “Swiss-German emptiness is the look of inferior design.”

Understanding, predicting, and sidestepping those kinds of cross-cultural friction points is one of Marcus’s current specialties. Marcus says he’s traveled to 37 countries, and one of his main observations is that “so many groups think they are the center of the universe.” It’s difficult enough to come up with designs that make sense to both men and women, or to both young people and old people. Creating a single interface that will work across all cultures is virtually impossible, he says.

Microsoft did it with Windows XP, which is still running on 500 million PCs around the world and is, by far, history’s most popular operating system. But most companies are probably be better off developing processes to localize and customize their interfaces for each group of users, Marcus says.

Sometimes companies call on his firm to do what he calls “cultural audits” of their software prior to translation. One company wanted to sell its English-language library management program in Saudi Arabia. Marcus and his interns interviewed Saudis and tried to identify terms, images, and concepts that might throw them, Marcus says. “You might have a very good translation of something which should never have been translated in the first place,” he explains. “They had a calendaring function that featured Jewish holidays, for example. That probably will not be necessary in Saudi Arabia. But there were other things too, like magic wands. To Saudis, that can look like wizardry and devil’s work.”

Lately, Marcus has been spending a lot of time flying back and forth to China, where he offers user-experience design training to companies eager to compete internationally. In 2012 he was named a Master at the De Tao Academy in Shanghai, which seeks to bring industrial expertise from around the world to China, and he’s an advisor to the Beijing-based Dragon Design Foundation, which runs design conferences in China and Europe.

Five years ago, Marcus notes, “no one had heard of Huawei,” the Shenzhen-based telecom equipment giant. Now it sells more smartphones than anyone except Samsung and Apple. “I can tell you, there are all kinds of companies like that. They are trying to create ‘Designed in China’ as a much stronger brand. I feel like I’ve been offered the chance to recreate AM+A in China.”

And as a free service to designers here in the U.S., his firm has published details about a series of “machines”—concept designs for applications intended to support various kinds of behavior change, borrowing ideas about persuasive technology borrowed from Stanford researcher BJ Fogg. The first, back in 2009, was the Green Machine, which showed how mobile-app builders might in theory help consumers use less energy by displaying data from smart-grid utility meters. It was followed by machines focused on health, money management, family storytelling, travel, innovation, driving, education, and happiness.

Today, plenty of commercial apps incorporate similar ideas. But helping to get these ideas into circulation was Marcus’s whole goal. “There are many startups creating stuff, but a lot of them create stuff which is not well researched,” he says.

The irony in trends like agile development is that today’s technology platforms actually call for more careful design thinking than ever.

“In the old days, chairs were one thing and books were another. They were rather simple and you knew how to operate them,” Marcus observes. “Now you can have chairs that tell stories and coffee cups that connect to the Internet. So computer-based products are challenging our expectations.”

But that doesn’t mean software developers can afford to throw out everything that’s been learned about the need for clarity and consistency. In fact, if they bothered to consult with veterans like Marcus, they might just avoid repeating some old mistakes.

“With all due respect to the brilliant young people creating all sorts of things, they often don’t have aged, wizened mentors who can provide a perspective—‘Oh yes, I see what you’re doing, it reminds me of what happened 20 years ago and it was a success or a failure for this reason,’” he says.

Marcus is hardly wizened; that’s a bit of playful self-caricature. And judging from his jetsetting lifestyle, 70 really is the new 50. But he does take the long view: all the way back to the beginning of computers in design, in fact. And that’s a perspective that’s completely missing at many of the Silicon Valley startups I talk with. Operating systems change. The way humans operate doesn’t.

Here’s a 2012 video of a Marcus lecture on mobile persuasion design and the firm’s “machines” project.

Wade Roush is the producer and host of the podcast Soonish and a contributing editor at Xconomy. Follow @soonishpodcast

Trending on Xconomy

By posting a comment, you agree to our terms and conditions.

2 responses to “Aaron Marcus, Bard of User-Centered Design, Battles “High-Order Crap””

  1. I got such a kick out of this comment: “There is now a fad of emphasizing design thinking, and I find that a little amusing, because for us professional designers with some history, that’s what we learned in our first semester” of art school.”

    As an industrial designer (and graphic designer too) I have really struggled to understand the exuberance over “design thinking.” I’ve even gone to lectures about it to figure out if I am missing something…and then I finally realized for a designer it is so basic it is like telling someone they need to breathe.

    I also loved the way Marcus vomits on another trendy sacred cow: agile development and MVP’s. I worry about a developer or designer who thinks exclusively in the equivalent of 30-second commercials instead of short films and feature length movies. Projects come in all sizes.

    • Wade Roush says:

      Thanks so much Jules. I wanted to put the spotlight on Aaron because I thought it might serve as a bit of an antidote to overhyped trends like agile development, lean startup thinking, and “design thinking” as a buzzword. I suppose everyone who reaches middle age looks at the youngsters and says “Oh, we were doing that 20 or 30 years ago,” but Silicon Valley’s memory seems especially short. Sometimes “these kids” talk as if they really do think they’re inventing the wheel.