The Future of Design History

June 30, 2015


The future is so now. Everywhere you turn, there’s some book, blog post, or conference about “THE FUTURE OF something-or-other” these days. The faster tech progress moves, the more impatient people become for the next new thing. There’s certainly nothing wrong with envisioning the future and imagining possibilities for what could be. Meaningful progress depends on it. But there is something deeply wrong when we forget about everything that got us here or, worse yet, have no clue it ever even existed.

Recognizing the value of design history and truly learning from it is a loaded topic. It would take more than a lowly blog post to address topics such as the gaps in design history, the underuse of primary source material in education, and the false idolization of influential figures in design history. What interests me right now is the perception and role of design history in an internet- and social-media-driven world. Two interlinked forces are at work shaping design history — for better and worse:

  1. The design explosion: As the popularity of design increases (never mind which design we’re talking about) and the demand for designers and design skills grows (never mind specifics, again), what design is has evolved in response to these changes. More diverse voices outside of design are shaping and “curating” conversations about design, new players are redefining the landscapes of design education and practice, and new applications of design are gaining more recognition among the more traditional products and services. However, with so much activity around design, it becomes more challenging to make sense of design and find information about design history that is accurate and useful.
  2. The digital revolution: There’s no denying the speed and scale of change still happening from the analog-to-digital transition. Design continues to explode because more people know about it, interact with it, and practice it thanks to the unprecedented reach of the Web, the ways we can access it, and the number of places we can share and find information. The whole of design may seem no more than a Google search away on whatever device we want, but much design history still hasn’t migrated from boxes and shelves to digital files on servers and in databases. And what has migrated may be unreliable or incomplete.

Both forces combined have certainly elevated, broadened, and enriched design over the past twenty years or so, but they have also given rise to a growing trend toward “new” and “now.” In the Internet Age, design history is neither of those things. Instead, it is associated with words like “old,” “outdated,” “archaic,” among other unfavorable descriptors. It’s about people from, like, a hundred years ago doing lots of long, hard manual work without computers. And let’s not forget the stuffy old instructors who deliver bone-dry design history lectures and assign dreadfully long readings from super-heavy textbooks. For these reasons, it must be repackaged in the context of quick-fix social media and grabby blog posts and linkbait to become “new” and “now.”

One of the biggest problems hurting design history today is the use of historical design imagery, particularly visual displays of information commonly labelled “infographics” culled from online library, museum, and archive collections, without the correct citation: it’s either missing, incomplete, or the source cited lacks the actual reference. Countless Tumblrs, Pinterest boards, Facebook pages, tweets, and other forms of social sharing feature these images with little more than a trite “cool historic infoviz” or “[INFOGRAPHIC] Awesome Victorian data display” to accompany the posts. It may seem harmless, but the compounded reposting creates more distance between the original source and the place it was found (reverse image search is occasionally helpful, but no substitute for correct citation).

The widespread use of historical design imagery as visual fodder for personal mood boards and attention-seeking reduces the value of design history to clip art that can be freely redistributed across the web without regard for provenance or historical context. Works that are out of copyright or in the public domain aren’t necessarily fair game because there are no legal repercussions (as with music videos on YouTube). They are still someone’s creation and it’s worth stating who made them, when, where, and why, not just in the interest of good scholarship but in the interest of helping others learn more about that “cool” discovery.

An event I attended recently at the New York Public Library, Peripheral Landscapes: The Art of Maps, perfectly encapsulated the problem of design history stripped of meaning. The event featured three digital collages constructed entirely from cut-outs of “decorative and non-informational elements that reside along the edges of maps” that are part of the library’s Lionel Pincus and Princess Firyal Map Division. The project was part of the library’s Net Artist in Residence program intended to make the most of their digitized historical collections and infuse them with a little more “new” and “now.” For most of the talk, the artist, Jenny Odell, spoke about her interests in online imagery and particularly Google Maps and Google Street View, and to be fair, most of the work she showed was conceptually interesting. When asked during Q&A if she documented the image sources she used, her response was a plain and simple “no.” Granted, much art contains allusions to its influences but doesn’t often state them directly. In this case, a library-sponsored art project using its own historical map collection to boost public awareness and use, attribution was surprisingly not a requirement — nor an obligation on the artist’s part — if only for the sake of helping viewers identify where the “non-informational element” came from and encouraging them to dig deeper into the map collection.

Making the effort to preserve links to the past across the analog-digital divide is critical to design history’s survival. In the absence of web citation police or a definitive, comprehensive online source of design history across all design fields, the responsibility belongs to everyone to correctly cite historical design imagery they post with the original source and not some derivative source or a linkrot-vulnerable URL. Even if it means doing a bit more homework beyond a Google search.

Living for the (Future) City

May 31, 2015


New York is a constantly changing, evolving city. Shops come and go, buildings rise and fall, fashion trends run their course. For years, the city has celebrated its own ability to reinvent itself and yet stay true to its core, the “real” New York — the Frank Sinatra New York and the Jay-Z and Alicia Keys New York. The city is a place of desire and drive, no better captured than in the towering ad-screens and street-level spectacles of Times Square. The same appetites have always come to be fed in NYC — status, beauty, power, importance. It’s just the packaging that has kept changing to suit the tastes of newer generations.

But as years have passed, the values of the city have gradually shifted. There’s more to New York now than just sustaining the wealth that put it on the map in the first place or claiming one’s own piece of the pie. There’s an awareness of something bigger than individual pursuits, deeper than the instant gratifications that city conveniences afford, and longer term than one’s apartment lease. The future of the city is on people’s minds — a cleaner, safer, more efficient, and more inclusive city. A city where people can actually thrive and build communities while preserving their environment and strengthening infrastructure.

Such was the spirit of the inaugural IDEAS CITY festival in 2011 (which then went by the much longer name “Festival of Ideas for the New City”). Back then, optimism was low as many were still struggling through the economic recession. The event provided a much-needed boost of positivity and hope, if only as a show of stubborn resilience against a bleak future, a proud reassertion that NYC arts and culture were here to stay. It certainly got me excited, so much that I wrote about my experience.

Fast forward to 2015, just four years into the future. I’m back at the same event with a shorter name, ready for more good stuff. This time was different though. The bright picture of the future was still there, but what was brought into sharper focus was the stark reality of the present (for more than just NYC dwellers). As stated on the event homepage:

The theme of this year’s IDEAS CITY Festival is The Invisible City, an homage to Italo Calvino’s literary masterpiece of 1972. This theme is rooted in civic action, with each of the Festival’s platforms serving as an invitation to explore questions of transparency and surveillance, citizenship and representation, expression and suppression, participation and dissent, and the enduring quest for visibility in the city.

The full-day conference on Thursday, May 28, zeroed in on each of these topics, as speakers shared their perspective on how they tackled problems within each or proposed solutions. By far, the most galvanizing speaker was Lawrence Lessig, whose dynamic opening keynote on inequality, networks, and democracy shone a spotlight on the broken-ness of American democracy and Internet regulation (find a way to watch it here).

Also eye-opening was the panel discussion Full Disclosure and the Morality of Information. As the panelists openly pointed out, government surveillance of American citizens and personal data collection through various social networks happen every day and have serious consequences, but little is being done, or can be done, to put a complete stop to them. Such unseen activities can be made visible through art projects and awareness campaigns, as the panelists explained, but the real change that is needed must come from individuals taking action to protect themselves.

The Saturday street fair seemed a bit more chaotic / haphazard than the 2o11 version, and maybe a bit emptier than I recall, but that didn’t take much away from the enjoyment. In one booth, a pianist performed his musical interpretation of whatever someone drew or wrote on a piece of paper, no matter how abstract it was. In another booth, the NYC Department of Design and Construction presented the city’s plans for inclusive design for the visually impaired, which included a prototype for a special concrete ramp with raised bumps and ridges plus braille on the railing to indicate a construction site.

What struck me most about the event was the actual absence of the “invisible” people of the city who were mentioned by some of the Thursday panelists: new immigrants, disabled individuals, and others who tend to go unnoticed or flat out ignored by everyday people and public services alike. I think it would have been fitting (necessary!) to not just talk about certain hidden populations receiving help from some group’s efforts but to invite them to participate in the conversation about what their present is like and what they want their future to look like.

As a series, IDEAS CITY has come a long way and is helping to redefine what cities can do to improve the well being of their citizens in the present and in the years to come. Rather than just showcase inspiring and imaginative art, architecture, and technology concepts aimed at a privileged few, IDEAS CITY 2015 dug into real issues that need much more than stylistic innovation to solve.

(The title is a reference to Stevie Wonder’s Living for the City)

Embrace the Wait

March 31, 2015


Have you ever stopped to think about waiting? We spend so much time just waiting for some things to happen, other things to pass. Consider the moments of waiting that happen over the course of a typical day: on line at a register, in traffic, in a doctor’s office, on a plane, etc. The very concept of waiting — a momentary suspension of activity or delayed action until a specific event happens — seems to suggest that we have to be in some mode of action at all times during our waking hours. Being idle has come to be a bad thing, and waiting itself has a generally negative connotation because the thing you want is not in the present but locked away in some defined or undefined point in the future. Time needs to pass before that desired state can come to be. And oh how we pass our time!

To me, commuting by mass transit is the perfect encapsulation of waiting, next to air travel. Commuting can be considered a form of routine waiting. It’s finite, scheduled, expected, and for that reason, we find ways to pour our lives and lifestyles into that span of time. Rituals, habits, indulgences all make their way into our commutes. Over the years as a regular commuter to and from New York City, I’ve observed a growing but unsurprising trend: devices have come to dominate the commuting experience. I regularly scan my subway car to see how many people are using a smartphone, tablet, e-reader, or other gadget. Most of the time, the majority of passengers are immersed in some tech-enabled activity: reading, listening to music, playing video games, watching movies. Signal permitting, people are chatting away, texting, snapping selfies, scrolling through streams of social media updates, or typing in updates of their own. It’s bad enough that we actually are capable of doing all of those things in a shared public space like a subway car or bus, that these are all options for things you can do when you have nothing else to do. The real problem is twofold. First, we haven’t yet learned how to distinguish what we should do from what we shouldn’t do in those situations. Second, and more the focus of this post, we don’t really know how to deal with moments of pause in an activity- and device-free way — to simply embrace the wait.

Reframing the meaning of those quiet, interstitial spaces in our lives requires an understanding of what we’re doing with the busy, booked-up blocks of time on our daily agenda and why. Most of us measure our personal worth or success by our productivity and accomplishment. We feel good about ourselves when we get more stuff done, so we work very hard to do more and more. In school and at work, we often get rewarded not just for scoring higher on a performance scale (like an A+ or a 100%) but for occupying our time with as many extracurricular activities as possible. We try our hardest to exclaim to the world “I am NOT idle! I am highly driven and motivated to succeed!” but what we achieve in the end is little more than exhaustion.

Filling empty space to capacity is a common habit, whether it’s our calendars, our closets, or our stomachs. We do no different to our brains. I would argue that the concept of information overload has less to do with us being bombarded by information from lots of external sources than with us deliberately saturating our own attention with more information than we need. If we’re drinking from the firehose of information, as they say, we’re also the ones holding the hose to our own mouths and controlling the valve. Devices that connect us to the plethora of information in the world and in our lives are not necessarily at fault. They just make it far too easy to indulge in our existing impulses, especially when we think there’s nothing better we should be doing than funneling our attention into a small glowing rectangle that’s conveniently within arm’s reach.

So what else should we be doing if we can’t play with our devices? My simplistic answer: Do as little as possible or nothing at all.

If sleep is the chance we have to rest our bodies and sort out all the memories we’ve collected during the day, what chance do we have during our waking hours to reset our minds and make sense of our experiences and our lives? Sure, there’s meditation, yoga, running, hiking, and many other mind-easing pursuits to help us gain perspective and focus, but what about those in-between waiting moments sprinkled throughout the day? I like lists, so here are five techniques I find useful:

  1. Create and capture rather than consume: Carry a notepad or small journal and write out what’s on your mind (yep, with a real pad and a real pen, no apps). It doesn’t matter whether you write or draw or both, and it doesn’t matter if it looks/sounds good or not, so long as you allow yourself to express whatever has accumulated in your head. Even sitting somewhere and describing what you see can be interesting.
  2. Do some creative visualization: I wrote about this technique in an earlier post, but it’s worth resurrecting. Whether you believe it works or not, it’s a remarkably calming exercise to envision a goal or end result you want to achieve in as crisp and vivid detail in your mind as possible. Give it form, color, texture, smell — whatever will bring it to life. With repeated practice, you might be surprised with the outcome.
  3. Work out unresolved issues: Unpack a problem that’s on your mind. Don’t simply replay what went wrong over and over. Look at it from different angles, step outside your shoes, identify the things you didn’t know or understand clearly.
  4. Move around: Lots of waiting involves sitting for prolonged periods. Movement, even in small ways, can be beneficial. There are many kinds of simple, low-impact exercises that can be done while seated or that require little space.
  5. Daydream: Really, why not? Let yourself stare out the window of a bus or train, watch the clouds go by, observe people bustling about, or just take in your surroundings.

Waiting can be about much more than waiting, and it can certainly involve more than the digital pacifiers we carry around with us. We need to reframe waiting as an opportunity to disconnect from the task-driven part of ourselves that craves stimulation and reconnect with the other, quieter part that longs for stillness, peace, and reflection. Maybe we can start to think anew about waiting as the space between notes of music, a deep breath after a steep climb, a blank page dividing chapters of a book, or a welcoming patch of green space in a towering, grey city.

Your Better Self

February 26, 2015


If you ask me what the hardest part of being an information designer is, only one thing comes to mind. It’s not developing an eye for layout or mastering all graphic forms. It’s not generating brilliant ideas or perfectly executing a concept on time and on budget. Nothing to do with finding the right content or data or discovering the optimal presentation. And actually no, it’s not even finding great projects or keeping a business afloat.

The hardest part of being an information designer is managing people (self included).

By “managing people,” I don’t mean supervision or control or anything remotely suggesting a power dynamic. It’s making sense of what makes people who they are — their very best, their absolute worst, and everything in between — and navigating different situations with grace, respect, and wisdom. It’s about peeling back the layers of what we see, hear, think, say, and do in order to come to terms with the psychological, emotional, and philosophical factors behind how we relate to one another and to ourselves.

Deep stuff. So what does this have to do with information design?

First off, let’s establish what an information designer’s main job is. Regardless of specialization, an information designer fundamentally brings order to disorder, creates structure where it lacks, and weaves systems out of seemingly disparate parts. To do that, an information designer must bring to bear a host of skills and proficiencies, like sound reasoning, knowledge of design principles and methods, and technical expertise to enable a particular outcome. An information designer must (re)define a problem, gather content/data, and follow some process to arrive at a solution (gross oversimplification here). It would be easy enough if we stopped there, but the one element that makes it all interesting is the wonderfully diverse people involved.

Assuming the role of the information designer, you typically have four key players to deal with: the client, the user/audience, your team members, and of course, yourself. Sounds a lot like other professions, but there’s one difference: you’re constantly trying to impose order and organization on people and situations that are often unruly, complex, and even irrational. Your job is not just to “shape” information about/for/with them but, in many cases, to orchestrate interactions and shape people’s thinking, attitudes, and behaviors as well. Information design often involves change of some sort to be planned for, communicated to, or enacted on people. In the process of doing the work, as the champion of logic and clarity, you’re trying damn hard to keep your own biases, preferences, feelings, and primitive urges in check. In the end, the “information design project” is just there to move the story along.* The real work is understanding people in order to find ways to help them be better and do better. The deeper you dig into a problem, the higher you climb from facts and figures into organizational strategy and change, the harder this challenge becomes.

Some of the more memorable moments in my career, the very same moments when I felt I grew as an information designer, were when I confronted head-on the stark realities of dealing with people. They were disorienting, sleep-depriving, and even nauseating. I view these as developmental lessons, in the same way that a child learns not to touch an open flame after getting burned. The goal is to know better next time and not get burned again.

Here’s a little walk-through of common people problems I’ve experienced on the job and the nuggets of wisdom I extracted from them:

Everyone has their favorite way of doing something, or special thing they made that they want others to value as much as they do. When you’re called in to rethink or redo it, or worse, when you go ahead and change it without their involvement or full buy-in, unpleasantness happens. Pride and ownership are extremely deep feelings and require extra special care to ensure any change is inclusive, thoughtful, and constructive, rather than destructive. Having been on both sides of that equation, I’ve come to realize that focusing on achieving a shared end goal and finding the best way there together should be the driving factor, not ego or fixation on one’s own solution.

Wurman calls it the disease of familiarity, when someone is so deeply entrenched in their field of expertise that they can’t explain it simply to an outsider. It can also blind them to inherent gaps in their own reasoning and make alternative points of view unimaginable. Such closeness can breed resistance to change because the problems others may see aren’t readily visible to them — everything already works just fine, thank you. Drawing a picture of a situation informed by different perspectives including their own can help, but it’s just a static image. You need to use the picture to prompt open discussion and allow time for someone to actually absorb and accept the picture in order to understand what possibilities exist outside of their comfort zone.

Impatience is the source of much stress, frustration, and nasty behavior, no matter what industry you’re in. Most information design work requires unhurried, focused concentration within a dedicated span of time to be truly effective. A meeting full of fast-talking executives calls for just as much slowing down and deliberate idea capture as a data visualization project might call for weeks or months of data collection, analysis, and synthesis. Confronting clients who balk at necessary process steps and timelines can be tough when they don’t see the value or impact on the end result. Unpacking the urgency and figuring out what’s really driving the deadline — in a calm, collected way — can potentially defuse an apparent rush. But sometimes, when there’s a real fire drill, you need to muster every ounce of good nature to get the job done, rather than vent about how unfair it is. Once the dust settles, you can work with your client or boss to find ways to manage situations better in the future.

Sometimes people don’t want to do something because it seems beyond their duties, takes too much effort, or they just don’t care. Sometimes that’s you. Negative feelings generated over the course of a bumpy project can kill motivation and lead to indirect sabotage: a passive-aggressive e-mail here, a half-baked deliverable there. Rather than face the problem directly and risk an argument with a client, boss, or team member, you might choose to put up with it or just bail out. Conditions usually deteriorate soon after the start of a project if there’s a lack of understanding of purpose, goals, roles, responsibilities, timelines, etc. A project can ground to a halt or fall apart if underlying personal issues bubble up to the surface and interfere with day-to-day activities. One approach is to focus on the plus side of doing good work, like the benefit to the end user/audience or improvement of one’s own skills while working to pinpoint the source of the problem. The idea is to actively map your way through the situation rather than sink into a rut of indifference that can drag down the whole project with it.

Habits and patterns are hard to break. Familiarity and comfort in the known are partly responsible, but the constant repetition of a routine can snowball over months and years and amass a weight of its own. Inertia can take the form of a company policy (that’s just the way we do things) or a frame of thought (I’m just not creative/visual/etc.). The problem is not so much the habit as is the deeply ingrained, almost immovable set of behaviors associated with it — also known as stubbornness (I won’t change). Clients are often fearful of any kind of change because it might disrupt the status quo and create more problems for the organization, whether it’s a new weekly meeting template, a revamped design and workflow for creating research reports, or a strategic map presenting a company-wide reorganization. When you face a brick wall of opposition, a good strategy is to first dig deeper and learn more about the nature of the stubbornness. A little time spent listening to real concerns and showing genuine interest can open dialogue and inform how a new way of working can be introduced.

Of course, there’s no magic secret to fixing people problems. People smarts come with age and experience, and there is no mastery because, after all, you’re only human. You do the best you can, try to listen to the angel on your shoulder, give people a chance, and always, always seek understanding. I’d say the best moments happen not just when a project is a success, but when you and the people you’re working with run into a problem, like a disagreement or complete misunderstanding, and find a way to work it out together. That wave of relief after making the right decision to untangle a problem and breaking through a tense situation — you know, that pile-of-bricks-lifted-off-your-chest feeling — is priceless. The sense of knowing someone better than before because you resolved a conflict is a special type of understanding. It can move mountains.


*This is Alfred Hitchcock’s MacGuffin concept, which I discovered in Dan Hill’s excellent book, Dark Matter & Trojan Horses: A Strategic Design Vocabulary.

Unpacking Understanding

January 25, 2015


How do we better understand understanding in order to advance understanding-related work?

There’s plenty of talk these days about making sense of messes, making the complex clear, bringing order to chaos, turning data into insights, creating understanding, et cetera. Who does that work — whether it’s an information designer, information architect, user experience designer, or any other hot professional of the moment — is not the focus of this post (although it is a largely neglected topic of discussion). My concern is that understanding-related work of any kind happens without much understanding of understanding itself. How can we get away with talking the talk, selling services that promise to make confusing things comprehensible when we know so little about what understanding is and how it works?

Such is the folly of many similar pursuits today: we want to innovate, we want to unlock creativity, we want to harness the power of design and design thinking. But these words, just like “understanding,” are victims of marketing spin, media hype, and fluffy visual rhetoric. The romanticized idea of these things, commonly represented by the colorful mosaic of post-its on an office wall, the intriguingly messy marker scribbles on a whiteboard, the twenty- or thirty-something team cheerfully engaged in group brainstorming around a flipchart, makes it seem so accessible, so easy to partake in the fun designers have enjoyed all along — and more deceptively, to wield tools and methods that will deliver the same results (better products and services, happier customers, higher revenue, etc). But what is the fundamental basis and origin of these activities? Why are people doing these specific things, and what do they actually get out of them?

Understanding is too often associated with visual outputs: those “insightful” data displays, “engaging” infographics, or “compelling” visualizations that attempt to speed us along the data > information > knowledge > wisdom continuum. We are bombarded with visual explanations, illustrations of factoids, pictures of numbers, maps of ideas, in every possible channel or medium. The reasoning is that because vision is believed to be the dominant sense, the primary input for information*, and pictures are so much faster to read, visual presentation of information ought to be more understandable. The more visual the better, right? Of course, this is flawed, superficial reasoning, (not to mention heavily skewed by some toward purposes far from enabling understanding). Visuals are just one delivery mechanism for content and data, and they are nowhere near universal. Certain visuals designed a certain way work for certain people more than others in certain contexts when conveying certain types of things — but why?

We don’t really know what innovation, creativity, design, design thinking, or understanding are because nobody stops for a minute to peel away the glossy veneer and really think about them, to question them, to research their history and foundations, to synthesize meaning and actually make sense of them. In this age of thought snippets and tl;dr, hardly anyone seems to have the time (or interest) to invest in any deep level of study and reflection beyond a Google search or skim of a blog post or magazine article or bother to follow a paper trail of references and source material that is, in fact, made of real paper and not bits and bytes. Instead, there is a willingness to accept “minimum viable knowledge” — just enough to gain familiarity and be “in the know.” For many practitioners in the design space, a “whatever sticks” approach may be good enough in the products and services they create, which includes applying received knowledge about formal design principles like type, color, scale, contrast, etc. The real science and scholarship behind these concepts and principles ends up being neglected and ultimately forgotten.

In the world of understanding-related work, we need to dig deeper into the “what,” “why,” and “how” of understanding:

  • How does understanding work? What are the processes involved in visual and non-visual information processing? How does the brain translate sensory input into something meaningful and usable? What does available research offer, and what have we yet to learn?
  • What do you need to enable understanding? How do you tap into the mechanisms of perception and cognition in order to maximize understanding in the creation of information artifacts and experiences? What are the ingredients/preconditions/requirements for understanding? And what lessons do different professions offer one another (teaching/education, graphic design, psychology, etc)
  • How do you really know if you’ve achieved understanding? What is the “aha!” moment, exactly? How do you effectively evaluate or measure understanding? Do any methods already exist? What are the pros and cons?

I’ll readily admit my own understanding of understanding is limited. I’ve framed plenty of great questions in the course of my career, and if I could embark on a full-time (generously funded) research project to answer these questions, I’d happily dive into it. However, for the time being as a full-time practitioner, I think it’s important to wade gradually into the realm of research and probe specific lines of inquiry at a time, rather than tackle everything all at once. For me, it involves carving out time to catch up on literature about neuroscience and psychology and piece together my own understanding, writing or diagramming thoughts and connections, so that I can start informing my day-to-day work and gauge how well I’m doing what I think I’m doing.

My hope is that other professionals focused on enabling understanding recognize the need to think more deeply about the work they do and try to replace assumptions, best guesses, and conventional “wisdom” with sound research, evidence, and self-driven learning. Whether you frame this as bridging theory and practice or academia and industry, there is a clear need for more thoughtful practice as well as more more practical thought if we want to holistically advance understanding work of all kinds.

*The claim that 90% of information transmitted to the brain is visual apparently lacks an authoritative source.

Next Page »
Copyright © 2015 Michael Babwahsingh | powered by WordPress with Barecity