The “Busy” Trap

September 24, 2014

busy-trap

I used to think being busy was a good thing, a necessary thing. “Busy” meant your mind was occupied, gears turning, neurons firing, things getting done. A mind at work was a healthy mind, an efficient mind that saw no challenge too great, no work pile too daunting. As I’ve cycled through different phases of “busy” in my still evolving career, I’ve come to realize what a Faustean bargain working hard and overachieving really is. Lots of work may mean more billable hours, flow-like waves of productive output, and seemingly blissful distraction from other less desirable aspects of life, like loneliness or an unpleasant home environment. But the real detriment of overwork, aside from the stress and nasty health problems, is the emptiness it creates — the lack of intellectual stimulation, the creative deprivation, and the psychological alienation. At its worst, work becomes meaning when meaning cannot be salvaged from anywhere else.

This is starting to sound melodramatic, I know. The reality is that life gives us chances (or we create the chances) to stop and reflect. Sometimes a sudden event like an illness or job loss jolts our routine and challenges us to either lament the setback and curse our misfortune or seize the opportunity to re-evaluate our lives. At other times, like the gaps between big projects or right after major deadlines, we can take a deep breath, look back on how we handled ourselves in the midst of our daily grind, and think deeply about what kind of person we became in the process:

  • When and why did values flip in favor doing that “one more thing” and staying later than planned?
  • How many recreational events and activities had to be passed up for work? Did it become a pattern?
  • How many personal relationships were affected by late night or weekend work?
  • Were mornings greeted with joy or dread?
  • Did the outcome measure up to the sacrifices?

Sadly, I can picture a reverse-Feltron annual report of my former work-life experience: bars not visited, bands not heard, number of locations in NYC not visited, restaurants not frequented, etc. The real data would be considerably less appealing: hours waiting on bus/train platforms after midnight, number of mistakes noticed the next morning (after sending off “final” files), most eaten meal substitute for dinner, etc.

Everybody’s situation and experiences are different, so I won’t rattle off advice like I’m a fully rehabilitated workaholic who’s figured it all out (which I’m not, and I haven’t). I work for myself, which comes with its own demands as well as an even greater need for discipline and boundaries. I occasionally find myself walking into the same old traps, but slowly my foresight is improving. I’m getting better at framing projects with more reasonable expectations and timeframes, yet without compromising the quality of the end result or my quality of life. Most importantly, I’m actively trying to instill better patterns, like a clear start and end to the work day, unplugged weekends, more frequent visits with family and friends, and some completely unproductive — but immensely gratifying — daydreaming.


The Game of Knowledge

September 1, 2014

game-of-knowledge2

Knowledge is power, but it all depends on how you play the game.

As the “Information Age” continues to unfold, gaining knowledge has become the central activity in many of our interactions.* Sure, we’re busy shuttling information around, constantly optimizing how we shape and transmit it, but no matter how thought-through, well-designed, and engaging it is, information must still be translated to something meaningful and useful for the situation we’re facing right now. Information informs our problems but it does not solve our problems. Knowledge — readily-applicable “intel” that helps us at the point of need — is what gets us somewhere.

But, as with anything valuable, there are often challenges to gaining the knowledge we need. There are gatekeepers to confront, mountains to climb, mazes to wind through, battles to be fought, and prices to paid before we can earn even the smallest insight. If it sounds a lot like a game, that’s because it is. To those of us who consider ourselves “seekers” of knowledge, it can be a real quest to locate the right source, dig around, and finally extract knowledge that will help us. And for some — the “knowers” — the focus is on achieving credibility or popularity, asserting authority, and/or controlling seekers’ access to their knowledge.

The dynamics between both kinds of players, the seekers and the knowers, are present everywhere, from online platforms to everyday interactions. Across these different contexts, knowledge can be treated three ways:

Knowledge as Commodity

Once upon a time, we relied on those static information repositories called libraries to find out whatever we needed. A library card granted easy access, but hard work of “search” lay ahead, from riffling through the card catalog to scanning the shelves to poring over piles of books. Hours later, after careful study, we had an answer. We gained some knowledge. Maybe. And yet, with the dawn of the Web and all our advances with search engines and algorithms, getting what we need still isn’t a perfect process. Simply searching Google for an answer offers no guarantee of finding one. If we don’t spot exactly what we’re looking for on the first page or a couple of pages of in (out of impossibly millions of results), we’re forced to navigate a maze of dubious, redundant/repackaged, or completely irrelevant content. Frustrating.

There’s so much out there that’s dutifully served up to us in giant batches by search engines like Google — with supposedly the “best intention” of putting the most helpful results first — but for the life of it, a search algorithm won’t really know what we need to know and how to deliver it (beyond weather, news, and stock prices). We may be able to sort, classify, and rank online information, but for the time being, we can’t cut to the chase and zero in on what matters to us.

Knowledge as Influence

When Google falls short and there’s no one in our immediate circle to ask, our next option is to find discussion forums and other question-and-answer platforms where real people can offer help. The online knowledge marketplace welcomes all kinds of queries from seekers (via Wikipedia, Yahoo! Answers, etc) and provides the knowers with platforms for sharing their knowledge (via Quora, Answers.com, Stack Overflow, etc.). In a way, these platforms help us cope with the whole problem of information overload by bypassing it completely: you just go to a knowledge platform, write your question, then wait for an “expert” to answer. Depending on the platform, you can decide which answer best addresses your question or let the crowd decide for you. The answer with the most votes ends up being the “best” one, but whether or not that’s the right answer is another question.

Many knowers may have the best intentions to contribute to greater understanding by sharing their knowledge, but a good deal seek out recognition for their knowledge and aspire to become “thought leaders” in their realm of expertise. The game for them becomes accumulation of likes, followers, tweets, retweets, upvotes, blog comments, or whatever other social sharing gestures build their reputation. For professionals trying to carve out a place for themselves in a crowded competitive marketplace, that recognition is more than just an ego booster — it builds credibility and potentially attracts more clients, more business, more money, new recruits, and so on. Influence through knowledge can certainly help open many doors to new opportunities (eg, projects, partnerships, book deals, speaking engagements, etc), but a sense of responsibility should come along with it as well. Is it more important to be known for knowing and to capitalize on that, or to help others do better with the hope that they’ll pay it forward when others need help?

Knowledge as Advantage

The value of knowledge can be so great that its ownership creates a power dynamic between knowers and seekers. In finance, law, health care, and other professions, seekers hire knowers with substantial expertise in those areas to help solve a problem or take advantage of an opportunity. Of course, with more knowledge and experience comes a higher price tag for obtaining or using that knowledge to get a specific result. When there’s a value-for-value exchange, and the knowledge received/used and outcome agreed upon more or less matches the amount paid for it, all is well. But when knowledge is withheld for personal gain and others’ loss or provided at a higher cost than what is reasonable, problems arise. Information asymmetry, when someone possesses more or better information than another in a given situation, is often used to gain advantage over those with little to no information (or even awareness that information exists). Buying a home or a car, deciding on health care tests and treatments, making investments, and many other everyday situations put seekers at a disadvantage when they place trust in a professional who has the power to disclose or withhold critical knowledge about a transaction that could tip the scales one way or the other (there are varying degrees of severity to this, of course). Every day, people fall prey to “contrived ignorance” (for lack of a better term), until the fraud is exposed, the asymmetry is corrected, and the game of deception is over.

Changing the Game

There’s been plenty of attention focused on the sciences, arts, and crafts surrounding data and information, and for good reason; in the period of time that understanding-focused disciplines have emerged and begun formalizing to the present, much valuable knowledge has been created to guide the process of turning raw data and content into not just a comprehensible/usable form, but a dynamic, adaptable, and personally meaningful form. At the same time, progress in optimizing information for the eye and brain still needs to account for the ways people — both seekers and knowers — think, feel, and behave.

Cognitive neuroscience is shedding more and more light on perception, memory, and other mechanisms of understanding, but the psychological dimensions of communication and the role of information design in human behavior remain under-explored. We’ll always need insights from the sciences to inform the “why,” “what,” and “how” of information, and we’ll hopefully answer more questions about the “who,” but each of us defines our own role and rules of engagement with others in knowledge exchange situations: when faced with the pressure of finding a solution from an outside source or when presented with the choice of how much knowledge we’re willing to share and how to share it.

Despite the reality (that is, intractability and stubbornness) of human nature, life would be a lot easier in many respects if we did away with the game of knowledge. No winners or losers. No wasted time and energy. No status-seeking. No profit-seeking. Just cooperation and collective engagement in advancing understanding.

*The role of data is important to acknowledge, but it’s just a raw material, an input that takes effort and skill to collect, analyze, and make usable. Information, next in the DIKW continuum, still requires some work to bridge what is presented to what is needed. Knowledge, as it is used here, focuses on that which has immediate use, without filtering, processing, or any extra work needed on an individual’s part.

Understanding, Fast and Slow

July 31, 2014

fast-and-slow

Have you ever felt like the only person in the room who didn’t get something? And you felt too embarrassed to ask for an explanation? Maybe it was in a classroom or business meeting or a social gathering where everyone was vigorously nodding in agreement, chuckling at an inside joke, or jumping to the next topic of discussion before you could make heads or tails of what just happened?

We’ve all been there — not understanding something as quickly as others (or so it seems) and experiencing a wave of negative feelings because of it. It starts in school: there are “bright” students who are praised for learning quickly and performing well and “dull” students who are frowned upon for being “slow,” not “applying” themselves, and getting poor grades. Rather than question the education system and the one-size-fits-most approach imposed upon us, many of us readily blame ourselves for our own perceived shortcomings: If I don’t get something, there must be something wrong with me.

The problem continues well into adulthood. Often, when we explain something to co-workers or others, we expect them to follow along at our pace: If I get it, why shouldn’t they? Worse yet, we may rid ourselves of any responsibility: If they don’t get it, too bad — that’s not my problem. We even label those who don’t match our accepted speed of comprehension — slow on the uptake, not on the ball, dim, not the sharpest knife in the drawer, etc. Sadly, the notion of explaining concepts to presumed “slow” people has spawned its own industry. Idiot’s Guides and (Fill-in-the-blank) for Dummies books provide generally useful instruction on a variety of topics, but the marketing wrapper for that content reinforces the stigma of presumed stupidity. Despite the light-hearted tone and humorous illustrations, the message behind such books is that anyone who needs a little extra help to get by in life is somehow inferior. Why does the thoughtful, clear explanation of anything have to be targeted to “idiots” and “dummies”? And when did intelligence become associated with how fast someone learns something?

Attitudes towards learning and rates of comprehension need to evolve to accommodate the diversity of thinking styles different people possess. To start, we need to accept the fact that slow isn’t necessarily bad and fast isn’t necessarily good. We also need to move away from the default solution to just make things more visual because we process more information more quickly through our eyes (as it stands, we’re still not doing a very good job of maximizing visual thinking to accelerate understanding). Effective communication that “clicks” for everyone relies on having a firm grasp of what you’re communicating and a knowledge of principles for structuring and presenting your content, whatever content and format it may be. I find these guidelines particularly useful:

  1. Show the whole picture, then focus on the parts. Just starting with detail or component pieces makes it hard to see how everything fits together and may alienate those who are unfamiliar with the larger system. A bird’s-eye view of content helps establish boundaries and relationships, so that learning is cumulative and associative from one part to the next.
  2. Provide persistent navigation and orientation. The longer the presentation or amount of content, the easier it is for someone to lose track of where they are and get confused. Much like a physical space, guiding someone through new or difficult content requires markers and signposts to let them know how far they’ve gone, how much is left, and of course, where the end is. A mini table of contents on every page of a presentation can help mark the journey: each section can be “lit up” when it’s active and greyed out when it’s not. Even a simple “three things” or “five things” construct can help make information memorable.
  3. Set checkpoints to confirm understanding. It’s easy to march right through an explanation or presentation of something we’re familiar with. It’s also easy to forget what it’s like not to be familiar with that same material, which is why it’s essential to regularly confirm understanding — genuine understanding — with an audience in-person. Slow down, scan people’s body language, look for frowns or squints, and even if the telltale signs aren’t visible, proactively ask “did that make sense?” or “should I repeat that?” to see where further explanation is needed. Often, requests for clarification don’t come on their own, so encourage questions — just don’t call them “stupid” questions.
  4. Prepare multiple explanations. A single, literal explanation of a technical subject may work perfectly well… for a technical audience. Multiple metaphorical explanations, in which concrete, tangible examples represent abstract or complex concepts, can be devised for almost anything and for almost every audience. You can usually tell when someone knows their stuff when they can easily generate compelling illustrations of the same thing using rich, memorable metaphors in order to bridge an understanding gap.
  5. Promote patience. This is probably the toughest of all. Not only is it important for the explainer/presenter to be patient with an audience and do whatever it takes to help them get something, but it is vital that group members (when dealing with a team setting) manage their behaviors and not intimidate those who need more time or effort to process. Collaborative work suffers when team members possess different levels of understanding about their project, so it benefits the entire team to bring everyone up to speed and leave no one behind.

For some, making sense of the world is a race down a highway. For others, it’s a winding, rambling road. Regardless what pace suits our audience, we still need to ensure they move toward understanding at a speed that suits them best — whether we’re information designers or not.


The Things You Don’t See

May 31, 2014

things-you-dont-see

Information design, by nature, is concerned with making sense of the known: data, facts, observations, ideas. It involves taking existing content and putting it in a format that makes it more readily know-able. But rarely does information design acknowledge the missing, the unknown.

A recent talk by Andy Kirk on representing the absence of data in visualizations rekindled my thinking on how much we underestimate “known unknowns” or completely miss the “unknown unknowns” in a given situation. While Andy highlights the challenges of graphically depicting a zero quantity as well as the visual and narrative impact of emptiness in data display, I see a parallel set of challenges in the conceptual, non-quantitative side of information design: how do you map a comprehensive understanding of a situation beyond what’s readily given and outside of one’s own frame of reference? (To clarify, the “conceptual, non-quantitative” work I’m referring to here isn’t about making infographics but about solving complex problems in organizations.)

Solid fact-finding and content gathering provide the raw material for information design. Asking the right questions and getting the necessary answers, however, can mean different things to different people. In conceptual information design, who, what, when, where, why, and how span a broad range of observable phenomena to be captured and analyzed (people, places, ideas, connections, processes, contexts, comparisons, etc). But the person framing the questions can only do so from their point of view and with the skills and expertise they possess at the time. Biases certainly affect the quality of fact-finding, but a lack of contextualization and systems thinking, among other awarenesses, can severely hinder information design work. Below are just a few blind spots in the early stages of information design work that can constrain understanding and potentially lead a project down the wrong path:

1. We don’t see what’s right in front of us.

We can easily ignore important pieces of information because they may seem too obvious or “common sense” (which isn’t so common at all). Our familiarity blinds us to knowledge that we take for granted but that may be completely foreign to someone else, so when we visualize a current reality and load it with facts, we may omit crucial details that an “outsider” needs in order to build understanding. Stepping outside ourselves and seeing the world from a newcomer’s perspective allows for more inclusion and inroads to unfamiliar territory.

2. We don’t see what’s just outside our field of view.

It’s easy for us to cling to all the facts we know for the sake of advancing a project towards completion, but the resulting picture we create may be far from complete and lack the clarity and accuracy that a situation may require. Gathering input from different perspectives, whether it’s a range of stakeholders, diverse sources of information, or literally a different vantage point, can help ensure a 360º view of a situation (much like we use space probes to see the dark side of the moon, which never faces Earth). And when there’s still a missing part of the picture, we should be able to denote it as “missing” or “other” in a visualization.

3. We don’t see what’s changing.

Change and flux are challenging to reconcile in information design. When something new is about to happen, the very fact that the change will happen may slip through the cracks and go unacknowledged in the shuffle of everyday work… until the change is announced and compliance is expected. Leadership transitions, large-scale reorganizations, technology updates/launches, and other organizational changes can significantly derail an information design project if they’re not recognized and addressed early enough. Anticipating and incorporating discussion of change and synchronizing communications (visualizations, training, etc) to the timeline of change can help alleviate the shock of the new.

4. We don’t see ahead into the future.

Facts and data rooted in past or present observations are the conventional inputs for information design projects. However, certain types of information design work require looking beyond what is towards what could be when exploring a problem. Information design for strategy development, for instance, depends heavily on both a firm understanding of the present and a clear vision of a desired future in order to bridge the gap between them. Thoughts and ideas about what tomorrow might look like, even the most far-flung and imaginative, are critical inputs into the process. Even if some elements of the future vision are fuzzy or undefined, saying so in a visualization enables a team of stakeholders to have a conversation on those elements and define an approach around them.

5. We don’t see what’s unfavorable or inconvenient.

As much as we want to impose Spock-like objectivity on information design work, the simple truth is that humans are subjective creatures strongly inclined to do what they want to do (or what they’re told to do). We may avoid delving into areas that are too difficult to deal with, too personally aggravating, or just not interesting enough, and in the process we sacrifice valuable content for comfort. Sometimes, we may not be aware we’re closing doors on things we don’t like. On the client/stakeholder side, they may evade certain questions or flag some sensitive topics as off-limits (even though they may still be important parts of the picture) or there may be content that is masked off by bureaucratic red tape. In both personal and organizational contexts, the key is to emphasize the value of acquiring the knowledge, however difficult it may seem: aside from deepening understanding of a situation, it may spawn new thinking and improve the quality of the outcome.

These are just initial (choppy) thoughts on the subject, and there is much more to say on the topic of biases, but the main point of this post is simple: There is no such thing as perfect knowledge or perfect information design. Even the best efforts to map a complex space or fill every bucket of content can fall short. Being aware of our blind spots makes us better equipped to handle new and more complex situations by forcing us to see more of the world with different eyes, rather than simply accepting the view from within our cozy little comfort zone.


Rational Thinking Made Tangible

March 31, 2014

rational_thinking

Over the years, I’ve come to understand that information design work is as much a process of reasoning and investigation as it is an activity of pure design decision-making and production. In the earliest stage of my career, I thought information design was only about making graphics that put facts and figures in a clear, understandable format. I assumed that “clear” and “understandable” meant employing graphic techniques like bold color coding, ample white space, good typography, and descriptive illustration. My design education introduced me to the formal principles and standard guidelines for doing design work, like color theory and grid systems, along with the time-honored maxims like “less is more” and “if you can’t make it bigger, make it red.” I believed I had all the ingredients and the tools to do proper information design work, with a generous dose of ego thrown in.

But the more I actually did the work and the more exposure I had to different challenges, the more gaps I uncovered in my own “expertise.” If something made sense or triggered an “aha!”, the graphic designer in me attributed the success primarily to design as I understood it — the skillful arrangement of elements on a page — and little else. What I didn’t fully acknowledge was the “why” behind information design — why does that particular arrangement of elements work. And what do “clear” and “understandable” really mean?

What makes information design work?

The answer to that question stems from a widely-circulated quote attributed to Edward Tufte, from his book Visual Explanations:

Good information design is clear thinking made visible.*

What immediately strikes me about this definition is the order of ideas: clear thinking precedes visualization. It’s a simple point, but a critical one when discussing foundational aspects of information design. The ability to reason and apply rigorous logic to understanding-related challenges is what enables the effective design of information, in any form. In practice, I think “clear thinking made visible” could broadly refer to a continuum of activities:

  • applying a knowledge of principles and rules behind systems to making sense of situations
  • creating and using frameworks for organizing content (thoughts, ideas, data, text, etc)
  • designing interfaces for those frameworks using a variety of methods, tools, and techniques (To clarify, I’m using the word “interface” loosely to refer to visual, aural, spatial, tactile, and maybe even gustatory and olfactory means of accessing and interacting with information, not just technology-based interfaces.)

There’s a lot to unpack in those three bullets, perhaps in future posts. For now, I’m mainly interested in reframing information design to account for the bigger, invisible picture that happens in the “pre-visual” or “pre-artifact” stages. What I hope to see, sooner rather than later, is a shift away from the narrow graphic design-centric perspective that has hindered understanding and growth of the field and towards a cognition-centric perspective that embraces the full scope and potential of what information design is and does below the surface.

With that in mind, I propose a revision of Tufte’s quote to something like this:

Effective information design is rational thinking made tangible.

There are three key words here:

Effective: Given the ever-expanding range of stuff passing as information design these days, it would seem necessary to distinguish works as effective or ineffective at enabling understanding, rather than simply “good,” which has its own subjective meaning. The word “effective” may also promote a greater focus on how information design functions holistically, rather than just how it looks or how it works alone.

Rational: Saying “rational” instead of “clear” thinking helps put a finer point on the type of thinking involved in information design — thinking that subscribes to reason; “clarity” alone may only suggest that thoughts are distinct and well-defined, complete statements, but they may lack any basis in logic.

Tangible: Information design can take many forms once it has passed through the conceptual, “figuring out” stages. The word “tangible” need not only refer to objects or artifacts but to those things that can be experienced directly.

Information design has a long way to go before it will break free from conventional notions of what it is and can — or can’t — be. Greater awareness of the upstream information design process is necessary, as are required studies in cognitive science and logic. Understanding the brain and how it works, from the theoretical to the practical levels, should be the next wave in information design education and practice, not more overemphasis on filling our design toolkit and producing dazzling outputs.

——–

*The full quote is “Good information design is clear thinking made visible, while bad design is stupidity in action.” After briefly Googling the quote for other instances of its use, I came across what could be its inspiration, by Bill Wheeler: “Good writing is clear thinking made visible.” It isn’t surprising that both writing and information design can be described in the same way.


Next Page »
Copyright © 2014 Michael Babwahsingh | powered by WordPress with Barecity