Embrace the Wait

March 31, 2015


Have you ever stopped to think about waiting? We spend so much time just waiting for some things to happen, other things to pass. Consider the moments of waiting that happen over the course of a typical day: on line at a register, in traffic, in a doctor’s office, on a plane, etc. The very concept of waiting — a momentary suspension of activity or delayed action until a specific event happens — seems to suggest that we have to be in some mode of action at all times during our waking hours. Being idle has come to be a bad thing, and waiting itself has a generally negative connotation because the thing you want is not in the present but locked away in some defined or undefined point in the future. Time needs to pass before that desired state can come to be. And oh how we pass our time!

To me, commuting by mass transit is the perfect encapsulation of waiting, next to air travel. Commuting can be considered a form of routine waiting. It’s finite, scheduled, expected, and for that reason, we find ways to pour our lives and lifestyles into that span of time. Rituals, habits, indulgences all make their way into our commutes. Over the years as a regular commuter to and from New York City, I’ve observed a growing but unsurprising trend: devices have come to dominate the commuting experience. I regularly scan my subway car to see how many people are using a smartphone, tablet, e-reader, or other gadget. Most of the time, the majority of passengers are immersed in some tech-enabled activity: reading, listening to music, playing video games, watching movies. Signal permitting, people are chatting away, texting, snapping selfies, scrolling through streams of social media updates, or typing in updates of their own. It’s bad enough that we actually are capable of doing all of those things in a shared public space like a subway car or bus, that these are all options for things you can do when you have nothing else to do. The real problem is twofold. First, we haven’t yet learned how to distinguish what we should do from what we shouldn’t do in those situations. Second, and more the focus of this post, we don’t really know how to deal with moments of pause in an activity- and device-free way — to simply embrace the wait.

Reframing the meaning of those quiet, interstitial spaces in our lives requires an understanding of what we’re doing with the busy, booked-up blocks of time on our daily agenda and why. Most of us measure our personal worth or success by our productivity and accomplishment. We feel good about ourselves when we get more stuff done, so we work very hard to do more and more. In school and at work, we often get rewarded not just for scoring higher on a performance scale (like an A+ or a 100%) but for occupying our time with as many extracurricular activities as possible. We try our hardest to exclaim to the world “I am NOT idle! I am highly driven and motivated to succeed!” but what we achieve in the end is little more than exhaustion.

Filling empty space to capacity is a common habit, whether it’s our calendars, our closets, or our stomachs. We do no different to our brains. I would argue that the concept of information overload has less to do with us being bombarded by information from lots of external sources than with us deliberately saturating our own attention with more information than we need. If we’re drinking from the firehose of information, as they say, we’re also the ones holding the hose to our own mouths and controlling the valve. Devices that connect us to the plethora of information in the world and in our lives are not necessarily at fault. They just make it far too easy to indulge in our existing impulses, especially when we think there’s nothing better we should be doing than funneling our attention into a small glowing rectangle that’s conveniently within arm’s reach.

So what else should we be doing if we can’t play with our devices? My simplistic answer: Do as little as possible or nothing at all.

If sleep is the chance we have to rest our bodies and sort out all the memories we’ve collected during the day, what chance do we have during our waking hours to reset our minds and make sense of our experiences and our lives? Sure, there’s meditation, yoga, running, hiking, and many other mind-easing pursuits to help us gain perspective and focus, but what about those in-between waiting moments sprinkled throughout the day? I like lists, so here are five techniques I find useful:

  1. Create and capture rather than consume: Carry a notepad or small journal and write out what’s on your mind (yep, with a real pad and a real pen, no apps). It doesn’t matter whether you write or draw or both, and it doesn’t matter if it looks/sounds good or not, so long as you allow yourself to express whatever has accumulated in your head. Even sitting somewhere and describing what you see can be interesting.
  2. Do some creative visualization: I wrote about this technique in an earlier post, but it’s worth resurrecting. Whether you believe it works or not, it’s a remarkably calming exercise to envision a goal or end result you want to achieve in as crisp and vivid detail in your mind as possible. Give it form, color, texture, smell — whatever will bring it to life. With repeated practice, you might be surprised with the outcome.
  3. Work out unresolved issues: Unpack a problem that’s on your mind. Don’t simply replay what went wrong over and over. Look at it from different angles, step outside your shoes, identify the things you didn’t know or understand clearly.
  4. Move around: Lots of waiting involves sitting for prolonged periods. Movement, even in small ways, can be beneficial. There are many kinds of simple, low-impact exercises that can be done while seated or that require little space.
  5. Daydream: Really, why not? Let yourself stare out the window of a bus or train, watch the clouds go by, observe people bustling about, or just take in your surroundings.

Waiting can be about much more than waiting, and it can certainly involve more than the digital pacifiers we carry around with us. We need to reframe waiting as an opportunity to disconnect from the task-driven part of ourselves that craves stimulation and reconnect with the other, quieter part that longs for stillness, peace, and reflection. Maybe we can start to think anew about waiting as the space between notes of music, a deep breath after a steep climb, a blank page dividing chapters of a book, or a welcoming patch of green space in a towering, grey city.

Your Better Self

February 26, 2015


If you ask me what the hardest part of being an information designer is, only one thing comes to mind. It’s not developing an eye for layout or mastering all graphic forms. It’s not generating brilliant ideas or perfectly executing a concept on time and on budget. Nothing to do with finding the right content or data or discovering the optimal presentation. And actually no, it’s not even finding great projects or keeping a business afloat.

The hardest part of being an information designer is managing people (self included).

By “managing people,” I don’t mean supervision or control or anything remotely suggesting a power dynamic. It’s making sense of what makes people who they are — their very best, their absolute worst, and everything in between — and navigating different situations with grace, respect, and wisdom. It’s about peeling back the layers of what we see, hear, think, say, and do in order to come to terms with the psychological, emotional, and philosophical factors behind how we relate to one another and to ourselves.

Deep stuff. So what does this have to do with information design?

First off, let’s establish what an information designer’s main job is. Regardless of specialization, an information designer fundamentally brings order to disorder, creates structure where it lacks, and weaves systems out of seemingly disparate parts. To do that, an information designer must bring to bear a host of skills and proficiencies, like sound reasoning, knowledge of design principles and methods, and technical expertise to enable a particular outcome. An information designer must (re)define a problem, gather content/data, and follow some process to arrive at a solution (gross oversimplification here). It would be easy enough if we stopped there, but the one element that makes it all interesting is the wonderfully diverse people involved.

Assuming the role of the information designer, you typically have four key players to deal with: the client, the user/audience, your team members, and of course, yourself. Sounds a lot like other professions, but there’s one difference: you’re constantly trying to impose order and organization on people and situations that are often unruly, complex, and even irrational. Your job is not just to “shape” information about/for/with them but, in many cases, to orchestrate interactions and shape people’s thinking, attitudes, and behaviors as well. Information design often involves change of some sort to be planned for, communicated to, or enacted on people. In the process of doing the work, as the champion of logic and clarity, you’re trying damn hard to keep your own biases, preferences, feelings, and primitive urges in check. In the end, the “information design project” is just there to move the story along.* The real work is understanding people in order to find ways to help them be better and do better. The deeper you dig into a problem, the higher you climb from facts and figures into organizational strategy and change, the harder this challenge becomes.

Some of the more memorable moments in my career, the very same moments when I felt I grew as an information designer, were when I confronted head-on the stark realities of dealing with people. They were disorienting, sleep-depriving, and even nauseating. I view these as developmental lessons, in the same way that a child learns not to touch an open flame after getting burned. The goal is to know better next time and not get burned again.

Here’s a little walk-through of common people problems I’ve experienced on the job and the nuggets of wisdom I extracted from them:

Everyone has their favorite way of doing something, or special thing they made that they want others to value as much as they do. When you’re called in to rethink or redo it, or worse, when you go ahead and change it without their involvement or full buy-in, unpleasantness happens. Pride and ownership are extremely deep feelings and require extra special care to ensure any change is inclusive, thoughtful, and constructive, rather than destructive. Having been on both sides of that equation, I’ve come to realize that focusing on achieving a shared end goal and finding the best way there together should be the driving factor, not ego or fixation on one’s own solution.

Wurman calls it the disease of familiarity, when someone is so deeply entrenched in their field of expertise that they can’t explain it simply to an outsider. It can also blind them to inherent gaps in their own reasoning and make alternative points of view unimaginable. Such closeness can breed resistance to change because the problems others may see aren’t readily visible to them — everything already works just fine, thank you. Drawing a picture of a situation informed by different perspectives including their own can help, but it’s just a static image. You need to use the picture to prompt open discussion and allow time for someone to actually absorb and accept the picture in order to understand what possibilities exist outside of their comfort zone.

Impatience is the source of much stress, frustration, and nasty behavior, no matter what industry you’re in. Most information design work requires unhurried, focused concentration within a dedicated span of time to be truly effective. A meeting full of fast-talking executives calls for just as much slowing down and deliberate idea capture as a data visualization project might call for weeks or months of data collection, analysis, and synthesis. Confronting clients who balk at necessary process steps and timelines can be tough when they don’t see the value or impact on the end result. Unpacking the urgency and figuring out what’s really driving the deadline — in a calm, collected way — can potentially defuse an apparent rush. But sometimes, when there’s a real fire drill, you need to muster every ounce of good nature to get the job done, rather than vent about how unfair it is. Once the dust settles, you can work with your client or boss to find ways to manage situations better in the future.

Sometimes people don’t want to do something because it seems beyond their duties, takes too much effort, or they just don’t care. Sometimes that’s you. Negative feelings generated over the course of a bumpy project can kill motivation and lead to indirect sabotage: a passive-aggressive e-mail here, a half-baked deliverable there. Rather than face the problem directly and risk an argument with a client, boss, or team member, you might choose to put up with it or just bail out. Conditions usually deteriorate soon after the start of a project if there’s a lack of understanding of purpose, goals, roles, responsibilities, timelines, etc. A project can ground to a halt or fall apart if underlying personal issues bubble up to the surface and interfere with day-to-day activities. One approach is to focus on the plus side of doing good work, like the benefit to the end user/audience or improvement of one’s own skills while working to pinpoint the source of the problem. The idea is to actively map your way through the situation rather than sink into a rut of indifference that can drag down the whole project with it.

Habits and patterns are hard to break. Familiarity and comfort in the known are partly responsible, but the constant repetition of a routine can snowball over months and years and amass a weight of its own. Inertia can take the form of a company policy (that’s just the way we do things) or a frame of thought (I’m just not creative/visual/etc.). The problem is not so much the habit as is the deeply ingrained, almost immovable set of behaviors associated with it — also known as stubbornness (I won’t change). Clients are often fearful of any kind of change because it might disrupt the status quo and create more problems for the organization, whether it’s a new weekly meeting template, a revamped design and workflow for creating research reports, or a strategic map presenting a company-wide reorganization. When you face a brick wall of opposition, a good strategy is to first dig deeper and learn more about the nature of the stubbornness. A little time spent listening to real concerns and showing genuine interest can open dialogue and inform how a new way of working can be introduced.

Of course, there’s no magic secret to fixing people problems. People smarts come with age and experience, and there is no mastery because, after all, you’re only human. You do the best you can, try to listen to the angel on your shoulder, give people a chance, and always, always seek understanding. I’d say the best moments happen not just when a project is a success, but when you and the people you’re working with run into a problem, like a disagreement or complete misunderstanding, and find a way to work it out together. That wave of relief after making the right decision to untangle a problem and breaking through a tense situation — you know, that pile-of-bricks-lifted-off-your-chest feeling — is priceless. The sense of knowing someone better than before because you resolved a conflict is a special type of understanding. It can move mountains.


*This is Alfred Hitchcock’s MacGuffin concept, which I discovered in Dan Hill’s excellent book, Dark Matter & Trojan Horses: A Strategic Design Vocabulary.

Unpacking Understanding

January 25, 2015


How do we better understand understanding in order to advance understanding-related work?

There’s plenty of talk these days about making sense of messes, making the complex clear, bringing order to chaos, turning data into insights, creating understanding, et cetera. Who does that work — whether it’s an information designer, information architect, user experience designer, or any other hot professional of the moment — is not the focus of this post (although it is a largely neglected topic of discussion). My concern is that understanding-related work of any kind happens without much understanding of understanding itself. How can we get away with talking the talk, selling services that promise to make confusing things comprehensible when we know so little about what understanding is and how it works?

Such is the folly of many similar pursuits today: we want to innovate, we want to unlock creativity, we want to harness the power of design and design thinking. But these words, just like “understanding,” are victims of marketing spin, media hype, and fluffy visual rhetoric. The romanticized idea of these things, commonly represented by the colorful mosaic of post-its on an office wall, the intriguingly messy marker scribbles on a whiteboard, the twenty- or thirty-something team cheerfully engaged in group brainstorming around a flipchart, makes it seem so accessible, so easy to partake in the fun designers have enjoyed all along — and more deceptively, to wield tools and methods that will deliver the same results (better products and services, happier customers, higher revenue, etc). But what is the fundamental basis and origin of these activities? Why are people doing these specific things, and what do they actually get out of them?

Understanding is too often associated with visual outputs: those “insightful” data displays, “engaging” infographics, or “compelling” visualizations that attempt to speed us along the data > information > knowledge > wisdom continuum. We are bombarded with visual explanations, illustrations of factoids, pictures of numbers, maps of ideas, in every possible channel or medium. The reasoning is that because vision is believed to be the dominant sense, the primary input for information*, and pictures are so much faster to read, visual presentation of information ought to be more understandable. The more visual the better, right? Of course, this is flawed, superficial reasoning, (not to mention heavily skewed by some toward purposes far from enabling understanding). Visuals are just one delivery mechanism for content and data, and they are nowhere near universal. Certain visuals designed a certain way work for certain people more than others in certain contexts when conveying certain types of things — but why?

We don’t really know what innovation, creativity, design, design thinking, or understanding are because nobody stops for a minute to peel away the glossy veneer and really think about them, to question them, to research their history and foundations, to synthesize meaning and actually make sense of them. In this age of thought snippets and tl;dr, hardly anyone seems to have the time (or interest) to invest in any deep level of study and reflection beyond a Google search or skim of a blog post or magazine article or bother to follow a paper trail of references and source material that is, in fact, made of real paper and not bits and bytes. Instead, there is a willingness to accept “minimum viable knowledge” — just enough to gain familiarity and be “in the know.” For many practitioners in the design space, a “whatever sticks” approach may be good enough in the products and services they create, which includes applying received knowledge about formal design principles like type, color, scale, contrast, etc. The real science and scholarship behind these concepts and principles ends up being neglected and ultimately forgotten.

In the world of understanding-related work, we need to dig deeper into the “what,” “why,” and “how” of understanding:

  • How does understanding work? What are the processes involved in visual and non-visual information processing? How does the brain translate sensory input into something meaningful and usable? What does available research offer, and what have we yet to learn?
  • What do you need to enable understanding? How do you tap into the mechanisms of perception and cognition in order to maximize understanding in the creation of information artifacts and experiences? What are the ingredients/preconditions/requirements for understanding? And what lessons do different professions offer one another (teaching/education, graphic design, psychology, etc)
  • How do you really know if you’ve achieved understanding? What is the “aha!” moment, exactly? How do you effectively evaluate or measure understanding? Do any methods already exist? What are the pros and cons?

I’ll readily admit my own understanding of understanding is limited. I’ve framed plenty of great questions in the course of my career, and if I could embark on a full-time (generously funded) research project to answer these questions, I’d happily dive into it. However, for the time being as a full-time practitioner, I think it’s important to wade gradually into the realm of research and probe specific lines of inquiry at a time, rather than tackle everything all at once. For me, it involves carving out time to catch up on literature about neuroscience and psychology and piece together my own understanding, writing or diagramming thoughts and connections, so that I can start informing my day-to-day work and gauge how well I’m doing what I think I’m doing.

My hope is that other professionals focused on enabling understanding recognize the need to think more deeply about the work they do and try to replace assumptions, best guesses, and conventional “wisdom” with sound research, evidence, and self-driven learning. Whether you frame this as bridging theory and practice or academia and industry, there is a clear need for more thoughtful practice as well as more more practical thought if we want to holistically advance understanding work of all kinds.

*The claim that 90% of information transmitted to the brain is visual apparently lacks an authoritative source.

The “Busy” Trap

September 24, 2014


I used to think being busy was a good thing, a necessary thing. “Busy” meant your mind was occupied, gears turning, neurons firing, things getting done. A mind at work was a healthy mind, an efficient mind that saw no challenge too great, no work pile too daunting. As I’ve cycled through different phases of “busy” in my still evolving career, I’ve come to realize what a Faustean bargain working hard and overachieving really is. Lots of work may mean more billable hours, flow-like waves of productive output, and seemingly blissful distraction from other less desirable aspects of life, like loneliness or an unpleasant home environment. But the real detriment of overwork, aside from the stress and nasty health problems, is the emptiness it creates — the lack of intellectual stimulation, the creative deprivation, and the psychological alienation. At its worst, work becomes meaning when meaning cannot be salvaged from anywhere else.

This is starting to sound melodramatic, I know. The reality is that life gives us chances (or we create the chances) to stop and reflect. Sometimes a sudden event like an illness or job loss jolts our routine and challenges us to either lament the setback and curse our misfortune or seize the opportunity to re-evaluate our lives. At other times, like the gaps between big projects or right after major deadlines, we can take a deep breath, look back on how we handled ourselves in the midst of our daily grind, and think deeply about what kind of person we became in the process:

  • When and why did values flip in favor doing that “one more thing” and staying later than planned?
  • How many recreational events and activities had to be passed up for work? Did it become a pattern?
  • How many personal relationships were affected by late night or weekend work?
  • Were mornings greeted with joy or dread?
  • Did the outcome measure up to the sacrifices?

Sadly, I can picture a reverse-Feltron annual report of my former work-life experience: bars not visited, bands not heard, number of locations in NYC not visited, restaurants not frequented, etc. The real data would be considerably less appealing: hours waiting on bus/train platforms after midnight, number of mistakes noticed the next morning (after sending off “final” files), most eaten meal substitute for dinner, etc.

Everybody’s situation and experiences are different, so I won’t rattle off advice like I’m a fully rehabilitated workaholic who’s figured it all out (which I’m not, and I haven’t). I work for myself, which comes with its own demands as well as an even greater need for discipline and boundaries. I occasionally find myself walking into the same old traps, but slowly my foresight is improving. I’m getting better at framing projects with more reasonable expectations and timeframes, yet without compromising the quality of the end result or my quality of life. Most importantly, I’m actively trying to instill better patterns, like a clear start and end to the work day, unplugged weekends, more frequent visits with family and friends, and some completely unproductive — but immensely gratifying — daydreaming.

The Game of Knowledge

September 1, 2014


Knowledge is power, but it all depends on how you play the game.

As the “Information Age” continues to unfold, gaining knowledge has become the central activity in many of our interactions.* Sure, we’re busy shuttling information around, constantly optimizing how we shape and transmit it, but no matter how thought-through, well-designed, and engaging it is, information must still be translated to something meaningful and useful for the situation we’re facing right now. Information informs our problems but it does not solve our problems. Knowledge — readily-applicable “intel” that helps us at the point of need — is what gets us somewhere.

But, as with anything valuable, there are often challenges to gaining the knowledge we need. There are gatekeepers to confront, mountains to climb, mazes to wind through, battles to be fought, and prices to paid before we can earn even the smallest insight. If it sounds a lot like a game, that’s because it is. To those of us who consider ourselves “seekers” of knowledge, it can be a real quest to locate the right source, dig around, and finally extract knowledge that will help us. And for some — the “knowers” — the focus is on achieving credibility or popularity, asserting authority, and/or controlling seekers’ access to their knowledge.

The dynamics between both kinds of players, the seekers and the knowers, are present everywhere, from online platforms to everyday interactions. Across these different contexts, knowledge can be treated three ways:

Knowledge as Commodity

Once upon a time, we relied on those static information repositories called libraries to find out whatever we needed. A library card granted easy access, but hard work of “search” lay ahead, from riffling through the card catalog to scanning the shelves to poring over piles of books. Hours later, after careful study, we had an answer. We gained some knowledge. Maybe. And yet, with the dawn of the Web and all our advances with search engines and algorithms, getting what we need still isn’t a perfect process. Simply searching Google for an answer offers no guarantee of finding one. If we don’t spot exactly what we’re looking for on the first page or a couple of pages of in (out of impossibly millions of results), we’re forced to navigate a maze of dubious, redundant/repackaged, or completely irrelevant content. Frustrating.

There’s so much out there that’s dutifully served up to us in giant batches by search engines like Google — with supposedly the “best intention” of putting the most helpful results first — but for the life of it, a search algorithm won’t really know what we need to know and how to deliver it (beyond weather, news, and stock prices). We may be able to sort, classify, and rank online information, but for the time being, we can’t cut to the chase and zero in on what matters to us.

Knowledge as Influence

When Google falls short and there’s no one in our immediate circle to ask, our next option is to find discussion forums and other question-and-answer platforms where real people can offer help. The online knowledge marketplace welcomes all kinds of queries from seekers (via Wikipedia, Yahoo! Answers, etc) and provides the knowers with platforms for sharing their knowledge (via Quora, Answers.com, Stack Overflow, etc.). In a way, these platforms help us cope with the whole problem of information overload by bypassing it completely: you just go to a knowledge platform, write your question, then wait for an “expert” to answer. Depending on the platform, you can decide which answer best addresses your question or let the crowd decide for you. The answer with the most votes ends up being the “best” one, but whether or not that’s the right answer is another question.

Many knowers may have the best intentions to contribute to greater understanding by sharing their knowledge, but a good deal seek out recognition for their knowledge and aspire to become “thought leaders” in their realm of expertise. The game for them becomes accumulation of likes, followers, tweets, retweets, upvotes, blog comments, or whatever other social sharing gestures build their reputation. For professionals trying to carve out a place for themselves in a crowded competitive marketplace, that recognition is more than just an ego booster — it builds credibility and potentially attracts more clients, more business, more money, new recruits, and so on. Influence through knowledge can certainly help open many doors to new opportunities (eg, projects, partnerships, book deals, speaking engagements, etc), but a sense of responsibility should come along with it as well. Is it more important to be known for knowing and to capitalize on that, or to help others do better with the hope that they’ll pay it forward when others need help?

Knowledge as Advantage

The value of knowledge can be so great that its ownership creates a power dynamic between knowers and seekers. In finance, law, health care, and other professions, seekers hire knowers with substantial expertise in those areas to help solve a problem or take advantage of an opportunity. Of course, with more knowledge and experience comes a higher price tag for obtaining or using that knowledge to get a specific result. When there’s a value-for-value exchange, and the knowledge received/used and outcome agreed upon more or less matches the amount paid for it, all is well. But when knowledge is withheld for personal gain and others’ loss or provided at a higher cost than what is reasonable, problems arise. Information asymmetry, when someone possesses more or better information than another in a given situation, is often used to gain advantage over those with little to no information (or even awareness that information exists). Buying a home or a car, deciding on health care tests and treatments, making investments, and many other everyday situations put seekers at a disadvantage when they place trust in a professional who has the power to disclose or withhold critical knowledge about a transaction that could tip the scales one way or the other (there are varying degrees of severity to this, of course). Every day, people fall prey to “contrived ignorance” (for lack of a better term), until the fraud is exposed, the asymmetry is corrected, and the game of deception is over.

Changing the Game

There’s been plenty of attention focused on the sciences, arts, and crafts surrounding data and information, and for good reason; in the period of time that understanding-focused disciplines have emerged and begun formalizing to the present, much valuable knowledge has been created to guide the process of turning raw data and content into not just a comprehensible/usable form, but a dynamic, adaptable, and personally meaningful form. At the same time, progress in optimizing information for the eye and brain still needs to account for the ways people — both seekers and knowers — think, feel, and behave.

Cognitive neuroscience is shedding more and more light on perception, memory, and other mechanisms of understanding, but the psychological dimensions of communication and the role of information design in human behavior remain under-explored. We’ll always need insights from the sciences to inform the “why,” “what,” and “how” of information, and we’ll hopefully answer more questions about the “who,” but each of us defines our own role and rules of engagement with others in knowledge exchange situations: when faced with the pressure of finding a solution from an outside source or when presented with the choice of how much knowledge we’re willing to share and how to share it.

Despite the reality (that is, intractability and stubbornness) of human nature, life would be a lot easier in many respects if we did away with the game of knowledge. No winners or losers. No wasted time and energy. No status-seeking. No profit-seeking. Just cooperation and collective engagement in advancing understanding.

*The role of data is important to acknowledge, but it’s just a raw material, an input that takes effort and skill to collect, analyze, and make usable. Information, next in the DIKW continuum, still requires some work to bridge what is presented to what is needed. Knowledge, as it is used here, focuses on that which has immediate use, without filtering, processing, or any extra work needed on an individual’s part.

Next Page »
Copyright © 2015 Michael Babwahsingh | powered by WordPress with Barecity