Showing posts with label psychology. Show all posts
Showing posts with label psychology. Show all posts

Tuesday, January 31

Writing Stories: Stranger Than Fiction

There is a childhood punishment that the protagonist of my debut novel describes in my forthcoming debut novel. One of the beta readers didn’t like it. She called it silly, disbelieving it would ever happen.


The irony is that I borrowed it from real life. No, the novel isn’t real life. It’s a work of fiction. But as most writers will tell you, we all draw on real people or events, especially those that leave physical, emotional, or psychological imprints on our lives. 


How else does one write straight, honest prose about human beings? Some of us look backward while writing forward, weaving the past into the present — even if we’re implanting the event on someone we made up, asking ourselves the whole time how the character might respond to it differently than we did. Sometimes they do. Sometimes they don’t.


This is where it can sometimes be tricky as a writer. We borrow bits of this and that from our lives and reshape them into something else for other people to experience until it isn’t our experience anymore, but someone else’s entirely, someone we made up. And this is why I sometimes offer a cautionary whisper to those who might remember actual events before they read my work. It’s not them or me or you or that or what happened, I tell them. Because, well, it isn’t any of that. Except in this case, maybe. 


The childhood punishment I’m talking about really happened, and it happened to me. It was so real, in fact, I spent the better part of my twenties believing I deserved it, coping with it and other psychological abuses as a sort of joke. How bad of a kid was I? I was so bad …


When I finally had my own kids, I stopped telling the joke. It was no longer funny as I realized it was a punishment that I could never prescribe on my kids or any kids, for that matter. There wasn’t any infraction worthy of such a punishment or even the threat of it — which four more children endured while growing up until it became a thing of legend. 


The punishment I’m talking about sounds familiar to most people. It was a room restriction, common enough that The Atlantic wrote about it like a rite of passage among previous generations. Some still argue that “grounding” can be effective. Maybe so. Except for mine, maybe. 


My grounding wasn’t a weekend or week, as some might have experienced. It was a month, with the real caveat being that everything interesting was removed from my room — books, games, papers, pencils, etc. The circumstances didn’t make sense either, as it had very little to do with anything I did but a demonstration of unchecked authority. She had told me there would be consequences, so she had to follow up. 


The consequence for putting one dish out of the dishwater away dirty was a month-long restriction. I didn’t doubt her. Past experiences had always convinced me she meant business. So I did what any preteen would do. I slowly, carefully, and meticulously inspected every dish while putting them away. And I felt true terror when she came in to inspect the work, slowing and randomly looking over glasses and plates and silverware. 


I was so very careful, but it was there anyway. There was a water spot on one of the knives. The declaration of its finding was so fierce that it alone would have taught me a lesson, assuming there was one to teach. But it didn’t stop there, couldn’t stop there. 


The consequence had already been outlined. I would be placed on room restriction for a summer month, only allowed out to use the bathroom and for meals.


I was so angry that it never occurred to me that I couldn’t see the offending water spot, nor could I discern whether it was the knife I had put away or some other that she had plucked from the drawer. What did occur, I learned later in life, is that she had triggered a fight or flight response, and I always tended to be a fighter. 


I made a cavalier proclamation that I didn’t care about her punishment or authority. I would take my punishment like a champ, shut myself off from her wickedness, and read, draw, and play games until my vacation from her ended. Tut tut. Lay it on me. 


That’s when she delivered what amounted to a left hook I never saw coming. She told me I was too smart for my own good, so all those things would be taken out of my room too. I would be left in there with nothing except my bed, clothes, and a window to look out of from the elevated first story of our apartment. It overlooked a pond. 


Sometimes my son and daughter ask me what I did for that month. They are especially perplexed because, nowadays, a cell phone restriction can be more impactful than banishment to a room ever seemed to be. From what I remember, and I blanked a good part of it, I imagined things. 

The protagonist in the novel, on the other hand, never says. He only mentions it as an illustration of circumstance, given the book isn’t about abuse. Any psychological abuse is only a subplot, a mechanism to help people understand the boy in relation to other events in the story. 


Even so, I sometimes hope its presence in the story sparks conversation about it as it did with one of my beta readers. When people hear or talk about abuse, the word conjures images of physical or sexual abuse before emotional abuse or neglect, but those things exist too. And the wholesale destruction of someone’s self-worth carries consequences that take even longer to heal. 


If you know of someone who needs help or if you need help yourself, Childhelp can put you in touch with local resources in your area. Aside from that, let’s have a conversation. Stories help people learn they are not alone, even when they sound stranger than fiction. 

Tuesday, July 12

Exploring Imagination: The Creativity Equation

This one study continues to surface in articles, and it always stops me in my tracks. It claims a creativity crisis in America, largely attributed to the pursuit of winning formulas over future breakthroughs.

The crisis began, they say, as American education put creative thinking on the back burner in favor of measurable rote memorization in the 1990s. Americans wanted to test better than other people at the expense of innovation — ergo, finding answers that don't exist on an answer key. The outcome has been a continual decline in education and creativity. And now we see it in other places, too — everything from business automation to book and film reboots. Almost everything seems stuck on rinse and repeat. 

Ironically, decades ago, it was our creativity and not our test scores that used to set America apart. In fact, Nobel Prize winner Richard Feynman warned us away from the rote memorization route. He suggested thinking things through instead of rehashing the past. But that's why he won a Nobel Prize. Thinking.

So what's the real problem? We're so confused about creativity nowadays that more and more people harbor an aversion to creativity. Even people who are attempting to ignite creativity in the classrooms are accidentally trying to make it an off-the-shelf commodity by denying the concept of creative genius.

The creativity equation. 

Like most things, answers to complex problems are somewhere in the middle, let's say between Project Zero, which operates on the belief that anybody can be creative, and the polar opposite, which believes relatively few individuals are born with it. More than likely, we're all born with a capacity to be creative (although some with a greater capacity than others) until society crushes our natural instinct for it

But let's set that debate aside and look at what's new in neuroscience. Some researchers studying our brains have found that creativity is linked to two different semantic memory processes: clustering, which is related to divergent thinking; and switching, which is related to combing distant associations between concepts. 

In short, aside from magic, our capacity for creativity is tied to how we develop associations between things and our ability to draw upon the broadest possible network of associations to connect and combine those ideas and concepts that are distant (unrelated stuff that fits together without being forced).

There is more to it, no doubt, but let's go with this idea to build a real blueprint. There are five steps in supercharging creativity. And while anybody can follow it, those who start young have even more time to get it right (and with less concern for society's insistence on conformity). 

Expansion. The more we learn and experience will increase our capacity for creativity. There isn't any other way to expand our conscious and subconscious database of distant ideas and concepts if we are not continually looking beyond our comfort zones. Dreams count too. 

Immersion. More uninterrupted time invested in creative pursuits through meditation, reflection, or experience will provide more time to explore the furthest reaches of our conscious and subconscious. We need to take the road less traveled and engage our alpha frequency

Evaluation. Great ideas seldom follow pre-existing models so it's better to measure them based on their feasibility, flexibility, and originality. Very often, original ideas are not as compelling as their next iteration — something that's been mulled over more than a minute. 

Execution. Creative ideas have to be actionable.  So, in addition to placing creativity at the forefront, we have to develop a secondary skill set to share it — writing, drawing, painting, producing, choreographing, or even implementing a system inside an organization. This is why creativity is hard work.

Vacation. Sometimes our minds need a break. The best breaks tend to be spending time in nature, improving our creative spaces, taking in some new entertainment, or otherwise busting up a routine so we can come back to the project with fresh eyes. And if you are one of those people who are guilty doing it, just remember that vacations still stimulate the expansion of our database.

That's all there is to it, sort of. There is magic that David Lynch likes to talk about (and I love to listen to him talk about it). There is the conversation about whether ideas come from one, some, or many. And there is plenty more we can learn about the mind. That's all fine. We'll save it for another time.

Thursday, July 8

Telling Stories: The Fiction - Nonfiction Dichotomy

His question almost stumped me. While taping my appearance on Ira's Everything Bagel, award-winning broadcaster Ira David Sternberg asked me to talk about the dichotomy of fiction writers/readers and nonfiction writers/readers.

Is there a difference? I don't really think so.

Sure, some people think so. Many people feel like Beasts of No Nation author Uzodinma Iweala as captured by Miwa Messer's column on Barnes & Noble Review. Despite his first book being fiction, he always felt that historical accounts of war and biographies seemed more relevant. There were more facts to hold onto and tangible lessons on how to live a life grounded in historical relevance. 

Other people say the opposite. Chris Elder, for example, wrote a post on Bookstr a few years ago. It's titled "Why Reading Fiction is Better For Your Brain Than Nonfiction." He quotes Mark Twain to serve his point. "Truth is stranger than fiction, but it is because Fiction is obliged to stick to possibilities; Truth isn’t," said Twain. He might have been on to something. Even the Havard Business Review has made a case for reading fiction.

Then again, how much of a dichotomy is there, really? I line up a little closer (but not exactly) with writer Mark Grant, which is why the question almost stumped me. "In general, fiction refers to plot, settings, and characters created from the imagination, while nonfiction refers to factual stories focused on actual events and people," Grant writes on Book Riot, right before delving into the fact that the two often intersect.

Maybe the dichotomy is made up or, if it exists, is razor thin. 

This might be a bit too philosophical to some, but the entirety of reality is made up of stories. In fact, homo sapiens arguably evolved from an unexceptional savannah-dwelling primate to become the dominant force on the planet because of them. 

It didn't really matter if those stories were fiction or fact. All that mattered is we learned to tell them, sharing past experiences or imagining a different experience altogether. In fact, this is the very premise of Sapiens: A Brief History of Humankind by Yuval Noah Harai and a concept I hard-baked into many communications classes and presentations over the years, including the one I delivered to the National Recreation and Park Association in 2015.

People are the only creatures on this planet that exist with a dual reality — one formed by our perception of the world around us (the desk, the chair, the window) and one that is formed entirely by abstract concepts, societal constructs, and public opinions. In short, most of our facts are not facts at all.

Maybe our brains can't tell the difference between fact and fiction either. 

I don't mean this rigidly. The brain processes sensory information and imagined information differently. We can tell a character in a novel apart from a living person. And yet, we can't deny that fiction acts as a sort of surrogate life. It can even influence how we see the world, doubly so if that fiction is packaged as real (like more documentaries pretend to be).

There are reasons for all this. The capacity for cognitive empathy is why we feel for fictional characters in stories, novels, and movies. So while our brains might process sensory information and imagined information differently, the sensory input of a book or film can still be mirrored by specific neurons, causing us to feel what all those characters feel or believe what those characters believe. 

One might even say it's the embellishment of fact that has made David McCullough, Adam Makos, and Brad Meltzer some of the most successful historians and biographers out there. Their storytelling ability helps bridge the gap between fact and what we fantasize about, triggering cognitive empathy and creating stronger, longer-lasting memories. 

Who knows? In some sections, they might have embellished plenty, much in the opposite way some fiction writers rip off reality on a regular basis. Or maybe I just made that up. Good night and good luck.

Postscript. Great news. The Kindle edition of 50 States will be available on July 21. Preorder today.

Saturday, October 19

Rekindling Creativity: Live, Learn, Leap

When automaton drives marketing, creativity can take a back seat. There is only one problem with it. A world run by algorithms is impossibly predictable. You look up product support, and you're subjected to a series of advertisements for a product you already own; only it’s broken. 

Predictably isn’t only inherent in computer programming. It becomes part of our daily routines. We wake up, get ready, exercise, have coffee, take breakfast, commute to work, check email, work on priorities, have a meeting, eat lunch, take another meeting, wrap up deadlines, transport kids, have dinner, watch television, go to bed, and then do the whole thing all over. 

Sure, everybody’s routine is probably a little different, but you get the point. You have a routine, and the better it goes, the more likely you feel content. The price you pay is not being present. 

The less your present, the more predictable our reactions when exposed to programming. The busier we are reacting to stimulus and situations or policies and politics, the less likely we are to take actions that move our lives forward. Sure, routines can be useful but they can also cause paralysis — in both marketing and our daily lives. The only problem is that some people grow so accustomed to contentment, they forget how to rewrite an increasingly scripted world.

Live. 

The first step toward rekindling creativity is to live with intention. Much like animals, people are hardwired to filter out unimportant details. Since we are bombarded by neural input, our brains tend to ignore the expected and notice the unexpected. This is the very reason even fitness trainers tell people to keep your fitness routine fresh

Life is exactly like that. You have to keep changing the stimulus so your brain doesn't slip in and become stuck in sameness. Make time for weekend retreats, walk somewhere new, drive a different route, skip your daily routine once a week (e.g. don't open email until noon or try a no-meeting Monday, have lunch with an old friend, perform a random act of kindness, or flip a coin to make some choices. You get the point. Do something different. 

Learn. 

I have always been a lifelong learner. I read books. I go to events. I listen to speakers. I take online courses. My lists for inspiration are endless. You don't have to start with any of them. But I did want to share that it was through one of the venues that I discovered the genius of David Lynch. 


He ties living and learning together perfectly. His concepts of capturing ideas literarily changed my life. The two-and-a-half minutes I'm sharing here will introduce you to a sliver of his understanding of consciousness. I'm calling out the time for a reason. Most people tell me that time famine is the number one reason to avoid learning. You have to find the time. I listen to audiobooks when I drive anywhere. Most Ted talks are only 18 minutes long. The very notion that you cannot afford to invest five or 20 minutes to improve yourself should be an indication that you probably need to more than anyone. 

Leap. 

Creativity isn't only about input. It's about output. In fact, the root meaning of the word “creativity” is “to grow.” To truly benefit from creativity, you have to turn new and imaginative ideas into reality. The idea doesn't only apply to arts or marketing. It applies to education. It applies to science. It applies to IT. It applies to business. It applies to finding a sense of purpose in our lives. 

One of the recent changes I've made in my life is to finally set time aside to work on writing fiction. I originally set a goal of writing one short-short (a story of 50 to 1,500 words) once a week and a short story (3,500 words or more) once a month. The leap to do so came from author Joyce Carol Oats whose class reminded me that feedback helps fuel writers. Right now, I share these stories at byRichBecker on Facebook. 

More importantly, the infusion of creativity in my life has awakened a passion to produce great things. While I've always enjoyed being on the leading edge in my field, writing fiction has elevated my work in advertising and marketing. It's made me more open in observations and making connections within the world. It's increased my sense of purpose and added excitement in everything I do.

And the reason I want to share this has very little to do with me and everything to do with providing some evidence for you. If you really are looking to rekindle your creativity, start by turning off those distractions and making small changes in your life, learning more about those things that interest you, and then transforming the ideas that start to come your way into action. Give a try. Try it for two weeks (or a month). And if you wouldn't mind, drop me a note and tell me how it worked out for you. I'd really love to know.

Friday, July 14

Writing Across Communication: Writing For Tomorrow

The writing you read today won't be the communication you need tomorrow. In a world where content can appear on any surface or no surface at all, providing consumers with real time intuitive assistance to find the right product, improve performance, or manufacture reality will require a different kind of thinking, planning, and promoting. The boundaries and barriers are gone.

Content will need to be versatile, portable, multimodal, and improve the consumer experience. Storytelling alone won't be good enough. Many stories will have to be told by the consumer, drawing upon a non-linear array of data capable of delivering visual, aural, written, kinesthetic content based on the platform they are using and their preference for learning, experiencing, and making purchases.

Logical or emotional, solitary or social, the words we write tomorrow will be blueprints that appreciate no one person is really the same — even if there are a few things that never change.

A sneak peek into the future with a predictive deck. 

When I needed a new deck to wrap up my final Writing Across Communication class last spring, I set an objective to help my writing students to appreciate the future as well as a few constants in that have always been part of human communication. It made for a worthwhile exercise in bringing consumer psychology and strategic communication together. They really do belong together.



While many writers spin their wheels trying to find the right way to spin their story, relatively few remember that most communication aims to motivate people. So if you don't know what motivates them, you are only operating with one-half of a two-part equation between the sender and receiver.

Three primary drivers for motivation. 

• Intensity of need or desire
• Perceived value of goal or reward
• Expectations of individual or peers

In order to start reconciling these drivers, it's generally a good idea to remember that humans are the only creatures on the planet that form perceptions based on objective and conceptual realities. We're also the only creatures who possess a capacity for cooperation that is both flexible and scalable.

As technology continues to blur the lines between these two realities, people will likely become increasingly responsive to conceptual influences, making the communication of tomorrow especially potent if the message is sent across multiple delivery methods, repeated across a multimodal spectrum, and delivered as non-linear content that allows the user to self-select the experience.

When done right, it will provide even more opportunities to change behavior, change perception, and change attitudes toward just about anything that doesn't oppose individual or cultural core values. Although even those are subject to change when communication is created from precise objectives.

Writing Across Communication will be available again at the University of Nevada, Las Vegas this fall. The class includes eight sessions from Sept. 21 through Nov. 9. I am currently developing an online version of the course for people outside Southern Nevada, independent of the university.

Wednesday, August 26

Psychology And Neuromarketing Can Be Fallible. So what?

There has been plenty of buzz up about the Reproducibility Project, which aimed to validate about 100 psychology science studies by attempting to reproduce the studies. Marketers should take note.

For those who place their faith in scientific-like testing (and big data), the findings of the Reproducibility Project ought to be astonishing. Two-thirds of the original studies tested proved fallible and even those that could be replicated demonstrated irregular statistical variations. Specifically, the magnitude of the effect tested was frequently half as small as the original finding.

Never place too much faith in any marketing formula. 

Sure, there is plenty to be gained by running A/B tests in an attempt to convert your business thinking from "we think" to "we know." There are many successful examples. But just because the results of testing turn out one way or another doesn't ensure success. A/B testing isn't a sure thing in marketing.

The truth is that we must stop treating single studies as unassailable truths, especially when other variables could be influencing the outcome of any finding, outcome, or assumption. True scientific thinking, after all, comes with a critical mindset rather than a yielding one. And we need to be more critical now than ever before, especially as people attempt to manipulate our thinking daily.

You can find evidence everywhere. Journalists are more likely to write attention-grabbing narratives first and then find examples to fill in the blanks than ever before. Scientists are more likely to build studies based upon biased theories than rely on objective observations. And marketers, whether they admit it or not, generally attempt to validate their work more than they produce better outcomes.

And no, it isn't always intentional. Anyone who has ever gone to an eye doctor only to be prescribed an inferior prescription knows how easy it is for mistakes to happen. No matter how meticulous the doctor or technician might be in the office, you eventually have to try it in the real world.

It recently happened to me. In the office, it seemed monovision — wearing a distance vision contact in one eye and a near vision correction in the other — was a suitable option for my slight presbyopia. In the real world, it didn't work at all. Too much of my interaction with the world relies on intermediate vision for monovision to be effective. The same thing can happen in scientific studies.

There are reasons humans are mostly unpredictable. 

If you truly want to understand psychology and sociology as it applies to marketing, you have to make a real effort to understand humans. First and foremost, you have understand that humans are the only creatures on this planet that form flexible and scalable cooperatives based on abstract concepts.

Yuval Noah Harari, author of the international bestseller Sapiens: A Brief History of Humankind, is especially intuitive on this point. As he explains it, bees and ants can form scalable cooperatives but aren't flexible in their ability to change their social structure. Whereas chimps and dolphins are flexible in how they cooperate, they are only able to do so in relatively small numbers.

The reason, it seems to me, it that humans are also the only creatures on this planet to operate with a dual reality, a perceptional concept studied in depth by Donald Hoffman, professor of cognitive science at the University of California, Irvine. In sum, humans perceive an objective reality (what is true) and a conceptual reality (what we accept as truth) at the same time.

For example, the money in your wallet is a piece of paper. The concept that it has value is a fiction that we have collectively agreed to accept as truth. And, to be clear, it is this dual reality often discussed by Hoffman that provides us our unique ability to form flexible and scalable cooperatives.

Marketing and communication, at their core, only has one purpose: to change behavior. And as such, marketers usually try to change behavior by drawing attention to an objective reality or attempting to elevate (or diminish) a conceptual reality. And what makes this especially interesting is that in the last decade, especially with the advent of social media (and likely to become more prevalent with the rise of augmented and virtual reality), is that marketers spend more time targeting the conceptual reality.

So what? The greater the emphasis on conceptual reality, the greater the unpredictability of testing because humans, throughout history, have proven to be consistently inconsistent. And in knowing this, maybe it is time to treat your approach to the science side of marketing as an exercise in adjustment and not in the collection of unassailable truths that will one day be proven false. Good luck.

Wednesday, July 1

What Marketers Really Need To Know About Silly Cat Videos

When describing the state of the Internet today, it's all too easy for marketers to see silly cat videos as the polar opposite of mental stimulus (myself included). And in doing so, marketers miss the point.

The popularity of silly cat videos has nothing to do with the type of content people want to consume. Their popularity has everything to do with how people want content to make them feel.

New research supports this supposition. After surveying nearly 7,000 Internet users on Internet cat consumption, researcher Jessica Gall Myrick discovered the motivations behind it and emotional benefit it delivers. People mostly watch cat videos as a means of mood management because of their potential to improve their mood. In fact, even those who use them as an excuse to procrastinate tend to temper any post-viewing guilt with feel-good fuzziness, as viewers describe their post-viewing mood as hopeful, happy, and content even if they felt anxious, annoyed, or sad before watching them.

Marketers need to pay better attention to how they make people feel. 

There is no shortage of causes that deserve consideration, topics primed to produce social outrage, or advertising that aims at creating feelings of scarcity (ads that aim to create feelings of fear, inadequacy, or make people feel unknowledgeable). Most of it, not unlike media coverage, is commonly negative or neutral. The net outcome is not surprising — it makes people feel bad or, more commonly, nothing at all.

Sooner or later, you have to wonder: Is the marketing content your organization produces adding to the anxiety or helping make people hopeful? Are you aligned with brands that promote happiness like Apple (innovation), Coke (happiness), Lowe's (empowerment) and Amazon (simplicity) or struggling   with ads that aim to demean, disparage, or attack others? Do you leave people wondering why they need your product or do you have the sense that somehow your product or service makes things better?


Sure, there are cases where negative advertising can work, especially if it is designed to capitalize on contempt for a perceived adversary. But such tactics are time sensitive to the cultural perceptions such as a decades long run of "dumb dad" ads. And social media makes for several splendid fails every year.

Don't get me wrong. The point here isn't to scrub away any rough edges if it fits. The point is to ask yourself what emotions your content is or isn't tapping into and making the appropriate adjustment in much the same way Charles Revson once did as the pioneering cosmetics executive behind Revlon.

"In the factory we make cosmetics. In the store we sell hope," Revson once said. 

Hope and happiness are powerful promises, ones that underscore many successful brands. They also cut to the quick of what motivates people in B2B and B2C spaces. Consumers want to make their lives and the world around them a little better. So do business owners. All of them might have a different outtake on what objectives best accomplish those overarching goals (comfort or exhilaration, opportunity or security), but almost all of them are rooted in hope and happiness.

When companies and content creators can't deliver on either, people turn to more than two million silly cat videos (2014) that have chalked up more than 26 billion views. Why? Not because marketers need to load their stream with silly cat videos but because these cats can deliver what most content misses — a few moments of mood managing happiness (even when these heroes look a bit grumpy).

Wednesday, April 22

How Much Marketing Has Become Psychological Trickery?

Marketing Meets PsychologyOne of the first lessons learned in advertising is that most purchasing decisions are made based on emotional impulses and irrational conclusions driven by our dreams, hopes, fears, and outrages. But for all marketers knew about advertising, it was social media that capitalized on the immediacy of it.

Instant gratification and chronic impatience has shortened not only attention spans but also the ability to make educated decisions. As a result, the fundamental market has changed with consumers who are generally more anxious and angry as the world feels a little less controllable and hard to understand. They are more prone to react with instinct over intelligence, favoring short term over the long term.

Five quick examples of psychological impulses shaping perception right now. 

• Vani Kari a.k.a. "Food Babe" has risen to become a popular food blogger for her denunciation of chemicals in food, but chemistry professor Michelle Francl has received an equal amount of attention for denouncing the decrier. Right or wrong, the initial attraction capitalized on our fear of the unknown while post-debate believability largely centers not on the facts but rather on people "like" the Food Babe.

• Socio-economic disadvantages are frequently attributed to poor performance in schools. While there is some truth to it, new studies suggest the labels meant to "save" these students can also be counterproductive. Students perform lower on tests when they are over praised, under challenged, or  merely reminded that they are disadvantaged. So wisdom holds true. We are what we think we are.

• The Guardian recently asked why people keep electing the least desirable politicians. The answer was psychology. People tend to vote for whomever simplifies the choice, demonstrates the confidence to deliver on a promise, and remains someone with whom they can relate. And what happens when nobody does? Then people are less likely to turn out and vote, which may explain low voter turnout.

• Most people have formed opinions about the Baltimore riots based upon visual content more than their understanding of the circumstances behind them. Depending on which visuals they were exposed to (rioters vandalizing stores or the peaceful side of the protests) and when in the timeline of events they were introduced to the story largely dictates their opinion of it.

 • Affirmation and frequency illusion work hand in hand in the subconscious. Not only do people see what they expect to see, whether or not it really happened, but they often believe what they see based on increasing frequency even if any improbable increase in frequency could be the result of simply noticing something in the first place. The validity of frequency is compounded from varied sources.

The packaging has become the product, for better or worse, in marketing. 

Content Marketing Stats from Hubspot
With trust in experts failing and the appetite for visual content increasing, people want to become more self-reliant simply by processing a mile of information to the depth of about one inch. In other words, they want someone else to study one inch of information a mile deep and distill any rationale into a soundbite that can be voted on, quickly and efficiently, based on little more than gut instinct.

The only problem with hard wiring the brain to work this way in tandem with modern technology is its reliance that the source has their best interest at heart. Mostly, they don't. The majority of content being produced today is by marketers and affirmation journalists, who exhibit varied degrees of bias.

That's not to say marketers are necessarily tricksters. It might be more accurate to say they've become more savvy in meeting the decision-making needs by distilling it in bite-sized simple comparisons to elicit an immediate emotional response. Right. "You won't believe what happened next" headlines work for a reason. So do easily digestible graphics that look authoritative and possibly objective.

Never mind that the content was compiled by an intern on the go. People are too busy rewriting their brains with potentially disastrous results to dig deeper into the issues or even the sources. As long as the marketer touches an emotion, narrows the choices, expresses confidence in the data, and delivers on any promises to somehow improve the purchaser's experience, people will buy the product, thought, or ideology. Sometimes, they even buy two.

Wednesday, February 11

The Psychology Of Facebook Can Get A Little Bit Crazy

As much as marketers hold on to hope for the promised land of big data — one algorithm to rule them all and in the darkness bind them — the information they covet remains convoluted. Big data can't crack what consumers don't share because algorithms play by the program rules and people never do.

One such study making the rounds even proves the point in its attempt to demonstrate the opposite. Despite some headline capturing claims that Facebook "likes" can assess your personality just as accurately as your spouse (and better than your friends), most people were miffed when they accepted an open invitation to take the algorithm for a test run. It seems that results vary.

The algorithm developed by Michal Kosinski at the Stanford University Computer Science Department, for example, pinned me down as a 25-year-old single female who is unsatisfied with life (among other things). It wasn't the only data fail among other friends who tried it. The model missed and missed and missed. There are reasons why, with mine being the easiest to decipher.

My personal usage of Facebook is best described as treating a few minutes out of every day as casual Friday. My connections are mostly limited to friends, family, and long-time online acquaintances. My principal activities include catching up with what they are doing, sharing stories about my children, and posting the occasional baked goods pictures. Why? Because I don't really do that anywhere else.

I also make a conscious effort to avoid controversy, not because I'm "agreeable" but because that social network isn't a place I want to invite deep discussion, debate, or any drama. And what that means is, in sum, that only a sliver of my personality comes across on Facebook. For others, I'm told, the assessments are wrong for a different reason. Not everyone is completely honest on Facebook, not all profiles are complete, and people "like" different pages and things for reasons you never expect.

Why big data models miss the mark with psychological stereotypes. 

Beyond the most obvious — that any algorithm is only as good as the input it is allowed to compile — there is always unexpected trouble when stereotypes are introduced into a psychological test. According to the aforementioned model, the algorithm assumes people who like "Snooki" or "Beer Pong" are outgoing and people who like "Doctor Who" and "Wikipedia" are not. Men who like "Wicked, The Musical" were defined as more likely to be homosexual and those liking "WWE" or "Bruce Lee" were not. Those who like "the Bible" are said to be more cooperative while those who like "Atheism" are competitive. And so on, and so forth.

Says who? Says some of the data that came from the myPersonality project designed by David Stillwell, deputy director of the Psychometrics Centre at the University of Cambridge. Between 2008 and 2012, myPersonality users agreed to take a survey, which asked participants about their personal details and personality traits. Their answers were then assigned to buckets such as openness, conscientiousness, extroversion, agreeableness and emotional stability. (The new test delivers those results too.)

But no matter how those results are derived, the best an algorithm can do is capture a data point and put it in a bucket. It has a much harder time recognizing intent, realizing something a human might notice like, let's say, that Joey isn't pregnant but his cousin June might be. The baby shower is coming up and he has been searching for and liking pages that might give him an idea of what to buy.

Stripped of any overreach that would paint Joey as an expectant mother, there is one area where analytics sometimes succeed. It might recognize that Joey is in the market for some baby gifts (assuming this deduction is made before and not after he finds a gift). Or perhaps, if Joey has also liked certain television shows, then one might deduce that he would be interested in similar shows. Or perhaps, some data might be employed to fine tune the tone of a message much like direct mail writers once did using PRISM research data.

But even then, minor research advantages were tempered by Rule No. 7 in Advertising in the past. It's the rule that reminded commercial writers that people tend to lie. They are predisposed to "like" things (or even "list" things in a Neilsen ratings book) that make themselves look a little brighter, better, smarter, or savvy regardless of what they really watch, like, or do. They also tend to share more positive life events than they do negative ones, connect and disconnect with people more easily, and like pages that friends recommend because they think they are doing their friends a favor. Maybe.

The irony in that? Some studies suggest that social networks can unintentionally contribute to depression, indicate anxiety related to relationship insecurity, and become as addictive as cocaine.

And while all three studies might provide an interesting read, marketers could probably learn more about their markets from the SizeUp tool provided by the Small Business Administration; any number of other affordable data providers like SEC filings, BizStats, or even the United States Census, or proven research methods such as consumer interviews, focus groups, and tests with a control group.

Does that sound too time consuming, cumbersome, or expensive? Then just wait until you see how expensive a product or service launch can be based on social network data alone. It's a little bit crazy.

Wednesday, November 5

Yes Virgina, There Are Impassioned Objectivists

Anytime I mention "objective journalism," someone contests the concept. They consider it an idealistic pipe dream. They claim that all journalists are biased. And they say it lacks the passion of advocacy journalism. But more than all that, they say objective journalism is dead. Get over it.

Sure, there is some truth to the statement that objective journalism is dead, but we mustn't mistake its current condition as evidence that the idea is boorish, flawed, or impossible. As defined, objective is an individual or individual judgment that is not influenced by personal feelings or opinions in considering and representing facts. And it's a quality that communicators ought not run from.

Objectivity comes with honesty and maturity. Grow up already.

The real problem it seems is that objective journalism allowed itself to be saddled with ideas that have nothing to do with objectivity — traits like fairness, indifference, and perfectness. Specifically, people expect that journalists (especially those who strive to be objective) must listen to both sides, transcend human frailty in hearing them, and then deadpan the facts for the public. But that's not it.

A working definition of objective journalism is more akin to how Iowa State journalism professor Michael Bugeja defined it: “Objectivity is seeing the world as it is, not how you wish it were.” The idea is that the communicator is willing to commit to the pursuit of truth, not what they hope is true.

People strive to be objective every day. A manager might like one employee better than another but promote the one with stronger skill sets. A coach might play the more talented player over their own child for the good of the team. A scientist might prove his theory wrong after reviewing empirical evidence. A judge might make a ruling that is right but weighs heavily on his or her heart.

So why would journalists somehow be incapable of striving to be objective (unless they don't want to be) where others have demonstrated the ability to succeed? It seems to me that all it would take is someone becoming impassioned to find the truth rather than promoting their own agenda or whatever agenda they have subscribed to believe. And it's in this passion for truth, rather than propping up fragile brands or frail ideologies, that deserves our respected admiration.

Forget balanced. A journalist might glean insight from different perspectives but truth doesn't take sides. Forget deadpan deliveries. Objectivity doesn't require anyone to feign disinterest in the face of outrage. Forget unconscious bias. The goal was never to transcend being human but merely to develop a consistent method of testing information, considering the evidence, and being self-aware of any personal and cultural bias. And all of these ideas were born out of a need for objectivity.

As as much as I have a fondness for Hunter S. Thompson, who had plenty to say about the objective journalism of his day, the lack of it enslaves us as the only "truth" that prevails is the one uttered with more frequency, more volume, and a more passionate will. And eventually, when the truth is no longer valued in favor of that "truth," it seems to me that we will finally find affirmation media to be an insult to our intellect and own sense of evidence.

Objective communication isn't limited to journalism. Stop saying yes. 

The Pew Research Journalism Project identified nine core principles of journalism, but I've always been partial to the idea that objectivity adheres to empirical standards, coherence standards, and rational debate. Empirical standards consider the evidence. Coherence standards consider how it fits within the greater context. Rational debate includes a diversity of views, but only gives merit to those views capable of meeting empirical and coherence standards.

In much the same way objective journalists strive to look out for the public interest, professional communications — marketers and public relations practitioners — better serve organizations (and the public) by applying objectivity to their situational analysis and measurements of outcomes. The stronger communicator is always the one who is objective as opposed to those who only aim to validate their actions or affirm a client/executive/decision maker's perceptions by saying yes.

Can we ever be certain? The answer is mostly no. While we can tear apart a baby's rattle and see what makes noise inside, we cannot see into the hearts of men and women to guess at their intent before there is any evidence of action. The best we can hope for is that those who have no intention of being objective wear the proclamation on their sleeves while others are given the benefit of the doubt until they prove otherwise. Let the truth lead for a while and see what happens.

Wednesday, October 22

What If The Only Hurdle Is What You Think?

A few nights ago at her practice, my daughter (age 8) and her softball team (8U, ages 8 and under) were challenged to a base-running relay race by their sister team (10U, ages 10 and under) in an older division. They readily accepted despite the odds.

Two years makes a big difference. Most of the girls on the 10U team had a 12- to 18-inch height advantage and the stride to go along with it. Even with a few 'accidental obstructions' by coaches to even out mismatched segments of the rely, it was pretty clear which girls would come out on top as victors.

Or maybe not. The race was relatively close in the end, with the team effort being only part of the story. While several 8U girls held their own, one of them gained ground during her segment without any coaching assistance or any easing off by the older girls. She was determined to win her heat.

And then she won it. The size difference didn't matter. The age difference didn't matter. The difference in life circumstances — having been born three months early and enduring juvenile rheumatoid arthritis for going on 6 years — didn't matter either. She won her heat from the inside out.

About 10,000 people a month Google the phrase "am I ugly."

Meaghan Ramsey of the Dove Self-Esteem Project wasn't the first to bring this disturbing trend to light, but she has been one of several voices who has helped raised awareness about self-esteem. Specifically, Ramsey has found a correlation between low body/image confidence and lower grade point averages/at-risk behaviors (drugs, alcohol, sex) and these correlations are heightened through the baked-in pressure of social networks to earn friends, likes and opinions via frequent feedback.


Ramsey contends that our increasingly obsessed culture is training our kids to spend more time and mental effort on their appearance at the expense of other values that make up one's self-concept. It's a good point, especially when you consider the depth and damage of crowd-sourced confidence beyond physical appearances.

Just as low body confidence is undermining academic achievement among students, low social confidence is undermining people well into adulthood. It's increasingly problematic because our society is adding layers of subjective superficial qualifiers that are determined by crowd-sourced opinions and visible connections. Specifically, superficial counts like "followers, likes, retweets, and shares" that have nothing to do with our value as human beings are being used as a means to validate their perception of others as well as their own concept of self.

The key to more meaningful outcomes transcends image. 

The overemphasis of imagepopularity and crowdsourcing in social media has a long history of undermining good ideas, worthwhile efforts, and individual actions. And the reason it undermines our potential as human beings is related to how we inexplicably convince ourselves that we are not pretty enough or smart enough or popular enough to be valued or liked or loved.

If appearances and opinions held true, then my daughter would be the least likely girl on the 8U team to become the fastest runner. But fortunately, no one ever told her that superficial appearances or history should somehow hold her back. So when I think about her, I always want her to be able to apply this same limitless attitude to her potential aptitude whether it is academics, athletics, or attractiveness (to the one and only partner who will ever really matter).

Wouldn't you if it were your daughter, sister, girlfriend, wife, or mother? Wouldn't you if it were your son, brother, boyfriend, husband, or father? Then maybe it's time we all took the effort to let potential not perception prove our realities, online or off. Good night and good luck.

Wednesday, September 17

Does Your Content Marketing Consider Customer Complexity?

As much as marketers are working to understand their customers as data points, many of them still need to understand their customers as real people. That is the fundamental challenge with big data — retaining the ability to see the unique individual within the throng of the crowd that it tends to track.

When you separate out one individual from the crowd, even as a thought exercise, it's easier to ask relevant questions. Who is this person? What do they want or need to know? How will they make their decision? What content would they be most interested in receiving? How will they use it? 

With the exception of this space (which is driven by a different purpose), I ask myself these questions every day. And when the opportunity presents itself, I spend time with the people we want to reach. 

People are infinitely complex and you're fooling yourself to think otherwise. 

If I have learned anything in advertising and marketing over the last 25 years, it's that consumer profiling just isn't good enough. While it can be helpful in capturing a snapshot of behavior and communicating it to other marketers or executives, it tends to dismiss the complexity of people.

Understanding people with any sense of depth requires a culmination of layered analysis that considers a dozen different aspects at once. For the purposes of illustration, pretend there are three.

Personality (Core). When you work with so many diverse marketers, you become familiar with all sorts of profiling tools that are designed to better understand people. One of the most useful was considering the four personality types (or nine if you prefer) that identify common foundations people operate from. 

For content creators, knowing that controllers needs to know the bottom line, analyzers want all the details, promoters are looking one step ahead, and supporters want to know how it benefits everyone else, can have a profound impact on content structure.

Learning (Input). As recently included in a guest post published by long-time friend and marketer Danny Brown, people consume information differently. In education, for example, learning styles include: visual (see), auditory (told), kinesthetic (touch), and language (read/write). 

Marketers who know it are much more likely to consider a multimedia approach to their digital marketing efforts. Multimodal communication tends to resonate better and benefit from longer recall.

Behavior (Output). While not everyone appreciates it today as they did when the content was fresh, Forrester Research did an excellent job in mapping out a Social Technographics model (or what many people have come know as the social media ladder). The ladder largely breaks down participants by the activities they are most likely to engage in online. 

These would include content creators, conversationalists, critics, jointers, spectators, and inactives (or passive consumers). How these different groups stack up in the data is interesting, but what is more interesting (from my perspective) is how these communication pools choose to consume, adapt, share, and build upon the content they are exposed to (if at all). 

Considering such dynamic individualities makes marketing invaluable. 

Creating content is one thing, but creating it (and embedding it within a content of diverse communication) so that it appeals to various personalities who consume information differently and respond to it differently is something else all together. If you want maximum attraction, retention, and action then the real challenge becomes one of content agility (covered in an upcoming post) delivered at the right time. 

Naturally, this isn't exclusive to online marketing and content. Real communication is much more immersive and seeks to reach people at the right time in the right environment. And considering how challenging that can be, it only makes sense to make sure the content sent makes sense for everyone.

How about you? Do you have any layers or filters that you have found useful over the years? If you do, I would love to know. The comments are yours.

Wednesday, July 23

There Is No Such Thing As An Easy A/B Lunch

"It is perhaps an all-too-human frailty to suppose that a favorable wind will blow forever." — Richard Bode

In the context of his book, First You Have To Row A Little Boat, Bode was writing about how almost impossible it is to imagine what it might be like to be caught in a dead calm while there is a breeze blowing hard against your sail or in your face or on your back. It's almost impossible to imagine it because our brains are mostly predisposed to see the most fleeting moments as infinitely constant.

When things are good, we think the honeymoon will never end. When things are bad, we readily embrace the pain as permanent. Never mind that most of us have lived long enough to know that the evidence doesn't bear either infinity out. We're generally inclined to indulge ourselves in deception.

Social media is not a science. It only feels like one.

Sure, some applications of social media seem to fall under the banner of science. Marketers are indeed in the business of observation and experimentation. They do attempt to study the structure of online communities and the behavior of people on a one to one, one to some, and one to many scale.

Some applications even attempt to apply scientific method to the mix, with A/B testing among the most prominent manifestations. There is only one problem with it. While A/B testing sometimes leads to a product development or marketing breakthrough, the operative word is sometimes.

The wind doesn't always blow in a favorable direction and sometimes it doesn't blow at all. Never mind that more and more data scientists are attempting to decipher public manipulation, but they frequently fail to appreciate that data has the propensity to manipulate its handlers too.

The biggest problem today, it seems, is that many data scientists have studied statistics but relatively few are practiced at applying scientific method in the physical or natural world (or psychological and sociological worlds for that matter). If they were, they might better appreciate the incongruity of choice — six studies of which were recently shared in an Econsultancy article by Ben Davis.

While some studies are stronger than others, a fair encapsulation of the research concludes that the choices offered, number of choices offered, order of the choices offered, and order of emotional triggers all influence A/B testing. Or, in other words, if A/B both suck, you prove nothing at all.

If you ask people whether they like big keys or little keys on a cellular phone, no one innovates touch screen technology. If you ask people which cola they like better during an A/B experiment, someone will eventually rediscover the recipe for New Coke. If you always listen to prescreen tests, every movie will have a happy ending.

But those examples are only the most straightforward research failures. Some hiccups are caused by the most subtle changes. The order information is presented (shoes before or/after a new dress). The timing of an interruption (when most people are online or when they are more receptive to share). The influence of the last destination they visited (did they leave feeling elated or aggravated).

There is no such thing as an easy lunch in marketing.

There are plenty of people who will tell you otherwise, but it's simply not true. Marketing is not a science, even if marketers love to sell science. It can be an asset but only if you think and think deep.

A few years ago, I had the privilege of working on franchise collateral for Capriotti's Sandwich Shop. I can't really speak to what they are doing now in terms of marketing, but I still love their sandwiches.

The challenge they had and probably still do, had a lot to do with psychology. Specifically, one of the questions that needed to be asked was how could they become part of the lunchtime decision-making process? The answer isn't as easy as you think.

When most people make decisions about what to have for lunch at the office the first A/B choice they create is fast food or sit down. The primary influencer at this stage is time, but it quickly turns toward taste. If fast food wins the consensus, then most people will run down the big brand list (McDonald's, Burger King, Wendy's, etc.) and make a decision based on preferences, experiences, and proximity.

Interestingly enough, KFC only gets a shot if someone says they don't want a burger. And other alternatives, like Subway, are added to the mix if someone insists on no fast food (a position thanks mostly to their Eat Fresh campaign). So where does Capriotti's fit?

A/B testing convinced some people that it fits everywhere because they consistently win on taste, but it really wasn't true. Sure, it won with loyalists, catering, or as a wild card but not where it needed to. To capture the average lunchtime customer, it comes down to the first round choice. Fast food or sit down? This sandwich shop is neither.

My solution was a bit different from the marketing firm that had contracted me onto their team. While they wanted to push award-winning sandwiches, I wanted to reframe the front end choice that there is lunch or Capriotti's, thereby pre-empting the fast food or sit down decision-making process.

But we didn't then and no one has since. So despite being voted the greatest sandwich in America, it's still niche and not mainstream no matter how many A/B tests they run. Why? As I said. There is no such thing as an easy lunch. Just because the winds of research keep blowing your organization in different directions doesn't mean it will always be there or push you to the destination you want. Someone has to aim for it.

Wednesday, July 2

Welcome To The Petri Dish. A Great Big Thumbs Up.

Don't expect the fervor over what some people are calling a breach of trust by the social network Facebook to last very long. Despite the growing distaste that most people have for it, big data is here to stay and the abuse of it will always be a few clicks away. The Internet is a petri dish.

If you missed the story, Facebook (in cooperation with Cornell and the University of California) conducted an experiment involving almost 700,000 unknowing and potentially unwilling subjects. The study was originally designed to debunk the idea that positive social media updates somehow make people feel like losers. Instead, it affirmed something most sociologists, many psychologists, and a few marketers already know.

"Emotions expressed by friends, via online social networks, influence our own moods, constituting, to our knowledge, the first experimental evidence for massive-scale emotional contagion via social networks," concluded the study. Negative and positive emotional content can influence our moods.

The significance of the study from the socio-psychological viewpoint. 

The summary of the study is clear cut. The researchers showed via a massive experiment on Facebook that emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness. They also provide experimental evidence that emotional contagion occurs without direct interaction between people (exposure to a friend expressing an emotion is sufficient) and in the complete absence of nonverbal cues.

The experiment itself consisted of manipulating the amount of positive and negative content people received from their friends and relatives throughout the day and over long periods of time. Sometimes the test reduced users' exposure to their friends' "positive emotional content," resulting in fewer positive posts of their own. Other times, it reduced exposure to "negative emotional content."

The study confirmed that the changes to a person's newsfeed had the potential to alter their mood. While interesting, it's not surprising. Everything we let into our heads influences us.

The books we read. The television programs we watch. The news we subscribe to. The advertising we see. The people we hang around. It's human nature. We are prone to adapt to our social settings and seek out affirmation for acceptance or validation. And the only remedy is awareness — either the truth or sometimes the constant recognition that someone is attempting to influence you.

The ethical lines of emotional manipulation and big data have blurred. 

It is naive for anyone to think that affirmation media doesn't have an agenda much in the same way it is naive to think that marketers don't have a brand agenda (which can be much more powerful than direct sales). They do, much in the same way Facebook has an agenda. The more the social network understands where our new ethical lines are drawn, the more it taps any amount of data for anyone.

The only reason this experiment has touched a nerve is because the people were forced to look at what they don't want to believe much in the same way people who track down an online catfish are often disappointed. The truth isn't something people necessarily want. They want their truth.

As privacy issues have waxed and waned over the years, so has public tolerance. People are all too willing to opt in (or neglect to opt out) for the most marginal of benefits. And as they do, online and offline privacy will continue to erode. The only changes since some of the earliest online privacy debates have been around semantics. Consumer profiling has morphed into big data. Shaping public opinion has drifted toward mass manipulation. And all of it is covered in TOS.

At least, that is what some people think about privacy. What do you think? Is manipulation in the eye of the beholder? Is an apology enough? Would it be all right to promote one hair color over another without product identification just before introducing a new hair dye? Or maybe it is fine to dedicate more airtime to isolated tragedies in an effort to change public policy. The comments are yours. 

Wednesday, June 4

The Written Word Is More Visual Than You Think

A recent article in The Guardian recalled a 1974 study conducted by Elizabeth Loftus and John Palmer. The psychologists asked students to watch a video clip that involved a multi-car pileup.

After watching the video, two-thirds of the students were asked one of two questions: How fast were the cars going when they smashed into each other?” or “How fast were the cars going when they hit each other?” The other third, the control group, wasn't asked a question after watching the video.

When the students returned to the lab a week later, they were asked if there was any broken glass around the accident. Surprisingly, students didn't recall what they saw in the video. Rather, more students recalled seeing broken glass if the word "smashed" had been used in the question. Less students recalled seeing broken glass if the word "hit" was used or if they were not asked a question.

"If you can't draw, you can't think."

The first time I heard the quote it was part of a presentation delivered by Josh Ulm, director of product design at Adobe, at a leadership retreat for AIGA. It wasn't until later that I discovered an earlier manifestation of the quote in an article written by Michael Gough, who also works at Adobe.

Where the quote originated doesn't matter, but it was one of several that really stuck with me. There is some truth in it, given that drawing acts as a bridge between the inner world of imagination and reason and the outer world of communication and sharing. But it's not the only bridge for our brains.

There is increasing evidence that writing helps us think too, but not always the kind we're used to in a post-penmanship world. Although Common Core is notorious for wanting to kids to rely on keyboards earlier, some studies suggest handwriting is extremely important to the learning process.

It makes sense. When we stop trying to divorce writing and drawing, we quickly remember that they are akin to each other. They are akin through handwriting, which opens the same cognitive thought process that drawing does. They are akin in graphic arts through typography. And they are akin in communication because they can both provide context or change our perception.

We don't even need to rely on a study to know it. If you ever had a friend call for a caption contest, you already know that whatever photograph is shown will adapt to whatever line of copy we give it.

The future of communication is a mixed medium

While there is an increasingly persistent conversation that attempts to separate language from art and art from language, the opposite holds true. The best artists know the title can be just as important as the painting. The best writers remember that the words leap off the page when they are vivid.

Just as it is impossible to pick between Sunflowers by Van Gogh over The Great Gatsby by F. Scott Fitzgerald (or Pythagoras over Beethoven), it is impossible to separate art and language when they are so often the same. And in knowing this, we might work harder to teach our children the importance of writing and handwriting and drawing and painting and music and photography in the greater context not only of communication, but also in our ability to think and then share our thoughts.

It's only when writers recognize the structure of their content matters and artists recognize their work is a language or perhaps several languages that either elevate the experience, expression, or object of their communication. John Dewey once wrote (Art as Experience, 1934) about art by saying: "Because objects of art are expressive, they are a language. Rather they are many languages. For each art has its own medium and that medium is fitted for one kind of communication. Each medium says something that cannot be uttered as well or as completely in any other tongue."

And the takeaway? Merely flipping the medium for more attention is not an answer. Sooner or later you have to pledge yourself to stop making boring art, whether or not that art is a vibrant painting or handful of words scrawled across the page. Sooner or later, we have to recognize that every skill set (typing and handwriting and drawing and coding) can be an important part of the experience — our own and the one we invite others to share as an experience or expression. Good night and good luck.
 

Blog Archive

by Richard R Becker Copyright and Trademark, Copywrite, Ink. © 2021; Theme designed by Bie Blogger Template