Wednesday, October 30

Where Would We Be Without Words? I Can Imagine.

Literacy
Years ago, when John Corcoran told me that almost half of his students were not able to read beyond a third grade level, I didn't want to believe him. And yet, I believed him.

I believed him because third grade was a pivotal year in my education too. It was the same year that my grandmother made the decision to have me repeat the third grade outside of the public school system. Had nothing changed, I would have landed on the wrong side of a statistical division.

According to the John Corcoran Foundation, two-thirds of students who cannot read proficiently by the end of fourth grade will end up in jail or on welfare. Not all of them do. Corcoran was a teacher.

He learned to cheat, but only cheated himself. 

As Corcoran progressed through school, he became more and more resourceful in hiding his illiteracy behind his natural aptitude for math, athletic prowess and deep friendships. He hid it so well, in fact, that he taught bookkeeping, social studies, and physical education for several years.

Living this lie wasn't easy for him, he told me, but it was not nearly as painful as not being able to help students who faced a similar problem. They could not read and he could not teach them.

Corcoran eventually did learn to read, but not until long after he left teaching and entered real estate. He was 48 years old at the time and an exception to the rule. Most people never learn to read.

A brief look at the growing literacy problem in the United States.

Literacy
There is a growing literacy problem in the United States and our self-confidence, much like Corcoran's self-esteem, makes us blind to it. According to the U.S. Department of Education, 14 percent of adults in the United States cannot read (the same number of people who do not use the Internet) and, according to the National Assessment of Adult Literacy (NAAL), this number swells to 40 percent when counting those who only possess level one reading skills (marginally functional).

High school graduation is not an indicator. As many as one in five students graduate without being able to read. About one in four graduate without being proficiently literate. One recent study, OECD Skills Outlook 2013, placed the United States 16th in literacy proficiency (among 23 countries).

The same organization warned that the U.S. was the only country among 20 OECD free-market countries where the current generation is less well educated than the previous one. It published this finding as part of the National Commission on Adult Literacy in 2008. It's not any better today.

Individual career paths aside, literacy is a family matter. 

Any time I step on stage or in front of a classroom, most people cannot imagine me as anything but a writer. Even with other occupational titles, writing has provided my career with a strong foundation. I write approximately 10,000 to 15,000 words a week (excluding email and social networks), which is the equivalent of a novel every other month (and the reason I don't write a novel every other month).

Ironically, I can imagine my career path without ever becoming a writer. From the onset, I wasn't very good at it because strong writing is indicative of being a strong reader. I wasn't a strong reader.

Reading came much later for me. I didn't learn to appreciate it until seventh grate. Writing came even later. My skill sets were only passable up until my freshman year. Both have stories for another time.

Teaching To Read
The point is that I can imagine it because I had to imagine it. But what I could not imagine would be the inability to help my daughter when she needed it most. She reads with confidence now.

While it has been an amazing journey transforming my daughter into a strong reader during the past six months, I can't help but wonder what might have happened without intervention. What if I didn't know how to read well, let alone teach? How long could she have hung on as a struggling reader?

Three days this week with literacy. Maybe you could connect to one.

All Hallow's ReadThursday is All Hallows Read. Most people pass out candy, but Neil Gaiman continues to make the case that people could pass out books instead. He calls the campaign All Hallows Read, a program that inspires more stories and less sweets for Halloween.

I wrote about the program last year, including five titles that have always conjured up an appropriate spirit for the season. Feel free to add The Ocean At The End Of The Lane, written by Gaiman. Coraline is another family favorite. The film is part of my family's Halloween lineup.

NCFLFriday is National Family Literacy Day.  The National Center For Family Literacy (NCFL)  is hosting a fundraising challenge for literacy. Proceeds from the campaign will help the center continue its work, which has helped more than one million families make educational and economical progress.

The reason family literacy is so important is that children's reading scores improve dramatically when their parents become involved and help them learn to read. This isn't possible without literate parents so the program goes a long way improving the household. The NCFL is my friend Geoff Livingston's account and he is raising funds along with hundreds of others. They have a "thunderclap" scheduled.

Cegas Valley Book Festival
Saturday is the Vegas Valley Book Festival. The Vegas Valley Book Festival is the largest literary event in Las Vegas, bringing together hundreds of writers, authors, artists, and illustrators to celebrate literacy and creativity. All programs and events are open to the public. Admission is free.

As social media director for AIGA Las Vegas, I have been overseeing elements of the social media campaign, including an event schedule on Facebook. If you are in Las Vegas this Saturday, there isn't a better way to promote family literacy and art appreciation. There is also an event kick off tonight with Catherine Coulter as this year's keynote.

One last thing for my own curiosity: What are you reading and why? I really would like to know. It's important because you never know who it might inspire next because words inspire lives. They inspired mine.

Wednesday, October 23

Content May Be King, But People Want Experiences

If you have invested any time as a communicator working in or with social media, there is a pretty good chance that you've heard the declaration that content is king at one time or another. There is some truth to the concept too, which was originally proposed by Bill Gates within a context that might surprise you.

Sure, we can all argue the finer points well enough or be cute and crown the audience, but the truth is that content will reign in one form or another. It's the crux of how we communicate our concepts, ideas, and observations. It's how we educate, inform, and entertain others in the world in which we live.

It doesn't even matter how that content is presented, as long as it is presented well. Write a post or white paper. Shoot a video or record a podcast. Share a picture or create a television series. It's all content.

Content appeals to the immediate but experiences set a plate of permanence.

While teaching Social Media For Strategic Communication at the University of Nevada, Las Vegas, last Saturday, we spent a considerable amount of time talking about content and the constant pressure to produce more and more easily digestible content. Almost everybody does, right?

If you believe like most people — that influence and conversions can be quantified by counting actions — then you could make the case that more posts, more tweets, more stuff that people can act upon somehow counts. In fact, this was the thinking that many direct mail houses adopted ten years ago.

If your direct mail campaign has a two percent return from some list, then all you have to do is increase the frequency (or the list) to generate more revenue. Right? Well, maybe not, even if this thinking does explain why online lead generation is overpriced.

If you ask me, I think all these tactical formulas are detracting from something stronger. And this something is tied to a question I asked in class — if more content more often produces results, then why does one sentence from a book read one time or one scene from a movie screened once or one comment made by a teacher one time stay with someone their entire life? And this something can best be summed up by experiences.

Novels work because our brains see them as experiences. Movies work because our heads are hardwired to attach emotion to sensory perception. Social networks climb to the top of traffic charts not because of the content they provide but because of the sensation of experiences we feel. It's not my social media deck that holds anyone's interest in my class but rather the way I present it and how we experience it.


This communication blog (or journal) is no exception. A few months after writing about radio host Bob Fass and my attempt to make this space more indicative of an open format, more people have visited. In fact, more people have visited despite my efforts to undo blog "rules."

I stopped concerning myself with frequency, making this a weekly as opposed to a daily. And, at the same time, tossed out short content in favor of writing something more substantive. The result has been eye opening in that topics that used to have a one-day shelf life now have a one-week shelf life or more.

While I am not proposing this would be the case for every content vehicle, it does provide an explanation tied in part to the question I asked in class. Even when I wrote daily, the most successful posts or series of posts had nothing to do with "Five Sure-Fire Ways To Get More Traffic!" They had to do with the cancellation of a television show or two, the coverage of a crisis communication study or two or twelve, the involvement and participation of people in things that matter (even when it feels personal), and experiments that involved thousands of people besides myself. Good content? Maybe.

Good experiences? Absolutely. As fun as it has been to write satire at times, the gags rely less on writing and more on experiences. People remember because they were part of something.

The future of the Internet doesn't rely on mobile as much as experiences. 

I used to tell students that technologies in social media mean software. With the advent of Google glass, increasingly immersive projection displays, and the encroachment of the online world into the offline world, I no longer can offer up any such disclaimer. All of it — the hardware, software, and people driving the content and devices — help create the experience and even alter it.

Consider, for example, short stories being published at a pace of 140 characters at a time, characters who suddenly open their new accounts, and one project that included a dialogue and storytelling exchange between four or more accounts (each characters talking from their unique point of view). There are more examples, well beyond Twitter, but the point remains the same. Storytelling creates experiences. Technology creates experiences. Person-to-person interaction on a one-to-one, one-to-some, and one-to-many scale creates experiences.

And the rest? That content without experiences? It still has a place in the world, sure. But the future of social media isn't in producing ever-growing reams of information to get people's attention. It will be to elevate the content into a form of communication that creates a shared experience, online or off.

Can you see this future? And if you can, how might it change your own marketing strategy away from tricking readers into sampling content into some compelling experience that they want to become part of and participate in? The comments are yours to share, your experiences or, perhaps, propose an entirely new conversation. I look forward to it.

Wednesday, October 16

Do Hardships Make Us Human Or Is The Air Of Success Better?

The first time I saw the crowd funding video for Yorganic Chef, I was pleased with the finished product. The run time felt long, but Nick Diakanonis made up for it with his authenticity. He's telling his own story. It only made sense that he would drift off script and elaborate.

I noticed something else the second time I watched the video. There was one segment missing and it left me wondering whether it made a difference. The script, along with the campaign page, left out a story segment. It's the hardship part. 

Another side to the Yorganic Chef crowd funding story. 

There is a good chance that you won't see this segment of the story elsewhere. It was one of the elements left behind after several team members thought the hardship part was a negative. They want to be upbeat and bright. And maybe they're right. Or maybe they're not.

So yesterday, after receiving permission from my friend Nick, I considered the contrast. You see, Yorganic Chef was scheduled to open last year and Nick had already achieved his dream.

That is, he had achieved his dream until something unexpected happened. Two weeks before opening in Los Angeles, the person who owned the facility and packaging equipment gave him an ultimatum. Either Nick would sign over the business and become an employee or there would be no launch. 

Imagine. For the better part of three years, you invest your entire life in one solitary idea — to create a line of non-frozen, ready-made gourmet meals from the ground up, including a direct-to-customer delivery system that required the invention of a patented state-of-the-art thermal bag. You're two weeks from opening. Your dream is about to come true. And then, suddenly, everything is swept away.

What do you do? Do you sell out and become the manager of your own concept (and leave everyone who has supported you behind)? Or, do you undo the last six months of progress and try to start all over? Many people would have been tempted to sell out, but Nick isn't like that. 


As I mentioned before, the video doesn't include anything about the crisis that Nick had to weather. And there is a good chance most of the stories about the campaign won't ever touch upon it. 

But it makes me wonder. Do we have to be perfect to succeed?

Perhaps more than any other kind of person, business professionals and politicians tend to be most concerned about their image. They want to convey an image of perpetual success. They never lose. 

My life has never been like that. Most people have a mixed bag. Sometimes there are great runs when everything seems easy. Sometimes it feels like standing in sludge, with every inch of forward motion requiring the greatest amount of effort possible. It's a given. Some of us share it. Some of us don't.

Sure, some hardships can become victim stories, saddling some people with excuses to never succeed again. But I'm not talking about those. I'm talking about the hardships people face and find a way to overcome, much like Nick is trying to do. He isn't just a successful culinary entrepreneur. He's a culinary entrepreneur who is hoping to rebound from a rotten turn after doing everything right. 

Does that little bit of detail make a difference? Some people seem to think so, believing that the story is more upbeat without mentioning the hardship. Others might disagree, not seeing anything negative in the full story. If anything, they see it as the clarifying detail in starting the Yorganic Chef crowd funding campaign.

The money being raised by Yorganic Chef has a purpose. It's Nick's chance to replace what was lost — the facility and equipment — when someone he trusted revealed a different agenda. If that hadn't happened, Yorganic Chef would already be serving Los Angeles and looking to open in a second market.

So how about you? When you see crowd funding stories like the one launched by Tinu Abayomi-Paul, does it make a difference that she has a need? Or do you prefer a different kind of back story, one that scrubs away the blemishes no matter how relevant they might be? Or maybe it all ties back into those topics we've explored before — perception matters (but not really). Either way, I'd love to know what you think. The comments are yours. 

Wednesday, October 9

How Simple Decisions In Social Media Make Big Differences.

social media
Social media can be a mean sport in some arenas. It can be so mean that sometimes the media overreacts, like Popular Science. The publication will abandon comments, claiming that a politically motivated war on expertise has eroded the popular consensus on "scientifically validated topics."

They don't want to be part of that, even if they still will be (whether they have comments or not). They might even have it wrong. The whole of science is not to continually reinforce "scientifically validated topics" but to investigate the known and unknown. After all, more than one scientifically validated topic has been turned on its head. There are things we don't know. But that's a topic for another time.

Do comment sections really make a difference? 

My interest in this topic was inspired in part by Mitch Joel, who suggested websites could turn comments off, at least until someone develops better technology to keep them free and clean. His point was that online conversations have evolved. Comments are anywhere and everywhere nowadays.

Specifically, people are more likely to share a link and/or add their thoughts elsewhere — Facebook, Twitter, Linkedin, Medium, or some other platform — than they ever will be to leave a comment at the source. Let's face it. Websites and blogs haven't been the center of the social universe for some time.

Today, social media requires significantly more elasticity and adaptability and the conversations that revolve around content are much more hyper-extended. They are smaller, shorter, less formal and more fragmented discussions about articles and posts. It's as if all of social traded sharing for substance.

This is vastly different from the days when bloggers used to covet comments as a measurement (despite never being able to explain why Seth Godin could succeed without them). Years ago, there were primarily three ways to respond to an article or post — you left a comment, wrote a rebuttal (on your own blog), or shared it as a thread in a niche forum. It made things orderly but also exclusionary.

Fragmentation
That is not the case anymore. Now, some articles can sport a dozen mini-conversations within the same platform, initiated by people who might have little or no connection to each other. It's fascinating and fragmented stuff, which is why some pros like Danny Brown look to close the loop on fragmentation.

Livefyre sounds like a decent solution, but not everyone cares for it despite going a bit beyond what Disqus "reactions" used to offer before they discontinued them. Other emergent comment solutions worth exploring include Google+ comments or Facebook comments. They draw mixed reactions too.

For me, I think the issue is something else beyond nuts and bolts. Errant comments, like those that Popular Science complained about, are manageable. Moderating comments by setting permissions isn't as hard as some people make it sound. And if fragmentation is a concern, Livefyre might mitigate it.

All that sounds fine, but it never gets to the root issue. You see, there is only one fundamental difference between comments at the source and comments away from the source.

Do you want comments to be part of the article or about the article?

Comments made at the source become part of the article. Comments made away from the source, even if they are ported in by a program, might relate to but are largely independent of the article. The difference is that simple, and this simplicity is deceiving.

science and faithIt's deceiving because when someone comments, where someone comments and to whom they comment to all have a bearing on the content, context, and character of that comment. It's deceiving because people tend to write to the author at the source (or other commenters) while they tend to write about the author or source material (sometimes slanting the intent to serve their purpose) away from the source. And it's deceiving because comments away from the source will never have the same kind of physical attachment or quasi permanence that those comments closer to source seem to achieve.

Right. Most people do not search for reactions when an article is older than a week. Few have the appetite to scroll long lists of link shares that aren't really comments, whether they are ported in or not. And, unless there is historical or outlandish content, even fewer read comments bumped to page 2.

So when Popular Science made the decision to abandon comments, they didn't just make a decision to suspend spammers and people they fundamentally disagree with on topics like climate change and evolution. They made a decision to disallow different viewpoints from becoming part of an article. And they more or less told told readers to write about the content but not to the authors of that content.

In a few weeks' time, their decision will likely be sized up for its pros and cons. But make no mistake, it was still the wrong decision. Silence is no friend of science.

You see, neither science nor faith need to shirk at a politically motivated war on their mutual expertise. The truth is that they are not nearly as polarizing as some would have you believe. Science and faith are like brothers in attempting to understand the unknown, often inspiring the other to stop and think.

What Popular Science could have done instead was create a white list of commenters better suited to scientific discussion, perhaps with differing but conscientious viewpoints. Such an approach might have moved their content forward, leading to breakthroughs or a better understanding of science.

But what do I know? I've adopted a different outlook altogether. Comments, I think, work best when they are treated like someone who calls into a radio talk show. If you could talk about anything you want, what do you want to talk about today? The comments are yours or we can chat in person at the University of Nevada, Las Vegas on October 19 during a 3-hour social media session.

Wednesday, October 2

Teaching People To Write Requires A Contradictory Approach.

"I have known writers who paid no damned attention whatever to the rules of grammar and rhetoric and somehow made the language behave for them." — Red Smith 

Red Smith was one of the finest sportswriters in history. Not only did he receive the J.G. Taylor Spink Award from the Baseball Writers Association Of America, but he was also the first sportswriter to win the Pulitzer Prize for commentary. Even more notable, he is the Red Smith for whom the Red Smith Award from the Associated Press is named. Ernest Hemingway even immortalized him in a novel.

"And he noticed how the wind was blowing, looked at the portrait, poured another glass of Valpolicela and then started to read the Paris edition of the New York Herald Tribune. 

I ought to take the pills, he thought. But the hell with the pills.

Then he took him just the same and went on reading the New York Herald. He was reading Red Smith, and he liked him very much." — Ernest Hemingway, Across the River and Into the Trees 

But that's not why I quote him in every writing, editing and proofreading class I teach. I quote him because he is right. And I quote him as a reminder to myself to never become a pompous ass about the trade and craft. There are too many writers who do, claiming they know this and that about writing.

Editing And Proofreading Your Work at the University Of Nevada, Las Vegas. 

It's always a challenging prospect — standing in front of varied students who range in age, interest and experience — for a three hours on some random morning or afternoon when the subject of the day is editing and proofreading. (It will be a morning session this Saturday.)

What makes this class especially daunting is that I have one chance to help people become better writers, editors or proofreaders. It's not like Writing For Public Relations at all, with writing assignments (and the rewrites of those assignments) being passed back and forth for ten weeks.

No, this class is a one-time shot, taught only once in the spring and once in the fall (occasionally once in the summer). And while I always present myself in a suit out of respect to those who attend, the class itself remains informal. I invite students to stop me cold, ask me questions on the fly, and otherwise test my respectable but finite knowledge about the written language.

Sometimes it takes awhile to visualize their hypotheticals and every now and again I have to research their questions after class because they stump me on the spot, but otherwise I manage well enough. Even the few times that I didn't think I managed well enough worked out for the best. I love to learn too.

Some of my lessons from previous classes even become part of my future classes. One of my favorite stories includes how I used to use "website" as an example of English being a living language until one student pointed out the Associated Press insisted it be spelled "Web site." So, I changed my class (providing the Associated Press explanation) only to be schooled by a different student when the Associated Press changed its ruling a week prior to my class (and without my knowledge). Figures.

The day someone thinks they've mastered writing is the day they aren't worth reading.

When I was younger, I used to sweat the outlandishly difficult questions or insistent but mostly wrong students. Nowadays, I'm more inclined to laugh about it, regardless of who is proven ignorant.

I attribute that to Smith. He knew better than most: telling someone how to write is futile. You can only show them how to write better. He was not alone in believing it.

Beyond the more obvious industry hacks like David Ogilvy, William Bernbach, Leo Burnett, Shirley Polykoff, I've learned a few things from authors like Ernest Hemingway, Allen Ginsberg, Ray Bradbury, Truman Capote, Norman Mailer, Kurt Vonnegut and Joseph Wambaugh (to name a few). Except for Vonnegut (sort of), none of them believed a formula could make anyone a great writer.

Instead, you have to see writers as people in various stages of aptitude, ranging from the novice who doesn't realize there are "rules" to the experienced "blow hard" who lives for the rules (or is delusional enough to think he/she is better than any rules). Once you do, you guide them from one layer of aptitude to the next until they understand that being a better writer isn't very complicated even if it is contradictory.

You want to write straight, honest prose that can touch a human being. Nothing more or less. 

That's the easy part. The hard part is that there are a thousand different ways to do it. There are a million different things that can get in the way of doing it well, which is why every word, sentence, paragraph, chapter, plot and story needs to be tested against whatever the writer already knows.

You dust off those "rules," filters, and suggestions and then ask an honest question: Would this concept make it better, worse, or about the same? Oversimplified, you might ask: Does starting this sentence with something unconventional like "And" make it better, worse, or about the same.

As long as you have a good reason to do or not do something (and not as a defensive justification or cop out), you can break with standard, style, format or the so-called rules any time you want. Of course, this also assumes you know the rules you want to break and understand what's behind them.

This is probably why people who teach writing sometimes seem like they burn the candle at both ends. We want, or at least I want, to lay out some rules for people to try on and then encourage them to wear those that fit and dismiss those that don't fit. It's how you become a better writer, and you will never stop doing it (ever). Anyone who tells you differently is either too busy trying to imitate or perhaps too busy trying to justify why they can't imitate.

What about you? Do you have any rules, techniques, or tips that you've found useful? I'd love to read them or check them out. Just drop them in the comments Or, if you would rather talk about something else all together, please do. The comment section is an open forum around here.

Wednesday, September 25

You Can Make The Internet Meaningful By Doing Stuff Offline.

I had never heard of neuralgia until a few days ago. It is pain in one or more nerves caused by a change in neurological structure of the nerves rather than by the excitation of healthy pain receptors. In other words, the nerves tell your brain to feel intense stimulus even when there isn't any.

It is painful. It is debilitating. And it afflicts someone I've come to know over the past few months. She has suffered with it for the better part of a decade but most people didn't even know it.

Most of the time, Tinu Abayomi-Paul's condition manifests itself as chronic back pain. This time is different. It is severe enough that she will be undergoing surgery and decommissioned for a month, maybe longer.

This is especially challenging for her because, like me, she has a small business. In her case, she has two micro-businesses with an emphasis on search engine optimization and social media. And because both businesses rely extensively on providing services, she will not generate any income while out.

What do small business owners do when there is no safety net? 

Sure, some business owners are like me. You set something aside to weather the storm and hope it's enough. This time around, I'm cutting it close after recovery. But that's a story for a different time.

I'm only mentioning it now because Abayomi-Paul is facing something similar but different. She didn't have the luxury of being ready to weather an unexpected surgery this time around. She needs help.

She isn't asking for charity. All she wants is to work through recovery. So Abayomi-Paul had the novel idea to run an Indiegogo campaign to raise the money she needs to make ends meet while she recovers. Here's her story, along with some discounted packages that she put together for her campaign.


This is a short 10-day run campaign. It ends next Tuesday and you can find out more information about  Abayomi-Paul on her website Free Traffic Tips. For campaign details and packages, visit Indiegogo.

Do keep in mind that I'm taking a leap of faith as this isn't a pure endorsement. I haven't worked with Abayomi-Paul before, but I do know plenty of people who have. Mostly, I've enjoyed some banter with her as part of a social network group. I've also read her content and watched a few instructional videos that she has produced. She knows her stuff without all the bull that other people like to spread.

Who knows? It might make a great case study or best practice as one of those stories for the other Internet — the one that people sometimes forget about in favor of big data, big numbers, and big distractions.

We can make a meaningful online experience by doing things offline.

Yes, there really is an Internet with deeper purpose. It's the one that many pros abandoned so they could write business card books about social. So you don't hear about this stuff as much anymore because it doesn't draw traffic. If you want visitors, you need to write about landing on this page instead.

But hey, that's mostly okay. I don't begrudge anyone an opportunity to enjoy a silly cat video or hawk some ROI (oddly) to companies that will never appreciate why Dawn Saves Animals without the benefit of coupon codes, junk mail, or mountains of content.

You see, it's all very simple really. They do something instead. And then what they did lands online. It's something I hope my kids learn. Legacies can be written about online, but we make them offline. I think Abayomi-Paul deserves that chance. Many people do. I'll write about a few more soon enough.

But today, given all the changes coming down on search engine optimization, maybe this will be a great opportunity to talk to someone who knows about it. And all she is really asking for in return is a little time offline so she can come back and deliver something meaningful online. So what do you think?

Is this a worthwhile case study for business practitioners who have the misfortune of a medical emergency? Or maybe you might like to hear from someone else about Abayomi-Paul? Kami Huyse, Anne Weiskopf, Jennifer Windrum, and Ann Handley were among the first funders. Or maybe you would like to talk about something else all together? I'm fine with that too. The comments are yours.

Wednesday, September 18

A Leadership Lesson From A Place Few Experts Tread

Last August, U.S. President Barack Obama compared Russian President Vladimir Putin to a tiresome schoolboy. But less than 30 days after he made the offhanded comment, it was President Putin who would school President Obama in foreign affairs. Russia is celebrating a diplomatic victory this week.

Somehow, President Obama and his administration allowed the Syria crisis to get away from them. Instead of the United States leading a coalition of countries to bring Syria to justice for using chemical weapons, Russia is being celebrated for stopping the escalation of aggression in the Middle East at the hands of unexceptional Americans. Syria will also surrender its chemical weapons, or so they say, and the world will be a better place.

The turnabout of this narrative was about as masterful as any propaganda since the end of the Cold War. One might even praise the audacity of the move, if not for the considerable consequences.

How recent events have changed the geo-political landscape for now.

Russia temporarily gains world prestige and more influence in the Middle East while protecting its Syrian allies, a country run by a leader who used chemical weapons against their own people. Syria also works lockstep with Iran, smuggling arms to the Hisbollah in Lebanon. And Iran has said all along that the U. S. was behind the uprising, a charge that may not have been initially accurate but has become accurate in the last two years. The arms sent into the conflict are limited, with the U.S. fearing these weapons could all too easily be turned on us as suppliers because some rebels are tied to the same terrorists the U.S. has fought for years. To say Syria is a mess is an understatement.

But most Americans don't even know that the U.S. has already picked a side. It wants to topple the government in Syria, but obviously less than Russia wants to keep Bashar al-Assad.

Those seem to be some of the facts (but not nearly all of them). Just don't mistake them as a call for action or involvement on my part. To me, Syria is another cumulation of events that convinces Americans to choose between two bad choices — act as the global police even when the world doesn't want you to while supporting rebels that may (or may not) include your enemies or do nothing, which is de facto support for a dictator who has long despised you and is happy to operate against your interests.

This is why so many advisors frame U.S. foreign policy in Syria up as a choice between which we like better: the enemy you know or the enemy you do not. It would take a fool to hazard a guess.

Lesson learned: Leadership does not talk big with a little stick. 

Many people seemed enamored by Teddy Roosevelt's foreign policy that is often summed up from his quip to "speak softy and carry a big stick." And yet, few seem to realize that this is akin to negotiating peacefully while simultaneously threatening people with a "big stick." It was coined at a time when the division between American isolationists and internationalists had boiled over, again.

This division is one of the more interesting ones in politics because it does not follow party lines. Although current public perception is that the Republicans are hawks and Democrats are doves, it's not really true. On the contrary, it was progressives who led the country into conflict and war more often than their counterparts who prefer to live and let live. Americans only think the opposite because neoconservatives joined progressives as being internationalists.

Sometimes this internationalist concept works. Sometimes it does not. And this time, it obviously has not worked for President Obama, partly because of his own words and actions for the better part of seven years. He has campaigned under the auspices of being against what the world saw as American imperialism, but has secretly and stealthily supported various programs that reinforce the idea anyway.

The primary difference between this administration and last mostly has to do with the size of the talk and the size of the stick. Bush favored speaking big and carrying a big stick. Obama favors speaking big and carrying a little stick. And, unfortunately, this has made Americans largely unsupportive of any action abroad while making their detractors much more emboldened to push new agendas.

Who cares? Well, that is a subject open for debate. There are those who believe the U.S. can exist without being a major player in the world and there are those who believe we have to lead the world. The thinnest majority of Republicans and Democrats believe we ought to lead because history has proven that trouble will knock on the door of the U.S. whether it goes looking or not.

Foreign policy isn't what this post is about. It's about leadership. 

There are plenty of people who have long criticized the foreign policy of the Obama administration, among other things. The reason it invites criticism is because it lacks coherency, primarily because the original vision that he brought to the presidency runs counter to the way the world works.

President Obama told the American people that retracting the reach of the United States while simultaneously making nice-nice with the world would place us in a potion where our diplomatic prowess alone could influence world affairs. It's not really true, but that was the vision he forwarded to the American people and the world (despite trying to keep a finger on specific interests anyway).

There are dozens of places where that was never going to work. Syria is one of them. Instead, it is one of those places where you have to make the decision, announce the decision, and act on the decision.

The Obama administration didn't do that, mostly, because too much could go wrong. They also didn't want to be responsible if it did. So, in effect, they pushed it off for a few years and then attempted to assemble a middle-of-the-road approach that wouldn't make it look like Obama was rolling back on his posture to be a polite player in the world. When that didn't work, he punted to Congress for a vote while simultaneously withholding any accountability to that vote in case it didn't go his way.

On the domestic front, it all comes across as being considerate, depending largely on how well you like his administration. All the while, everyone forgot that the U.S doesn't exist in a vacuum. Other world leaders saw the vote-and-pony show as indecisiveness at best and weakness at worst. And no matter how you see it, other countries have since seized on the moment.

Contrast this with what Prime Minister David Cameron did. He said the United Kingdom ought to become involved and he made a very strong case to Parliament. When Parliament voted against intervention, he stated it was a mistake but would accept the will of the people. It was a done deal and he didn't look too passive, too pompous or too weak after the outcome.

What's the difference? The difference is that Cameron understands being a leader as opposed to being an expert politician. In this case, a leader transcends their appearance of authority in order to ensure any following is aligned to the organizational goals and not themselves as individuals.

Experts, on the other hand, tend to be different all together. They derive their appearance of authority from their reputation and are not willing to risk it by accepting responsibility. In this case (and possibly many others), President Obama is playing expert in Syria (without the right expertise, perhaps).

The expert fallacy can cost an organization its clarity. 

Right now, almost everyone in the U.S. is looking for experts to solve problems when what we really need are leaders. We see it in politics. We see it in business. But based on the number of people who have added "expert" to their labels (deserved or not), it's safe to say that we have a glut of those instead.

What's the difference? Leaders are those people who figure things out. They are people who have a vision, sometimes asking experts for their opinions on how to make that vision real, and then approve those opinions based on what he or she believes is most likely to make that vision real.

If they'e right, history remembers them with reverence. If they are wrong, not so much. The risk is part of the job. Leaders are held accountable. In government, they don't pin blame elsewhere. In business, they don't need golden parachutes. These are the people who make their own way.

Leaders don't cling to and attempt to manipulate the world they know; they look to shape the world into something no one had ever considered before. (Ergo, a push button phone design expert can't see a flat screen phone as being functional.) And this is why they continually find solutions that experts could never fathom. It's one thing to be studied in what is, and another thing to see what could be.

When it comes to world affairs, history has shown it that the world will praise whomever is steadfast in their vision and conviction to see it through, despite being wrong on some points. So how about you?

Are you are a leader or follower? Do you know your field or are you ready to re-imagine it? Or maybe you want to talk about something else? One of my friends has already suggested we abandon Syria and start focusing on some of the problems we have right here in this country, like homeless workers. What do you think ... about anything?

Wednesday, September 11

Any Fool Can Do What Another Fool Has Done

Miley Cyrus
When Miley Cyrus finally started talking about her performance on the MTV Video Music Awards, she hit every publicity misnomer in existence. According to the pop star, she and Robin Thicke weren't making fools of themselves. They were "making history."

"Madonna's done it. Britney's done it," she said. "Every VMA performance, that's what you're looking for; you're wanting to make history."

She said she doesn't pay any attention to the negative comments either. No matter what anyone thinks, Cyrus says that this has played out so many times in pop music that it doesn't even matter. She's claims to be amused by anyone still taking about it. She said they've thought about it more than she ever did.

Of course, few people are talking about twerking anymore. Her Wrecking Ball video has out-buzzed all that as the pop star stripped down to nothing in order to break video viewership records. Never mind that just as many people are tuning in to see her naked as they are to see her sing, she must be a winner.

So is fashion designer Kenneth Cole. He didn't even have to strip down to boots in order to get attention. He only had to make a joke about boots. "'Boots on the ground' or not, let's not forget about sandals, pumps and loafers," wrote the fashion designer in response to the possibility of the United States taking military action in Syria. Count up all the retweets and raves. He must be a winner too.

The public's fascination with spectacle is as cyclical as it is tired.

Kenneth ColeAmerica isn't becoming a society of spectacle. It has always been a society of spectacle, with the only difference from one decade to the next being our mainstream appetite for it. The 1960s, 1920s, 1880s, 1840s, 1790s all had racy, raunchy, and tasteless elements. The whole world has been part of it too.

It happens so often that one would think we would grow tired of it. But then we all suffer some odd form of public amnesia, forgetting the existence of such things as history tends to tidy itself up when the pendulum swings toward a more buttoned-down decade.

Even when we do remember, we tend to confine our memories to the 1960s because people were really in it for political commentary as opposed to quick profits. And perhaps that alone is why the modern spectacle feels as empty as it is tasteless.

Whereas people like Andy Warhol, Bob Dylan, and Ken Kesey made history, people like Cyrus, Cole, and Ariana Grande will become footnotes of the eventually forgotten. If you don't believe it, take a look at the twerk fail hoax video masterminded by Jimmy Kimmel.

His hoax caught 10 million views, proving that you neither have to be famous nor talented to make a similar impact. But honey badger don't care. Cyrus was happy to up the ante. She not only strips off her clothes for 30 million views but her integrity too. The video isn't much different than the time-honored streak, except most people this desperate for attention aren't attempting to rebrand themselves.

Miley Cyrus nudePublicity is easy. Reputation is hard. 

Those six words were all I offered up about the subject prior to writing this article. They say it all.

Sure, one can easily subscribe to the notion that negative publicity has a positive impact on sales. When you compare Michael Jackson and his run-ins with the law as Jonah Berger, Alan Sorensen, and Scott Rasmussen did in their 2010 research paper on negative publicity, Jackson's album sales went up.

The crux of the research is not new but it is interesting. It is underpinned by the notion that purchases are tied to the quality of the product and what any publicity triggers you to think about.

Negative publicity for Jackson made people think about his great music. Negative publicity preceded cookbook sales for chef Paula Deen. Negative publicity spurred sales of Mel Gibson's films. And yet, you have to ask yourself about the after-controversy market for new material. In other words, negative publicity might drive short-term sales but cost someone's reputational legacy in the process.

In fact, it might be more accurate to say that negative publicity creates an illusion of positive sales because research cannot quantify the lost sales of material that will never be created or a lost legacy. History holds a different reverence for John Lennon, Elvis Presley, Johnny Cash, and Jackson.

But who cares? Some say millennials don't care.

According to some studies, the generation born between 1981-2000 places money, fame and image ahead of self-acceptance, affiliation, and community. And whether you believe it or not, Cyrus fits the short-term mindset as much as Cole is trying to reach them. They are less likely to ridicule the behavior of someone like Cyrus or Cole and more likely to praise it.

Earlier studies said pretty much the same. They don't care. And maybe they aren't alone. The phenomenon isn't confined to a single generation. Most people think that 15 minutes of fame (or infamy) is worth the reputational cost as long as they can capitalize on the short-term success.

CyrusThe Onion did a brilliant job in articulating this fact too. On the day after the Cyrus stunt started making waves, CNN didn't lead the news with world affairs, human achievement, or an attempt to be a positive force for change. The leading headline reinforced mainstream rubber necking.

The commentary is sharply satirical in the telling. The purported explanation from the managing editor of CNN is as simple as it gets. Although making Cyrus the top news story was admittedly a disservice, it ensured more web traffic than any bothersome news like chemical weapons in Syria, civil unrest in Egypt, or even the 50th anniversary of Martin Luther King's "I Have A Dream" speech.

So no, it's not millennials who are guilty of placing the spotlight on one girl's narcissistic booty shaking. That honor belongs to the media serving its viewership. As long as they believe that popcorn means more advertising dollars than meat, then more generations will likely view the working world with disdain in favor of a few fleeting seconds of fame.

But so what? I don't personally care whether Cyrus' actions detract from her own talent. It's up to each of us to carve out our own path in this life. And if that includes selling out for temporary success, I hope it's worth it. Just don't pretend it's original or historic. It's not. History is littered with forgotten fools.

How about you? Do you subscribe to the notion that all publicity is good publicity or that 100,000 Twitter followers will somehow ensure your words will outlast the pyramids of Egypt? What do you think? And by that, I mean anything. There comments are yours. Let's talk.

Wednesday, September 4

Thinking Still Beats Searching When You Need Four Gallons.

Thinking
My wife had a question the other day, but it wasn't her question. The question belonged to my son and he didn't want to ask me. He thought he knew what I would say. He was wrong, but close enough.

The question was a puzzler of sorts. It was a problem from his math teacher. And any student who turns in the answer Tuesday (today) will receive extra credit. The reason my wife asked me wasn't a puzzler. She wanted him to receive the extra credit. (What parent wouldn't? Besides me, I mean.)

Maybe I should clarify that point. I don't want him to receive extra credit. I want him to learn it. And given that he had the whole weekend to figure it out and it was only the Friday before the long Labor Day weekend, there was no rush on my part. 

How can you make four gallons if you only have a three gallon bucket and a five gallon bucket?

I told him to wait until I had finished my part of the shopping list, groceries for the meals I would cook for the week ahead. Even then, I said, expect some help but not the answer. He didn't want that. 

A few minutes later, I looked over at him. He had moved on to another problem. Specifically, he was trying to figure out which route to take as he transported his stolen loot from a bank to an escape vehicle.  Right. He was playing PayDay 2 on the Xbox. 

"Why aren't you working on the problem?" I asked.

"I already spent 20 minutes working on it in class," he said.

"Well, obviously that isn't enough," I suggested. 

"It's all right," he said. "I already looked it up." 

"You did what?"

"I did what you were probably going to tell me to do," he said.

"You did what?" 

"I looked it up. Done."

"You looked it up, where?" 

"Google."

Ah, Google. If there has ever been a company of smart people responsible for the dumbing down of America, it has to be Google. All students have to do is drop in a few key words from their math problems and poof — they can find an answer while unceremoniously learning nothing in the process.

"I didn't tell you to look it up," I said. "I was going to give you a hint."

The reason I wanted to give him a hint was because the puzzler is not the real problem. Although the question suggests you need to measure four gallons of water using a three gallon bucket and a five gallon bucket, the real problem is something else. It's what stops most people after 20 minutes of class.

In order to solve the problem, you really need to establish what X might be. And in this case, X is really whatever it takes to make gallon of water. I wouldn't have told him that, but intended to point him in that direction by asking what stopped him from answering the question. Except, I couldn't anymore. 

Google beat me to it. And today, all across the country, Google is going to beat other teachers and parents too. It's not the company's fault, but it is creating a problem. Sometimes it pays to look something up. Other times, it is much more rewarding to figure it out. Figuring teaches you to think and rethink. 

The most creative (and possibly efficient solutions) aren't online. 

EducationOne of my favorite authors of all time never wrote any fiction. His name is Richard Feynman. He was a scientist and winner of a Nobel Prize in physics. The reason he won it is punctuated by his affliction for figuring things out as opposed to looking them up. By thinking, he often debunked popular theories. 

It had been that way all his life. Even when he was 11, Feynman started to think his way around radios. Eventually, he moved on to fixing burglar alarms, amplifiers and other gadgets too. It was in his nature. He seldom looked anything up. Reinventing the wheel, for him, often made the wheel better. 

There are dozens of stories that underscore his point in his books and books about him. He said it over and over and over again. Even when the New York Times wrote an article about his legacy in 1992, it recounted how Murray Gell-Mann described The Feynman Algorithm to solve everything. 

What is the algorithm? It's simple enough. You write down the problem. You think very hard. And then you write down an answer. For many years, this phenomenon called thinking is what set American students apart from students in the rest of the world despite those international tests that suggested otherwise.

Most students, he observed when teaching abroad, are taught to memorize the answers. But he preferred to teach students to think through problems rather than always assuming the experts were right. Not only did that inspire new ways to think about things, but it also gave students the ability to apply what they've learned to a completely new set of paradigms and problems. Right. They get good at it.

There are some days that I'm not sure Feynman would feel American students are set apart anymore. Many of our students have been taught to resist the urge to think nowadays. And they are not alone. 

People ask questions online all the time or turn to key word searches to ask things like "how do I get more traffic to my site?" or "how do I get more Twitter followers?" or "who are the influencers in this field and that field?" as if those people can think better than they. There is nothing wrong with that, but I wonder if any of them know that one set of solutions doesn't fit a different set of problems.

Sure, seeing how other people solve their problems can be useful at times. But almost every communication problem is patently unique. You have to think very hard. Besides, just as I told my son, you have to try thinking in order to become a great thinker. It requires practice, just like anything. 

How about you? What do you think? And by that, I mean about anything? The comments are yours. Let's talk.

Wednesday, August 28

Does Social Justice Fit Somewhere Between Silly Cat Videos?

Sometimes the hardest thing to reconcile about social networks is how serious they can be. You know what I mean. We've all seen friendships and family members splinter over political and social issues on platforms like Twitter and Facebook. People lose jobs. Companies get embarrassed. Bullies are outed.

Yes, social media can be serious. In fact, it was the seriousness of it that inspired one recent discussion about how labels can trap and condemn us if we aren't careful. They really do. Every day. 

In direct contrast, social networks don't always seem serious. It's the silliness and steady stream of absurdity that can prove bothersome. And this seems especially true when it detracts from social justice.

This is why Amy Tobin was inspired to write Social Justice: Have The Social Networks Failed Us, Or Have We Failed Them?, a column that captures how something silly like Ben Affleck as Batman can trump something serious like chemical weapons in Syria. The effect is always profound. Any time someone draws a contrast between soft news and hard news, someone else will feel petty for talking about superheroes while people die in the streets of Syria. It's us who fiddles while Rome burns.

If you want to change the world, don't blow against the wind. Fan the flame that's waiting.

I know how Amy feels. A few years ago, Tony Berkman, president of BlogCatalog, asked the same thing in a different way. He wanted to know what bloggers would talk about if they weren't talking about then headline stealers Paris Hilton and Britney Spears. So we all sat down and decided to find out. 

The question was especially relevant to me. A couple of years prior, celebrity was the cause for why one of our best practice short-term public relations campaigns became a best practice media kit. The kit was a winner but the campaign missed when we were scooped by celebrity.

Specifically, a celebrity trumped a media event that centered around education in Nevada. So there you go. When a "Who Wants To Marry A Millionaire?" contestant files for divorce, news stations don't stand by to cover the governor and a virtual who's who list in state education at the opening of a new private school. 

So naturally, when Berkman asked his question about bloggers, I was primed to participate as one of the founders of an initiative called BloggersUnite. It was the first series of social media awareness campaigns that coordinated bloggers (and later social network participants) to change the world by setting a conversational agenda online.

In the months and years that followed, I developed and executed campaigns for DonorsChoose.org, Amnesty International, AIDS.gov, Heifer International, and March Of Dimes (among others). All of my work was contributed as an in-kind effort to change the world. All of the campaigns were successful, with the most visitable delivering 1.2 million posts on one day, reaching 250 million. 

The volume of the campaign was so loud that it was covered by several dozen media outlets, including CNN. And despite some pushback from social media enthusiasts who prematurely concluded that it was all buzz and no bite, this early awareness campaign eventually changed American policy in Darfur. Right. We changed the world. And we didn't change it once. We changed it a few dozen times.

The prospect that people were willing step up was especially inspiring for Berkman. So he eventually spun the initiative into a standalone, do-it-yourself platform called BloggersUnite. It still exists, but as a silent giant.

Why? It's silent for the same reason I warned him against crowd-sourced solutions, hoping that social would be its own steward for good. Most people don't know how to plan campaigns and most people are too easily distracted to lead. At the same time, if there was ever a time I wanted to be wrong, it was about this observation. 

The spontaneity of social media and social networks is unpredictable at best and overrated at worst. In other words, it takes more than people to drive meaningful conversations like the campaigns we managed before the platform. It takes someone to give it shape and fan the flame once it gets started.

Even then, it takes considerable patience and planning to get anything off the ground, no matter how good the cause might be. You also have to be empathic, not only for the people you are trying to help, but also for those who offer up no sign of support. Why? Because you don't know them.

The hardest lesson in the world is finding empathy for those who laugh while we cry.

Developing these campaigns was hard work. But what is even harder was knowing when not to launch one. As Berkman eventually learned, you can only ask a community to promote worthwhile causes a few times year. Ask too much and you'll burn them out. Ask them to plan it too and most will pass.

And it's on this point that I want to come full circle. When we see society as opposed to people, we all tend to think that all these people — the person sitting across the table, reading our post, passing us on the street — is somehow isolated or inoculated or apathetic against the world. They're not.

Not only are most of them active with their own causes, but they also have their own private battles to fight too. This one just survived cancer. That one just lost their wife to it. This one isn't sure how they'll pay the rent next month. That one found out their spouse is having an affair. This one is wondering where their education took a wrong turn. That one is in need of the services someone else is promoting. And the list goes on. And on. And on.

So if any of those people want to laugh at the prospect of Ben Affleck being Batman, it's okay. They've earned it. Maybe tomorrow they can fret over the international crisis in Syria instead. Or maybe they won't.

As I mentioned to one of my friends while discussing this subject, something needs to be done in Syria but when you attempt to prioritize it against something like a cure for cancer, then there is no contest. But even without prioritizing an endless list of heartbreak in the world, we might remember that even Shakespeare saw a need to insert comedy into his tragedies. Life is heavy enough. It takes considerable effort to lighten it.

Applied to causes, the concept comes from the man who inspired the last BloggersUnite campaign that I was able to step up for and play a major role in developing as a last minute campaign. Patch Adams was among the first in the medical community to defy the dourness of cause marketing and shaping public opinion. He epitomizes the life lesson that angels have wings because they take themselves lightly.

At least that's the way I see it. What do you think? What does Tony Berkman, Simon Mainwaring, or Kate Olsen think? What does anything think? Are social networks too serious, too silly, or does that old rule apply — social is whatever you make of it?

The comments are yours. Feel free to fiddle with this subject or suggest something else. I would love nothing better than every topic to come from you. Let's talk for a change.

Wednesday, August 21

Will Automation Steal The Soul From Social?

There have been several interesting side discussions sparked by my Bob Fass post about his largely unrecognized precursor contributions to social media. Some of them are still simmering, with the most common thread related to where marketing and public relations intend to take social.

Right. If you work in the field, they are talking about you.

And what they have to say might not be taken kindly. There are a growing number of people who are weary of social networks not because they don't like to connect but because conversations are being recorded, even jacked. Some marketers feel they must. Numbers are the measure counted.

"Why spend time counting tweets and retweets when I could actually, you know, connect with other people?" asked David Flores, reflecting on the internal struggle he and other marketers and communicators feel.

Why count indeed? For all the talk about social freeing people from the trappings of unearned authority, some of the liberators have worked diligently to erect new ones. Never mind that the scoring is stacked.

As the New York Times recently cited, some researchers think that only 35 percent of Twitter followers are real people. The balance is made up of bots and semi-automated accounts. That means an account boasting 10,000 might only reach 3,500. But if you ask me, I think it is generous in some cases. Bots attract bots, giving accounts the aura of popularity while never reaching a real human being.

Geoff Livingston recently touched on this too, writing Pop Created The Twitter Link Farm. He focused in on the increasing number of links, with one of the most interesting comments chalking it up to a platform shift. While that might make sense because Twitter never considered itself a social network, the platform shift from conversation to broadcast is a symptom of what marketers measure.

They measure actions (tweets, retweets, link clicks), which discourages dialogue. It discourages it because conversations are not valued on the action scale. It discourages it because the more organic conversations take place, the more marketers have to drown them out with frequency. And it discourages it because scalable actions require automation, which means the marketer isn't participating.

The crux of it reminds me of an Internet infancy story. 

Once upon a time there was a company called America Online (now Aol). No, it wasn't the oddly popular but not so relevant multinational mass media giant we know today. It was a pay-based online service that was the precursor to some of the services people rave about today.

It was also, for many people, the only real option to access the Internet. Sure, there were other choices like the defunct Prodigy or eWorld but not really. Much like they do now, people (and companies) tended to gravitate to where the most people were and that was America Online.

In more ways than one, Twitter is almost akin to the America Online chat room, except it hosts unlimited people as opposed to 23 people at a time. And, in more ways than one, Facebook is akin to America Online communities (with the advent of streaming over threading), right down to its aspiration to be your total and complete online experience. Sure, other networks have borrowed ideas too. Most aren't so new.

For the era, this service worked remarkably well. Most people couldn't even conceive of an Internet without it. It felt like America Online was relatively immortal. And perhaps that is why in addition to charging people $2.95 per hour for usage, the company decided to allow marketers to post links and program bots to run some conversations.

That generated some extra revenue for the company until something unexpected happened. Since marketers knew that the only way to increase their exposure was to increase their frequency, they literally drowned out all human conversations until no one was left except chat rooms of bots, churning away at their pre-programmed content.

How long before marketers reach critical mass again? It's anybody's guess. 

There are only two outcomes for abused message delivery systems. En masse, marketers will either push messages to the point where they become irrelevant (direct mail and pitch lists) or the platform will eventually elevate the rates until it is inaccessible (television) to anyone except those with deep pockets (television and radio). When that happens, people will migrate away to other networks instead.

From my perspective, longevity will favor those marketers that avoid the temptation of the short-term gain because people drive networks, not numbers. After all, as soon as you start thinking about people in terms of numbers, whether how many followers they have or some secret sauce social score, there is a good chance you have already lost them (unless you gamed social to get them in the first place).

At least, that is what I think. What does Brian Solis or Guy Kawasaki or Scott Stratten think? What do you think? Will automation steal the soul from social? Is there something on the horizon that might replace it? Or maybe you would like to strike up some other conversation? The choice belongs to you. The comments are yours. I'll read them too.

Wednesday, August 14

Lions And Labels And Agendas, Oh My. They Made Me Blind.

Agendas
"How we can get people to actually solve problems instead of pushing agendas?" — Amy Vernon 

This is a question that has been rolling around in my head since Amy Vernon asked it in response to an open call for conversation last week. My short answer coached the problem in politics, but the problem is much more hardwired into human beings than we might think. If it wasn't political labels that drive the diatribe and prevent problem solving, it would be something else.

It might be religious labels. It might be ethnic labels. It might be occupational labels. Or it might be the books we read. The music we like. The clothes we wear. The activities we pursue. The experiences we've had. The places we live. The places where we were born. The people we know.

We wake up every day with several thousand labels around our necks. We let them shape us and allow them to shape our perception of other people. We make ourselves slaves to them. And there is no end to how many we might make up. It's why we have nice things. And it's why we can't have nice things.

We've been indoctrinated into addiction. It took our entire childhood.

The truth is we spend most of our childhoods being indoctrinated into labels that make life easier and harder because every label carries an agenda. That's the point. Someone invented them to give life directions, expectations, and excuses. And then our parents and guardians conspire to pass them along just as most of us will when we have children too.

They aren't the only ones. Every peer and role model you ever had did the same thing, for better and worse. Many labels are moving targets, falling in and out of popularity with minorities and majorities.

It doesn't really matter what those labels might be. They blind us by casting bigger shadows than the people who wear them, they bind us to limitations and opportunities, and they consciously and subconsciously tint the lens that we wear when we try to solve problems as individuals and groups.

The only people not overtaken by them have to make a conscious effort to recognize them for what they are, strive to be objective even when it feels impossible, and struggle to retain their sense of self-esteem while not subscribing to stereotypes that the greater society values. It's one of the most difficult things anyone can do in life because people are genuinely afraid that all they can be are their labels.

Who would you be if labels didn't define you? Besides happy, I mean. 

Motherhood
I once had a friend who was struggling with motherhood. She insisted that she wasn't a good mother. The idea was pretty absurd to me because my perception of her abilities vastly eclipsed her own self-perception. So I gave it my best shot. She needed to free herself from the shackles of a "good mom."

After I asked her to write down the definition of everything she considered to be a "good mother," we both took a breath to admire the sheer weight of expectations. Without going into too much detail, suffice to say that the label she had placed on a pedestal was unreachable and unachievable.

Mostly, her list included everything she thought her mom did right, the opposite of everything her mom did wrong, several dozen expectations that are currently popular in society, several dozens values dedicated by faith (even though she was agnostic at the time), and so on and so forth. Once she took it all in and could laugh at how grandiose her job description was, I offered an alternative concept.

"Just like your husband married you and not the idea of a 'good wife,' your son wants to be raised by you and not the idea of a 'good mother,'" I said. "If you are you and do everything from a perspective of unconditional love, then you will be better than a 'good mother' because no one can be you better."

You would be surprised how great people can be when they aren't paralyzed by labels. She did fine.

So what does that have to do with solving problems? Almost everything. 

Have you ever noticed that some of the most explosive companies in history have come out of nowhere? There is a reason for that. They are generally started by entrepreneurs solving a specific problem or changing the status quo.

Why can't big companies do the same? Some of them can, but the advantage belongs to the startup in that they haven't saddled themselves with labels, policies and office politics. People focus on the objective at hand, without any other distractions. Their teams aren't always proven as much as they are ready to prove themselves. And whatever idea they've been turning over is all that really matters.

So let's say the problem is more altruistic, like thirsty children. How do we solve it? Charity: Water says the best way to solve it is to build water projects that put clean, drinkable water closer to the source.

All that stands in their way to deliver it is labels. Some people don't like their business model. Some people don't like that the founder is Christian. Some people don't like their partner organizations. Some people don't like that the program helps people abroad as opposed to at home. Some people worry about project sustainability. Some people want to support another charity. And the list goes on.

Water
We add in additional angst if we make a political issue, where political labels complicate the process. Instead of dealing with the problem. Suddenly, who solves how much of the problem under what criteria and conditions as well as how do they go about it all become subject to the agenda purview of this party or that party and all those special interests, with little concern for actual outcome. The net result becomes a thousand-fold document that costs one hundred times more to accomplish significantly less than what is required.

Nobody is exempt and the test is self-evident. Think of an agency that solves our water problem. Now impart different labels on it, one at a time inserting the label ahead of the word agency. Like this: "_______" agency for water.

Christian. Islamic. Jewish. Satanic. Democrat. Republican. Libertarian. Jeffersonian. Secretive. Communist. Domestic. Conservative. Liberal. International. African. Jamaican. Japanese. Home-Based. Government. And so on and so forth. Which one would you give to?

If you're being honest, certain descriptions might have elicited a positive or negative emotional reaction. It might have been slight, but your prejudices exist, possibly based on your proximity or positive and negative experiences with people who have claimed to represent those things or what you have been told to expect from such people. In swapping the labels, you may even forget the problem.

How do you overcome prejudices and agendas to solve problems?

My oversimplified definition of public relations applies here. While I have more academic definitions, I often say that public relations is the art and science of making "we" out of "us and them." If you want to solve problems without being plagued by agendas, the only possibility is to ask people to temporarily check their labels (not their values) at the door.

It's a tall order to be sure, especially because most people don't even know they exist. They do. They exist on a grand scale, such as those who judge us by the color of our skin. And they exist on a small scale such as how much we might weigh or the shine of our shoes. So if you find someone to set those things aside, even for a little while, then hold onto them tight. They are rare individuals.

At least, that is what I think. I would love to know what you think. I'd also love to know what Dr. Steve Nguyen thinks, and Roger Dooley, and Sandeep Gauntam, or anyone who makes psychology a primary interest as opposed to me, about two classes short of that degree (it was my minor).

Of course, we need not stop with psychologists or people with a bent for the human condition. Anyone can chime in, especially Amy Vernon, who opened the box on this relevant topic. And if this topic is too far removed, that's fine too. What would you like to talk about? The comments are open. Let's talk.

Wednesday, August 7

Bob Fass Beats Everyone In Social Media. Good Morning, 1963.

Ask any social media expert what he or she knows about Bob Fass and most will stare at you blankly, head bobbing but without recognition. They never heard the name before. He isn't "known" in social. He doesn't have a klout score.

And yet, he ought to be known in social media. His ground-breaking work in social media using radio as his medium started long before many social media experts were born. And frankly, he did it better than most people do today.

"But wait," you say. "Radio doesn't count. It's broadcast."

While that might be true for some shows and stations, it was never the case for Fass. Beginning in 1963, he became a pioneer of free form radio. Anyone who called in was given an opportunity to speak about any subject under the sun. There was no plan. There was no format. There was no automation. He didn't fake it.

He didn't even concern himself with a niche. He never worried about his identity. He never once thought of himself as an influencer. He never did anything to chase down listenership. He was merely human, looking to elevate the unsung heroes of New York City from midnight until the break of dawn.

As a result, anybody and everybody was allowed on his show, especially counterculture figures like Paul Krassner, Bob Dylan, Abbie Hoffman, Arlo Guthrie, Timothy Leary, and Allen Ginsberg (to name a few). Listeners were allowed to call in and talk to any of them. One even suggested Dylan sing better, a comment that gave everyone a good laugh. Nobody was hurt by it or needed counseling.

It was a scene where the freedom to think and hash things out made sense. And most of the time, people just called in because they wanted to have a good time. For a few hours every night, they weren't alone.


The scene mostly played out much the same beyond the station too. The so-called virtual community that Fass had created eventually spilled into the streets. He hosted a Fly-In at JFK airport. He organized a Sweep In to clean up city streets. He had a hand in Yip In at Grand Central Station. His listeners marched on the Democratic National Convention in Chicago, 1968. His measured results made history.

But even as they did, Fass never let it go to his head. He wanted to connect with real people. He invited two-way communication. He wanted people to experience life in real time. And he still broke convention at every opportunity. If he liked a song, he might play it once or all night long. His call.

Social media wants to be Radio Unnameable but can't reconcile the business side. 

There is a certain level of inauthenticity in most social media programs because, well, they are programs. At the end of the day, most want you to do something because they are commercial enterprises. There is nothing wrong with that, but sometimes people seen as leaders forget that.

What makes almost all of them fundamentally different from pre-social media mavericks like Fass is that Fass didn't necessarily have an agenda (certainly not a commercial agenda). Social media experts, whether purists or public relations practitioners, don't have the luxury anymore. Most can only pretend to be authentic as they are serving an agenda to capture more leads, listeners, exposure.

This isn't a criticism. It's an observation. Many social media enthusiasts that started five or ten years ago had to abandon their hands-on approaches in favor of scalability. So, almost without fail (there are exceptions), the solutions they turned to came from the same media they had once ridiculed — a mass media model built on number of messages, listeners, clicks, and shares — while rewriting history just to say they thought it up first.

So what is the alternative? And if there is an alternative, just short of open mic night, does encouraging it make any sense from a professional or commercial standpoint? My guess is probably not because the answer lies somewhere in the balance of those two opposing ideas. But what do you think? And by that, what you do you think about anything?

If this space was more like Radio Unnameable, what would we talk about? What you would like to talk about? I'm curious so feel free to suggest anything at all. I'm listening and I'm not alone. The comments are yours.
 

Blog Archive

by Richard R Becker Copyright and Trademark, Copywrite, Ink. © 2021; Theme designed by Bie Blogger Template