Wednesday, October 23

Content May Be King, But People Want Experiences

If you have invested any time as a communicator working in or with social media, there is a pretty good chance that you've heard the declaration that content is king at one time or another. There is some truth to the concept too, which was originally proposed by Bill Gates within a context that might surprise you.

Sure, we can all argue the finer points well enough or be cute and crown the audience, but the truth is that content will reign in one form or another. It's the crux of how we communicate our concepts, ideas, and observations. It's how we educate, inform, and entertain others in the world in which we live.

It doesn't even matter how that content is presented, as long as it is presented well. Write a post or white paper. Shoot a video or record a podcast. Share a picture or create a television series. It's all content.

Content appeals to the immediate but experiences set a plate of permanence.

While teaching Social Media For Strategic Communication at the University of Nevada, Las Vegas, last Saturday, we spent a considerable amount of time talking about content and the constant pressure to produce more and more easily digestible content. Almost everybody does, right?

If you believe like most people — that influence and conversions can be quantified by counting actions — then you could make the case that more posts, more tweets, more stuff that people can act upon somehow counts. In fact, this was the thinking that many direct mail houses adopted ten years ago.

If your direct mail campaign has a two percent return from some list, then all you have to do is increase the frequency (or the list) to generate more revenue. Right? Well, maybe not, even if this thinking does explain why online lead generation is overpriced.

If you ask me, I think all these tactical formulas are detracting from something stronger. And this something is tied to a question I asked in class — if more content more often produces results, then why does one sentence from a book read one time or one scene from a movie screened once or one comment made by a teacher one time stay with someone their entire life? And this something can best be summed up by experiences.

Novels work because our brains see them as experiences. Movies work because our heads are hardwired to attach emotion to sensory perception. Social networks climb to the top of traffic charts not because of the content they provide but because of the sensation of experiences we feel. It's not my social media deck that holds anyone's interest in my class but rather the way I present it and how we experience it.


This communication blog (or journal) is no exception. A few months after writing about radio host Bob Fass and my attempt to make this space more indicative of an open format, more people have visited. In fact, more people have visited despite my efforts to undo blog "rules."

I stopped concerning myself with frequency, making this a weekly as opposed to a daily. And, at the same time, tossed out short content in favor of writing something more substantive. The result has been eye opening in that topics that used to have a one-day shelf life now have a one-week shelf life or more.

While I am not proposing this would be the case for every content vehicle, it does provide an explanation tied in part to the question I asked in class. Even when I wrote daily, the most successful posts or series of posts had nothing to do with "Five Sure-Fire Ways To Get More Traffic!" They had to do with the cancellation of a television show or two, the coverage of a crisis communication study or two or twelve, the involvement and participation of people in things that matter (even when it feels personal), and experiments that involved thousands of people besides myself. Good content? Maybe.

Good experiences? Absolutely. As fun as it has been to write satire at times, the gags rely less on writing and more on experiences. People remember because they were part of something.

The future of the Internet doesn't rely on mobile as much as experiences. 

I used to tell students that technologies in social media mean software. With the advent of Google glass, increasingly immersive projection displays, and the encroachment of the online world into the offline world, I no longer can offer up any such disclaimer. All of it — the hardware, software, and people driving the content and devices — help create the experience and even alter it.

Consider, for example, short stories being published at a pace of 140 characters at a time, characters who suddenly open their new accounts, and one project that included a dialogue and storytelling exchange between four or more accounts (each characters talking from their unique point of view). There are more examples, well beyond Twitter, but the point remains the same. Storytelling creates experiences. Technology creates experiences. Person-to-person interaction on a one-to-one, one-to-some, and one-to-many scale creates experiences.

And the rest? That content without experiences? It still has a place in the world, sure. But the future of social media isn't in producing ever-growing reams of information to get people's attention. It will be to elevate the content into a form of communication that creates a shared experience, online or off.

Can you see this future? And if you can, how might it change your own marketing strategy away from tricking readers into sampling content into some compelling experience that they want to become part of and participate in? The comments are yours to share, your experiences or, perhaps, propose an entirely new conversation. I look forward to it.

Wednesday, October 16

Do Hardships Make Us Human Or Is The Air Of Success Better?

The first time I saw the crowd funding video for Yorganic Chef, I was pleased with the finished product. The run time felt long, but Nick Diakanonis made up for it with his authenticity. He's telling his own story. It only made sense that he would drift off script and elaborate.

I noticed something else the second time I watched the video. There was one segment missing and it left me wondering whether it made a difference. The script, along with the campaign page, left out a story segment. It's the hardship part. 

Another side to the Yorganic Chef crowd funding story. 

There is a good chance that you won't see this segment of the story elsewhere. It was one of the elements left behind after several team members thought the hardship part was a negative. They want to be upbeat and bright. And maybe they're right. Or maybe they're not.

So yesterday, after receiving permission from my friend Nick, I considered the contrast. You see, Yorganic Chef was scheduled to open last year and Nick had already achieved his dream.

That is, he had achieved his dream until something unexpected happened. Two weeks before opening in Los Angeles, the person who owned the facility and packaging equipment gave him an ultimatum. Either Nick would sign over the business and become an employee or there would be no launch. 

Imagine. For the better part of three years, you invest your entire life in one solitary idea — to create a line of non-frozen, ready-made gourmet meals from the ground up, including a direct-to-customer delivery system that required the invention of a patented state-of-the-art thermal bag. You're two weeks from opening. Your dream is about to come true. And then, suddenly, everything is swept away.

What do you do? Do you sell out and become the manager of your own concept (and leave everyone who has supported you behind)? Or, do you undo the last six months of progress and try to start all over? Many people would have been tempted to sell out, but Nick isn't like that. 


As I mentioned before, the video doesn't include anything about the crisis that Nick had to weather. And there is a good chance most of the stories about the campaign won't ever touch upon it. 

But it makes me wonder. Do we have to be perfect to succeed?

Perhaps more than any other kind of person, business professionals and politicians tend to be most concerned about their image. They want to convey an image of perpetual success. They never lose. 

My life has never been like that. Most people have a mixed bag. Sometimes there are great runs when everything seems easy. Sometimes it feels like standing in sludge, with every inch of forward motion requiring the greatest amount of effort possible. It's a given. Some of us share it. Some of us don't.

Sure, some hardships can become victim stories, saddling some people with excuses to never succeed again. But I'm not talking about those. I'm talking about the hardships people face and find a way to overcome, much like Nick is trying to do. He isn't just a successful culinary entrepreneur. He's a culinary entrepreneur who is hoping to rebound from a rotten turn after doing everything right. 

Does that little bit of detail make a difference? Some people seem to think so, believing that the story is more upbeat without mentioning the hardship. Others might disagree, not seeing anything negative in the full story. If anything, they see it as the clarifying detail in starting the Yorganic Chef crowd funding campaign.

The money being raised by Yorganic Chef has a purpose. It's Nick's chance to replace what was lost — the facility and equipment — when someone he trusted revealed a different agenda. If that hadn't happened, Yorganic Chef would already be serving Los Angeles and looking to open in a second market.

So how about you? When you see crowd funding stories like the one launched by Tinu Abayomi-Paul, does it make a difference that she has a need? Or do you prefer a different kind of back story, one that scrubs away the blemishes no matter how relevant they might be? Or maybe it all ties back into those topics we've explored before — perception matters (but not really). Either way, I'd love to know what you think. The comments are yours. 

Wednesday, October 9

How Simple Decisions In Social Media Make Big Differences.

social media
Social media can be a mean sport in some arenas. It can be so mean that sometimes the media overreacts, like Popular Science. The publication will abandon comments, claiming that a politically motivated war on expertise has eroded the popular consensus on "scientifically validated topics."

They don't want to be part of that, even if they still will be (whether they have comments or not). They might even have it wrong. The whole of science is not to continually reinforce "scientifically validated topics" but to investigate the known and unknown. After all, more than one scientifically validated topic has been turned on its head. There are things we don't know. But that's a topic for another time.

Do comment sections really make a difference? 

My interest in this topic was inspired in part by Mitch Joel, who suggested websites could turn comments off, at least until someone develops better technology to keep them free and clean. His point was that online conversations have evolved. Comments are anywhere and everywhere nowadays.

Specifically, people are more likely to share a link and/or add their thoughts elsewhere — Facebook, Twitter, Linkedin, Medium, or some other platform — than they ever will be to leave a comment at the source. Let's face it. Websites and blogs haven't been the center of the social universe for some time.

Today, social media requires significantly more elasticity and adaptability and the conversations that revolve around content are much more hyper-extended. They are smaller, shorter, less formal and more fragmented discussions about articles and posts. It's as if all of social traded sharing for substance.

This is vastly different from the days when bloggers used to covet comments as a measurement (despite never being able to explain why Seth Godin could succeed without them). Years ago, there were primarily three ways to respond to an article or post — you left a comment, wrote a rebuttal (on your own blog), or shared it as a thread in a niche forum. It made things orderly but also exclusionary.

Fragmentation
That is not the case anymore. Now, some articles can sport a dozen mini-conversations within the same platform, initiated by people who might have little or no connection to each other. It's fascinating and fragmented stuff, which is why some pros like Danny Brown look to close the loop on fragmentation.

Livefyre sounds like a decent solution, but not everyone cares for it despite going a bit beyond what Disqus "reactions" used to offer before they discontinued them. Other emergent comment solutions worth exploring include Google+ comments or Facebook comments. They draw mixed reactions too.

For me, I think the issue is something else beyond nuts and bolts. Errant comments, like those that Popular Science complained about, are manageable. Moderating comments by setting permissions isn't as hard as some people make it sound. And if fragmentation is a concern, Livefyre might mitigate it.

All that sounds fine, but it never gets to the root issue. You see, there is only one fundamental difference between comments at the source and comments away from the source.

Do you want comments to be part of the article or about the article?

Comments made at the source become part of the article. Comments made away from the source, even if they are ported in by a program, might relate to but are largely independent of the article. The difference is that simple, and this simplicity is deceiving.

science and faithIt's deceiving because when someone comments, where someone comments and to whom they comment to all have a bearing on the content, context, and character of that comment. It's deceiving because people tend to write to the author at the source (or other commenters) while they tend to write about the author or source material (sometimes slanting the intent to serve their purpose) away from the source. And it's deceiving because comments away from the source will never have the same kind of physical attachment or quasi permanence that those comments closer to source seem to achieve.

Right. Most people do not search for reactions when an article is older than a week. Few have the appetite to scroll long lists of link shares that aren't really comments, whether they are ported in or not. And, unless there is historical or outlandish content, even fewer read comments bumped to page 2.

So when Popular Science made the decision to abandon comments, they didn't just make a decision to suspend spammers and people they fundamentally disagree with on topics like climate change and evolution. They made a decision to disallow different viewpoints from becoming part of an article. And they more or less told told readers to write about the content but not to the authors of that content.

In a few weeks' time, their decision will likely be sized up for its pros and cons. But make no mistake, it was still the wrong decision. Silence is no friend of science.

You see, neither science nor faith need to shirk at a politically motivated war on their mutual expertise. The truth is that they are not nearly as polarizing as some would have you believe. Science and faith are like brothers in attempting to understand the unknown, often inspiring the other to stop and think.

What Popular Science could have done instead was create a white list of commenters better suited to scientific discussion, perhaps with differing but conscientious viewpoints. Such an approach might have moved their content forward, leading to breakthroughs or a better understanding of science.

But what do I know? I've adopted a different outlook altogether. Comments, I think, work best when they are treated like someone who calls into a radio talk show. If you could talk about anything you want, what do you want to talk about today? The comments are yours or we can chat in person at the University of Nevada, Las Vegas on October 19 during a 3-hour social media session.

Wednesday, October 2

Teaching People To Write Requires A Contradictory Approach.

"I have known writers who paid no damned attention whatever to the rules of grammar and rhetoric and somehow made the language behave for them." — Red Smith 

Red Smith was one of the finest sportswriters in history. Not only did he receive the J.G. Taylor Spink Award from the Baseball Writers Association Of America, but he was also the first sportswriter to win the Pulitzer Prize for commentary. Even more notable, he is the Red Smith for whom the Red Smith Award from the Associated Press is named. Ernest Hemingway even immortalized him in a novel.

"And he noticed how the wind was blowing, looked at the portrait, poured another glass of Valpolicela and then started to read the Paris edition of the New York Herald Tribune. 

I ought to take the pills, he thought. But the hell with the pills.

Then he took him just the same and went on reading the New York Herald. He was reading Red Smith, and he liked him very much." — Ernest Hemingway, Across the River and Into the Trees 

But that's not why I quote him in every writing, editing and proofreading class I teach. I quote him because he is right. And I quote him as a reminder to myself to never become a pompous ass about the trade and craft. There are too many writers who do, claiming they know this and that about writing.

Editing And Proofreading Your Work at the University Of Nevada, Las Vegas. 

It's always a challenging prospect — standing in front of varied students who range in age, interest and experience — for a three hours on some random morning or afternoon when the subject of the day is editing and proofreading. (It will be a morning session this Saturday.)

What makes this class especially daunting is that I have one chance to help people become better writers, editors or proofreaders. It's not like Writing For Public Relations at all, with writing assignments (and the rewrites of those assignments) being passed back and forth for ten weeks.

No, this class is a one-time shot, taught only once in the spring and once in the fall (occasionally once in the summer). And while I always present myself in a suit out of respect to those who attend, the class itself remains informal. I invite students to stop me cold, ask me questions on the fly, and otherwise test my respectable but finite knowledge about the written language.

Sometimes it takes awhile to visualize their hypotheticals and every now and again I have to research their questions after class because they stump me on the spot, but otherwise I manage well enough. Even the few times that I didn't think I managed well enough worked out for the best. I love to learn too.

Some of my lessons from previous classes even become part of my future classes. One of my favorite stories includes how I used to use "website" as an example of English being a living language until one student pointed out the Associated Press insisted it be spelled "Web site." So, I changed my class (providing the Associated Press explanation) only to be schooled by a different student when the Associated Press changed its ruling a week prior to my class (and without my knowledge). Figures.

The day someone thinks they've mastered writing is the day they aren't worth reading.

When I was younger, I used to sweat the outlandishly difficult questions or insistent but mostly wrong students. Nowadays, I'm more inclined to laugh about it, regardless of who is proven ignorant.

I attribute that to Smith. He knew better than most: telling someone how to write is futile. You can only show them how to write better. He was not alone in believing it.

Beyond the more obvious industry hacks like David Ogilvy, William Bernbach, Leo Burnett, Shirley Polykoff, I've learned a few things from authors like Ernest Hemingway, Allen Ginsberg, Ray Bradbury, Truman Capote, Norman Mailer, Kurt Vonnegut and Joseph Wambaugh (to name a few). Except for Vonnegut (sort of), none of them believed a formula could make anyone a great writer.

Instead, you have to see writers as people in various stages of aptitude, ranging from the novice who doesn't realize there are "rules" to the experienced "blow hard" who lives for the rules (or is delusional enough to think he/she is better than any rules). Once you do, you guide them from one layer of aptitude to the next until they understand that being a better writer isn't very complicated even if it is contradictory.

You want to write straight, honest prose that can touch a human being. Nothing more or less. 

That's the easy part. The hard part is that there are a thousand different ways to do it. There are a million different things that can get in the way of doing it well, which is why every word, sentence, paragraph, chapter, plot and story needs to be tested against whatever the writer already knows.

You dust off those "rules," filters, and suggestions and then ask an honest question: Would this concept make it better, worse, or about the same? Oversimplified, you might ask: Does starting this sentence with something unconventional like "And" make it better, worse, or about the same.

As long as you have a good reason to do or not do something (and not as a defensive justification or cop out), you can break with standard, style, format or the so-called rules any time you want. Of course, this also assumes you know the rules you want to break and understand what's behind them.

This is probably why people who teach writing sometimes seem like they burn the candle at both ends. We want, or at least I want, to lay out some rules for people to try on and then encourage them to wear those that fit and dismiss those that don't fit. It's how you become a better writer, and you will never stop doing it (ever). Anyone who tells you differently is either too busy trying to imitate or perhaps too busy trying to justify why they can't imitate.

What about you? Do you have any rules, techniques, or tips that you've found useful? I'd love to read them or check them out. Just drop them in the comments Or, if you would rather talk about something else all together, please do. The comment section is an open forum around here.

Wednesday, September 25

You Can Make The Internet Meaningful By Doing Stuff Offline.

I had never heard of neuralgia until a few days ago. It is pain in one or more nerves caused by a change in neurological structure of the nerves rather than by the excitation of healthy pain receptors. In other words, the nerves tell your brain to feel intense stimulus even when there isn't any.

It is painful. It is debilitating. And it afflicts someone I've come to know over the past few months. She has suffered with it for the better part of a decade but most people didn't even know it.

Most of the time, Tinu Abayomi-Paul's condition manifests itself as chronic back pain. This time is different. It is severe enough that she will be undergoing surgery and decommissioned for a month, maybe longer.

This is especially challenging for her because, like me, she has a small business. In her case, she has two micro-businesses with an emphasis on search engine optimization and social media. And because both businesses rely extensively on providing services, she will not generate any income while out.

What do small business owners do when there is no safety net? 

Sure, some business owners are like me. You set something aside to weather the storm and hope it's enough. This time around, I'm cutting it close after recovery. But that's a story for a different time.

I'm only mentioning it now because Abayomi-Paul is facing something similar but different. She didn't have the luxury of being ready to weather an unexpected surgery this time around. She needs help.

She isn't asking for charity. All she wants is to work through recovery. So Abayomi-Paul had the novel idea to run an Indiegogo campaign to raise the money she needs to make ends meet while she recovers. Here's her story, along with some discounted packages that she put together for her campaign.


This is a short 10-day run campaign. It ends next Tuesday and you can find out more information about  Abayomi-Paul on her website Free Traffic Tips. For campaign details and packages, visit Indiegogo.

Do keep in mind that I'm taking a leap of faith as this isn't a pure endorsement. I haven't worked with Abayomi-Paul before, but I do know plenty of people who have. Mostly, I've enjoyed some banter with her as part of a social network group. I've also read her content and watched a few instructional videos that she has produced. She knows her stuff without all the bull that other people like to spread.

Who knows? It might make a great case study or best practice as one of those stories for the other Internet — the one that people sometimes forget about in favor of big data, big numbers, and big distractions.

We can make a meaningful online experience by doing things offline.

Yes, there really is an Internet with deeper purpose. It's the one that many pros abandoned so they could write business card books about social. So you don't hear about this stuff as much anymore because it doesn't draw traffic. If you want visitors, you need to write about landing on this page instead.

But hey, that's mostly okay. I don't begrudge anyone an opportunity to enjoy a silly cat video or hawk some ROI (oddly) to companies that will never appreciate why Dawn Saves Animals without the benefit of coupon codes, junk mail, or mountains of content.

You see, it's all very simple really. They do something instead. And then what they did lands online. It's something I hope my kids learn. Legacies can be written about online, but we make them offline. I think Abayomi-Paul deserves that chance. Many people do. I'll write about a few more soon enough.

But today, given all the changes coming down on search engine optimization, maybe this will be a great opportunity to talk to someone who knows about it. And all she is really asking for in return is a little time offline so she can come back and deliver something meaningful online. So what do you think?

Is this a worthwhile case study for business practitioners who have the misfortune of a medical emergency? Or maybe you might like to hear from someone else about Abayomi-Paul? Kami Huyse, Anne Weiskopf, Jennifer Windrum, and Ann Handley were among the first funders. Or maybe you would like to talk about something else all together? I'm fine with that too. The comments are yours.

Wednesday, September 18

A Leadership Lesson From A Place Few Experts Tread

Last August, U.S. President Barack Obama compared Russian President Vladimir Putin to a tiresome schoolboy. But less than 30 days after he made the offhanded comment, it was President Putin who would school President Obama in foreign affairs. Russia is celebrating a diplomatic victory this week.

Somehow, President Obama and his administration allowed the Syria crisis to get away from them. Instead of the United States leading a coalition of countries to bring Syria to justice for using chemical weapons, Russia is being celebrated for stopping the escalation of aggression in the Middle East at the hands of unexceptional Americans. Syria will also surrender its chemical weapons, or so they say, and the world will be a better place.

The turnabout of this narrative was about as masterful as any propaganda since the end of the Cold War. One might even praise the audacity of the move, if not for the considerable consequences.

How recent events have changed the geo-political landscape for now.

Russia temporarily gains world prestige and more influence in the Middle East while protecting its Syrian allies, a country run by a leader who used chemical weapons against their own people. Syria also works lockstep with Iran, smuggling arms to the Hisbollah in Lebanon. And Iran has said all along that the U. S. was behind the uprising, a charge that may not have been initially accurate but has become accurate in the last two years. The arms sent into the conflict are limited, with the U.S. fearing these weapons could all too easily be turned on us as suppliers because some rebels are tied to the same terrorists the U.S. has fought for years. To say Syria is a mess is an understatement.

But most Americans don't even know that the U.S. has already picked a side. It wants to topple the government in Syria, but obviously less than Russia wants to keep Bashar al-Assad.

Those seem to be some of the facts (but not nearly all of them). Just don't mistake them as a call for action or involvement on my part. To me, Syria is another cumulation of events that convinces Americans to choose between two bad choices — act as the global police even when the world doesn't want you to while supporting rebels that may (or may not) include your enemies or do nothing, which is de facto support for a dictator who has long despised you and is happy to operate against your interests.

This is why so many advisors frame U.S. foreign policy in Syria up as a choice between which we like better: the enemy you know or the enemy you do not. It would take a fool to hazard a guess.

Lesson learned: Leadership does not talk big with a little stick. 

Many people seemed enamored by Teddy Roosevelt's foreign policy that is often summed up from his quip to "speak softy and carry a big stick." And yet, few seem to realize that this is akin to negotiating peacefully while simultaneously threatening people with a "big stick." It was coined at a time when the division between American isolationists and internationalists had boiled over, again.

This division is one of the more interesting ones in politics because it does not follow party lines. Although current public perception is that the Republicans are hawks and Democrats are doves, it's not really true. On the contrary, it was progressives who led the country into conflict and war more often than their counterparts who prefer to live and let live. Americans only think the opposite because neoconservatives joined progressives as being internationalists.

Sometimes this internationalist concept works. Sometimes it does not. And this time, it obviously has not worked for President Obama, partly because of his own words and actions for the better part of seven years. He has campaigned under the auspices of being against what the world saw as American imperialism, but has secretly and stealthily supported various programs that reinforce the idea anyway.

The primary difference between this administration and last mostly has to do with the size of the talk and the size of the stick. Bush favored speaking big and carrying a big stick. Obama favors speaking big and carrying a little stick. And, unfortunately, this has made Americans largely unsupportive of any action abroad while making their detractors much more emboldened to push new agendas.

Who cares? Well, that is a subject open for debate. There are those who believe the U.S. can exist without being a major player in the world and there are those who believe we have to lead the world. The thinnest majority of Republicans and Democrats believe we ought to lead because history has proven that trouble will knock on the door of the U.S. whether it goes looking or not.

Foreign policy isn't what this post is about. It's about leadership. 

There are plenty of people who have long criticized the foreign policy of the Obama administration, among other things. The reason it invites criticism is because it lacks coherency, primarily because the original vision that he brought to the presidency runs counter to the way the world works.

President Obama told the American people that retracting the reach of the United States while simultaneously making nice-nice with the world would place us in a potion where our diplomatic prowess alone could influence world affairs. It's not really true, but that was the vision he forwarded to the American people and the world (despite trying to keep a finger on specific interests anyway).

There are dozens of places where that was never going to work. Syria is one of them. Instead, it is one of those places where you have to make the decision, announce the decision, and act on the decision.

The Obama administration didn't do that, mostly, because too much could go wrong. They also didn't want to be responsible if it did. So, in effect, they pushed it off for a few years and then attempted to assemble a middle-of-the-road approach that wouldn't make it look like Obama was rolling back on his posture to be a polite player in the world. When that didn't work, he punted to Congress for a vote while simultaneously withholding any accountability to that vote in case it didn't go his way.

On the domestic front, it all comes across as being considerate, depending largely on how well you like his administration. All the while, everyone forgot that the U.S doesn't exist in a vacuum. Other world leaders saw the vote-and-pony show as indecisiveness at best and weakness at worst. And no matter how you see it, other countries have since seized on the moment.

Contrast this with what Prime Minister David Cameron did. He said the United Kingdom ought to become involved and he made a very strong case to Parliament. When Parliament voted against intervention, he stated it was a mistake but would accept the will of the people. It was a done deal and he didn't look too passive, too pompous or too weak after the outcome.

What's the difference? The difference is that Cameron understands being a leader as opposed to being an expert politician. In this case, a leader transcends their appearance of authority in order to ensure any following is aligned to the organizational goals and not themselves as individuals.

Experts, on the other hand, tend to be different all together. They derive their appearance of authority from their reputation and are not willing to risk it by accepting responsibility. In this case (and possibly many others), President Obama is playing expert in Syria (without the right expertise, perhaps).

The expert fallacy can cost an organization its clarity. 

Right now, almost everyone in the U.S. is looking for experts to solve problems when what we really need are leaders. We see it in politics. We see it in business. But based on the number of people who have added "expert" to their labels (deserved or not), it's safe to say that we have a glut of those instead.

What's the difference? Leaders are those people who figure things out. They are people who have a vision, sometimes asking experts for their opinions on how to make that vision real, and then approve those opinions based on what he or she believes is most likely to make that vision real.

If they'e right, history remembers them with reverence. If they are wrong, not so much. The risk is part of the job. Leaders are held accountable. In government, they don't pin blame elsewhere. In business, they don't need golden parachutes. These are the people who make their own way.

Leaders don't cling to and attempt to manipulate the world they know; they look to shape the world into something no one had ever considered before. (Ergo, a push button phone design expert can't see a flat screen phone as being functional.) And this is why they continually find solutions that experts could never fathom. It's one thing to be studied in what is, and another thing to see what could be.

When it comes to world affairs, history has shown it that the world will praise whomever is steadfast in their vision and conviction to see it through, despite being wrong on some points. So how about you?

Are you are a leader or follower? Do you know your field or are you ready to re-imagine it? Or maybe you want to talk about something else? One of my friends has already suggested we abandon Syria and start focusing on some of the problems we have right here in this country, like homeless workers. What do you think ... about anything?
 

Blog Archive

by Richard R Becker Copyright and Trademark, Copywrite, Ink. © 2021; Theme designed by Bie Blogger Template