Wednesday, October 9

How Simple Decisions In Social Media Make Big Differences.

social media
Social media can be a mean sport in some arenas. It can be so mean that sometimes the media overreacts, like Popular Science. The publication will abandon comments, claiming that a politically motivated war on expertise has eroded the popular consensus on "scientifically validated topics."

They don't want to be part of that, even if they still will be (whether they have comments or not). They might even have it wrong. The whole of science is not to continually reinforce "scientifically validated topics" but to investigate the known and unknown. After all, more than one scientifically validated topic has been turned on its head. There are things we don't know. But that's a topic for another time.

Do comment sections really make a difference? 

My interest in this topic was inspired in part by Mitch Joel, who suggested websites could turn comments off, at least until someone develops better technology to keep them free and clean. His point was that online conversations have evolved. Comments are anywhere and everywhere nowadays.

Specifically, people are more likely to share a link and/or add their thoughts elsewhere — Facebook, Twitter, Linkedin, Medium, or some other platform — than they ever will be to leave a comment at the source. Let's face it. Websites and blogs haven't been the center of the social universe for some time.

Today, social media requires significantly more elasticity and adaptability and the conversations that revolve around content are much more hyper-extended. They are smaller, shorter, less formal and more fragmented discussions about articles and posts. It's as if all of social traded sharing for substance.

This is vastly different from the days when bloggers used to covet comments as a measurement (despite never being able to explain why Seth Godin could succeed without them). Years ago, there were primarily three ways to respond to an article or post — you left a comment, wrote a rebuttal (on your own blog), or shared it as a thread in a niche forum. It made things orderly but also exclusionary.

Fragmentation
That is not the case anymore. Now, some articles can sport a dozen mini-conversations within the same platform, initiated by people who might have little or no connection to each other. It's fascinating and fragmented stuff, which is why some pros like Danny Brown look to close the loop on fragmentation.

Livefyre sounds like a decent solution, but not everyone cares for it despite going a bit beyond what Disqus "reactions" used to offer before they discontinued them. Other emergent comment solutions worth exploring include Google+ comments or Facebook comments. They draw mixed reactions too.

For me, I think the issue is something else beyond nuts and bolts. Errant comments, like those that Popular Science complained about, are manageable. Moderating comments by setting permissions isn't as hard as some people make it sound. And if fragmentation is a concern, Livefyre might mitigate it.

All that sounds fine, but it never gets to the root issue. You see, there is only one fundamental difference between comments at the source and comments away from the source.

Do you want comments to be part of the article or about the article?

Comments made at the source become part of the article. Comments made away from the source, even if they are ported in by a program, might relate to but are largely independent of the article. The difference is that simple, and this simplicity is deceiving.

science and faithIt's deceiving because when someone comments, where someone comments and to whom they comment to all have a bearing on the content, context, and character of that comment. It's deceiving because people tend to write to the author at the source (or other commenters) while they tend to write about the author or source material (sometimes slanting the intent to serve their purpose) away from the source. And it's deceiving because comments away from the source will never have the same kind of physical attachment or quasi permanence that those comments closer to source seem to achieve.

Right. Most people do not search for reactions when an article is older than a week. Few have the appetite to scroll long lists of link shares that aren't really comments, whether they are ported in or not. And, unless there is historical or outlandish content, even fewer read comments bumped to page 2.

So when Popular Science made the decision to abandon comments, they didn't just make a decision to suspend spammers and people they fundamentally disagree with on topics like climate change and evolution. They made a decision to disallow different viewpoints from becoming part of an article. And they more or less told told readers to write about the content but not to the authors of that content.

In a few weeks' time, their decision will likely be sized up for its pros and cons. But make no mistake, it was still the wrong decision. Silence is no friend of science.

You see, neither science nor faith need to shirk at a politically motivated war on their mutual expertise. The truth is that they are not nearly as polarizing as some would have you believe. Science and faith are like brothers in attempting to understand the unknown, often inspiring the other to stop and think.

What Popular Science could have done instead was create a white list of commenters better suited to scientific discussion, perhaps with differing but conscientious viewpoints. Such an approach might have moved their content forward, leading to breakthroughs or a better understanding of science.

But what do I know? I've adopted a different outlook altogether. Comments, I think, work best when they are treated like someone who calls into a radio talk show. If you could talk about anything you want, what do you want to talk about today? The comments are yours or we can chat in person at the University of Nevada, Las Vegas on October 19 during a 3-hour social media session.

Wednesday, October 2

Teaching People To Write Requires A Contradictory Approach.

"I have known writers who paid no damned attention whatever to the rules of grammar and rhetoric and somehow made the language behave for them." — Red Smith 

Red Smith was one of the finest sportswriters in history. Not only did he receive the J.G. Taylor Spink Award from the Baseball Writers Association Of America, but he was also the first sportswriter to win the Pulitzer Prize for commentary. Even more notable, he is the Red Smith for whom the Red Smith Award from the Associated Press is named. Ernest Hemingway even immortalized him in a novel.

"And he noticed how the wind was blowing, looked at the portrait, poured another glass of Valpolicela and then started to read the Paris edition of the New York Herald Tribune. 

I ought to take the pills, he thought. But the hell with the pills.

Then he took him just the same and went on reading the New York Herald. He was reading Red Smith, and he liked him very much." — Ernest Hemingway, Across the River and Into the Trees 

But that's not why I quote him in every writing, editing and proofreading class I teach. I quote him because he is right. And I quote him as a reminder to myself to never become a pompous ass about the trade and craft. There are too many writers who do, claiming they know this and that about writing.

Editing And Proofreading Your Work at the University Of Nevada, Las Vegas. 

It's always a challenging prospect — standing in front of varied students who range in age, interest and experience — for a three hours on some random morning or afternoon when the subject of the day is editing and proofreading. (It will be a morning session this Saturday.)

What makes this class especially daunting is that I have one chance to help people become better writers, editors or proofreaders. It's not like Writing For Public Relations at all, with writing assignments (and the rewrites of those assignments) being passed back and forth for ten weeks.

No, this class is a one-time shot, taught only once in the spring and once in the fall (occasionally once in the summer). And while I always present myself in a suit out of respect to those who attend, the class itself remains informal. I invite students to stop me cold, ask me questions on the fly, and otherwise test my respectable but finite knowledge about the written language.

Sometimes it takes awhile to visualize their hypotheticals and every now and again I have to research their questions after class because they stump me on the spot, but otherwise I manage well enough. Even the few times that I didn't think I managed well enough worked out for the best. I love to learn too.

Some of my lessons from previous classes even become part of my future classes. One of my favorite stories includes how I used to use "website" as an example of English being a living language until one student pointed out the Associated Press insisted it be spelled "Web site." So, I changed my class (providing the Associated Press explanation) only to be schooled by a different student when the Associated Press changed its ruling a week prior to my class (and without my knowledge). Figures.

The day someone thinks they've mastered writing is the day they aren't worth reading.

When I was younger, I used to sweat the outlandishly difficult questions or insistent but mostly wrong students. Nowadays, I'm more inclined to laugh about it, regardless of who is proven ignorant.

I attribute that to Smith. He knew better than most: telling someone how to write is futile. You can only show them how to write better. He was not alone in believing it.

Beyond the more obvious industry hacks like David Ogilvy, William Bernbach, Leo Burnett, Shirley Polykoff, I've learned a few things from authors like Ernest Hemingway, Allen Ginsberg, Ray Bradbury, Truman Capote, Norman Mailer, Kurt Vonnegut and Joseph Wambaugh (to name a few). Except for Vonnegut (sort of), none of them believed a formula could make anyone a great writer.

Instead, you have to see writers as people in various stages of aptitude, ranging from the novice who doesn't realize there are "rules" to the experienced "blow hard" who lives for the rules (or is delusional enough to think he/she is better than any rules). Once you do, you guide them from one layer of aptitude to the next until they understand that being a better writer isn't very complicated even if it is contradictory.

You want to write straight, honest prose that can touch a human being. Nothing more or less. 

That's the easy part. The hard part is that there are a thousand different ways to do it. There are a million different things that can get in the way of doing it well, which is why every word, sentence, paragraph, chapter, plot and story needs to be tested against whatever the writer already knows.

You dust off those "rules," filters, and suggestions and then ask an honest question: Would this concept make it better, worse, or about the same? Oversimplified, you might ask: Does starting this sentence with something unconventional like "And" make it better, worse, or about the same.

As long as you have a good reason to do or not do something (and not as a defensive justification or cop out), you can break with standard, style, format or the so-called rules any time you want. Of course, this also assumes you know the rules you want to break and understand what's behind them.

This is probably why people who teach writing sometimes seem like they burn the candle at both ends. We want, or at least I want, to lay out some rules for people to try on and then encourage them to wear those that fit and dismiss those that don't fit. It's how you become a better writer, and you will never stop doing it (ever). Anyone who tells you differently is either too busy trying to imitate or perhaps too busy trying to justify why they can't imitate.

What about you? Do you have any rules, techniques, or tips that you've found useful? I'd love to read them or check them out. Just drop them in the comments Or, if you would rather talk about something else all together, please do. The comment section is an open forum around here.

Wednesday, September 25

You Can Make The Internet Meaningful By Doing Stuff Offline.

I had never heard of neuralgia until a few days ago. It is pain in one or more nerves caused by a change in neurological structure of the nerves rather than by the excitation of healthy pain receptors. In other words, the nerves tell your brain to feel intense stimulus even when there isn't any.

It is painful. It is debilitating. And it afflicts someone I've come to know over the past few months. She has suffered with it for the better part of a decade but most people didn't even know it.

Most of the time, Tinu Abayomi-Paul's condition manifests itself as chronic back pain. This time is different. It is severe enough that she will be undergoing surgery and decommissioned for a month, maybe longer.

This is especially challenging for her because, like me, she has a small business. In her case, she has two micro-businesses with an emphasis on search engine optimization and social media. And because both businesses rely extensively on providing services, she will not generate any income while out.

What do small business owners do when there is no safety net? 

Sure, some business owners are like me. You set something aside to weather the storm and hope it's enough. This time around, I'm cutting it close after recovery. But that's a story for a different time.

I'm only mentioning it now because Abayomi-Paul is facing something similar but different. She didn't have the luxury of being ready to weather an unexpected surgery this time around. She needs help.

She isn't asking for charity. All she wants is to work through recovery. So Abayomi-Paul had the novel idea to run an Indiegogo campaign to raise the money she needs to make ends meet while she recovers. Here's her story, along with some discounted packages that she put together for her campaign.


This is a short 10-day run campaign. It ends next Tuesday and you can find out more information about  Abayomi-Paul on her website Free Traffic Tips. For campaign details and packages, visit Indiegogo.

Do keep in mind that I'm taking a leap of faith as this isn't a pure endorsement. I haven't worked with Abayomi-Paul before, but I do know plenty of people who have. Mostly, I've enjoyed some banter with her as part of a social network group. I've also read her content and watched a few instructional videos that she has produced. She knows her stuff without all the bull that other people like to spread.

Who knows? It might make a great case study or best practice as one of those stories for the other Internet — the one that people sometimes forget about in favor of big data, big numbers, and big distractions.

We can make a meaningful online experience by doing things offline.

Yes, there really is an Internet with deeper purpose. It's the one that many pros abandoned so they could write business card books about social. So you don't hear about this stuff as much anymore because it doesn't draw traffic. If you want visitors, you need to write about landing on this page instead.

But hey, that's mostly okay. I don't begrudge anyone an opportunity to enjoy a silly cat video or hawk some ROI (oddly) to companies that will never appreciate why Dawn Saves Animals without the benefit of coupon codes, junk mail, or mountains of content.

You see, it's all very simple really. They do something instead. And then what they did lands online. It's something I hope my kids learn. Legacies can be written about online, but we make them offline. I think Abayomi-Paul deserves that chance. Many people do. I'll write about a few more soon enough.

But today, given all the changes coming down on search engine optimization, maybe this will be a great opportunity to talk to someone who knows about it. And all she is really asking for in return is a little time offline so she can come back and deliver something meaningful online. So what do you think?

Is this a worthwhile case study for business practitioners who have the misfortune of a medical emergency? Or maybe you might like to hear from someone else about Abayomi-Paul? Kami Huyse, Anne Weiskopf, Jennifer Windrum, and Ann Handley were among the first funders. Or maybe you would like to talk about something else all together? I'm fine with that too. The comments are yours.

Wednesday, September 18

A Leadership Lesson From A Place Few Experts Tread

Last August, U.S. President Barack Obama compared Russian President Vladimir Putin to a tiresome schoolboy. But less than 30 days after he made the offhanded comment, it was President Putin who would school President Obama in foreign affairs. Russia is celebrating a diplomatic victory this week.

Somehow, President Obama and his administration allowed the Syria crisis to get away from them. Instead of the United States leading a coalition of countries to bring Syria to justice for using chemical weapons, Russia is being celebrated for stopping the escalation of aggression in the Middle East at the hands of unexceptional Americans. Syria will also surrender its chemical weapons, or so they say, and the world will be a better place.

The turnabout of this narrative was about as masterful as any propaganda since the end of the Cold War. One might even praise the audacity of the move, if not for the considerable consequences.

How recent events have changed the geo-political landscape for now.

Russia temporarily gains world prestige and more influence in the Middle East while protecting its Syrian allies, a country run by a leader who used chemical weapons against their own people. Syria also works lockstep with Iran, smuggling arms to the Hisbollah in Lebanon. And Iran has said all along that the U. S. was behind the uprising, a charge that may not have been initially accurate but has become accurate in the last two years. The arms sent into the conflict are limited, with the U.S. fearing these weapons could all too easily be turned on us as suppliers because some rebels are tied to the same terrorists the U.S. has fought for years. To say Syria is a mess is an understatement.

But most Americans don't even know that the U.S. has already picked a side. It wants to topple the government in Syria, but obviously less than Russia wants to keep Bashar al-Assad.

Those seem to be some of the facts (but not nearly all of them). Just don't mistake them as a call for action or involvement on my part. To me, Syria is another cumulation of events that convinces Americans to choose between two bad choices — act as the global police even when the world doesn't want you to while supporting rebels that may (or may not) include your enemies or do nothing, which is de facto support for a dictator who has long despised you and is happy to operate against your interests.

This is why so many advisors frame U.S. foreign policy in Syria up as a choice between which we like better: the enemy you know or the enemy you do not. It would take a fool to hazard a guess.

Lesson learned: Leadership does not talk big with a little stick. 

Many people seemed enamored by Teddy Roosevelt's foreign policy that is often summed up from his quip to "speak softy and carry a big stick." And yet, few seem to realize that this is akin to negotiating peacefully while simultaneously threatening people with a "big stick." It was coined at a time when the division between American isolationists and internationalists had boiled over, again.

This division is one of the more interesting ones in politics because it does not follow party lines. Although current public perception is that the Republicans are hawks and Democrats are doves, it's not really true. On the contrary, it was progressives who led the country into conflict and war more often than their counterparts who prefer to live and let live. Americans only think the opposite because neoconservatives joined progressives as being internationalists.

Sometimes this internationalist concept works. Sometimes it does not. And this time, it obviously has not worked for President Obama, partly because of his own words and actions for the better part of seven years. He has campaigned under the auspices of being against what the world saw as American imperialism, but has secretly and stealthily supported various programs that reinforce the idea anyway.

The primary difference between this administration and last mostly has to do with the size of the talk and the size of the stick. Bush favored speaking big and carrying a big stick. Obama favors speaking big and carrying a little stick. And, unfortunately, this has made Americans largely unsupportive of any action abroad while making their detractors much more emboldened to push new agendas.

Who cares? Well, that is a subject open for debate. There are those who believe the U.S. can exist without being a major player in the world and there are those who believe we have to lead the world. The thinnest majority of Republicans and Democrats believe we ought to lead because history has proven that trouble will knock on the door of the U.S. whether it goes looking or not.

Foreign policy isn't what this post is about. It's about leadership. 

There are plenty of people who have long criticized the foreign policy of the Obama administration, among other things. The reason it invites criticism is because it lacks coherency, primarily because the original vision that he brought to the presidency runs counter to the way the world works.

President Obama told the American people that retracting the reach of the United States while simultaneously making nice-nice with the world would place us in a potion where our diplomatic prowess alone could influence world affairs. It's not really true, but that was the vision he forwarded to the American people and the world (despite trying to keep a finger on specific interests anyway).

There are dozens of places where that was never going to work. Syria is one of them. Instead, it is one of those places where you have to make the decision, announce the decision, and act on the decision.

The Obama administration didn't do that, mostly, because too much could go wrong. They also didn't want to be responsible if it did. So, in effect, they pushed it off for a few years and then attempted to assemble a middle-of-the-road approach that wouldn't make it look like Obama was rolling back on his posture to be a polite player in the world. When that didn't work, he punted to Congress for a vote while simultaneously withholding any accountability to that vote in case it didn't go his way.

On the domestic front, it all comes across as being considerate, depending largely on how well you like his administration. All the while, everyone forgot that the U.S doesn't exist in a vacuum. Other world leaders saw the vote-and-pony show as indecisiveness at best and weakness at worst. And no matter how you see it, other countries have since seized on the moment.

Contrast this with what Prime Minister David Cameron did. He said the United Kingdom ought to become involved and he made a very strong case to Parliament. When Parliament voted against intervention, he stated it was a mistake but would accept the will of the people. It was a done deal and he didn't look too passive, too pompous or too weak after the outcome.

What's the difference? The difference is that Cameron understands being a leader as opposed to being an expert politician. In this case, a leader transcends their appearance of authority in order to ensure any following is aligned to the organizational goals and not themselves as individuals.

Experts, on the other hand, tend to be different all together. They derive their appearance of authority from their reputation and are not willing to risk it by accepting responsibility. In this case (and possibly many others), President Obama is playing expert in Syria (without the right expertise, perhaps).

The expert fallacy can cost an organization its clarity. 

Right now, almost everyone in the U.S. is looking for experts to solve problems when what we really need are leaders. We see it in politics. We see it in business. But based on the number of people who have added "expert" to their labels (deserved or not), it's safe to say that we have a glut of those instead.

What's the difference? Leaders are those people who figure things out. They are people who have a vision, sometimes asking experts for their opinions on how to make that vision real, and then approve those opinions based on what he or she believes is most likely to make that vision real.

If they'e right, history remembers them with reverence. If they are wrong, not so much. The risk is part of the job. Leaders are held accountable. In government, they don't pin blame elsewhere. In business, they don't need golden parachutes. These are the people who make their own way.

Leaders don't cling to and attempt to manipulate the world they know; they look to shape the world into something no one had ever considered before. (Ergo, a push button phone design expert can't see a flat screen phone as being functional.) And this is why they continually find solutions that experts could never fathom. It's one thing to be studied in what is, and another thing to see what could be.

When it comes to world affairs, history has shown it that the world will praise whomever is steadfast in their vision and conviction to see it through, despite being wrong on some points. So how about you?

Are you are a leader or follower? Do you know your field or are you ready to re-imagine it? Or maybe you want to talk about something else? One of my friends has already suggested we abandon Syria and start focusing on some of the problems we have right here in this country, like homeless workers. What do you think ... about anything?

Wednesday, September 11

Any Fool Can Do What Another Fool Has Done

Miley Cyrus
When Miley Cyrus finally started talking about her performance on the MTV Video Music Awards, she hit every publicity misnomer in existence. According to the pop star, she and Robin Thicke weren't making fools of themselves. They were "making history."

"Madonna's done it. Britney's done it," she said. "Every VMA performance, that's what you're looking for; you're wanting to make history."

She said she doesn't pay any attention to the negative comments either. No matter what anyone thinks, Cyrus says that this has played out so many times in pop music that it doesn't even matter. She's claims to be amused by anyone still taking about it. She said they've thought about it more than she ever did.

Of course, few people are talking about twerking anymore. Her Wrecking Ball video has out-buzzed all that as the pop star stripped down to nothing in order to break video viewership records. Never mind that just as many people are tuning in to see her naked as they are to see her sing, she must be a winner.

So is fashion designer Kenneth Cole. He didn't even have to strip down to boots in order to get attention. He only had to make a joke about boots. "'Boots on the ground' or not, let's not forget about sandals, pumps and loafers," wrote the fashion designer in response to the possibility of the United States taking military action in Syria. Count up all the retweets and raves. He must be a winner too.

The public's fascination with spectacle is as cyclical as it is tired.

Kenneth ColeAmerica isn't becoming a society of spectacle. It has always been a society of spectacle, with the only difference from one decade to the next being our mainstream appetite for it. The 1960s, 1920s, 1880s, 1840s, 1790s all had racy, raunchy, and tasteless elements. The whole world has been part of it too.

It happens so often that one would think we would grow tired of it. But then we all suffer some odd form of public amnesia, forgetting the existence of such things as history tends to tidy itself up when the pendulum swings toward a more buttoned-down decade.

Even when we do remember, we tend to confine our memories to the 1960s because people were really in it for political commentary as opposed to quick profits. And perhaps that alone is why the modern spectacle feels as empty as it is tasteless.

Whereas people like Andy Warhol, Bob Dylan, and Ken Kesey made history, people like Cyrus, Cole, and Ariana Grande will become footnotes of the eventually forgotten. If you don't believe it, take a look at the twerk fail hoax video masterminded by Jimmy Kimmel.

His hoax caught 10 million views, proving that you neither have to be famous nor talented to make a similar impact. But honey badger don't care. Cyrus was happy to up the ante. She not only strips off her clothes for 30 million views but her integrity too. The video isn't much different than the time-honored streak, except most people this desperate for attention aren't attempting to rebrand themselves.

Miley Cyrus nudePublicity is easy. Reputation is hard. 

Those six words were all I offered up about the subject prior to writing this article. They say it all.

Sure, one can easily subscribe to the notion that negative publicity has a positive impact on sales. When you compare Michael Jackson and his run-ins with the law as Jonah Berger, Alan Sorensen, and Scott Rasmussen did in their 2010 research paper on negative publicity, Jackson's album sales went up.

The crux of the research is not new but it is interesting. It is underpinned by the notion that purchases are tied to the quality of the product and what any publicity triggers you to think about.

Negative publicity for Jackson made people think about his great music. Negative publicity preceded cookbook sales for chef Paula Deen. Negative publicity spurred sales of Mel Gibson's films. And yet, you have to ask yourself about the after-controversy market for new material. In other words, negative publicity might drive short-term sales but cost someone's reputational legacy in the process.

In fact, it might be more accurate to say that negative publicity creates an illusion of positive sales because research cannot quantify the lost sales of material that will never be created or a lost legacy. History holds a different reverence for John Lennon, Elvis Presley, Johnny Cash, and Jackson.

But who cares? Some say millennials don't care.

According to some studies, the generation born between 1981-2000 places money, fame and image ahead of self-acceptance, affiliation, and community. And whether you believe it or not, Cyrus fits the short-term mindset as much as Cole is trying to reach them. They are less likely to ridicule the behavior of someone like Cyrus or Cole and more likely to praise it.

Earlier studies said pretty much the same. They don't care. And maybe they aren't alone. The phenomenon isn't confined to a single generation. Most people think that 15 minutes of fame (or infamy) is worth the reputational cost as long as they can capitalize on the short-term success.

CyrusThe Onion did a brilliant job in articulating this fact too. On the day after the Cyrus stunt started making waves, CNN didn't lead the news with world affairs, human achievement, or an attempt to be a positive force for change. The leading headline reinforced mainstream rubber necking.

The commentary is sharply satirical in the telling. The purported explanation from the managing editor of CNN is as simple as it gets. Although making Cyrus the top news story was admittedly a disservice, it ensured more web traffic than any bothersome news like chemical weapons in Syria, civil unrest in Egypt, or even the 50th anniversary of Martin Luther King's "I Have A Dream" speech.

So no, it's not millennials who are guilty of placing the spotlight on one girl's narcissistic booty shaking. That honor belongs to the media serving its viewership. As long as they believe that popcorn means more advertising dollars than meat, then more generations will likely view the working world with disdain in favor of a few fleeting seconds of fame.

But so what? I don't personally care whether Cyrus' actions detract from her own talent. It's up to each of us to carve out our own path in this life. And if that includes selling out for temporary success, I hope it's worth it. Just don't pretend it's original or historic. It's not. History is littered with forgotten fools.

How about you? Do you subscribe to the notion that all publicity is good publicity or that 100,000 Twitter followers will somehow ensure your words will outlast the pyramids of Egypt? What do you think? And by that, I mean anything. There comments are yours. Let's talk.

Wednesday, September 4

Thinking Still Beats Searching When You Need Four Gallons.

Thinking
My wife had a question the other day, but it wasn't her question. The question belonged to my son and he didn't want to ask me. He thought he knew what I would say. He was wrong, but close enough.

The question was a puzzler of sorts. It was a problem from his math teacher. And any student who turns in the answer Tuesday (today) will receive extra credit. The reason my wife asked me wasn't a puzzler. She wanted him to receive the extra credit. (What parent wouldn't? Besides me, I mean.)

Maybe I should clarify that point. I don't want him to receive extra credit. I want him to learn it. And given that he had the whole weekend to figure it out and it was only the Friday before the long Labor Day weekend, there was no rush on my part. 

How can you make four gallons if you only have a three gallon bucket and a five gallon bucket?

I told him to wait until I had finished my part of the shopping list, groceries for the meals I would cook for the week ahead. Even then, I said, expect some help but not the answer. He didn't want that. 

A few minutes later, I looked over at him. He had moved on to another problem. Specifically, he was trying to figure out which route to take as he transported his stolen loot from a bank to an escape vehicle.  Right. He was playing PayDay 2 on the Xbox. 

"Why aren't you working on the problem?" I asked.

"I already spent 20 minutes working on it in class," he said.

"Well, obviously that isn't enough," I suggested. 

"It's all right," he said. "I already looked it up." 

"You did what?"

"I did what you were probably going to tell me to do," he said.

"You did what?" 

"I looked it up. Done."

"You looked it up, where?" 

"Google."

Ah, Google. If there has ever been a company of smart people responsible for the dumbing down of America, it has to be Google. All students have to do is drop in a few key words from their math problems and poof — they can find an answer while unceremoniously learning nothing in the process.

"I didn't tell you to look it up," I said. "I was going to give you a hint."

The reason I wanted to give him a hint was because the puzzler is not the real problem. Although the question suggests you need to measure four gallons of water using a three gallon bucket and a five gallon bucket, the real problem is something else. It's what stops most people after 20 minutes of class.

In order to solve the problem, you really need to establish what X might be. And in this case, X is really whatever it takes to make gallon of water. I wouldn't have told him that, but intended to point him in that direction by asking what stopped him from answering the question. Except, I couldn't anymore. 

Google beat me to it. And today, all across the country, Google is going to beat other teachers and parents too. It's not the company's fault, but it is creating a problem. Sometimes it pays to look something up. Other times, it is much more rewarding to figure it out. Figuring teaches you to think and rethink. 

The most creative (and possibly efficient solutions) aren't online. 

EducationOne of my favorite authors of all time never wrote any fiction. His name is Richard Feynman. He was a scientist and winner of a Nobel Prize in physics. The reason he won it is punctuated by his affliction for figuring things out as opposed to looking them up. By thinking, he often debunked popular theories. 

It had been that way all his life. Even when he was 11, Feynman started to think his way around radios. Eventually, he moved on to fixing burglar alarms, amplifiers and other gadgets too. It was in his nature. He seldom looked anything up. Reinventing the wheel, for him, often made the wheel better. 

There are dozens of stories that underscore his point in his books and books about him. He said it over and over and over again. Even when the New York Times wrote an article about his legacy in 1992, it recounted how Murray Gell-Mann described The Feynman Algorithm to solve everything. 

What is the algorithm? It's simple enough. You write down the problem. You think very hard. And then you write down an answer. For many years, this phenomenon called thinking is what set American students apart from students in the rest of the world despite those international tests that suggested otherwise.

Most students, he observed when teaching abroad, are taught to memorize the answers. But he preferred to teach students to think through problems rather than always assuming the experts were right. Not only did that inspire new ways to think about things, but it also gave students the ability to apply what they've learned to a completely new set of paradigms and problems. Right. They get good at it.

There are some days that I'm not sure Feynman would feel American students are set apart anymore. Many of our students have been taught to resist the urge to think nowadays. And they are not alone. 

People ask questions online all the time or turn to key word searches to ask things like "how do I get more traffic to my site?" or "how do I get more Twitter followers?" or "who are the influencers in this field and that field?" as if those people can think better than they. There is nothing wrong with that, but I wonder if any of them know that one set of solutions doesn't fit a different set of problems.

Sure, seeing how other people solve their problems can be useful at times. But almost every communication problem is patently unique. You have to think very hard. Besides, just as I told my son, you have to try thinking in order to become a great thinker. It requires practice, just like anything. 

How about you? What do you think? And by that, I mean about anything? The comments are yours. Let's talk.
 

Blog Archive

by Richard R Becker Copyright and Trademark, Copywrite, Ink. © 2021; Theme designed by Bie Blogger Template