A financial crisis brought about by foreign wars and financial mismanagement and malfeasance. An administration, desperate to meet the demands of the people and stay solvent, forces through legislation that is opposed by many in the government and by the people. The first lady, when told that the people had no bread, replies, “Then let them eat cake.” (Well, technically Brioche, though it turns out the quote itself was probably just made up by a tabloid journalist, in this case, some hack named Rousseau.)

I’ll bet you thought I was talking about the current United States, until the bit about the cake, right?

The point, continuing from my last post, The four R’s: Reading, ‘riting, ‘rithmatic and Revolution!, is that revolutions, be they French, American, or educational, share similar characteristics and causes. And the French Revolution provides the recipe for this week’s post.

How many of you have read the article, “In Florida, Virtual Classrooms With No Teachers” in The New York Times? You’d remember it if you had: it’s the one about the high school students in North Miami Beach who walk into their first day of precalculus class in their senior year to find that their teachers had been replaced by… computers.

No, this is not a scene from my cyberpunk science fiction novel Spirit in Realtime. (Shameless plug — I’m still looking for a publisher! Tweet me: @jlsimons) It’s the sad reality for over 7,000 students in the Miami-Dade County Public School system.

You see, in 2002 Florida passed the Florida’s Class Size Reduction Amendment, which limits the number of high school students  to 25 students per classroom for core classes like math and English. It also limits 4th-8th grade classes to 22 students and pre-K-3rd grade to 18.

In order to meet these legally mandated limits, Florida has instituted what it calls e-learning labs, which are not legally restricted. In these virtual classrooms, students have no teachers, merely a “facilitator” who takes care of any technical issues that may arise. Supposedly, the facilitator is also present to make sure students “progress,” but I’m betting their primary raison d’etre is to keep the kids from going Office Space on the computers… and each other.

Now I’m not against virtual classrooms. Quite the opposite. I think they satisfy a growing need and, when approached properly, can outperform the real ones.

For instance, Mashable cites a US Department of Education report from 2009 based on 50 independent studies: “the agency found that students who studied in online learning environments performed modestly better than peers who were receiving face-to-face instruction.”

The world of online and virtual education is blossoming. I can watch a free lecture on the Special Theory of Relativity by Yale Professor Ramamurti Shankar on Academicearth.org along with dozens of other lectures and full courses in philosophy, biology, chemistry, literature, physics and more filmed right in the classrooms at MIT, UC Berkeley, Harvard, Yale, Stanford, Princeton, NYU, Columbia, and other leading colleges and universities.

I can learn anything from basic math to differential calculus, with the French Revolution and “The Role of Phagocytes in Innate or Nonspecific Immunity” thrown in for fun, from Salman Khan of The Khan Academy, a non-profit dedicated to their “mission of providing a world-class education to anyone, anywhere.” They’ve delivered 37,295,405 lessons (according to their website) and count Bill Gates as one of their most vociferous supporters. You can watch Salman and Bill talking about The Khan Academy below, and I promise, I didn’t tell Bill what to say at all. (Thanks for the support, Bill. The check is in the mail.)

The point I’m making here is that I can choose to watch those lectures and lessons, not that I am forced to watch them. (Which is good news, because I can’t tell a phagocyte from a Lymphocyte, and, in all honesty, the entire subject makes my brain hurt.) When students have the liberty to choose online education, and the motivation, there are no limits to what they can learn.

The students in Miami had no choice. Their parents had no choice. Some of them didn’t even know about the virtual classrooms until the day they walked in and saw the computers.

To quote the Times article,

Alix Braun, 15, a sophomore at Miami Beach High, takes Advanced Placement macroeconomics in an e-learning lab with 35 to 40 other students. There are 445 students enrolled in the online courses at her school, and while Alix chose to be placed in the lab, she said most of her lab mates did not.

“None of them want to be there,” Alix said, “and for virtual education you have to be really self-motivated. This was not something they chose to do, and it’s a really bad situation to be put in because it is not your choice.”

At 15, Alix already knows something that school administrators do not. Or worse, they know, but they don’t care. Or even worse, they know, they care, but they have no choice based on the new law.

Bingo! Again, quoting the Times article:

School administrators said that they had to find a way to meet class-size limits. Jodi Robins, the assistant principal of curriculum at Miami Beach High, said that even if students struggled in certain subjects, the virtual labs were necessary because “there’s no way to beat the class-size mandate without it.”

So, to sum up, an overwhelmed bureaucracy struggling to do its job comes up with a solution that seems to solve the problem, at the expense of the very people they were supposed to be helping. And the students are forced to eat virtual cake.

And not all of them, just some of them. Where is the equality in that? The fraternity? Will a college looking at these students give special consideration to the differing quality in instruction they received compared to students, some in the same school, who had an actual teacher to explain a difficult concept to them? Will their grades be asterisked? And what will the long term impact be on a student who repeatedly ends up in virtual classes in, lets say, English, starting in 7th Grade in one of the six middle schools using e-learning labs in Miami and continuing through senior year? Will the “facilitator” be able to awaken within that student a love for the rhythm and rhyme of good writing, the heart and soul of a poem, the nuances of meaning in serious prose? Or will we leave it to HAL9000, the computer in 2001: A Space Odyssey:

“I know I’ve made some very poor decisions recently, but I can give you my complete assurance that my work will be back to normal. I’ve still got the greatest enthusiasm and confidence in the mission. And I want to help you.”

Then again, maybe not.

Can someone please explain to me why an education system that can exile students to virtual classrooms during the time they are most in need of nurturing, guidance and, for want of a better word, teaching, shouldn’t be overthrown?  To the barricades, citizens. (More to come…)

Full Disclosure: My client, StraighterLine, is one of the disruptive and revolutionary forces actively engaged in changing education by offering self-paced, online college courses at ridiculously low costs. My relationship with StraighterLine is the reason I have been following developments in the field of education. While I am otherwise compensated for my marketing efforts on behalf of StraighterLine, this series of posts is not one of those efforts. The post is mine and I am in no way being compensated for writing it.

What does the start of a revolution look like from the inside?

Revolutions don’t have a precise starting point. It is easy to say that the American Revolution officially began on July 4, 1776 with the signing of the Declaration of Independence. But was that really the start of the revolution, or merely the official notification of a movement that had been brewing for years?  We know now that the Boston Tea Party was a clear step on the road to revolution, perhaps even one of the opening shots, but at the time, for the participants, as there was not yet a revolution to lead up to, it was “merely” a principled protest in defense of their rights (or, I guess, just a rowdy Thursday night in Boston.)

But I think we can agree on a few of the basic characteristics of the period leading up to a revolution:

  1. The pervasive, powerful and dominating institution about to be revolted against has become unresponsive to the needs of the people whom it supposedly exists to serve.
  2. Forces within the institution who recognize its failure and wish to change find themselves in conflict with forces against that change.
  3. Voices, both inside and outside of the institution, begin to address shortcomings and suggest solutions to the institution itself and to the public at large.
  4. The people most at the mercy of the institution begin to cry out for their needs to be addressed by the institution.
  5. The institutional bureaucrats and apologists fight back against their accusers, both internal and external, and frequently crack down on dissent, especially by their constituents.

Now here’s where it gets interesting. If we’re talking about governments or religions, then historically, what happens next is invariably violent, bloody, and disruptive (with one or two notable exceptions that prove the rule, such as Gandhi’s India).

But if we’re talking about economics, what happens next may be disruptive, but it’s not necessarily bloody or violent. Certainly, people will be displaced, livelihoods will be lost and fortunes will vanish. There may be riots. But any bloodshed connected to the Industrial Revolution pales in comparison to the French Revolution, the American Revolution, the Russian Revolution, the Protestant Reformation, etc. etc. etc.

We live in an era of change and disruption across multiple industries: publishing, journalism, marketing and advertising, media and entertainment, manufacturing, health, finance… well, you get the point, right? Any of these sectors may be on the verge of revolution (and nearly all are impacted by even bigger global revolution of virtually simultaneous, planet-wide shared awareness, perception and discussion about which I blogged in October.)

But if we want to find a flawed, failing institution that meets the five aforementioned characteristics, there’s one that really stands out:  education.

Here’s a nice juicy statistic to get us started:

45% of the 2300 undergraduates at 24 institutions analyzed for “Academically Adrift: Limited Learning On College Campuses,” (University of Chicago Press) demonstrated “no significant improvement in a range of skills—including critical thinking, complex reasoning, and writing—during their first two years of college.” Even worse, 36% didn’t “demonstrate any significant improvement in learning”  over four years of college!

According to the publisher, “As troubling as their findings are, Arum and Roksa argue that for many faculty and administrators they will come as no surprise—instead, they are the expected result of a student body distracted by socializing or working and an institutional culture that puts undergraduate learning close to the bottom of the priority list…Higher education faces crises on a number of fronts, but Arum and Roksa’s report that colleges are failing at their most basic mission will demand the attention of us all.”

Reporting yesterday on the book for Inside Higher Ed, Scott Jaschik wrote, “the book acknowledges that many college educators and students don’t yet see a crisis… The culture of college needs to evolve, particularly with regard to “perverse institutional incentives” that reward colleges for enrolling and retaining students rather than for educating them. “It’s a problem when higher education is driven by a student client model and institutions are chasing after bodies,” he (Arun) said.”

Now in case you haven’t noticed, dear reader, my posts tend to run long to begin with, and even I can see that this isn’t a bone I can finish gnawing in a single meal. I’m going to continue to address this issue in upcoming posts.

So for now, I’m going to leave you with a simple question, to which I humbly ask for your answers and opinions: can someone please explain to me how we can, in good conscience, counsel our children to mortgage their futures under a mountain of student loan debt when 45% of them won’t get much out of their first two years, and 36% won’t get much out of their entire four years of college?

Full Disclosure: My client, StraighterLine, is one of the disruptive and revolutionary forces actively engaged in changing education by offering self-paced, online college courses at ridiculously low costs. My relationship with StraighterLine is the reason I have been following developments in the field of education. While I am otherwise compensated for my marketing efforts on behalf of StraighterLine, this post is not one of those efforts. The post is mine and I am in no way being compensated for writing it.

I was writing some inner monologue for Max for ” Life In The Whirlwind,” the third book of my teen cyberpunk trilogy, and she was wondering what she’d be like when she got older. Which got me thinking about how we all turn out, compared to what we dreamed we would be. (And my 6-year old daughter’s plans for her own future had nothing to do with that. Really.)

For me, it’s pretty clear. I’m happy to own my choices, even the ones that didn’t turn out the way I like. Every once in a while I read an old poem of mine and think, “That guy wouldn’t recognize this guy” but mostly because that guy didn’t understand all the issues yet. He wouldn’t disown me, he’d just figure I took a different road than I thought I would. (Can you guess which choice I picked in the poll?)

How about you? Can you please explain to me, via this poll, which allows you to enter your own responses too, what the 17-year old you would think of the you you are now?

Like so many of you out there, I am outraged at the sanitizing of Huckleberry Finn by replacing the “N” word with “slave.” At first, I assumed Alan Gribben and NewSouth Books must be doing it to sell books to schools and libraries that banned the original, riding the wave of political correctness and sensationalism to the best seller list.

But then I read the article in Publisher’s Weekly and came to the realization that Gribben really thinks he’s doing the right thing:

“After a number of talks, I was sought out by local teachers, and to a person they said we would love to teach this novel, and Huckleberry Finn, but we feel we can’t do it anymore. In the new classroom, it’s really not acceptable.” Gribben became determined to offer an alternative for grade school classrooms and “general readers” that would allow them to appreciate and enjoy all the book has to offer. “For a single word to form a barrier, it seems such an unnecessary state of affairs,” he said.

The article ends with a quote from NewSouth publisher Suzanne La Rosa:

But the heart of the matter is opening up the novels to a much broader, younger, and less experienced reading audience: “Dr. Gribben recognizes that he’s putting his reputation at stake as a Twain scholar,” said La Rosa. “But he’s so compassionate, and so believes in the value of teaching Twain, that he’s committed to this major departure. I almost don’t want to acknowledge this, but it feels like he’s saving the books. His willingness to take this chance—I was very touched.”

Sounds reasonable, right? Even noble: Making Huckleberry Finn accessible to everyone, at the cost of one’s reputation. I mean, after all, the book is considered one of the great American novels, perhaps the greatest. It’s the ultimate indictment of those who judge people by how they look, or the title or position in society they hold, or even their familial relationship, rather than judging them by their actions and their hearts.

And wouldn’t that message be just as strong without the “N” word or “Injun” scattered over 200 times across its pages?

Who cares? That’s not the issue here.

The question is: Who owns Huckleberry Finn? And I don’t mean who owns the right to publish it. I mean, whose book is it?

It’s not Gribben’s book. It’s not our book. Librarians and school teachers and school boards and offended readers don’t own it.

It’s Mark Twain’s book. He wrote it. He could have used the word slave, but he didn’t.

Good intentions don’t justify censorship or the mutilation of art, whether you’re a teacher or the Pope. (Sorry, Pius IX.)  And I don’t think anyone who has ever read Mark Twain would suggest he would approve of Gribben’s actions. This is exactly the kind of misguided sophistry Twain would skewer with his rapier wit. Rather than openly fight the injustice of censorship, our brave hero slinks in shrouded in a cloak of acceptability.

But mostly, it’s just wrong. Twain is powerless to defend his words against Gribben’s literary rape.

Isn’t there a word for depriving someone of their right to self-determination, when you treat them like an object to serve your needs rather than as a human being deserving of respect?

Can someone please tell me who gave Alan Gribben and NewSouth Books the right to treat Mark Twain like a… “slave?”

10 years ago today, probably right around the same time I’m sitting down to write this post, my good friend Bill Railey died alone in his apartment on 8th Avenue above the Molly Wee Pub in NYC.

He lived a hard drinking, hard smoking, hard partying life, and when he found out he had late stage, inevitably terminal lung cancer, he never whined about it, never cried foul.

I have a bottle of single malt Irish whiskey that I bought for Bill on his last birthday, but he couldn’t drink at that point, so I saved it. Every year, on New Year’s Eve, I have a shot for Bill and repeat the toast I heard him make in his gravelly voice, more times than I care to remember: “To happiness, whatever it takes.”

He was as unsentimental as they come, never suffered fools, and wasn’t afraid to fight for what he believed in. He was an anarchist, a Randian, a biker, a philosopher, an animator and above all else, an artist.

And he had more faith than almost anyone I’ve ever known.

Not the kind of faith people vest in unseen creators, powerful institutions, the legal system or governments. No, Bill Railey’s faith was in himself, in the abilities of individuals, in the things we do rather than the things we say.

His faith was sorely tested. He took part in more than his share of battles, in courtrooms against better funded enemies, ex-wives, and even the mighty Disney machine.

But he never gave up. Even while he was losing his last, toughest battle, he never gave up and he never lost faith in himself.

At the time of his death, Bill and I were collaborating on two projects. One was an animated series about a female vigilante serial killer that was in its infancy. The other was called Thinking Meat, which he described as “the world’s only animated program recorded before a live studio audience” and had originally conceived with Sally Franz.

We actually posted two episodes, “Cosby on Def Jam” and “The Zoo” to iFilm. They were a little raunchy, a little offensive, and very funny. In other words, they would have killed on YouTube today. Unfortunately, this was 5 years before YouTube… and just a few months before Bill died.

At a time when many people turn to religion, even people who never believed before, Bill never resorted to mysticism or superstition. He faced his end rationally, with his eyes open wide and no regret for the choices he made.

I was fortunate to know Bill, and perhaps, weird as it sounds, lucky that his inevitable end came on New Year’s Eve. Because each year during that time when the whole world makes resolutions, as I take that shot of whiskey from the dwindling remains of that last bottle, I remember my friend, and repeat his toast, “To happiness, whatever it takes.”

I become inspired to live my next year the way Bill lived his life: rationally, with faith in myself. To do whatever it takes (within my own code of morality, of course), to live my life so that when it ends, I can own my choices, and recognize my steps as my own.

Can someone please explain to me why anyone would want to live life any other way?

Don’t you hate it when people rant about the good old days? I know I do.

“You young whippersnappers may not remember this, but in the good old days, Ma Bell ran the phones and you could hear a pin drop on the other end of the line. At least, that’s what the teleeevision commercials said.”

Well, I was talking with a friend yesterday and he pulled out his Droid and fumbled with the touch screen, trying to find the phone, and the vitriol that dripped from his voice as he said how much he hated that phone was the kind usually reserved for villains who stole your bible and shot your dog.

It got me thinking about cell phones. I use a Voyager, an aging pre-smart phone that browses the internet and gets emails, but not all that well. Its best feature is the real keypad, which for me is critical. But the fact is, the sound quality sucks.

It’s not the network. I’m on Verizon, and after using MCI, Sprint and AT&T, I can tell you that at least in my experience, Verizon delivers the best quality in the tri-state area.

No, the fact is that the sound quality on most mobile phones sucks these days. 5 years ago, on a basic LG flip phone, the sound quality was fine.

But today?

Think about how many calls you get from people on cell phones where you can barely make out what they’re saying, even if they’re not on the streets of NY but in their own home.

And what if you’re receiving the call on your own cell phone? The problems are magnified beyond measure.

We all make excuses for our phones. We overlook the problems: the abysmal quality, the crashes, the dropped calls, the lost contacts, the notepad messages that can’t be transferred, the calendars and text inboxes that seem to have smaller capacities for storing data than a 5 1/2″ floppy and the battery life that runs out quicker than you can say “Honey, where’d I leave my charger?”

We move from phone relationship to phone relationship, hoping upon hope that the next one will be “the one.”

In other words, we are serial enablers.

I’m in the process of looking for my own “Mr. Right.” (Yes, my phone is a he. It’s not a ship, or a car, or an airplane. It’s a phone, and to me it’s a he. Lucky for me. If it were a she, she would have dumped my sorry, abusive self years ago.)

I’m probably going to get a Droid 2, because I need the keypad, and it’s better than the Droid’s, and I want a bigger screen and I want a tricked-out phone that runs Flash  and has apps and synchs with my Google Calendar. Plus, I want to stay with Verizon, which means the Samsung Epic 4G is out.

So I’ll suffer through unfathomable interfaces and lousy call quality because I want all the technological bells and whistles. I want to be able to run Tweetdeck on my phone and connect to Facebook with a click and visit sites that use Flash and type with real keys instead of pounding my non-conductive fingertips against the touch screen until I’m red in the face and ready to stroke out.

But can someone please explain to me when it not only became acceptable to tolerate lower quality in return for “better” technology, but required?

Add to FacebookAdd to DiggAdd to Del.icio.usAdd to StumbleuponAdd to RedditAdd to BlinklistAdd to TwitterAdd to TechnoratiAdd to Yahoo BuzzAdd to Newsvine

Did you see the bit on HLN about bedbugs infesting firehouses in Albuquerque, New Mexico the other day?

What caught my attention wasn’t the bugs, which are popping up all over the place like Tea Party candidates.

Nor was it the fact that the Firefighter Wives Auxillary Association went to a national high end mattress company and asked them donate 170 mattresses to the firestations, which they did. (You can read the whole story here.)

What hooked me was that the mattress company has requested to remain anonymous.

That’s right — anonymous!

As some of you might know, I co-authored a book with Dr. Richard Steckel about cause related marketing titled “Making Money While Making a Difference: How to Profit with a Nonprofit Partner.”

The entire book is about the positive bottom line benefits of cause-related marketing, an absolute win-win when done right, and while I wrote it over a decade ago, I’m pretty sure I didn’t put in anything about the benefits of anonymous donations.

BECAUSE THERE ARE NONE! At least not to marketing or sales. There are the tax benefits, of course, which must be monumental for 170 mattresses. And as my wife suggested, there may be a religious angle, which I guess would be good for your soul and future accommodations in whichever afterlife you may believe in.

But you have to agree that it’s an unusual move, in this day and age when organizations from NASA to Oakley were falling all over themselves to milk the publicity from helping out the Chilean miners. (Can you say $450 sunglasses, or $41 Million in media exposure?)

I’m still dumbfounded by it. Companies are constantly on the lookout for opportunities to, well, make money while making a difference. Opportunities like this one.

Which leads me to wonder, can someone please explain whether I’m right, or whether I’ve become so jaded that I can’t see an act of charity as anything other than a missed marketing opportunity?

Add to FacebookAdd to DiggAdd to Del.icio.usAdd to StumbleuponAdd to RedditAdd to BlinklistAdd to TwitterAdd to TechnoratiAdd to Yahoo BuzzAdd to Newsvine

Magazines come and magazines go. In the first 9 months of 2011, 110 new magazines began publishing while 127 closed up shop, according to this Oct. 11 press release from MediaFinder.com. (Both numbers are down from the same period last year, when 259 launched and 383 folded.)

But of all the new launches this year, the breakout title for me is Inspire, the magazine of al-Qaeda. It’s published in English out of Yemen and edited by a 24-year old Pakistani American from North Carolina, Samir Kahn.

The cover for the second issue of Inspire, Fall 2010

With catchy articles like “Make a bomb in the kitchen of your mom” and instructions for turning your pickup truck into a steel-bladed “mowing machine” for mowing down enemies, the magazine is sure to deliver a unique audience.

Good for Inspire. One of the only ways a magazine can succeed in the post-print era is by delivering eyeballs nobody else can reach. Find an under-served niche and exploit it. (Speaking of which, please don’t confuse Inspire with InSpire Magazine, the woman’s magazine published by, I kid you not,  Niche Publications LLC)

Does anybody know if Inspire sells ad space?

After all, there are plenty of companies that would want to reach terrorists and zealots. Fertilizer manufacturers, truck rental companies, used car dealers, explosive shoes, box cutter suppliers and backpack makers come to mind, but I’m sure there are others. (Quite a few of the magazine ad sales people I’ve known would sell their own mothers for the commission, so I can’t imagine they’d have a problem with this.)

I wonder what kind of added-value the magazine offers to its advertisers? Laminated copies of the ads with the words, “As seen in Inspire” slugged into the upper right corner? Free advertorials? Survey cards?

Maybe they let advertisers rent the subscriber list at a discount?

If that’s the case, can someone please explain to me how I can get my hands on that list? I can think of a Navy S.E.A.L. team that would love to hand deliver a special promotion to the readers.

Add to FacebookAdd to DiggAdd to Del.icio.usAdd to StumbleuponAdd to RedditAdd to BlinklistAdd to TwitterAdd to TechnoratiAdd to Yahoo BuzzAdd to Newsvine

I was at SMX East Tuesday and attended a session on Facebook advertising. The experts on the panel were talking about how, in order to actually get useful results out of advertising on the world’s largest social network, they had to change their Facebook creative as often as 4-5 times a day to combat blindness, fatigue and annoyance.

Swapping out ads every few hours? Optimizing banner campaigns and paid search and websites on the fly? Managing brand reputations that can change in hours thanks to a viral video or a negative blog post?

When did advertising get so hard?

It used to be, you ran a TV spot on Must See TV and the whole world knew about your product.

It used to be, you rented a great mailing list, sent out a juicy catalog half the size of a phonebook, and watched the orders come rolling in over the phone or in the mail.

It used to be, you did your keyword research, put up a bunch of paid search ads in Google AdWords, and watched people come to your site and buy things.

It’s not like it used to be.

Advertising has gotten really tough. And it’s gotten tough because our target audiences stopped being targets and started being participants.

Now, you have to listen to them – but if you do, you can learn what you need to succeed.

Now you have to engage them – and when you do, they’ll reward you with the real version of the brand loyalty you thought you had before.

Now, you have to treat your customers like a Facebook Friend, a Twitter Follower, an engaged stakeholder – and if you don’t, they’ll find a company who does, but only after they tell everyone how shabbily you treated them. (5 years ago, if you said this to a client, they would have called you crazy and shown you the door.)

The bad news is that there are more channels, more touchpoints, and more tools than ever before, and they’re labor intensive, difficult to quantify, and constantly changing. (Just keeping up with the changes to Google is a full time job!)

The good news is that there are more channels, more touchpoints, and more tools than ever before at our disposal to change the way we relate to our customers.

So can someone please explain to me why, rather than change their methods to get the most advantage out of these newly engaged and empowered customers, so many advertisers are just trying to find a way to make the new mediums work like the old ones?

Add to FacebookAdd to DiggAdd to Del.icio.usAdd to StumbleuponAdd to RedditAdd to BlinklistAdd to TwitterAdd to TechnoratiAdd to Yahoo BuzzAdd to Newsvine

I just saw “The Social Network” and I loved it. Aaron Sorkin proved once again that he is the best dialogue writer in Hollywood (followed closely by Quentin Tarantino and Diablo Cody, IMHO). His words, and director David Fincher’s skill, kept the movie flowing and riveting, never once sounding anything but utterly real and believable.

And Jesse Eisenberg made Mark Zuckerberg into an everyman for our generation.

In the first scene, Zuckerberg tells his girlfriend that there are more geniuses in China than there are people in the US. We begin to see Zuckerberg as an everyman: even though he’s a genius, and knows it, that doesn’t guarantee entry into the members-only clubs where the cool people hang out.

“The Social Network” is about us, all of us, trying to fit in, looking for a place to belong, and finding our voice: collectively and individually. It’s a messy process, and there will be sins of commission and omission along the way.

I heard a reviewer on whatever cable channel was on at the time saying that this movie isn’t just the movie of a decade, it’s the movie of the generation, and that got me thinking.

We live in a time that future generations will look back on as revolutionary. And it’s not revolutionary because men like Bill Gates and Mark Zuckerberg built products and companies that changed everything: it’s revolutionary because society was ready to embrace the new world their creations helped birth.

That new world is the world of virtually simultaneous, planet-wide shared awareness, perception and discussion.

Think about it. How do you get your information now? How do you experience the world? And most importantly, how do you share it, and what’s the lag time between discovery and dissemination?

I used to be a newspaper junkie. Then a Google News Junkie. Now, I have a News list on Twitter that gets the latest updates from the WSJ, The NY Times, Huffington Post, CNN, Mashable, Techcrunch and more. (The WSJ alone has dozens of Twitter feeds.) Now I can finally scan the news quickly and easily and know what’s going on everywhere instantly.

A few days ago, the shooting at the University of Texas was first reported on Twitter by students on campus. And as the situation developed, the local police were sending out their “official updates” to the news networks via Twitter.

The implications for Marketing and Advertising are sweeping. Because in the new era, ideas don’t spread because you throw money into spreading them. An idea spreads now because the wired-together world likes it and tells itself about it. The internet is littered with the corpses of bad ideas drenched in the blood of wasted marketing dollars.

Yes, getting heard among the rising background noise is hard. And at its most basic level, if you don’t know how to use the tools of social media, or don’t have the time, then marketers and advertisers can help.

But make no mistake: the ultimate success or failure of an idea, a product or a service is now dependent upon the quality of the idea, the product or the service. If people like it, they tell others. If they don’t, they don’t. And the way people find out about things these days is through a connected, always-on social network that exists online and off, via text and email and word of mouth across mobile phones and smart phones and laptops and computers, via Facebook and Twitter and Google.

It didn’t used to be that way, and that is sad for the good ideas that died stillborn and unheard, for lack of money or wherewithal. But I say, good riddance to the old world, and welcome to the new.

And yet, there are still those who resist the tide and cling to the ways they’ve always known, who look at multiple channels and only see fragmentation, who look at millions of people talking about what’s important to them and only perceive self-indulgent and distracting noise.

Can someone please explain to me how anyone can look at this time as anything less than a revolution, as the dawn of an era where a world of billions of individuals finally came together to know itself as a whole community greater than the sum of its parts?

Add to FacebookAdd to DiggAdd to Del.icio.usAdd to StumbleuponAdd to RedditAdd to BlinklistAdd to TwitterAdd to TechnoratiAdd to Yahoo BuzzAdd to Newsvine