Marina Gorbis's Blog, page 1554

September 6, 2013

Don't Tell a Suffering Coworker "It Could Have Been Worse"

After being subjected to the upsetting experience of receiving negative feedback on a task, research participants felt particularly badly, scoring an average of 4 on a 7-point positive-affect scale, if they were indirectly told that getting a low score on a task was a "not serious" event. By contrast, those who could decide for themselves on the seriousness of such an event felt less bad (4.63), even though they too tended to classify the experience as "not serious." The research, by a team led by Kristin W. Grover of the University of Vermont, suggests that people who have suffered misfortunes feel worse when their experiences are minimized by others, but feel better when they internally minimize the experiences themselves. Saying "It was for the best" or "It could have been worse" makes sufferers feel misunderstood and isolated, the researchers say.





 •  0 comments  •  flag
Share on Twitter
Published on September 06, 2013 05:30

Our Self-Inflicted Complexity


People who make it their business to study large-scale problems (business theorists and economists among them) seem to be in broad agreement that the world is growing ever more complex — and that this trend makes their work harder. If this is true, then we should be grateful for their ongoing efforts and to a large extent let them off the hook for failing to make more progress. But is it true?



The claim can be hard to evaluate given the number of meanings that attach to the word complexity. But if we start from a solid, shared definition it becomes easier to consider. Of all the definitions, I like Peter Senge's old but simple one best. He spells it out in The Fifth Discipline, by way of explaining why seemingly sophisticated forecasting tools so often miss the mark:



[T]hey are all designed to handle the sort of complexity in which there are many variables: detail complexity. But there are two types of complexity. The second type is dynamic complexity, situations where cause and effect are subtle and where the effects over time of interventions are not obvious. Conventional forecasting, planning and analysis methods are not equipped to deal with dynamic complexity.


Senge's distinction between detail complexity (driven by the number of variables) and dynamic complexity (heightened by any subtlety between cause and effect) is not only key to explaining why some overhyped tools don't deliver. More broadly, it is consistent with how growing knowledge in a field inherently advances and generates complexity.



The starting point for knowledge is mystery. Everything we now know started as a mystery in which we couldn't even discern the variables that mattered, and therefore had no capacity to understand cause and effect. Think of how the world was baffled, for example, in the very early days of the AIDS crisis. We didn't know how to think about this new and horrible condition.



But in due course, as is the case in many domains of knowledge, AIDS became less of a mystery. With hard work and study we advanced to a heuristic — that is, we started to understand what variables mattered and developed a sense of the cause and effect. We came to the conclusion that it is an acquired autoimmune disease transmitted primarily through sexual contact. This enabled researchers to focus on the relevant variables and better understand cause-and-effect relationships — for example, the relationship between unprotected sex and transmission.



Some knowledge gets advanced all the way to algorithm, in which every relevant variable is specified and the cause-and-effect relationships are precisely defined. This has happened, for example, with polio. We figured out what causes it and developed a vaccine that if taken protects the individual against the disease forever. We haven't driven AIDS knowledge to an algorithm yet. It is not entirely clear what all the relevant variables are and there are still plenty of subtleties to attempt to understand between cause and effect. But our understanding is far advanced from a mystery — and hence the many treatments that help HIV-infected patients avoid developing full-blown AIDS.



AIDS researchers and every other scientist since Aristotle have attempted to ferret out cause and effect because they want to explain how the world works. They want to drive knowledge toward an algorithm like E=MC^2 with all the subtlety gone.



The question is: How do they do it? How do they eliminate the subtlety between cause and effect in order to drive knowledge toward algorithm? Typically, the approach is to tackle cause and effect (dynamic complexity) by reducing the number of variables considered (detail complexity).



My own clan — the economists — is particularly inclined in this direction. There are a thousand economists working on partial equilibrium problems for every one working on a general equilibrium problem. This is despite the fact that no one would contest that general equilibrium clarity is the most valuable knowledge by far. Why? Because it is really difficult to specify any general equilibrium cause-and-effect relationships.



Instead, most of the guns deployed in modern knowledge advancement are aimed at narrow problems for which the cause-and-effect relationship is specified with the famous "all other things being equal" proviso. Each narrow knowledge domain develops analytical tool-sets that deepen the narrow knowledge domain. Each narrow domain develops ever more algorithmic knowledge, and those developing the knowledge are extremely confident that they are right because they are so specialized within their own domain. The liver expert is completely confident that he or she is correct even if it is the interaction with another condition that threatens your health most.



This approach has created another kind of complexity: inter-domain complexity. Every field is segmented into multiple domains, each with deep algorithmic knowledge, specialized tools, and experts in the domain who think they are absolutely right. And they are indeed right, as long as we ignore the reality of detail complexity.



However, the real world we live in, and have always lived in, is a world of detail complexity. So when we sacrifice dealing with detail complexity to focus on dynamic complexity, the solutions don't produce the outcomes that we really want. For all their great work, it is unclear that economists have actually helped government officials manage the complex task of managing a national economy any better than they ever have. And despite massive advances in narrow domains of medical knowledge, actual health outcomes have been difficult to improve, especially in errors of high detail complexity.



This is, I believe, what makes it feel that complexity has increased. I absolutely do not believe that the subtlety between cause and effect has increased at all in the world. But the negative manifestations of the largely unaddressed inter-domain complexity make it feel like we have massive un-addressable complexity overwhelming us.



In other words, we are bedeviled by manufactured complexity — complexity that could have been avoided but has instead been amplified by the pursuit of narrow knowledge in a broad world.



It is vital, therefore, to our ability to make progress against large-scale problems that we figure out how to tackle inter-domain complexity.





This post is part of a series of perspectives leading up to the fifth annual Global Drucker Forum in November 2013 in Vienna, Austria. For more on the theme of the event, Managing Complexity, and information on how to attend, see the Forum's website.





 •  0 comments  •  flag
Share on Twitter
Published on September 06, 2013 05:00

September 5, 2013

Ambitious Women Face More Obstacles than Just Work-Life Balance


For the past year and change, the American conversation about women and leadership has revolved around challenges of work-life balance — which most of the time actually means "work-family balance."



The women we're hearing from — Anne-Marie Slaughter, Sheryl Sandberg, and the rest — aren't jetting out of the office at 5:30 to train for a marathon or learn Chinese or even just binge-watch Law and Order: Special Victims Unit. They're leaving "early" to take care of their children. And so we talk about having it all, leaning in, or opting out — and we talk about women who don't make it to the very top of their companies, still, as if it's a personal choice.



The truth is — as many have pointed out — that lots of ambitious people, male and female, make personal choices that take them off the path of leadership. It's also true that women are often gently but firmly nudged off this path more frequently than men, when work and family invariably clash. And that is a problem. Not just for the women, but for the companies missing out on the benefits of diversity and the economy that's not playing with a full talent deck.



But while that is a major obstacle to getting more women into senior roles, it's far from the only — or even the most important one. Yesterday, I interviewed HBR Editor Amy Bernstein about our current issue, which spotlights women in leadership. We agreed that it's time to shift our focus away from issues of work and life, and personal career decisions about "sitting at the table" or "leaving before you leave," to look at some of the institutional barriers that women still face.





One of these challenges is what Herminia Ibarra, Robin Ely, and Deborah Kolb call "second-generation gender bias." The basic idea: we become leaders iteratively, by taking increasingly challenging roles, learning, and then having our performance affirmed by those around us. For women, this process is often interrupted for a simple reason: when women display leadership behaviors we consider normative in men, we see them as unfeminine. When women act more feminine, we don't see them as leaders.



A previous McKinsey study also identified another barrier: women aren't given as many high-profile, big budget, or international assignments as their male peers. These are the developmental projects that put talented women on the path to the C-Suite.



Work from Catalyst identified another challenge: women aren't sponsored by higher-ups to the same degree that men are, although women do tend to have lots of mentoring relationships. This translates to women receiving lots of well-meant advice, but not a lot of growth roles.



(The depressing list goes on. My colleagues at HBR have pulled together some of the latest research on these and other barriers, along with a curated reading list from HBR's deep archive on this issue.)



It would be disingenuous to say that none of these challenges are related to the joys and burdens of parenting, which still disproportionately fall to women. But increasingly, men share in those joys and burdens too. And the women we're talking about — ambitious mid- to senior-level executives with their eye on the C-Suite — can afford to mitigate a lot of those burdens. So I think it's also disingenuous to portray — as so much of the popular press does — the lack of women at senior levels as evidence of some personal choice on their part.



At the same time, it's not exactly that there's a glass ceiling (or a glass cliff, or a maternal wall): the days of blatant discrimination are (mostly) behind us. Today, it's more like a glass obstacle course of a hundred hard-to-see hurdles.



No wonder so many women seeking leadership roles suffer from bruised shins. No wonder so many of them never make it to the other side.



And yet, as Bernstein was quick to point out, when I asked her if it was depressing that we're still, in 2013, talking about this:



But we can deal with it. We can address it. Nissan addresses it, Avon addresses it, Merck addresses it. Big companies that don't turn easily address it, and they make a difference, and they have seen results. So yes, [gender bias] is bad, and no one want to have to talk about it, but given that it's still out there, isn't it wonderful that we can figure out how to deal with it, how to address it, and how to overcome it. And then we can go on to the next thing.


Here's to overcoming it.





 •  0 comments  •  flag
Share on Twitter
Published on September 05, 2013 11:13

Six Classes Your Employer Wishes You Could Take


School is back amid growing controversy and cynicism. The quality, validity and economic value of college degrees and MBAs have rarely been under such sustained assault. Employability of graduates has never been so dismal. Machines are clearly getting smarter at many of the things people traditionally do on the job. That means people need to become non-traditionally smarter at things machines are not quite yet ready to think about or do. And that means educators worldwide must revisit how they want to make their most important product — their students — more valuable.



Were I advising aspiring top-tier universities — or their students on what they should expect from their high-priced education — the following classes would represent excellent starting points for fundamental curricular reform.



Multimedia Editing. Increasingly, knowledge workers won't simply be creating or generating information but assembling, reorganizing and prioritizing information from others. In other words, they'll be editing. They will need to extract, abstract, synthesize and linearly present other people's — and machine's — work. Much of this information will be incomplete or inchoate. (Just ask my own editor.) The ability to write a sentence or video a sequence is not the same as editing them. The ability to immerse oneself in terabytes of data, identify (individually or collaboratively) what's most important and restructure it in an accessible, meaningful and usable form for a variety of audiences will increasingly be an essential skill. What enterprise isn't interested in graduates capable of transforming a petabyte of information into a slick 12-minute interactive multimedia presentation?



Scenarios. In addition to knowing how to create a compelling narrative out of reams of data, there will be a premium paid to those who can paint vivid pictures of possible tomorrows. Scenario planning is as essential for strategy formulation as it is for the design of next generation technologies and industries. Thinking in terms of scenarios forces people to rigorously examine fundamental assumptions and unexpected risks. Scenarios demand expository and analytic, as well as literary, skills. What serious employer doesn't want to hire someone who can envision and articulate scenarios describing the future(s) of the enterprise?



Fantasy Sports Competition. Understanding probabilities, statistics and analytics is increasingly vital to identifying and effectively managing high performing talent. As Moneyball and the rise of quants in professional sports worldwide attest, the ability to relentlessly improve the quality and specificity of performance analytics is key to success. Luck matters but so do the data-driven odds. This class requires small teams of students to compete against each other in at least two data-rich team sports. The student teams, with budgets and other constraints, have to assemble and field the best-performing teams they can and justify their investments and trade. Their grades are, indeed, dependent on their teams' "on-the-virtual field/court" performance. Credit given for the development of novel/innovative metrics for performance assessment (for example, attendance figures as a "team economics" variable). The goal is not creating teams or leagues of aspiring Nate Silvers but to assure that students come away with the statistical savvy not to be probabilistically buffaloed by Nate Silver wannabees.



Reverse Engineering. This class looks at what makes experiments, inventions and artifacts tick and then takes them apart and rebuilds them. In other words, this is a hands-on class where students gain knowledge and skill by seeking to replicate and recreate things that work. What makes Amazon's web page work? What are the ingredients of a touchscreen? How does a mobile phone cam take a picture versus a video? What makes a prosthetic limb responsive? What makes a toaster toast? This is a class that puts things together by first deconstructing them. The goal is giving students a vocabulary and capability for interactively understanding the links between technology design and construction. Fixing, maintaining and/or repairing technologies is not the purpose; empowering students to identify and understand the fundamental physics, materials science and design technologies that combine into valuable outcomes is. Appreciating the essence of technology and the technology of essence is key. Grasping how difficult, challenging and important design integration can be — and why it must be managed well — is an essential takeaway.



Comparative Coding. Another blog on this site asks, "Should MBAs Learn to Code?" Alas, that's exactly the wrong question. The better question is: What aspects of coding should MBAs (and university students) learn? When the world is filled with cascading style sheets, XML, Erlang, Python, Ruby-on-Rails, Objective C, C++, Java, SQL, etc, "coding" becomes a misnomer. The pedagogical challenge becomes what are the most important things people need to understand about the grammar, semantics and culture of computer languages? If fluency isn't possible or practical, what is? Does value primarily come from the ability to code? Or from an ability to read, follow or grasp a program's limits and strengths? Designing a coherent course that gives people who use software genuine and actionable insight into the languages underlying the apps and services they use remains one of the great educational challenges of the digital era. Perhaps "comparative coding" will evolve into a class not unlike "editing," where the key to cognitive and conceptual success is the ability to identify the code that's most important and effectively tweak it. A classroom experience that gives students confidence that they know why — and how — their software works (and evolves) as it does will be of inestimable value to the students themselves and the employers who hire them.



Cooking Science & Technology. Another hands-on course integrating fundamental scientific principles with real-world knowledge challenges students to transform their understanding of food. Everybody eats. But too many people think that food comes from supermarkets and that cooking simply means heating up food according to a recipe. Understanding the chemical and material properties of ingredients is, indeed, a science. Appreciating the role of technologies in every stage of the cultivation, preparation, presentation and preservation of those ingredients requires a genuine grasp of high-tech engineering. The role of local sourcing and global supply chains are integral to knowing why, how and how much food ends up on people's plates. Controversial GMO foods challenge notions of what's natural while moleculer gastronomy innovations transcend expectations about food tastes and textures. The ability to improvise is just as important as the willingness to follow a recipe to the gram. Planning and successfully executing a complex meal is an exercise in project management. Understanding how convection, radiation and microwave ovens work — and why — in relation to various ingredients represents the antithesis of perishable knowledge. The kitchen can and should be a laboratory for innovation. There are few better courses for combining scientific insights, raw materials, new technologies and customer satisfaction.



These classes strive to balance the transmission of knowledge in classroom environments with the cultivation of real-world skills. Coursework here demands an appreciation of how to collaborate; interact with more data and analytics; generate and communicate testable results; and improvise and innovate if things don't go as planned. Students are not simply studying for tests; they're testing their own ingenuity and intuition. Students achieving competence — let alone mastery — in their coursework would be both cognitively enriched and more economically desirable to potential employers. It wouldn't hurt their ability to be entrepreneurial either.





 •  0 comments  •  flag
Share on Twitter
Published on September 05, 2013 09:00

Why We Need More (Women) Leaders

There's been a lot of talk this past year about why more women don't become leaders. About what our society needs to change to produce more female leaders. There's even been some discussion about why women are better leaders than men in some arenas.



Often overlooked is this basic reality: what the world needs is more leaders, of whatever gender or any other characteristic. We need more leaders at every level in every kind of organization — businesses, government, schools, neighborhood and professional associations, unions, religious entities, and charitable institutions.



With change, crisis, and complexity coming at us faster and faster from all directions, we dare not depend on just a few to lead us and we dare not eliminate any group of people from the opportunity to lead others to a better future.



John Kotter defines leadership as "creating a vision of the future and strategies for producing the changes needed to achieve that vision; aligning people around the vision; and motivating them to overcome barriers and produce the changes needed to achieve the vision."



How many people acting in these ways does the world need?



Millions.



And they need to come from everywhere.



In today's organizations, we need to unlock the leadership potential within so those who want to lead, get to. From the senior information technology official who knows that her department can be a better partner for the business, to the first-shift line worker really irritated that her shift can't seem to match the productivity of the third shift, we have seen that under the right circumstances, leaders will step forward to make a difference.



What are the right circumstances?



A culture where the vision for the future is clearly understood throughout the organization. A culture where people are invited to step forward to help advance the vision in small and big ways. A culture where good-faith efforts that don't work out are seen as bigger barriers to tackle, or a reason to re-examine the goals, rather than as a failure that must be punished. A culture where transparency is the norm, barriers to progress are shared, and people are asked to help knock down those barriers. A culture where wins, both large and small, are widely celebrated. A leadership culture, where one seldom hears the phrases, "That's not your job," or "That's not my job."



The key is that women (and men) in position to influence others — either by virtue of the title they hold in an organization or because they have gained the necessary skills, insight and confidence — create the conditions under which many more people can and will lead within their broadly-defined spheres of influence. They encourage, promote, lay the groundwork for, communicate the need, celebrate steps along the way and otherwise create the conditions under which many more people can and will lead within their own spheres:



The first-shift line worker who tells her boss she needs to watch the third shift to find out what they are doing differently is leading.



Her boss who says, "Great, let me know what you find out," is leading.



The plant supervisor is leading when he celebrates the line worker's initiative and acts on the information she found.



And the COO is leading when she inspires others to action by telling the whole story throughout the company.



So yes, we need more women leaders. And men, too.





 •  0 comments  •  flag
Share on Twitter
Published on September 05, 2013 08:00

How CEOs Are Succeeding in Africa

Jonathan Berman, author of Success in Africa, busts media myths about the continent.



Download this podcast


A written transcript will be available by September 13.




 •  0 comments  •  flag
Share on Twitter
Published on September 05, 2013 07:42

Make Your Knowledge Workers More Productive


With scarcely any help from management, knowledge workers can increase their productivity by 20%. When we interviewed 45 such people across 39 companies in 8 industries in the United States and Europe, we found that by identifying low-value tasks to either drop completely, delegate to someone else or outsource, the average worker gained back roughly one day a week they could use for more important tasks. (We detail this process in our HBR article, "Make Time for the Work That Matters.")



If that's what your team can achieve without you, just imagine what they might do with your support.



Yet here is the challenge you face as a senior executive: You cannot manage your knowledge workers in the traditional and intrusive way you might have done with manual workers. Knowledge workers own the means of production — their brains. So large-scale re-engineering programs, productivity drives, and changes to the incentive system are unlikely to work: they can easily be resisted, ignored or gamed. But just letting your knowledge workers figure things out for themselves isn't a good model either — it is an abrogation of your responsibilities as a manager, and it allows people to either shirk their duties or lose focus chasing too many priorities.



You need to find the middle ground: judicious interventions that allow knowledge workers to help themselves. Our research and work with companies suggest three broad approaches you can try, each with its own pros and cons:



Enact a sharp "decree" to force a specific change in behavior. In 2011, Thierry Andretta, the CEO of French fashion company Lanvin, announced an initiative called "no email Wednesdays" because he thought people had stopped actually talking to each other. Atomic Object, a software company in Michigan, put in place a standing-only rule in meetings, to keep them focused and short. Marissa Mayer, Yahoo's CEO, ended the company's work-from-home policy to foster a more collaborative, innovative environment. Such decrees are risky; by forcing people out of their daily routine, you are bound to upset a few of them. For a decree to work, you need to be able to enforce it effectively, have a good reason to do it, and a thick skin. You should also position your decree as temporary — a way of forcing a change in behavior for a few months or a year, so that people can subsequently figure out for themselves if this new way of working is sensible.



Build smart support systems. Smarter support systems are intended to de-bureaucratize work and to help people prioritize more value-added work. While they take some time to work through, their benefits are likely to be felt for many years. There are two main approaches: building something new or taking something else away.



In terms of building something new, consider the notion of task-sourcing and hyperspecialization. This is where a knowledge worker farms out specific tasks to a low-cost provider, so she can focus on more value-added work. One of us (Jordan) had personal responsibility for setting up such a service in Pfizer - it became known as PfizerWorks, and its aim was simply to help employees be more effective in their jobs. Now whenever a PfizerWorks user decides she needs support, she simply pushes a button, describes her need in a pop-up window, and presses send. Users then report time and money saved after each task is completed. For example, a worker may want to understand which states in the US have similar drug substitution laws. This request would be routed to an analyst who would first locate, then download the individual state drug substitution law from each of the 50 states websites. The analyst would then review each law and group them by similar characteristic. The resulting deliverable could then be turned into a presentation or even a database for delivery back to the requester.



Possibly the biggest impact is the ease in which an employee can summon support and the increased motivation that results in getting back to doing what is interesting and impactful work.



In terms of taking something away, there are many management processes in large firms that are ripe for simplification. For example, a few years ago the top executive team at UBS Wealth Management realized that the biggest constraint on their future growth was their cumbersome and conservative budgeting process. By eliminating this top-down process, and pushing accountability for target-setting down to the individual heads of trading around the world, they enabled the business to grow more effectively and they got their knowledge workers — their client advisors — to take more responsibility than they had before. At a more micro-level, we saw a team at pharmaceutical company Roche recently experiment with a much simpler expense-claim processing system based around peer review rather than oversight, and again it was a useful way of getting rid of tedious and non-value-added activities.



Lead by example. A third approach is to follow Gandhi's dictum: be the change you wish to see in the world. Leading by example is about getting people to take responsibility for their own effectiveness. By pushing for significant changes in the balance of responsibility between your knowledge workers and the people who are nominally above them in the corporate hierarchy, you can help the knowledge workers to become more effective in their work.



Consider the case of Ross Smith, Director of testing for Microsoft Lync — the video conferencing and instant messaging service formerly known as Microsoft Office Communicator. Since taking the job four years ago, he has sought ways of giving greater responsibility to his 80-person division of software engineers. Last year, when Microsoft Lync 2010 was released, he was asked to reorganize his division so that the testing of the next generation product could begin. Rather than decide everything himself, he decide to let the reorg happen in a bottom-up way. He explained that individual contributors would select which of four teams they would like to be in. These 80 or so people became free agents looking for the position that was best for them. The team leaders could not offer more money, but they could offer employees opportunities to develop their careers, new technologies to work on, and new colleagues to work with.



The reorg — quickly dubbed a "WeOrg" — took longer than anticipated, as people really wanted to do the research to find the right fit and to interview their prospective bosses. There was some skepticism about whether it would work, but Smith had already built a sufficiently strong culture of trust for his people to give him the benefit of the doubt. He was also able to promise there would be no staff cuts because of the changes. The outcome was that 95 per cent of the team "liked" or "somewhat liked" the new method. But more broadly, the result was that Smith's 80-person team felt a much greater responsibility than usual for structuring their work in a way that was most effective for them and their colleagues.



While these three approaches are very different, all are forceful ways of pushing people out of their comfort zones to find more efficient ways of working. And who doesn't want more hours in the day?





 •  0 comments  •  flag
Share on Twitter
Published on September 05, 2013 07:00

Should Higher Education Be Free?

In the United States, our higher education system is broken. Since 1980, we've seen a 400% increase in the cost of higher education, after adjustment for inflation — a higher cost escalation than any other industry, even health care. We have recently passed the trillion dollar mark in student loan debt in the United States.



How long can a business model succeed that forces students to accumulate $200,000 or more in debt and cannot guarantee jobs — even years after graduation? We need transformational innovations to stop this train wreck. A new business model will only emerge through continuous discovery and experimentation and will be defined by market demands, start-ups, a Silicon Valley mindset, and young technology experts.



Neither the pedagogical model nor the value equation of traditional higher education have changed much in the past fifty years. Harvard, MIT, Yale, Princeton, and Stanford are still considered the best schools in the world, but their cost is significantly higher today than two decades ago.



According to Rafael Reif, MIT's president, who spoke at the Davos conference this past January, there are three major buckets that make up the total annual expense (about $50,000) of attending a top-notch university such as MIT: student life, classroom instruction, and projects and lab activities.



There is a significant opportunity to help reduce the lecture portion of expenses using technology innovations.



According to the American Institute of Physics (PDF), as of 2010, there are about 9,400 physics teachers teaching undergraduates every September in the United States. Are all of these great teachers? No. If we had 10 of the very best teach physics online and employed the other 9,390 as mentors, would most students get a better quality of education? Wouldn't that lead to lower per unit cost per class?



Yes, you might argue the lack of "classroom experience" is missing. But when it comes to core classes which don't require labs or much in-person faculty interaction, does the current model justify the value-price equation?



What is traditional college education really worth?



In a recent interview, Laszlo Bock, SVP of people operations at Google, said, "One of the things we've seen from all our data crunching is that G.P.A.'s are worthless as a criteria for hiring, and test scores are worthless — there is no correlation at all except for brand-new college grads, where there's a slight correlation." Even more fascinating is his statement that "the proportion of people without any college education at Google has increased over time," leading to some teams in which 14% have not gone to college. "After two or three years," Bock said, "your ability to perform at Google is completely unrelated to how you performed when you were in school, because the skills you required in college are very different."



Mr. Bock's comments suggest that smart people can figure out how to pass college tests if they can master what the professor wants, resulting in great test scores — but this skill and knowledge has very little relevance to solving daunting business problems with no obvious answers.



Once leading companies embrace what Google is already doing, seismic shifts and breakthroughs will occur in college education. Maybe a two year college degree will be sufficient instead of four. Imagine a business model where you take two years of courses online with the world's best teachers, followed by two years in structured problem-solving environments. Driven by market forces, such new business models could emerge faster than we expect.



So what is happening now? Who are some of the new education providers experimenting with new business models?



Emerging new education models

There are three strong players with millions of students and thousands of course offerings, all for free and available to anyone in the world. Coursera, Udacity, and edX have over four million enrolled students in their Massive Open Online Courses (MOOCs).



All three uniquely (and differently) replicate the classroom experience. Each uses top-notch professors and technologies in a creative manner — but not without challenges. One of the authors (Jatin Desai) enrolled in a few courses to test out the environments and found that, just like in the traditional classroom, courses vary greatly based on who is teaching. Some professors use the technology brilliantly and others use it as minimally as possible. (Access to higher bandwidth greatly enhances the experience.)



These three are not the only ones in the MOOC movement; many others
are quickly joining. In fact, the New York Times dubbed 2012 "The Year of the MOOC"
and Time magazine said that free MOOCs open the door to the "Ivy League for the Masses."



According to a recent Financial Times article, many employers are unsure of what to make of MOOC education — unsurprising, since many new technologies and business models go through multiple evolutions. The good news, according to the article, is that 80% of respondents surveyed would accept MOOC-like education for their internal employee development. We can extrapolate from this survey that the employer demand for online education exists — and, moreover, that it is only a matter of time until universities and well-funded venture capitalists will respond to this white space in the market very soon.



Georgia Tech, in fact, has already responded; in January, it will begin offering a master's degree in computer science, delivered through MOOCs, for $6,600. The courses that lead to the degree are available for free to anyone through Udacity, but students admitted to the degree program (and paying the fee) would receive extra services like tutoring and office hours, as well as proctored exams.



In the near future, higher education will cost nothing and will be available to anyone in the world. Degrees may not be free, but the cost of getting some core education will be. All a student needs is a computing device and internet access. Official credentialing from an on-ground university may cost more; in early 2012, MIT's MOOC, MITx, started to offer online courses with credentials, for "a small fee" available for successful students — and we're eager to see how Georgia Tech's MOOC degree will transform the education model.



What's next? How far are we away from new business models where MOOC-type pedagogy will dominate the first two years of college experience? When will most employers begin to accept non-traditionally credentialed MOOC-based education? And what will this mean for the education industry? With luck and ongoing innovation, perhaps the US's broken education system may be repaired.





 •  0 comments  •  flag
Share on Twitter
Published on September 05, 2013 06:00

Money's Other Purpose: Easing Our Fear of Death

In an experiment, people who had been counting money indicated a lower fear of death than people who had been counting slips of white paper -- about 5.3 versus 6.5 on a zero-to-12 scale, says a team led by Tomasz Zaleskiewicz of the University of Social Sciences and Humanities in Warsaw. Moreover, people's estimates of the sizes of coins were an average of 34% larger if they had been primed to think about mortality, presumably because thoughts of death intensify the subjective value attributed to money. People seem to desire money in part because it has the power to soothe fears of death, the researchers say.





 •  0 comments  •  flag
Share on Twitter
Published on September 05, 2013 05:30

Better Data Can Help Advertisers Avoid Cringe-Inducing Ad Placement


You're watching TV with the family and having a nice moment. Then all of a sudden a commercial comes on for an erectile dysfunction drug, or a preview for a horror flick. You interrupt your blissful reverie with a reflex reaction, knocking over the chip bowl in a mad dash to change the channel. And now you have to explain both the commercial and your crazed reaction to your children. Fun times with the family indeed.



Parents too often find themselves caught off guard during commercial breaks by advertising that is inappropriate for children. This is obviously bad for the parent, but it's bad for advertisers too, because they have failed to reach their consumer.



For marketers, there is an easy solution by thoughtfully matching product advertising with the compatible programming. The Association of National Advertisers (ANA) found marketers can increase ad effectiveness up to 30% by placing it in the right programming context, and that 10.7% of viewers changed their opinion about purchasing a product based on the programming context that the advertisement was displayed in.



This insight was a game changer for Crown Media Family Networks, home of Hallmark Channel and Hallmark Movie Channel. By focusing the broad range of advertisers with a family friendly message, Crown Media has driven up sales by 20% and nearly tripled profitability since 2010. In turn, by advertising on Hallmark Channel and Hallmark Movie Channel, these family-friendly brands are speaking directly to their priority audience and driving consumer sales, achieving a high ROI without additional spending.



But what about other non-family oriented brands and channels? This is where the palate concept comes in. Most traditionally think about palates in the context of food and beverage. We like to think of it as your sense of 'taste' broadly in many categories, including media. The premise is you can predetermine what you like or not like. And that we can create distinct groups of people with similar palates.



Paul Ekman, a noted American psychologist, pioneer in the study of emotions and the discoverer of the "micro-emotion" chronicled in Malcolm Gladwell's Blink, found six universal expressions or emotions — anger, disgust, fear, happiness, sadness and surprise. He also has a broader list of emotions that include amusement, contempt, contentment, embarrassment, excitement, guilt, pride, relief, satisfaction, sensory pleasure and shame.



Because both programmers and marketers aim to engage their audience through one or a variety of these emotions, the alignment of congruent programming and advertising content ensures that everybody wins, including the viewer. For example, Hallmark might be focused on happiness and sadness, the CSI series highlights anger, disgust, and fear, while reality TV might hit a wide variety of emotions. All brands also have their version of a brand architecture/pyramid which have 'emotional benefits' that aspire to provide or solve for one of the emotions above.



In a perfect world, media buyers would have access to an algorithm that would allow them to pinpoint the perfect programming environment for their product. Media spend would remain constant, while advertising effectiveness could increase by as much as 30%. This more precise approach would mark a significant improvement over the traditional demo-aligned media buying methodology.



The good news is that much of this information is increasingly available thanks to big data integration of what people buy and what people watch. Companies like Nielsen (parent company of my firm) and others are starting to offer these service. For instance, Nielsen Catalina Solutions has integrated data of what people are watching (via Nielsen's people meter television data) and what people are actually buying (via store membership card data) into a single-source database of households. Brands are not only finding interesting insights on what shows their buyers are watching, but they are using it to drive incremental sales without spending more on ads.



Deloitte noted that focused TV advertisements likely represent less than 0.1% of global TV advertising revenues — less than $200 million out of a $227 billion advertising market. Until the industry at large catches up, this signifies a tremendous opportunity for those marketers willing to shift their tactics to set a new bar and get a huge jump on their competition.





 •  0 comments  •  flag
Share on Twitter
Published on September 05, 2013 05:00

Marina Gorbis's Blog

Marina Gorbis
Marina Gorbis isn't a Goodreads Author (yet), but they do have a blog, so here are some recent posts imported from their feed.
Follow Marina Gorbis's blog with rss.