Marina Gorbis's Blog, page 1364

September 18, 2014

Better Teachers Receive Worse Student Evaluations

A 1-standard-deviation increase in university teachers’ effectiveness in boosting student performance reduces the students’ evaluations of their professors’ teaching quality by about half of a standard deviation, on average — enough to significantly reduce the teachers’ percentile ranking at the university, says a team led by Michela Braga of Bocconi University in Italy. Students, especially the least able, appear to respond negatively in their evaluations to the extra effort that good teachers require of them, a finding that casts doubt on universities’ reliance on student evaluations to inform faculty-promotion decisions. The researchers also found that student evaluations improve when there is fog and as the weather gets warmer, and they deteriorate on rainy days.




 •  0 comments  •  flag
Share on Twitter
Published on September 18, 2014 05:30

Ken Burns on “The Roosevelts” and American Leadership

More than nine million viewers tuned in to watch the first episode of Ken Burns’s new film “The Roosevelts” on PBS earlier this week—a sign that even in an era of reality TV and critically-acclaimed cable dramas, people want to understand more about real-life leaders. For Burns, the seven-part, 14-hour series (which is available via streaming video on the PBS website), is the latest in a career in which he’s trained his lens on leaders from Jefferson and Lincoln to Susan B. Anthony and David Sarnoff. Burns spoke with HBR about how his work as a filmmaker has influenced how he thinks about leadership. What follows are edited excerpts from our conversation: 


HBR: Why did you decide to pair the two President Roosevelts in a joint documentary?


Burns: It’s sort of strange that after all these years they haven’t been paired together in some major book or film. They have incredibly related and intertwined narratives that taken by themselves are strong, but are even more powerful when put together.  I assume it’s just the laziness of traditional media culture that it hasn’t been done until now—because Theodore Roosevelt was a Republican and Franklin Roosevelt was a Democrat, people feel that you should put them in different silos.


Businesses typically avoid nepotism, but American voters seem to accept family dynasties. Why is that?


I think it’s less a dynastic thing than the younger generation of politicians taking advantage of the last name of the older one. While I think Americans are resistant, in some way, shape and form to the notion or idea of dynasties, there is a convenience to having a well-known name. George W. Bush would not have been president without his father preceding him, and Hillary Clinton would not stand out as much as a presidential aspirant had she not been First Lady and married to Bill Clinton. They are familiar faces in a media culture that self-selects to boldfaced names.


As a historian, do you think about how today’s leaders will be viewed in 100 years?


It’s an unfair task—you just don’t know. I’m in the story business, and because I’ve chosen to work in American history, the stories are usually concluded 25 years out from the present moment. That’s the nature of history.


Has your view of leadership changed since you began this work 30 years ago?


I think it’s remained fairly constant. What’s so delightful is what we call “leadership” comes in so many different varieties—so many different human beings, and so many varieties of human experience. Look at Abraham Lincoln and Franklin Roosevelt—Lincoln was born into poverty on the frontier, and FDR was born to such great privilege that he could have spent his life in idleness, as many of his relatives did.


How is online streaming video changing your business?


One of the ways we abolish the cacophony of noise and the information deluge is to binge watch, which is how streaming video is allowing us to control our content. When I met television critics this year, none of them complained that “The Roosevelts” is 14 hours. At every juncture of my professional life critics have said no one will watch long historical documentaries, but now they realize that people are starved for 14 hours of content, whether it’s “Orange is the New Black” or “House of Cards.” The same laws of storytelling apply—if it’s a good story, it’s a good story. Mine just have to be based on fact. I can’t make it up.




 •  0 comments  •  flag
Share on Twitter
Published on September 18, 2014 05:00

September 17, 2014

The Chief Innovation Officer’s 100-Day Plan

Congratulations! Your energy and track record of successfully launching high-impact initiatives scored you a plum role heading up innovation. Expectations are high, but some skeptics in the organization feel that innovation is an overhyped buzzword that doesn’t justify being a separate function. So, what can you do in your first 100 days to set things off on the right track?


Over the past decade we’ve helped dozens of leaders through their first 100 days. Based on our experience, augmented by in-depth interviews with a few of the most seasoned practitioners with which we have worked, we suggest that innovation leaders put the following five items on their 100-day punch list.


Spend quality time with every member of the executive committee. This should go without saying, but it’s vitally important to develop relationships with the CEO, business unit leaders, and other key executives to understand the company’s strategy, so that the innovation approach and projects you pursue align with overall corporate goals. Brad Gambill, who over the past few years has played a leading role in strategy and innovation at LGE, SingTel, and TE Connectivity, believes the first 100 days are an ideal time to “ask dumb questions and master the basics of the business.” He particularly suggests focusing on the things “everyone else takes for granted and thinks are obvious but aren’t quite so obvious to people coming in from the outside.” So don’t be afraid to ask why a decision-making meeting ran the way it did or challenge the wisdom of pursuing a certain strategy or project.


It is particularly important to understand these executives’ views of two things – innovation’s role in helping the company achieve its growth goals and your role in leading innovation. Is innovation intended to improve and expand the existing business, or is it meant to redefine the company itself and the industry in which it operates? Do executives expect you to establish and incubate a growth businesses, act as a coach to existing teams, or focus on establishing a culture of innovation so that new ideas emerge organically?


As you invest time with top executives, you should begin to understand the organizational relationship between your innovation work and the current business. Are leaders willing to give up some of their human and financial resources to advance innovation? Are you expected to recruit a separate team from within and beyond the company? Or are you expected to spin straw into gold by working without dedicated resources? Will leaders support you if you propose radical changes to people, structures, processes, and roadmaps, or are you supposed to change everything but in a way that no one notices?


Zero in the most critical organizational roadblocks to innovation. Chances are, you won’t get the same answers to these questions from everyone you talk to. Those areas where executives disagree with one another will define the most immediate (and often the most fundamental) challenges and opportunities you’ll face in your role.


As quickly as possible within your first 100 days, therefore, you will need to understand where the fault lines lay in your company. Pay particular attention to the three hidden determinants of your company’s true strategy – how it funds and staffs projects, how it measures and rewards performance, and how it allocates overall budgets. A clear understanding of where leaders’ priorities fail to match what the company is actually funding and rewarding will help you identify the biggest hurdles to achieving your longer-term agenda, and where short-term workarounds are required.


Define your intent firmly but flexibly. You don’t need to have all the answers perfectly formulated from the beginning. But you should have a perspective – even on Day 1 – regarding how your role as the innovation leader can help the organization achieve its overall strategy. Look for ways to stretch the boundaries of current innovation efforts, but remember you are not the CEO or CTO. You need them to want to support you, not worry that you are gunning for their jobs. Gambill suggests one way to build this trust is never to bring up a problem without also proposing a solution. The CEO “has lots of people who know how to point out problems; it is important to establish yourself as a problem solver and confidant as quickly as possible.”


Determine how you plan to balance your efforts between developing ideas, supporting initiatives in other parts of the organization, and creating an overall culture of innovation. Those are related, but distinctly different, tasks. Don’t get too rooted to your initial perspective. Be as adaptable in your approach as you will be when you work on specific ideas.


Develop your own view of the innovation landscape around the company. Colin Watts, who has played a leadership role in innovation and strategy functions at Walgreens, Campbell Soup, Johnson & Johnson, and Weight Watchers, suggests getting a “clear market definition ideally grounded in customer insights.” Companies tend to define their world based on the categories in which they compete or the products they offer. However, customers are always on the lookout for the best way to get a job done and don’t really care what industries or categories the solutions happen to fall into. Understanding how customers make their choices often reveals a completely different set of competitors, redefining the market in which your company operates, its role in the market, and the basis for business success.


Watts also suggests zeroing in on the adjacencies that have the potential to shape your market. As he notes, “There is no such thing as an isolated market anymore.” Through an innovation lens you are likely to see early signs of change that the core business might have missed.


Develop a first-cut portfolio of short and longer-term efforts, with a few planned quick losses. A key component of your job, of course, will likely be to advance a set of innovation initiatives. Some may already be in progress. There may be a backlog of ideas waiting to be developed. Or the raw material might be a bit rougher, existing primarily in people’s heads. Regardless, in the first 100 days you want to come up with a clear view of some of the specific things on which you will plan to work. Some of these might be very specific initiatives, like identifying product-market fit for a new technology. Some might involve investigating broader areas of opportunity (for example, “wearables”). Some may involve developing specific capabilities. One specific capability Watts suggests building as an “investment that will pay back for years to come” is a “fast and cheap way to pilot ideas and products.”


Savvy innovation leaders place some long-term bets that they start to explore while also quickly addressing some more immediate business opportunities to earn credibility. If your portfolio is all filled with near-in ideas, some people in the core organization might naturally ask why they can’t do what you are doing themselves. And you are probably missing the most exciting and possibly disruptive ideas in your space. But if the portfolio is filled only with further out ideas, you run the risk that organizational patience will run out as you do the long, hard work of developing them.


When considering quick wins, don’t avoid quick losses. True innovation requires an organization to stop avoiding failure and see the benefits of learning from it. But failure remains very scary to everyone. Have enough things going on that you can tolerate a quick loss without damaging your overall pipeline. As Watts says, “You may be able to do it fast or do it cheap or do it reliably but not likely all three.” Make sure that you and your executive sponsors loudly and proudly celebrate the first project you stop when it becomes clear it won’t work.


That feels like a lot for 100 days, and it is. Innovation has the power to positively transform an organization, but no one said it was going to be easy.




 •  0 comments  •  flag
Share on Twitter
Published on September 17, 2014 10:00

Too Many Marketing Teams Are Stuck in the Past

Many marketing organizations are still operating like it’s the 1990s — or even earlier. Duplicative marketing teams exist within the same company across multiple product lines. Digital marketing teams are centralized yet isolated from the broader organization. Marketing groups are splintered into communications, consumer marketing, brand marketing, and digital marketing units with no common thread in strategy and execution.


Over the past dozen years, I have participated in both the infusion of digital capabilities into traditional marketing organizations and the establishment and maturation of digital marketing organizations at Disney, J.Crew and, now, Conde Nast Entertainment where I am VP of marketing-digital. Based on this experience, I see five areas that need to change in order for marketing to function effectively in the digital age.


Internal structure: Most marketing teams are organized by either functional expertise (such as social media marketing or marketing analytics) or brand. To be a successful digital marketing organization, your team needs to be organized by functional expertise rather than by brand, project or platform in order to deliver coherent, integrated campaigns across all consumer touchpoints. The customer who is a fan of your brand’s Facebook page should receive a more personalized email newsletter after visiting your website. She should be given a personalized promo code in her email to shop at your brick and mortar store based on her online shopping history, and later, be reminded with a push notification message on her mobile phone when the promo code is about to expire so she can take advantage of it online.


My team at J.Crew was organized by function such as affiliate marketing, paid search, email marketing, and search engine optimization. At Conde Nast Entertainment, my team is also organized by function across social media, paid advertising, earned/owned media, insights/analytics, and audience development. Each functional expert is responsible for all 12 brands that we work on. This structure is effective in a multi-brand environment with a centralized marketing team because each brand benefits from deep functional expertise as well as consistency across touch points.


Functional alignment: Many marketing organizations suffer from a failure of cross-functional collaboration. For example, IT decisions that affect marketing may be made without a thoughtful analysis about the resulting user experience beyond page load speed and server uptime. New product features may be introduced into an e-commerce site without understanding how they will impact traffic conversion rates and average order value.


Digital marketing teams need a seat at the table so they can infuse digital-first marketing insights into product and technology planning. Website feature changes should not be released without thoughtful analysis of the potential impact on traffic. Email marketing templates should not be altered for design reasons without a/b testing the impact of the change on click-through rates. Website page title changes should not be based solely on editorial considerations but also search engine optimization competitiveness.


Meritocracy vs. hierarchy: In traditional marketing organizations, job responsibilities and titles are hierarchical and rarely fluid. Each role is clearly defined and limited in scope. The new digital marketing organization thrives on a less hierarchical structure with more flexibility and an emphasis on meritocracy. Your Email Marketing Manager may also happen to be an expert in Instagram. Hence, your next email campaign may be highly integrated with social media. At Conde Nast Entertainment, digital marketing execution sometimes falls to whoever on my team can figure out the best way forward first.


When technology and consumer behavior patterns are changing so quickly, there may not be time to wait until the person assigned to the campaign gets around the task.


Data-driven decision making: Compared to digital organizations, traditional marketing organizations have a longer feedback loop on their campaign performance and results of their go-to-market strategy. In digital organizations, immediate data allows marketers to be smarter and faster in their decision-making. It is time to capitalize on the marriage of traditional and digital marketing data. Digital marketing insights can guide the strategy of traditional marketing and verse versa.


At J.Crew, I would determine my paid search marketing investments and choose which clothing product categories to drive online demand based on in-store sales data. For example, if the mint green cashmere sweater is a top category seller at stores in New York City, I would shift my paid search advertising to concentrate on relevant keywords, and target by geography and remarketing lists to customers in similar zip codes as they shop through search engines.


Governance: A few forward-thinking organizations are doing without a chief marketing officer, and instead have given the job of leading marketing to a chief digital officer. The question of who owns digital marketing in an organization is often uncertain. Accountability for digital revenue, digital product innovation, omnichannel strategy, and online audience growth blurs the line between many traditional roles from marketing to technology to product development to strategy.


Organizations that seek to be more digitally focused should first ensure alignment at the top between vision and execution. The CEO’s vision must prioritize digital marketing innovation. The execution of the vision could be governed by either a Chief Digital Officer or a CMO. Having this a CDO role could make sense if the company is revenue and product focused in an advertising supported business model. The CMO role could make sense if the company is consumer and content focused because of the specialized knowledge required to drive an effective traffic and audience strategy.


What’s next? As a marketer, I have witnessed two camps of organizational transformations in the digital age.


Camp one is characterized by cycles of digital misalignment across the company. The company makes significant investments in digital marketing, infrastructure, product design and technology to optimize digital performance, only to gut everything and start over again every two to five years. This is not only disruptive to the people in the organization but also to your company’s bottom line.


Organizations in the second camp define a pivotal moment at which they will become a digital-first company with a commitment to invest in digital marketing, technology infrastructure, and digital talents. From this point on, the organization reorganizes its workforce, strategy roadmap and investments to build a new marketing organization that fully integrates traditional and digital marketing in a sustainable way.




 •  0 comments  •  flag
Share on Twitter
Published on September 17, 2014 09:00

Algorithms Make Better Predictions — Except When They Don’t

Predictive analytics is proving itself both powerful and perilous. Powerful, because advanced algorithms can take a near-unlimited number of factors into account, provide deep insights into variation, and scale to meet the needs of even the largest company. Perilous, because bad data and hidden false assumptions can seriously mislead. Further, algorithms cannot (yet, anyway) tap intuition — the soft factors that are not data inputs, the tacit knowledge that experienced managers deploy every day, nor the creative genius of innovators.


So what should managers, especially leaders, do? The obvious answer is employ both computer-based programs and your own intuition. In this post, I’ll use a series of simple plots to explain how to tap the potential of predictive analytics, sidestep the perils, and bring both the data and your good judgment to bear.


To start, consider the figure below, “Performance since 2008,” a quarter-by-quarter time-series plot of results on a variable of interest (for example, it could be sales of a particular item, estimated resources to complete a certain project, etc). We need to predict performance for the first quarter of 2015 (1Q15).


Performance Since 2008 chart


A quick glance only might yield, “Wow, I don’t know. Performance is bouncing up and down. How would I even guess?”


After staring at the plot a bit longer, most individuals (and all good analytics programs) will spot seasonality: down in first quarters, up in thirds. The next figure is a simpler plot, featuring first quarters only.




Performance First Quarters Only chart


This plot suggests that the first quarter is pretty mundane; except for 2014, performance is tightly contained in a 91 to 93 band.


So what’s the prediction for 2015’s first quarter? As the figure, “Potential Predictions for First Quarter, 2015,” depicts, I can argue for at least three:


Potential First Quarter Predictions chart



“We should expect 1Q15 to be like most first quarters. There were several huge snowstorms last year, so 2014 was an anomaly.” Perhaps the explanation of a seasoned veteran who’s learned it’s best to under-promise and over-deliver.
“2014 is the new normal. We got a one-time boost because we turbocharged the knurdle valve.” Perhaps the prediction and explanation of an engineer who is proud to have improved a piece of the variable in question.
“We started a new trend in 2014 and should expect to see similar gains in 1Q15.” Perhaps the prediction and explanation of a new product manager, aiming to score points with the boss, who is demanding across-the-board improvements.

The quandary here underscores the importance of algorithms. I have no doubt that each of these managers is smart, well-meaning, and doing his or her best. But at most, only one of them is “right.” One in three is an excellent batting average in baseball, but hardly up to the demands of competitive business. Algorithms offer distinct advantages. Good ones are unemotional and (largely) apolitical. They don’t care that it is best to under-promise and over-deliver or that the new boss is particularly demanding.


At the same time, they’re capable of digging deeper. They can help evaluate whether the weather really played a factor in 2014 and take weather forecasts into account in predicting 2015. Similarly, they can seek evidence for the “new trend” in the second quarter and in similar variables. They can also search for possible causes. (Note: Algorithms can only detect correlation, though. Individuals must work out causation.)


In a related vein, good predictions should feature ranges, such as 94.9 ± 2.4. To see why this is, take a look at the figure below. The plot features two cases, one exhibiting low variation (in gray — note that all past values are between 94 and 96), the second relatively higher variation (in blue — values here range from 90 to 100). In both cases the mean is 94.9.


Variation Impacts Chart


Now suppose 1Q15 comes in at 98 (the hypothetical dot in the figure). This should come as a surprise in case 1 — 98 is far above anything that’s happened in the past. Not so in case 2 — several past values are greater. Thus prediction ranges (94.9 ± 2.4 for case 1 and 94.9 ± 9.5 for case 2) help managers understand just how close they should expect actual performance to be to the point prediction (94.9).


Calculating these ranges is quite technical. Few can do it by eyeball alone. But for good computerized algorithms, it is a snap.


These three abilities — to take emotion and politics out of the prediction, to seek deeper insights, and to quantify variation — are powerful, and leaders should seek to leverage them. That being said, managers should not be seduced into thinking that predictive algorithms are all-knowing. They are not.


Algorithms only operate on the inputs they’re provided. Snowstorms affect many things and may lie at the heart of the boost in the first quarter of 2014, as mentioned above. But if weather is not part of the algorithm, the suspected explanation cannot be taken into account.


Algorithms can also be remarkably sensitive to bad data. Consider the result if you were to change one data value by dropping a decimal place (e.g., a 95 became 9.5). The resulting prediction interval changes from 94.9 ± 2.4 to 91.4 ± 50, setting a trap for the unwary. At first glance, one might not challenge the 91.4 ± 50 and use it without too much thought. The impact, from preordering too much stock to missing an opportunity to reserving too little resources for completing an important project, may go unnoticed as well. But the costs can add up. Bad data is all too common and the impact on predictions can be subtle and vicious. At the root of the financial crisis, bad data on mortgage applications led banks to underestimate the probability of default — an issue that cascaded as those mortgages were packaged into complex products.


In addition, algorithms are also based on assumptions that may effectively be hidden in arcane technical language. For example, you may have heard, “We’ve assumed that variables are independent, homoscedastic, and follow normal distributions.” Such language can camouflage an assumption that is simply not true, since the terminology can scare people off from digging deeper. For example, the assumption that mortgage defaults are independent of one another held true enough (or didn’t matter) for a long time, until pressed in the run-up to the financial crisis. As Nate Silver describes in The Signal and The Noise, this led those who held the mortgages to underestimate risk by orders of magnitude (and exacerbating the data quality issues noted above).


Thus, you should never trust an algorithm that you don’t understand. The same applies for the input data. The only way to truly understand the algorithm is to ask (either yourself or data scientists) a lot of questions. You need to understand the physical reality that drives the variables you’re interested in and the explanatory factors you’re using to predict them. You need to understand the real-world implications of the assumptions.


More than anything, you need to know when the algorithm breaks down. Plots like the one below help. The figure presents the time-series for the “one misplaced decimal” situation I referenced above. I’ve also added the upper and lower prediction ranges (technically, this is a “control chart,” and the ranges are lower and upper control limits respectively). It is easy enough to see that 3Q12 was very strange indeed. There might be an explanation (i.e., bad data), or it may be that the underlying process is unstable. This is the key insight smart managers really seek. Until they know, smart managers don’t trust any prediction.


Control chart


This picture also underscores the need to invest in data quality. Over the long run, nothing builds better predictions more than knowing you can trust the data. Conversely, there is nothing worse than having a meeting about the implications of 1Q15’s predictions degrade into a shouting match about whether bad data stymies everything.


Finally, you must develop a keen sense of smell for predictive analytics, the data, and your own intuition. Trust your intuition and use it to challenge the analytics and the data, and conversely, use them to train your intuition. If something just doesn’t “smell right,” become very, very skeptical.


Good algorithms make better predictions than people most of the time — except when they don’t. If you’re fighting the first half of this claim, you need to get over it. Stop thinking of the algorithm as your enemy. And if you doubt the second half, prepare for some very harsh surprises.




 •  0 comments  •  flag
Share on Twitter
Published on September 17, 2014 08:00

Get Over Your Fear of Sales

When you graduate from college with a degree in communication studies and rhetoric, the business world can look very confusing. Unsure of where I fit in, I explored options. Many friends suggested sales. I was doubtful. I worried that being in sales would not carry the prestige and credibility I so badly wanted as I started my professional career. I was also having a hard time getting excited about selling any particular product.


Then I interviewed with a partner at a (then) Big Six consulting firm. He talked about an opportunity to work on “leveraging the most important assets in the firm— its people.”


“That sounds terrific,” I said, thinking that matched my interests, “What function is that?” The partner replied, “Human resource management.”


My 22-year-old self thought, How cool is that? It even has management in the title. That sounded way better than “Sales Rep for Acme Company.” Off to San Francisco I went to be a human resource management associate at a Big Six consulting firm.


After two years in this position, I had an epiphany. I was at an expense-account business lunch with a senior partner and an audit associate. The partner liked us both and remarked that we both had a lot of talent. He went on to say, “The main difference is that you (pointing to the audit associate) generate revenue, and you (pointing to me) are overhead.” Two of the three people at the table had a quick laugh, and my job search began as soon as we returned to the office. I came to a stark realization that day: sales is at the heart of every commercial enterprise and that being the revenue-generating engine of a business was actually a good thing. Maybe even something to be proud of.


In truth, though, I was afraid of sales. The perception. The quotas. I hated the idea of having to be pushy.


I’m hardly the only one with this misconception of sales. Twenty years later, with two stints as an executive vice president of sales along the way, I often see that despite the obvious need to sell their products, many companies encounter some form of resistance to “sales.” Ironically, this unwillingness to own and embrace a sales culture frequently comes from within the sales team itself. I hear sales professionals say, “I don’t really sell. I help clients make a buying decision.” Or “My job is more of being a consultant to my clients.” And my favorite, “I’m not in sales, I’m in business development.” Even professionals who have dedicated their careers to sales are afraid of sales. Or at least, they’re afraid of the label.  Why?


I’ve come to the conclusion that at least part of that fear stems from the persistence of an anachronistic definition of selling and a complete misunderstanding of what successful sales professionals actually do.


Many people equate sales with making people buy things they don’t want, don’t need, and can’t afford. That perception likely emerged from the days, at the turn of the 20th century, when hucksters and peddlers were among the few sales jobs on the U.S. census, and unfortunately this image still persists in some professions. The proverbial used-car salesman springs to mind.


But today there are over 28 census codes that reference professional sales specifically, many of which require tremendous expertise. For instance, a client of mine in the medical device industry employs sales professionals whom doctors consult about the proper application of their product while they are in surgery. Take that in. A doctor asking a sales professional questions during surgery. This is not your father’s salesman.


When I’m called on to help an organization with a sales transformation, I quickly gauge the culture and begin to address counterproductive beliefs that are holding them back from getting the performance they want. There are three key steps to overcoming a negative sales culture: You need to help them see that:


If you operate on the assumption that people will benefit from using your products and services, then sales is entirely about helping others. Done well, selling today is helping people identify and address their needs in order to achieve their goals: to improve efficiency in a business, to make something easier, to live a better life in retirement, to be safer, live longer, and so forth. In this way, sales is not simply an appendage of the organization responsible for distribution, but the conduit for showing how your clients benefit from your products or services.


How you sell is a vital part of the value you create for the customer. While conducting research and observing my own sales teams, I’ve sat in on over 1,000 meetings between sellers and buyers, and one of the things I’ve observed is that successful salespeople don’t “pitch” and they don’t “close.”  That is, they don’t prattle on about how great their offerings are, and they’re not pushy (what some have called the “spray and pray” method). This may sound like heresy to many sales professionals, particularly those who cut their teeth in sales before the 1990s. But it is true.


What they do instead is engage in a mutual dialogue about what a client is trying to accomplish, and then apply the solutions offered through their products or services to the client’s needs. The very best ask smart questions, helping clients to see problems they didn’t even know they had or opportunities around the corner.


One of the best examples of I’ve seen of this was a sales rep for a major daily newspaper. Her job was to sell ad space in a highly competitive market where advertisers had ever-increasing alternatives to newspaper space. I had the opportunity to observe several of her sales calls as part of a consulting assignment for the paper, and I immediately noticed how little she talked versus how much she encouraged the client to speak.  She told me that her objective was to help the client see why advertising with her newspaper would help him grow his business, and she asked insightful questions. When she did talk about advertising options, she focused specifically the ideas the client expressed. The meeting lasted only 45 minutes because she didn’t spend any time talking about features or benefits that weren’t relevant. At the end, she simply expressed an interest in working with the prospect, offering two or three suggestions on how they might proceed. He opted to receive a proposal and agreed to review it the next week. How she sold her product was key to her success as one of the top five sales reps in the company.


Every employee is selling in some capacity — even if they don’t think they are — so they might as well get good at it. In his book To Sell Is Human, Dan Pink indicates that more than 40% of our professional time is spent selling. Not only selling the company’s products or services, but selling ideas, approaches, or a particular way to solve a problem. I have written here before about how sales professionals create value with customers. Your ability to create value, is inextricably linked with your ability to sell, no matter what position you’re in.


When I work with professionals in customer service or IT who bristle at the idea of being in sales, I emphasize that done well, sales and service are very alike, though one is typically proactive, and the other is reactive. While that difference is not trivial, consider that the outcome of a good service experience and a good sales experience is the resolution of some problem a customer has, or the identification of some opportunity for improvement.


Don’t run from away from sales  as I used to. Update your thinking to the 21st century. Sales is the engine powering all business. And sales professionals are the ones driving the train.




 •  0 comments  •  flag
Share on Twitter
Published on September 17, 2014 07:00

Workers Don’t Have the Skills They Need – and They Know It

How do workers feel about the adequacy of their skills? Until now, few studies have examined their views. Today, a survey of employees is being released that provides strong confirmation of the notion that employees need better skills to do their jobs well, especially skills related to technology.


Over the past decade, employers have repeatedly reported that they have difficulty finding workers with the skills needed for today’s jobs. But influential voices have challenged this finding. For instance, The New York Times Editorial Board calls the notion of a skills gap “mostly a corporate fiction,” saying “don’t blame the workforce.” They claim that employers just “want schools and, by extension, the government to take on more of the costs of training workers that used to be covered by companies as part of on-the-job employee development.”


The new survey, commissioned by Udemy, a company that provides online training courses, sharply challenges the view that the skills gap is a corporate fiction. Polling 1,000 randomly selected Americans between the ages of 18 and 65, the survey found that 61% of employees also feel that there is a skills gap. Specifically, 54% report that they do not already know everything they need to know in order to do their current jobs. Moreover, about one third of employees report that a lack of skills held them back from making more money; a third also report that inadequate skills caused them to miss a promotion or to not get a job.


The most important skills that employees are missing are computer and technical skills. Of those reporting that they needed skills for their current job, 33% reported lacking technical skills, including computer skills. Management skills were second most important.


Skills Employees Need chart


The skills gap is not mainly about too little schooling. Survey respondents made clear that the skills learned in school differ from those required on the job; so while schooling is important, it’s not sufficient preparation for success at work. Of survey respondents who went to college, only 41% reported that knowledge learned in college helps them succeed in their current job. Seventy-two percent of respondents report that they needed to learn new skills for their current job. More generally, respondents reported acquiring those new skills in a variety of ways: some took formal, in-person classes, some took online courses, and many relied on informal learning from colleagues and other sources.


According to the survey, employers generally play an important role in helping workers learn. Employers paid for the majority of workers who reported taking paid online or in-person courses. And 30% reported that employers are very helpful at helping them gain new job skills; another 46% report that employers were somewhat helpful.


The overall picture is consistent with the view that new technology — especially information technology — is raising the skill level needed to thrive in the workplace. Schools don’t teach all of these skills and consequently on-the-job learning is very important. Employers aren’t the only ones who recognize this challenge. Employees know the skills gap is real, and they’re trying to close it.




 •  0 comments  •  flag
Share on Twitter
Published on September 17, 2014 06:00

People Are More Selfish and Dishonest After Doing Math

Research participants who had spent 15 minutes solving math problems were 4 times more likely to lie for personal gain in an ethics game than those who had answered randomly selected verbal questions from a standardized test, says a team led by Long Wang of the City University of Hong Kong. The act of calculating appears to crowd out people’s social and moral concerns, resulting in behavior that is more self-interested and even immoral. Stimuli such as family photos that prompt thoughts about social values appear to diminish these negative effects, the researchers say.




 •  0 comments  •  flag
Share on Twitter
Published on September 17, 2014 05:30

What Apple Gets Right with Its Smartwatch

When people say Apple has built things people didn’t know they need, it’s not really true. Apple has built things that meet the needs people have always had. More than any other consumer company, Apple gets what people really, fundamentally need. That’s why announcements like last week’s Apple Watch tend to have the cultural impact they do.


When we think of needs and products we often go right to Maslow’s Hierarchy of Needs, the ubiquitous theory that human needs manifest in a specific sequence, from base survival to the pinnacle of self-actualization. Marketers have spent decades figuring out at what level of Maslow’s hierarchy their customers are stuck, and then offering products and marketing for that need. Think of Campbell’s “Mmm-mmm Good” campaign at one end and Lexus’s “Relentless Pursuit of Perfection” at the other. If Maslow was right, brands needed to target a single need, satisfy it well, and be done.


But it turns out that Maslow wasn’t entirely right. My own research at Forrester Research has focused on synthesizing a much more complete and empirical description of people’s fundamental needs based on research in psychology, economics, and neuroscience. When we talk about human needs, we use four categories:



Connection
Comfort
Uniqueness
Variety

Crucially, we’ve learned that these needs are not hierarchical. Think of yourself: You don’t wake up in the morning and only think about food, then worry about making money, then think about loftier pursuits. Neither your day nor your life unfolds like that. It’s messier, because of our adaptive and clever biology. Our hormones, our neurotransmitters, even our gut bacteria cause us to think about base needs like survival and loftier ones like personal fulfillment simultaneously. In fact, they compete with one another for our attention, and we prioritize and re-prioritize them on the fly, as context changes.


Apple’s understanding of this is what sets it apart when it comes to launching market-changing products, including the newly announced Apple Watch. Apple doesn’t lock into one need on the hierarchy (soup that satisfies hunger, or perfect luxury car), but instead builds and markets products that connect on all four of the human needs that we’re grappling with constantly. Let’s use the Apple Watch as an example:


Connection: Texts, finger-drawn emoticons, even the feature some consider hopelessly gimmicky, heartbeat sharing, are all central to the device keeping you connected.


Comfort: Connections to loved ones is part of comfort, and so is the built-in health and fitness tracking, which makes the device something of a coach in your quest to improve yourself.


Uniqueness: An easy box for Apple to check. Though many were surprised by the Apple Watch’s conventional look (which pundits immediately declared savvy), Apple actually took the traditional winding crown of a watch and with it created a unique UI and UX, making it a tool for zooming in and out of maps or menus. The same is true in Apple’s creation of an original touch interface which distinguishes between a tap and a press, giving the small screen twice as much utility as it would otherwise have.


Variety: Design plays a big role here through interchangeable watch bands. We’ve seen recently examples even in Apple’s own marketing of customers celebrating uniqueness even though the products are remarkably uniform. Think of the commercial that flashes through the lids of dozens of MacBooks, each been dressed up with its own clever stickers, literally wrapped around the company’s brand mark. Variety can of course also come from the suite of apps available to put on your watch.


But couldn’t other smart watch entrants do the same thing? Forrester survey data shows that interest in wearing a wrist-based computing or sensoring device had grown from an already-high 28% in 2013 to an impressive 42% in 2014, all before the Apple Watch was a thing. But ask an average person if they know about the Pebble, the Samsung Gear products, or the new Moto 360 and you’ll get blank stares in return. They may know the Nike Fuel Band or Fit Bit.


I’d argue that none of those devices delivers on our four needs as fully or as conveniently as Apple. For example, even though Pebble is aiming for all four needs, it has used less-convenient technology to deliver on those needs — admirable as the early entrant but insufficient at this stage in the market. Samsung, on the other hand, has created a device that promises to meet these four needs fully, but as a company it doesn’t have the market power to draw other app makers into the environment as quickly as Apple can, giving Apple an app variety advantage from its first day on sale — as the mobile payment system announcement demonstrates. And in the mind of the potential buyer, Samsung and the others suffer vis-a-vis Apple because none can offer the reassurance — itself a form of comfort — that the company behind it has delivered on this before.


That’s another secret to Apple’s dominance. Once it established itself as a company that could meet these needs, people tend to trust the brand more — maybe more than it deserves, but certainly more than other entrants — giving it an advantage that other brands need to fight just to get people to listen; that’s why so many competing companies literally use Apple in their marketing, comparing their own products and features to the one’s that seem to hog all the attention: Apple’s. Apple seems to own the conversation. Other, highly-regarded smartwatches already exist, but now people are talking about Apple’s proposed definition of a smartwatch.


This was precisely the strategy that Apple used to sell the iPad, showing dialogue-free commercials that merely depicted the magical things the iPad could do for you. This made some people buy an iPad, and others know what they would want when they finally got a tablet from another manufacturer. Either way, Apple dominated by controlling the expectations the user had about what needs the iPad would fulfill.


The watch experience will be harder to illustrate than the iPad’s, to be sure, but I suspect Apple’s not done creating this experience. The smartwatch is the spearhead to a broader wearable experience, populated with a phone, a watch, an earpiece, health monitors, and more things that, deep down, you know you need.




 •  0 comments  •  flag
Share on Twitter
Published on September 17, 2014 05:00

September 16, 2014

A Brief History of America’s Attitude Toward Taxes

The changing attitudes toward and laws around income taxes has been a major driver of the rise of America’s modern talent-based, knowledge economy.


Two things strike me as I study the history. First, it is hard to see the historical development of US income taxation as a gradual evolution. Rather, it is characterized by major swings. Second, it is interesting to see a very consistent cycle in the tax treatment of the super-rich. I think that today we are approaching an inflection point. Unless we do something about the current set up, the tax system may end up as a major factor in the fall of that talent economy.


As I see it, the tax system has moved through four distinct eras over the last century and a half. During each era, government and society subscribed to a theory about what taxes were for, which was eventually replaced by another theory, flipping us into another era. Let’s look at how the pendulum has swung and how the treatment of the super-rich has changed. All the data is from the very handy Tax Foundation website. I have inflated all the incomes to 2013 dollars to make comparisons more easily understood.


The First Era: 1862-1915


From its inception in 1862 and up to 1915, personal income tax was not unlike a modern day state sales tax: a percent or two of income, with richer folks paying a slightly higher level. For example, in 1915 a $1 million earner paid income tax at a 2% rate. Like a sales tax, income tax was seen primarily as a revenue earner and not as a tool for influencing behavior. It was only mildly progressive: the rate was 1% on incomes up to about $450,000. In this era, the rich (e.g., $1 million earners) were taxed exactly the same as the super-rich (e.g., $10 million earners).


The Second Era: 1916-1931


In 1917, with the First World War at its height, Congress passed the War Revenues Tax Act, which changed thinking about personal income taxation dramatically. The new theory was that personal income tax could fund the war effort. And within that funding, rich people could and should pay more — and the super-rich much, much more.


Under the new Act, rates skyrocketed: a $1 million earner paid a 16% rate and the top marginal rate, which kicked in at $36 million, was a hitherto unimaginable 67%. A year later the rates went up still further: 43% for the $1 million earner and a 77% top rate kicking in at a $15 million income level.


After the war ended, rates drifted back down (the top rate went down to 25% in 1925) though the prewar rates were gone forever. But interestingly, the level at which the top rate kicked in fell all the way to $1.3 million by 1925. So although there had been a meaningful distinction between the rich and super-rich during the height of the war, after the war, they were all lumped together and the $36 million earner, who in 1918 had paid at a rate over four times that of the $1 million earner, was paying at the same rate in 1925-1931.


The Third Era: 1932-1981


The Great Depression precipitated the next big swing. The rate for $1 million earners shot up from 22% to 35% in one year between 1931 and 1932 and the top rate from 25% to 63%. Within just a dozen more years (1944) those rates were 84% and 94% respectively, with the top rate kicking in at only $2.6 million. At those rates, the average present-day mid-level investment banker would be giving the federal government all but 6 cents of his/her last dollar earned, which would seem to us to be a huge disincentive.


But in the third tax era, income of that scale was not typically assumed to something you could earn by working; it was something you derived by virtue of owning a particular asset, and earning from that asset what the economists call a “rent.”   According to the theory, most rich people were basically rentiers and their income from owned assets could — and should — be taxed at very high rates with no adverse impact on their behavior or the economy.


Financing WWII could have been used as an excuse for these highly confiscatory rates, but rather than dropping after the end of war, they continued to rise. By 1963, the $1 million earner was paying 89%. So in the mid-1960s, anybody in America that would be considered reasonably rich was keeping a mere 10% of marginal earnings — and that is before paying all state, municipal and indirect taxes; with all of those added in, they probably kept less than 5 cents on the extra dollar.


From about 1960, however, the economy began to change, as I describe in this HBR article, with an increasing proportion of earnings and wealth being tied to value created by way of the exercise of talent through work. With this change there came a growing awareness that 90% personal income taxation had a disincentive effect. Between 1963 and 1981, therefore, the rate on a $1 million earner slid from 89% to 70%. But, somewhat paradoxically and echoing the 1920s, the level at which the top rate kicked in plummeted to $272,000 — meaning that by 1981 virtually everyone who was upper-middle class or above paid the top marginal rate.  There was no longer a distinction of any meaningful kind between rich and super-rich.


The Fourth Era: 1982-Present


It was not until the 1980s, by which time the idea that the economy was knowledge driven had firmly taken hold, that our lawmakers finally abandoned the prewar assumption that all rich people were rentiers and recognized that at the prevailing rates talented people were being put off work. Instead, the new theory was basically that all income should be considered to be the product of exercising talent and that people should be taxed less so that they had a motive to work.


But with the abandonment of the rich-as-rentier concept, lawmakers no longer drew a distinction between the rich and other folks, making it easier to justify reducing tax thresholds to compensate for falling rates. This is exactly what happened: in 1982 the top rate dropped to 50% but kicked in at $101,000. By 1988, it had fallen to 28% and kicked in at $29,000, which meant that America effectively had a flat tax of 28% (the 15% rate for incomes below $29,000 would have applied to very few fully employed Americans). Since then the top rate has drifted up to 39.6%, kicking in at $220,000. But the progressivity of the system is still extremely modest.


Towards a Fifth Era?


A quick look at this brief history dispels a common misperception among American Baby Boomers, Generation X’s, and Millennials who all think the current system is “the way America taxes” because it is the only thing they have ever known. It is actually a modern phenomenon — a product of the most recent theory change, in this case from a rentier theory, in which economic growth is seen as the product of exploiting assets, to a talent theory in which growth is driven by the exercise of talent and the application of knowledge.


The history also demonstrates that the current system of equal treatment of the rich and the super-rich (and in this case also of the same as the upper-middle class) is not typical or normal. Rather, it happens to be at one of the two poles across which the system has oscillated over history.


So will the current system endure? I think not. In times of crisis, America has shown that it asks the super-rich to pay a lot more than the rich and I think this will happen based on the feeling that it is a time of economic crisis in America. Also, although applying a rich-as-rentier theory (implying tax rates in the 70% plus range for high incomes) isn’t really fit for purpose in a talent-driven economy, it’s also not justifiable to have a maximum rate that doesn’t distinguish between a mid-level executive and a hedge fund manager.


My bet is that the Fifth Era will look a lot like the early Third Era — after the height of the Great Depression but before the inception of WWII. That is, $10 million earners paying in the 75% range, $1 million earners in the 50% range and $500,000 earners in the 35% range.


How high or low the rates of the Fifth Era structure will be will depend, I think, on whether talent is seen as engaging primarily in trading value or primarily in creating value for their fellow citizens (in terms of better products and services and more jobs). If it is the former, they will be taxed more highly as unworthy rentiers and there will be little concern for incentive effects. If the latter, they will be taxed as important economic assets whose incentives must not be dampened. Right now, sentiment is trending more in the former direction than in the latter — a perception that the talented people on the Forbes 400 list have done little to dispel.




 •  0 comments  •  flag
Share on Twitter
Published on September 16, 2014 10:00

Marina Gorbis's Blog

Marina Gorbis
Marina Gorbis isn't a Goodreads Author (yet), but they do have a blog, so here are some recent posts imported from their feed.
Follow Marina Gorbis's blog with rss.