Marina Gorbis's Blog, page 1591

June 7, 2013

How CIOs Can Change the Game

The findings are sobering: almost half of CEOs view their CIOs as out of step with the business and about the same percentage think IT should be a commodity service, purchased as needed. We tackled this thorny issue in a webinar sponsored by the Harvard Business Review, Dell, and CIO.com called "Change the Conversation, Change the Game." It was an enlightening conversation with business strategy guru Gary Hamel, Newport News Shipbuilding's CIO Leni Kaufman, and Walgreens' CIO Tim Theriault, with HBR editor Angelia Herrin moderating. The entire webinar is worth watching for its many golden nuggets, but here are a few key takeaways on what CIOs need to be doing differently to meet this brave new world in which IT can no longer afford to be just a service provider.



Don't talk IT. Talk business. As Leni Kaufman noted: "I think often people come into a conference room, they come into a meeting, and then they talk IT. Well, don't talk IT. Talk business. Talk about the goals of the company, the growth plan, the projection it's on, how you're going to improve profitability, talk about what the government is funding, what's happening with sequestration. Be part of that conversation, and then you become part of what is on the CEO's mind. You have to do the job that you're there to do, but really make it much bigger, much broader than that."



Talent management is critically important. "You need to make sure that your people, in your next line of the reporting structure, are absolutely top talent that can carry the agenda forward," said Tim Theriault. "I want to spend more of my time on continuous improvement and innovation. So the good news is it represents more opportunity for others — people who want to be CIOs someday. I get to focus on the things that really align to the CEO, but at the same time I have to make sure IT agenda is being carried out exceptionally well. You absolutely are still responsible. So your reliance on talent management is critically important."



Be a compassionate contrarian. "What does it mean to be a leader in this kind of environment today?," asked Gary Hamel. "Beyond all the technical skills and so on, for me there are three things that are really critical. One is, you have to be a contrarian in your heart. You have to be able to look at what everybody else takes for granted and say, is there another way of doing this? Number two, you have to have a lot of courage today. You have to be able to look beyond what everybody else takes as best practice. And I think the third and most important thing is, if you really want to be a change leader, is you have to have compassion. People have to believe that you are not fighting your corner. This is not about IT; it's not even just about the business. It's about working from the customer backwards. And when people understand that that's who I'm here for, and that's my ultimate reference point, and how do I improve the quality of life, people will give you enormous amount of runway to try things, to take risks, to experiment. I think that that contrarian heart and that compassionate spirit, that courage, those are huge multipliers for anybody today who's trying to be a leader in this chaotic world we're in.



My view is that the CIO facing three dilemmas. One, I need to help the CEO figure out how to reorganize the company — not that he's going to specifically ask you that, but what are the technology capabilities and potentialities that can be used to create management innovation? Sometimes when you think innovation, you think products and services, but the reality is management innovation. The other piece of it is, how am I going to use technology to create the true value proposition, because we have been so good at the efficiency gain that drew all of us into commoditization. Everything is a commodity now. Because we have been so good at this, at efficiency, now we have to move efficacy. How are we going to do that? How are we going to integrate information into our products and services? And for the entire time by the way we do have to keep the lights on. How can the CIO do all of these things at once? The answer is you can't. You've got to start prioritizing. The CIO needs to step up and become the mentor to the organization, because you should understand how the business operates. You also understand the potentials and threat of the technologies, and you can act as the mentor in the C-suite on the change that's coming at us. Because it's coming, and it's whether you're going to change catastrophically, or whether you're going to transform. It's really the choice you're facing.





 •  0 comments  •  flag
Share on Twitter
Published on June 07, 2013 07:00

Your Smartphone Works for the Surveillance State

I was 10 years old when the Berlin Wall came down — old enough to grasp that something important was happening, but not really old enough to understand exactly what was happening. Like a lot of kids born around that age, the specter of communism has never seemed like that much of a threat. We would hear stories about how horrific life was living under conditions such as these; but only in the context of something that had already failed. It's only through history and books or films that my generation has a grasp of what life must have been like.



Just recently, I had the chance to watch the German film, The Lives of Others, which won the 2007 Oscar for Best Foreign Language Film. Not only is it a remarkable story, but it gave me the best glimpse I've had yet of what day-to-day life must have been like in a state like East Germany. The infamous East German secret police, the Stasi, managed to infiltrate every pay of German life, from factories, to schools, to apartment blocks — the Stasi had eyes and ears everywhere. When East Germany collapsed in 1989, it was reported to have over 90,000 employees and over 170,000 informants. Including the part-time informants, that made for about one in every 63 East Germans collaborating to collect intelligence on their fellow citizens. You can imagine what that must have meant: people had to live with the fact that every time they said something, there was a very real chance that it was being listened to by someone other than for whom they intended. No secret police force in history has ever spied on its own people on a scale like the Stasi did in East Germany. In large part because of that, those two words — "East Germany" — are indelibly imprinted on the psyche of the West as an example of how important the principles of liberal democracy are in protecting us from such things happening again. And indeed, the idea that it would happen seems anathema to most people in the western world today — almost unthinkable.



And yet, here we are. In terms of the capability to listen to, watch and keep tabs on what its citizens are doing, the East German government could not possibly have dreamed of achieving what the United States government has managed to put in place today.



The execution of these systems is, as you'd expect, very different. The Germans relied upon people, which, even if not entirely effective, must have been absolutely terrifying: if for no other reason than you weren't sure who you could and could not trust. There was always that chance someone was reporting back on you. It might have been a colleague. A neighbor. A shop keeper. A school teacher. Not knowing whether someone you couldn't see was listening to what you had to say, or whether those that you could see might be passing it back to the authorities — that must have taken an incredibly heavy toll on people.



But as any internet entrepreneur will tell you, relying entirely on people makes scaling difficult. Technology, on the other hand, makes it much easier. And that means that in many respects, what has emerged today is almost more pernicious; because that same technology has effectively turned not just some, but every single person you communicate with using technology — your acquaintances, your colleagues, your family and your friends — into those equivalent informants.



Think about the proportion of our lives that are undertaken online and digitally. Every tweet, every interaction on Facebook, every photo on Instagram. You search for directions with a myriad of online mapping options. You check in your location on Foursquare. You review restaurants you've visited on Yelp. You speak to people all over the world using Skype. Every time you have a question, you type it into Google, or perhaps ask it on Quora. An increasing amount of your purchases are conducted on eBay or Amazon. You back up your laptop to the cloud. Almost everything you listen to or read is there too, or in iTunes. And while you might scoff at these as something that only early adopters use, even late adopters of digital technologies leave behind an incredibly detailed trail of their lives. Every minute you spend on the phone; in fact, every minute you carry it around in your pocket; every email you write; every instant message you send. Every transaction that passes through your credit card is recorded.



For an average person, with access to just one of these, you could piece together quite an interesting picture of a person's life. Interviewed recently on the Charlie Rose Show, Biz Stone, co-founder of Twitter, observed that for a lot of people, "email is the most intimate witness to our lives in some capacity. [It] knows a lot about our lives." But that's absolutely nothing compared to the portrait you could paint of somebody with access to a full range of all these services.



Which, we found out yesterday, is exactly what the NSA has.



But the technology alone isn't the problem. There has been a dramatic shift in mentality, and it doesn't take much to work out the date on which this happened: September 11, 2001.



Much has been said about what happened in the aftermath of that tragic event. The extent to which there was an extreme political response is understandable, if not entirely forgivable. What is more shocking, however, is that ten years on — with the risk of death being higher from a lightning strike than from a terrorist attack, and with the election of a President that had railed against "a false choice between the liberties we cherish and the security we provide" — the problem hasn't got better at all.



In fact, it's the opposite. It's got worse. Way worse.



Well before the revelations of the last two days, there had been serious hints that things were going astray. Like the data center off in the desert of Utah, apparently part of a network capable of storing yottabytes of data (I don't know about you, but I'd never even heard the term "yottabyte" before). A former employee of the NSA described the basic premise of the center as just to capture everything: "financial transactions or travel or anything... [and] the ability to eavesdrop on phone calls directly and in real time." Similarly, the Verizon revelations shouldn't come as that big a surprise; according to a former AT&T worker cooperating in an Electronic Frontier Foundation lawsuit, AT&T had provided the NSA "with full access to its customers' phone calls, and shunted its customers' internet traffic to data-mining equipment installed in a secret room in its San Francisco switching center" as far back as 2006.



And you could even see symptoms of the problem overseas. Europe — yes, the former home of East Germany — bravely proposed a series of changes to its laws that would enshrine the privacy of its citizens. These proposals would strike most people as reasonable: a right to get information out of a provider in a form that could be taken to a rival provider, and a right to be forgotten by a provider. Who was opposed to this? Not China or Iran, concerned that its citizens might benefit from rules that would allow them to cover their footprints or that companies would have a much stronger incentive to protect users' privacy. Instead, it was an all-American alliance: US technology firms, concerned that privacy might undercut their business models, and the US government, worried that their ability to surveil without issue might be disturbed. The scale of the lobbying effort to defeat these privacy regulations was so unprecedented that it caused EU Commissioner Viviane Reding to say that "I have not seen such a heavy lobbying operation."



The government will undoubtedly argue that the way in which this surveillance is all being conducted is very different to how it would be used in a non-democratic state; that is in fact exactly the line taken in an exit interview by Alec J. Ross, the State Department's outgoing senior adviser on innovation: "the truth of the matter is that there are laws and due process in the United States that protect our liberties to a degree that simply do not exist in 99% of the rest of the world." The evidence points to the contrary, however. For example, over the past ten years, the Supreme Court has prevented any challenge to the warrantless wiretapping of American citizens, relying on logic that could have been lifted straight out of Catch-22: to "properly challenge secret Government programs requires the very information the Government refuses to disclose" — in other words, nobody actually has the standing to challenge the policy. And even in the presence of evidence — for example, let's say the Government erroneously had sent you documentation of the fact that you're being warrantlessly wiretapped — then it will simply fall back on argument of sovereign immunity to have the case dismissed.



There's not a thing that can be done about it.



Now, if this was an ideological principle — a deep and profound belief in transparency, and the disinfecting power of sunlight — then, again, at least it would be understandable. But it's not that, either. Simultaneously, while doing everything it can to watch you, the government is taking another page out of the East German playbook — doing everything it can to stop you from watching it.



The Washington Post ran a special back in 2010 entitled "Top Secret America" that detailed the extent to which this was taking place. "The top-secret world the government created in response to the terrorist attacks of Sept. 11, 2001, has become so large, so unwieldy and so secretive that no one knows how much money it costs, how many people it employs, how many programs exist within it or exactly how many agencies do the same work." There is an entire industry that is simply out of the view of the public. Information about what Government is doing — essential for people to be able to make an informed choice in a representative democracy — is simply buried within it. Over-classifying has been turned into an art form. Embarrassing information is increasingly being classified, so it never sees the light of day. In fact, just last year, US Government transparency has hit an all-time low: the government cited national security "to withhold information at least 5,223 times — a jump over 4,243 such cases in 2011 and 3,805 cases in Obama's first year in office. The secretive CIA last year became even more secretive: Nearly 60 percent of 3,586 requests for files were withheld or censored for that reason last year, compared with 49 percent a year earlier."



The difficulty with which it is possible to pry open this world, and the consequences for doing so, are escalating rapidly, too. Whistleblowers are being strung up: say you leak information about government financial waste and mismanagement, well, you could find yourself being charged under the Espionage Act — the same statute used to convict Aldrich Ames, the C.I.A. officer who, in the eighties and nineties, sold U.S. intelligence to the K.G.B. In fact, this administration has launched more prosecutions of whistleblowers using this law than every previous administration combined. It has sought rules to allow federal agencies to fire employees without appeal if their work has some tie to national security. FBI investigations into leaks just so happen to be conducted in a way to ensure a chilling of the relationship between government officials and journalists. Then theere's the cases of Bradley Manning and John Kiriakou.



And while whistleblowers are being strung up, journalists are being hunted down, too. The DoJ secretly obtained two months of telephone records of AP journalists. Similarly, Fox News reporter James Rosen went from being a journalist to an "an aider and abettor and/or co-conspirator" in order to get a subpoena for his private email account. As the New Yorker pointed out, it was "unprecedented for the government, in an official court document, to accuse a reporter of breaking the law for conducting the routine business of reporting on government secrets."



And we haven't even touched on the topic of Wikileaks. Despite it taking on the role of a publisher, using the power of the internet to avoid the requirement of a legacy print business, it wasn't long after it started peeling back all these layers of secrecy that it was denounced by some as a terrorist organization. One might wonder how the East Germans would have reacted to such an organization? Perhaps Lenin, who had a 19-meter statue erected of him in the East German city of Leninplatz, might offer us some clues to as the way they would have thought about it: "Why should freedom of speech and freedom of the press be allowed? Why should a government which is doing what it believes to be right allow itself to be criticized? It would not allow opposition by lethal weapons. Ideas are much more fatal things than guns. Why should any man be allowed to buy a printing press and disseminate pernicious opinions calculated to embarrass the government?"



It's a line of reasoning that befits a failed surveillance state. And yet today, is remains all too familiar.



Yesterday, when news of the PRISM program leaked into the public domain, two items struck me. The first, from the New York Times: "The defense of this practice offered by Senator Dianne Feinstein of California, who as chairman of the Senate Intelligence Committee is supposed to be preventing this sort of overreaching... said that the authorities need this information in case someone might become a terrorist in the future." And then, there was this, from the Washington Post: "They quite literally can watch your ideas form as you type."



Watching peoples' ideas form as they type, in order to protect against someone who might become a terrorist in the future. George Orwell, eat your heart out.



The thing about that wall that cleft Berlin in half is that it didn't just represent a means of keeping people from freely moving. It represented something much more — it was about ideas and principles. About the balance between security and freedom. About whether you were there to serve the state, or the state was there to serve you. What I can't seem to shake is the feeling that somehow, the country most responsible for tearing that wall down has somehow managed to rebuild one in its own back yard.




Data Under Siege
An HBR Insight Center





Welcome to the "Data Under Siege" Insight Center
Does Your CEO Really Get Data Security?
The Companies and Countries Losing Their Data
Hack-Proof Your Company's Social Media





 •  0 comments  •  flag
Share on Twitter
Published on June 07, 2013 06:21

Your iPhone Works for the Secret Police

I was 10 years old when the Berlin Wall came down — old enough to grasp that something important was happening, but not really old enough to understand exactly what was happening. Like a lot of kids born around that age, the specter of communism has never seemed like that much of a threat. We would hear stories about how horrific life was living under conditions such as these; but only in the context of something that had already failed. It's only through history and books or films that my generation has a grasp of what life must have been like.



Just recently, I had the chance to watch the German film, The Lives of Others, which won the 2007 Oscar for Best Foreign Language Film. Not only is it a remarkable story, but it gave me the best glimpse I've had yet of what day-to-day life must have been like in a state like East Germany. The infamous East German secret police, the Stasi, managed to infiltrate every pay of German life, from factories, to schools, to apartment blocks — the Stasi had eyes and ears everywhere. When East Germany collapsed in 1989, it was reported to have over 90,000 employees and over 170,000 informants. Including the part-time informants, that made for about one in every 63 East Germans collaborating to collect intelligence on their fellow citizens. You can imagine what that must have meant: people had to live with the fact that every time they said something, there was a very real chance that it was being listened to by someone other than for whom they intended. No secret police force in history has ever spied on its own people on a scale like the Stasi did in East Germany. In large part because of that, those two words — "East Germany" — are indelibly imprinted on the psyche of the West as an example of how important the principles of liberal democracy are in protecting us from such things happening again. And indeed, the idea that it would happen seems anathema to most people in the western world today — almost unthinkable.



And yet, here we are. In terms of the capability to listen to, watch and keep tabs on what its citizens are doing, the East German government could not possibly have dreamed of achieving what the United States government has managed to put in place today.



The execution of these systems is, as you'd expect, very different. The Germans relied upon people, which, even if not entirely effective, must have been absolutely terrifying: if for no other reason than you weren't sure who you could and could not trust. There was always that chance someone was reporting back on you. It might have been a colleague. A neighbor. A shop keeper. A school teacher. Not knowing whether someone you couldn't see was listening to what you had to say, or whether those that you could see might be passing it back to the authorities — that must have taken an incredibly heavy toll on people.



But as any internet entrepreneur will tell you, relying entirely on people makes scaling difficult. Technology, on the other hand, makes it much easier. And that means that in many respects, what has emerged today is almost more pernicious; because that same technology has effectively turned not just some, but every single person you communicate with using technology — your acquaintances, your colleagues, your family and your friends — into those equivalent informants.



Think about the proportion of our lives that are undertaken online and digitally. Every tweet, every interaction on Facebook, every photo on Instagram. You search for directions with a myriad of online mapping options. You check in your location on Foursquare. You review restaurants you've visited on Yelp. You speak to people all over the world using Skype. Every time you have a question, you type it into Google, or perhaps ask it on Quora. An increasing amount of your purchases are conducted on eBay or Amazon. You back up your laptop to the cloud. Almost everything you listen to or read is there too, or in iTunes. And while you might scoff at these as something that only early adopters use, even late adopters of digital technologies leave behind an incredibly detailed trail of their lives. Every minute you spend on the phone; in fact, every minute you carry it around in your pocket; every email you write; every instant message you send. Every transaction that passes through your credit card is recorded.



For an average person, with access to just one of these, you could piece together quite an interesting picture of a person's life. Interviewed recently on the Charlie Rose Show, Biz Stone, co-founder of Twitter, observed that for a lot of people, "email is the most intimate witness to our lives in some capacity. [It] knows a lot about our lives." But that's absolutely nothing compared to the portrait you could paint of somebody with access to a full range of all these services.



Which, we found out yesterday, is exactly what the NSA has.



But the technology alone isn't the problem. There has been a dramatic shift in mentality, and it doesn't take much to work out the date on which this happened: September 11, 2001.



Much has been said about what happened in the aftermath of that tragic event. The extent to which there was an extreme political response is understandable, if not entirely forgivable. What is more shocking, however, is that ten years on — with the risk of death being higher from a lightning strike than from a terrorist attack, and with the election of a President that had railed against "a false choice between the liberties we cherish and the security we provide" — the problem hasn't got better at all.



In fact, it's the opposite. It's got worse. Way worse.



Well before the revelations of the last two days, there had been serious hints that things were going astray. Like the data center off in the desert of Utah, apparently part of a network capable of storing yottabytes of data (I don't know about you, but I'd never even heard the term "yottabyte" before). A former employee of the NSA described the basic premise of the center as just to capture everything: "financial transactions or travel or anything... [and] the ability to eavesdrop on phone calls directly and in real time." Similarly, the Verizon revelations shouldn't come as that big a surprise; according to a former AT&T worker cooperating in an Electronic Frontier Foundation lawsuit, AT&T had provided the NSA "with full access to its customers' phone calls, and shunted its customers' internet traffic to data-mining equipment installed in a secret room in its San Francisco switching center" as far back as 2006.



And you could even see symptoms of the problem overseas. Europe — yes, the former home of East Germany — bravely proposed a series of changes to its laws that would enshrine the privacy of its citizens. These proposals would strike most people as reasonable: a right to get information out of a provider in a form that could be taken to a rival provider, and a right to be forgotten by a provider. Who was opposed to this? Not China or Iran, concerned that its citizens might benefit from rules that would allow them to cover their footprints or that companies would have a much stronger incentive to protect users' privacy. Instead, it was an all-American alliance: US technology firms, concerned that privacy might undercut their business models, and the US government, worried that their ability to surveil without issue might be disturbed. The scale of the lobbying effort to defeat these privacy regulations was so unprecedented that it caused EU Commissioner Viviane Reding to say that "I have not seen such a heavy lobbying operation."



The government will undoubtedly argue that the way in which this surveillance is all being conducted is very different to how it would be used in a non-democratic state; that is in fact exactly the line taken in an exit interview by Alec J. Ross, the State Department's outgoing senior adviser on innovation: "the truth of the matter is that there are laws and due process in the United States that protect our liberties to a degree that simply do not exist in 99% of the rest of the world." The evidence points to the contrary, however. For example, over the past ten years, the Supreme Court has prevented any challenge to the warrantless wiretapping of American citizens, relying on logic that could have been lifted straight out of Catch-22: to "properly challenge secret Government programs requires the very information the Government refuses to disclose" — in other words, nobody actually has the standing to challenge the policy. And even in the presence of evidence — for example, let's say the Government erroneously had sent you documentation of the fact that you're being warrantlessly wiretapped — then it will simply fall back on argument of sovereign immunity to have the case dismissed.



There's not a thing that can be done about it.



Now, if this was an ideological principle — a deep and profound belief in transparency, and the disinfecting power of sunlight — then, again, at least it would be understandable. But it's not that, either. Simultaneously, while doing everything it can to watch you, the government is taking another page out of the East German playbook — doing everything it can to stop you from watching it.



The Washington Post ran a special back in 2010 entitled "Top Secret America" that detailed the extent to which this was taking place. "The top-secret world the government created in response to the terrorist attacks of Sept. 11, 2001, has become so large, so unwieldy and so secretive that no one knows how much money it costs, how many people it employs, how many programs exist within it or exactly how many agencies do the same work." There is an entire industry that is simply out of the view of the public. Information about what Government is doing — essential for people to be able to make an informed choice in a representative democracy — is simply buried within it. Over-classifying has been turned into an art form. Embarrassing information is increasingly being classified, so it never sees the light of day. In fact, just last year, US Government transparency has hit an all-time low: the government cited national security "to withhold information at least 5,223 times — a jump over 4,243 such cases in 2011 and 3,805 cases in Obama's first year in office. The secretive CIA last year became even more secretive: Nearly 60 percent of 3,586 requests for files were withheld or censored for that reason last year, compared with 49 percent a year earlier."



The difficulty with which it is possible to pry open this world, and the consequences for doing so, are escalating rapidly, too. Whistleblowers are being strung up: say you leak information about government financial waste and mismanagement, well, you could find yourself being charged under the Espionage Act — the same statute used to convict Aldrich Ames, the C.I.A. officer who, in the eighties and nineties, sold U.S. intelligence to the K.G.B. In fact, this administration has launched more prosecutions of whistleblowers using this law than every previous administration combined. It has sought rules to allow federal agencies to fire employees without appeal if their work has some tie to national security. FBI investigations into leaks just so happen to be conducted in a way to ensure a chilling of the relationship between government officials and journalists. Then theere's the cases of Bradley Manning and John Kiriakou.



And while whistleblowers are being strung up, journalists are being hunted down, too. The DoJ secretly obtained two months of telephone records of AP journalists. Similarly, Fox News reporter James Rosen went from being a journalist to an "an aider and abettor and/or co-conspirator" in order to get a subpoena for his private email account. As the New Yorker pointed out, it was "unprecedented for the government, in an official court document, to accuse a reporter of breaking the law for conducting the routine business of reporting on government secrets."



And we haven't even touched on the topic of Wikileaks. Despite it taking on the role of a publisher, using the power of the internet to avoid the requirement of a legacy print business, it wasn't long after it started peeling back all these layers of secrecy that it was denounced by some as a terrorist organization. One might wonder how the East Germans would have reacted to such an organization? Perhaps Lenin, who had a 19-meter statue erected of him in the East German city of Leninplatz, might offer us some clues to as the way they would have thought about it: "Why should freedom of speech and freedom of the press be allowed? Why should a government which is doing what it believes to be right allow itself to be criticized? It would not allow opposition by lethal weapons. Ideas are much more fatal things than guns. Why should any man be allowed to buy a printing press and disseminate pernicious opinions calculated to embarrass the government?"



It's a line of reasoning that befits a failed surveillance state. And yet today, is remains all too familiar.



Yesterday, when news of the PRISM program leaked into the public domain, two items struck me. The first, from the New York Times: "The defense of this practice offered by Senator Dianne Feinstein of California, who as chairman of the Senate Intelligence Committee is supposed to be preventing this sort of overreaching... said that the authorities need this information in case someone might become a terrorist in the future." And then, there was this, from the Washington Post: "They quite literally can watch your ideas form as you type."



Watching peoples' ideas form as they type, in order to protect against someone who might become a terrorist in the future. George Orwell, eat your heart out.



The thing about that wall that cleft Berlin in half is that it didn't just represent a means of keeping people from freely moving. It represented something much more — it was about ideas and principles. About the balance between security and freedom. About whether you were there to serve the state, or the state was there to serve you. What I can't seem to shake is the feeling that somehow, the country most responsible for tearing that wall down has somehow managed to rebuild one in its own back yard.




Data Under Siege
An HBR Insight Center





Welcome to the "Data Under Siege" Insight Center
Does Your CEO Really Get Data Security?
The Companies and Countries Losing Their Data
Hack-Proof Your Company's Social Media





 •  0 comments  •  flag
Share on Twitter
Published on June 07, 2013 06:21

Don't Draw the Wrong Lessons from Better Place's Bust

The failure last month of green-tech start-up Better Place, which promised to free drivers and nations from oil dependence and revolutionize transportation, has generated both attention and derision.



But a blanket dismissal of its effort is a mistake. For entrepreneurs, investors, and policy makers, there is plenty to learn from both the strategy and the outcome.



There was good reason for the attention and funding (over $800 million) that Better Place attracted. While every other player in the electric car space was focused on innovating individual pieces — vehicles, batteries, charge spots — Better Place's strategy was unique in innovating the larger puzzle to deliver an affordable drive-anywhere, anytime solution. Its approach was the first to align the key actors in the ecosystem in a way that addressed the critical shortcomings — range, resale value, grid capacity — that undermine the electric car as a mass-market proposition. (Note to Tesla owners: you are not the mass market).



Better Place's most visible and best-publicized innovation was its switchable battery technology, a novel way to overcome the short-range limits and long recharge times dictated by existing battery technology. Skeptics initially doubted the engineering feasibility of fast battery switches, the ability to roll out infrastructure on a national basis, and the willingness of carmakers to come on board. Renault came on board as the first (but ultimately only) car manufacturing partner, and switch stations deployed along major traffic routes successfully offered an almost-instant range extension that held the promise of promoting the electric car from a secondary short-haul vehicle to a primary, and possibly sole, family car.



Less touted but more important than the physical separation of the battery from the car was Better Place's innovation of separating ownership of the battery ownership and the car. EV advocates are quick to note that technology improvements in batteries will one day eliminate the range problem. What they often miss, however, is that these very same improvement will destroy the resale value of used electric cars with older batteries. Since resale value ranks high for mass market buyers, this has all the makings of a deal breaker.



Better Place's solution eliminated this risk. Instead of buying batteries, consumers would buy subscriptions for miles (just as mobile-phone operators sell subscriptions to minutes). Better Place would then use these multi-year contracts to finance its infrastructure investments and battery depreciation.



Finally, adding a service dimension to what had been a pure product sale allowed Better Place to address the final roadblock to mass adoption of electric cars: the generation and distribution of electricity itself. If just 5% of drivers in Los Angeles County were to attempt to charge their batteries at the same time, they would threaten to bring down the power grid, adding a load equivalent to two midsized power plants in an already strained system. Better Place's model, which had the firm intermediating in real time between utilities and drivers, allowed it to control the battery-charging load that would be placed on the system at any given moment.



What Went Wrong?



The shallow answer is not enough customers. Better Place started selling cars in Israel and Denmark in late 2012. By May 2013 it had sold fewer than 3,000 vehicles. A small number, but these are also small markets. In relative market terms, the results looked less than dismal: in May, Better Place sales accounted for 1% of cars sold in Israel, and its single available model, the Renault Fluence ZE, was outselling Toyota's category leading Prius. Moreover, Better Place's customer-satisfaction ratings were off the charts.



The deeper answer is not enough time to get enough customers. The clock, which started ticking in 2007, ran out. Which begs the question of why it took so long to get to market. Part of this time was spent, wisely, in perfecting the technologies (battery switch, network management, in-car intelligence) that would make the system run. Customer satisfaction is the testament to the success of the technology. Part of the time was spent in navigating the institutional hurdles that inevitably accompany every attempt at doing something new (zoning rules, insurance). But too much of this time was lost to wasteful efforts to establish toeholds and run pilots in a slew of new geographies (e.g., Australia, the Netherlands, California, Hawaii, Japan, China and Canada) before Better Place's two core markets, Israel and Denmark, had been secured.



When Better Place was founded, its strategy called for initial rollouts in Israel and Denmark. These were inspired choices to prove the viability of its strategy: They are small countries where gasoline is exceptionally expensive and purchase taxes on gasoline-powered cars are very high. They are superior to everyone else's target of California, where the high-end niche of rich environmentalists is attractive but cheap gasoline, vast driving distances, and an incredibly competitive car market undermine the appeal of electric cars for mainstream buyers.



Despite their relatively small populations, the economics in Israel and Denmark were such that even modest market success in just these two markets would have yielded the attractive financial returns critical for investors. Just as importantly, they would have yielded the meaningful sales volumes critical for retaining and attracting partners, most importantly automakers.



But in my conversations with Better Place executives over the course of the past three years, it was clear that the emphasis was shifting from "an idea this novel needs to demonstrate unquestionable economic viability," to "an idea this good needs to be deployed across the world as fast as possible." These were not opposing goals, but prioritizing the latter over the former would have profound implications.



As Better Place pursued new geographies it used up its limited resources: money, management attention, and, most precious of all, the patience of its partners, especially Renault. In early May, Renault announced that it was scaling back its commitment to switchable battery cars and that the long-awaited second model, the Zoe compact that Better Place had counted on to complement the mid-sized Fluence, would not be coming after all. This vote of no confidence would make it infinitely harder for Better Place to line up new car manufacturers, without which it was dead in the water.



In February 2013, after a series of mismanaged leadership transitions including the firing of founder and CEO Shai Agassi, Better Place finally reversed course. It announced its exit from all non-core markets to focus exclusively on Israel and Denmark. Within months of the decision it had captured 1% market share in Israel, but by then it was too late. It declared bankruptcy on May 26.



The tragedy is Better Place delivered on the most novel aspects of its business model, succeeding in both technology development and aligning the interests of the critical actors in the electric-car ecosystem. Its failure lies in its own discipline and execution. Entrepreneurs, investors, and policymakers should distinguish between the drivers of the failure and the elements that carry the seeds of (someone else's) future success.





 •  0 comments  •  flag
Share on Twitter
Published on June 07, 2013 06:00

Minijobs Give German Employers New Flexibility

Germany's relatively strong economy has led to a proliferation of "minijobs," a special employment classification originally designed for stay-at-home mothers that allows people to earn up to 450 euros a month tax-free. About 7.4 million people, or nearly 1 in 5 working Germans, now hold these low-wage, part-time positions, which include restaurant and clinic work, says the Wall Street Journal. Proponents say minijobs give employers flexibility to adjust their workforces and keep wages low; opponents say they trap workers in marginal occupations.





 •  0 comments  •  flag
Share on Twitter
Published on June 07, 2013 05:30

Leading People in an Anxious World

Safety is now Americans' overriding concern. Several years ago, as I sat in a secondary school board meeting, the visiting headmaster of a K-8 school was asked what he considered the highest priority for parents in choosing high schools. I was astounded when he said "safety" rather than, for example, "quality of education." But that was just a hint of how Americans' safety fears would blossom in the years to come. We have become the most anxious of nations, fearing terrorists, gun rampage, sexual assault, hurricanes, tornadoes, snowstorms, identity theft, discrimination, and germs, among other things, and not necessarily in that order.



usanxiety.gifCEOs and managers need to understand what that means to their organizations. Our colleagues at work harbor worries in varying degrees, and management faces the new challenge of non-judgmentally navigating this path to security while recognizing the cost of protection. While we might think that common sense should guide us in evaluating the appropriate action to take about heightened risks, my idea of "reasonable" might be one partner's concept of "reckless." I favor encryption when any sensitive numbers or identifying data is sent to clients, but would not safeguard the average email, where a few colleagues might raise our safety bar as high as possible, almost regardless of cost. As a firm, we're debate these issues more now than we ever did before.



When it comes to physical safety, the sensitivity of the topic can make debate feel untouchable. Prior to one of many storms last winter, our governor suggested that people stay off the roads to make way for plows. Even if I felt the directive was alarmist, safety does and should take precedent. In that case, our firm settled on a compromise where people use discretion in their travel to and from work, but the office is open. We can all estimate the cost in wages, rent and even missed opportunities to sit with clients or each other.



This new frontier demands an executive response aimed at making constituents feel secure, first, while also evaluating the cost of any added safeguards. Here are some considerations for managers that have been on my mind:



Recognize that we have widely different thresholds beyond which we begin to feel unsafe. In my financial services company, we have a broad range of attitudes about required security levels for everything from our office entrance, digital encryption, and road safety during severe weather. It is essential to listen and not force our own attitude onto our colleagues and employees.



Try to design policies so that employees feel empowered to make some decisions themselves. For example, if the governor advises residents to avoid driving during a hurricane warning, let people make their own choices about coming to or leaving work. Otherwise, companies run the risk of people being resentful, anxious, or distracted.



Analyze situations carefully in terms of potential costs, liabilities, and benefits. A few years ago, I had a stalker. After this man aggressively charged into our office, running past the security officer and into the elevator, several of my colleagues not surprisingly felt that we needed to create a safer environment. I was less worried, despite being the target of his interest, believing that a night in jail and a severe warning from the judge might have an effect. However, we needed to address everyone's anxiety; that required studying access and surveillance systems for doors, elevators, and hallways. We selected a new entry locking system and convinced our building-mates to install access card security in the elevators. While the costs were not extreme, they were still meaningful, but the benefit in terms of everyone's comfort level was worthwhile.



Sometimes it's worth it to take a well-calculated risk. On the day of the manhunt for the second suspect in the Boston Marathon bombing, the governor of Massachusetts ordered a "lockdown" in the highest risk towns and advised, but did not order, residents of other communities to stay home from work for the day. I had already driven into work before knowing the extent of the restrictions, having written an email suggesting that my colleagues listen to the news, use discretion, and follow the governor's instructions.



Part of my decision hinged on a meeting planned that day with a client prospect, who, I suspected, might decide to come into Boston that day anyway. I wrote him to say that I was available if, by any chance, he was in the city, but that if not, we should reschedule at his convenience. He was already at work in his office a block away. While not the major reason, I believe that one factor in his choice to hire us is that we shared a similar, albeit, contrarian view of the widespread lockdown, and he read something into that about my work ethic and that of a few colleagues who rode bikes, walked or drove into city.



Most importantly, managers must understand that this is a different world where we constantly face challenges related to safety. The cost to address security in the physical and digital workplace is simply a larger expense line than in the past, whether because of lost work days or because of the cost of securing computers, digital files, or buildings. I've realized that I must not judge anyone's decision to either come to work on a potentially dangerous day, or to stay home with their families. This is especially true when public transportation has been canceled, or when family dynamics factor into individual decision making.



As managers, we must be sensitive to our employees' fears and anxieties, which themselves reduce productivity and satisfaction, while also being aware of what that means financially. So when I suggest that we drive out to Six Flags and ride the roller coaster together as a bonding experience, I am ready to accept a toned down tilt-a-whirl option, predetermined "designated drivers" for the ride home, and pre-screening the amusement park for security protocol. This is our new world.





 •  0 comments  •  flag
Share on Twitter
Published on June 07, 2013 05:00

June 6, 2013

Pricing Strategies People Love

An interview with Sandeep Baliga and Jeff Ely, professors at the Kellogg School of Management and Northwestern University. For more, see The Power of Purple Pricing.



Download this podcast


A written transcript will be available by June 14.




1 like ·   •  0 comments  •  flag
Share on Twitter
Published on June 06, 2013 15:56

You Have No Control Over Security on the Feudal Internet


Facebook regularly abuses the privacy of its users. Google has stopped supporting its popular RSS feeder. Apple prohibits all iPhone apps that are political or sexual. Microsoft might be cooperating with some governments to spy on Skype calls, but we don't know which ones. Both Twitter and LinkedIn have recently suffered security breaches that affected the data of hundreds of thousands of their users.



If you've started to think of yourself as a hapless peasant in a Game of Thrones power struggle, you're more right than you may realize. These are not traditional companies, and we are not traditional customers. These are feudal lords, and we are their vassals, peasants, and serfs.



Power has shifted in IT, in favor of both cloud-service providers and closed-platform vendors. This power shift affects many things, and it profoundly affects security.



Traditionally, computer security was the user's responsibility. Users purchased their own antivirus software and firewalls, and any breaches were blamed on their inattentiveness. It's kind of a crazy business model. Normally we expect the products and services we buy to be safe and secure, but in IT we tolerated lousy products and supported an enormous aftermarket for security.



Now that the IT industry has matured, we expect more security "out of the box." This has become possible largely because of two technology trends: cloud computing and vendor-controlled platforms. The first means that most of our data resides on other networks: Google Docs, Salesforce.com, Facebook, Gmail. The second means that our new internet devices are both closed and controlled by the vendors, giving us limited configuration control: iPhones, ChromeBooks, Kindles, Blackberries. Meanwhile, our relationship with IT has changed. We used to use our computers to do things. We now use our vendor-controlled computing devices to go places. All of these places are owned by someone.



The new security model is that someone else takes care of it — without telling us any of the details. I have no control over the security of my Gmail or my photos on Flickr. I can't demand greater security for my presentations on Prezi or my task list on Trello, no matter how confidential they are. I can't audit any of these cloud services. I can't delete cookies on my iPad or ensure that files are securely erased. Updates on my Kindle happen automatically, without my knowledge or consent. I have so little visibility into the security of Facebook that I have no idea what operating system they're using.



There are a lot of good reasons why we're all flocking to these cloud services and vendor-controlled platforms. The benefits are enormous, from cost to convenience to reliability to security itself. But it is inherently a feudal relationship. We cede control of our data and computing platforms to these companies and trust that they will treat us well and protect us from harm. And if we pledge complete allegiance to them — if we let them control our email and calendar and address book and photos and everything — we get even more benefits. We become their vassals; or, on a bad day, their serfs.



There are a lot of feudal lords out there. Google and Apple are the obvious ones, but Microsoft is trying to control both user data and the end-user platform as well. Facebook is another lord, controlling much of the socializing we do on the Internet. Other feudal lords are smaller and more specialized — Amazon, Yahoo, Verizon, and so on — but the model is the same.



To be sure, feudal security has its advantages. These companies are much better at security than the average user. Automatic backup has saved a lot of data after hardware failures, user mistakes, and malware infections. Automatic updates have increased security dramatically. This is also true for small organizations; they are more secure than they would be if they tried to do it themselves. For large corporations with dedicated IT security departments, the benefits are less clear. Sure, even large companies outsource critical functions like tax preparation and cleaning services, but large companies have specific requirements for security, data retention, audit, and so on — and that's just not possible with most of these feudal lords.



Feudal security also has its risks. Vendors can, and do, make security mistakes affecting hundreds of thousands of people. Vendors can lock people into relationships, making it hard for them to take their data and leave. Vendors can act arbitrarily, against our interests; Facebook regularly does this when it changes peoples' defaults, implements new features, or modifies its privacy policy. Many vendors give our data to the government without notice, consent, or a warrant; almost all sell it for profit. This isn't surprising, really; companies should be expected to act in their own self-interest and not in their users' best interest.



The feudal relationship is inherently based on power. In Medieval Europe, people would pledge their allegiance to a feudal lord in exchange for that lord's protection. This arrangement changed as the lords realized that they had all the power and could do whatever they wanted. Vassals were used and abused; peasants were tied to their land and became serfs.



It's the internet lords' popularity and ubiquity that enable them to profit; laws and government relationships make it easier for them to hold onto power. These lords are vying with each other for profits and power. By spending time on their sites and giving them our personal information — whether through search queries, e-mails, status updates, likes, or simply our behavioral characteristics — we are providing the raw material for that struggle. In this way we are like serfs, toiling the land for our feudal lords. If you don't believe me, try to take your data with you when you leave Facebook. And when war breaks out among the giants, we become collateral damage.



So how do we survive? Increasingly, we have little alternative but to trust someone, so we need to decide who we trust — and who we don't — and then act accordingly. This isn't easy; our feudal lords go out of their way not to be transparent about their actions, their security, or much of anything. Use whatever power you have — as individuals, none; as large corporations, more — to negotiate with your lords. And, finally, don't be extreme in any way: politically, socially, culturally. Yes, you can be shut down without recourse, but it's usually those on the edges that are affected. Not much solace, I agree, but it's something.



On the policy side, we have an action plan. In the short term, we need to keep circumvention — the ability to modify our hardware, software, and data files — legal and preserve net neutrality. Both of these things limit how much the lords can take advantage of us, and they increase the possibility that the market will force them to be more benevolent. The last thing we want is the government — that's us — spending resources to enforce one particular business model over another and stifling competition.



In the longer term, we all need to work to reduce the power imbalance. Medieval feudalism evolved into a more balanced relationship in which lords had responsibilities as well as rights. Today's internet feudalism is both ad-hoc and one-sided. We have no choice but to trust the lords, but we receive very few assurances in return. The lords have a lot of rights, but few responsibilities or limits. We need to balance this relationship, and government intervention is the only way we're going to get it. In medieval Europe, the rise of the centralized state and the rule of law provided the stability that feudalism lacked. The Magna Carta first forced responsibilities on governments and put humans on the long road toward government by the people and for the people.



We need a similar process to rein in our internet lords, and it's not something that market forces are likely to provide. The very definition of power is changing, and the issues are far bigger than the internet and our relationships with our IT providers.




Data Under Siege
An HBR Insight Center





Welcome to the "Data Under Siege" Insight Center
Four Things the Private Sector Must Demand on Cyber Security
Does Your CEO Really Get Data Security?
The Companies and Countries Losing Their Data





 •  0 comments  •  flag
Share on Twitter
Published on June 06, 2013 10:00

What Anonymous Feedback Will (and Won't) Tell You


A survey evaluating a team's performance can be a powerful tool for making that team more effective. And the first message that consultants and HR professionals often communicate on these surveys is: "To ensure that the team gets the best data and feels protected, we will make sure responses are confidential." The widespread assumption is that if team members know their answers are confidential, they will respond honestly. But if you ask for confidential feedback, it might create the very results you are trying to avoid.



If team members are reluctant to have their names associated with their responses, then you've already identified what is probably the most significant problem in your team — lack of trust. Leaders routinely insist that team members be accountable as a team, so the logic follows that they should also be accountable for giving good, critical feedback. But enabling respondents to comment without being linked to their responses actually catalyzes the situation the survey is designed to overcome: It seeks to create increased accountability using a process that lacks transparency and precludes accountability.



Without a basic level of trust among team members (including the team leader), team performance, working relationships, and individual well-being suffer. And when it comes to the negative outcomes of confidential surveys, there are some key unintended consequences you might recognize. I saw all of these outcomes early in my career, in part when I worked for the University of Michigan's Survey Research Center designing and administering surveys, and facilitating feedback meetings for leaders and their teams:



Even if the answers are honest, they aren't necessarily valid. Ratings may be inaccurate, biased, or even self-serving — but without survey-taker identification it's impossible to determine how each member responded and why. When results are presented confidentially and aggregated, they may be misleading; for example, one outlier response can greatly affect the team's average score.



Team members can't identify specific behaviors to change on their own. For your team to become more effective, they need to know the specific behaviors that influence their effectiveness. But responses to even well-crafted and validated survey items like "Team members follow through on group decisions" or "Conflict interferes with achieving goals" do not by themselves provide that level of specificity. Identifying specific behaviors that need to change requires that members talk directly with each other about what they might do differently.



Mixed messages create conflict and reduce trust. The second message that consultants and HR professionals communicate is: "The survey feedback session is an important time to discuss the results as a team and seek clarification from each other." But it's impossible to simultaneously clarify a response's meaning while maintaining confidentiality. Team members who expect anonymity are likely to feel threatened when those who are expecting clarification ask about their specific responses. The latter team members will inevitably become frustrated, further reducing trust.



When used well, surveys are a valuable tool for improving team effectiveness. I still use surveys with my leadership team clients to help them increase their effectiveness, but I design the survey administration and feedback to minimize these unintended consequences. But how do you resolve the tradeoff of confidentiality and accountability? Trust is the key, but there is no easy or fool-proof solution. If you, as the team leader, are one of the primary sources of team mistrust, the situation is even more challenging. Nevertheless, these specific actions may help:




Raise the dilemma with your team. Test your assumption that team members want confidentiality by asking them directly. Explain the tradeoff of survey confidentiality and effectiveness, including the issues of validity, behavior specificity, accountability, and trust. Ask team members for their reactions.

Ask which conditions would need to be met for them to complete the survey using their names. Work to create those conditions.

If you learn that low trust is a significant issue, address it. Ask members to be accountable for stating their views but emphasize that no one will be coerced into sharing information they are not yet willing to share. Assure members that providing information and opinions, even if negative, will not have punitive consequences, and ensure that this is the case. Use a set of ground rules to make the conversation safe and productive.

Don't be afraid to use outside help. An internal or external consultant can help the team engage in this conversation in a way that simultaneously maximizes the psychological safety and accountability of team members. It is challenging to create this environment alone, particularly if you, as the team leader, are one of the sources of team mistrust.


It's key to remember is that effective teams help each other improve. They provide each other with regular, specific feedback about how their behaviors are affecting the team and team goals. Research shows that regularly following up with colleagues is a powerful variable in creating long-term sustainable change in leadership behavior. To provide this effective feedback and support, team members need to agree on the specific behaviors that each member will change.





 •  0 comments  •  flag
Share on Twitter
Published on June 06, 2013 09:00

A Technique to Bridge the Gap Between Marketing and IT

Quick: when someone says "IT," what comes to mind? Usually, people think of the systems they use as employees inside organizations — not customers. Here's a familiar story: Marketing conducts research ("Big Data"! "Analytics"!) and uncovers new customer insights; it then turns to operations to translate the insights into action...and hits a wall. The people in operations are too focused on fulfilling internal requests and service agreements to worry about customers, the ones that pay real money.



This classic breakdown between marketing and IT is being bridged at a few leading companies such as ING, the Netherlands bank. ING's customer intelligence group does market research and "database" marketing — they develop sales campaigns for targeted customers. Kim Verhaaf, director of customer intelligence, told me: "A few years ago customer intelligence was mainly about finding customers for our products in order to push sales and conversion rates. The financial crisis changed the market conditions for banks and also people's attitudes towards banks. The Dutch market changed from a 'grow with the flow' market to a 'battle for share' market. And in a highly competitive market, emotional preference becomes important."



To learn more about emotions, the customer intelligence group took several banking products and processes and measured customers' reactions to them. They found that their offerings trigger more emotions than they had realized. Some provoked positive feelings, but some also prompted frustration, anger, and stress. They also found that customer satisfaction does not depend solely on the functionality of a product or service; it can depend on reliability and ease of use. Verhaaf said they realized that "to create more positive reactions and build trust with our customers, we really need the help of IT in changing our products and processes."



While the customer intelligence group was learning about customers' emotions, , the new chief information officer, was learning about ING's IT organization. He found IT to be completely detached from anything like a real end customer. The IT group treated the internal organization as their customers, rather than the actual bank customers. That meant that they were focused on delivering internal processes. For example, through a standard process model for IT (ITIL), they had identified the incident management process, (an incident is a customer problem), and were happy that 80% of incidents were resolved in two hours. But nobody looked at statistics that showed that incidents were increasing. And nobody looked at what these incidents were doing to the bank's customers. When they looked at the real customer impact, they realized they needed to reduce incidents dramatically. They started with 1,000 incidents per week. This was a surprising and scary number for their CEO. By focusing on reducing incidents, they got them down by 40% to 600 per week in a year. Van Kemenade: "In our most recent report we are down to 450 per week. Customers can now depend on a more reliable bank, which is helping us regain their trust."



How did marketing and operations at ING learn to marry these customer insights and operations? Answer: Cross-functional collaboration using "Agile Scrum."



As I described in a previous post, ING's IT organization has been transitioning from the traditional development approach of (1) define functional requirements, then (2) design, then (3) build (the "waterfall" approach) to making quick, small changes to systems ("Agile Scrum"). Ron van Kemenade: "Our focus on customers has led us to reduce our development cycle from months to days. We've moved from a project orientation to continuous delivery, applying Lean Six Sigma approaches."



Agile and Scrum have allowed ING to respond quickly to signals from customers. But moving to continuous delivery is a struggle. Some business people who are used to the traditional waterfall method can fall into an unfortunate cycle: taking months to develop requirements, then waiting for IT to respond, then telling IT that's not what they wanted. Now instead at ING they say, "Here's your team. You need to be in every daily or weekly Scrum cycle or sprint to decide if the work is meeting your needs." It demands more time from the business people, but they are engaged and own it.



While Scrum has been employed primarily in software development, ING shows that it has broader management applications. They have used Agile Scrum as a key tool for collaboration across functions in processes such as developing new products and in marketing campaigns. And the frequent (daily or weekly) meetings accelerate decision-making.



Historically, people at ING were either internally focused or externally focused. They worked either to increase efficiency or to address customers' emotions. They have learned that to be the preferred bank, they have to do both at the same time. And that requires close and constant collaboration between marketing and IT.





 •  0 comments  •  flag
Share on Twitter
Published on June 06, 2013 08:00

Marina Gorbis's Blog

Marina Gorbis
Marina Gorbis isn't a Goodreads Author (yet), but they do have a blog, so here are some recent posts imported from their feed.
Follow Marina Gorbis's blog with rss.