Marina Gorbis's Blog, page 1353
September 29, 2014
Beware the Analytics Bottleneck
Within the next three years there will be over 20 billion connected devices (e.g. oil pipelines, smart cities, connected homes and businesses, etc.) which can empower the digital enterprise — or intimidate them. With the pace of digital “always on” streaming devices and technology innovation accelerating, one might think technology would continue to pose a challenge for businesses. Historically, new technologies from the mainframe to client server and ERP — while enabling organizations to pursue new business goals — became a bottleneck to progress. This is due to constraints like lengthy implementation processes and inflexibility to adapt as business conditions changed. Turns out that isn’t the case today. There is a new, even more elusive, bottleneck: the organization itself and its ability to adopt and adapt big data and analytics capabilities.
Based on our work with clients in a variety of industries from financial services to energy, here are three ways we’ve seen organizations embrace the analytics opportunities of today and transform from being the constraint into being the change agent for their company’s future.
Don’t be overwhelmed — start slower to go faster: Given the ferocious pace of streaming data, it can be challenging for many organizations to glean data insights at the same speed and determine the right data-driven decisions and actions to take. To avoid getting overwhelmed by all the data and the possible opportunities it could uncover, companies should slow down and just focus on the things that matter — it’s much easier to focus on resolving five issues that could truly make a difference instead of 500 issues that might help the business.
Once the shortlist of focus areas is determined, organizations can then more effectively chase their desired outcomes by doubling down their analytics efforts in data automation and embedding insights in decision processes to help achieve their wanted results, quicker. This should also be done in tandem with continuing to drive analytics adoption in the business for an even bigger benefit.
An upstream energy equipment manufacturer, for example, used this approach to better understand the amount of time production equipment sat idling. The company knew there was huge value in solving the idle problem, but it could not do so leveraging traditional technologies as the data volumes were too large (i.e. 300,000 locations, approximately 20 machines per location, 2-300 data points per machine, and 45 millisecond sensor sample rates). Using a Big Data Discovery platform and methodology, within 10 weeks the team was able to show more than $70M in savings from analysis from a subset of the locations and could analyze the data at high speeds (e.g. 13,500 sites, 20 TB, 15 seconds to render).
Technology doesn’t have to be exposed (Keep the complexity behind the curtain): Organizations shouldn’t be reticent to explore new technologies and experiment with their data to improve the effectiveness of their analytics insights for key decision processes. Machine learning, or the growing set of data discovery and analysis tools used to uncover hidden insights in the data, is a sophisticated technology that can do just this. Its data exploration capabilities and simplicity are also becoming necessities to ensuring competitiveness in the connected world.
Machine learning techniques can aid a company to: learn from past behavior and predict behavior of new customers (e.g. risk models to predict consumer risk to default), segment consumer behavior in an optimized, market friendly fashion (e.g. customer lifestyles modeled from geo-location data on cellphones), or conduct crowd simulation models where each customer’s response to a reward is modeled. This is just a snapshot of possibilities; many more types of outcomes from machine learning are also possible.
For example, one retail bank applied machine learning to its customer analytics and achieved a 300% uplift on sales campaigns compared to a control group. Despite this lift, the bank was experiencing relatively slow adoption in the retail channel with many branch managers still using traditional methods of relationship selling. To improve the adoption rate the bank focused on a change program that dumbed down what qualified leads meant and also showed the managers the WIIFM (“What’s in it for me?”) approach to show how this would help them achieve their goals.
Make faster decisions for faster rewards: It’s important for businesses to sense, analyze, interpret and act fast on the data insights as competitive advantages will likely be more fleeting than long lasting in the hypercompetitive world. With this, we are seeing a fundamental shift in strategic decision making that is powered by big data discovery, a capability that accelerates the time to insight.
As an example, a large bank used a data discovery capability to gain deeper insight into their customer experience strategy and understand why there was a drop off in customer satisfaction. The data discovery analysis took weeks instead of months, where a team of data scientists, functional experts and business analysts worked to tag, filter and find correlations in the data, and how it differed by customer segments. The analytics team discovered that the bank’s most affluent customer segments were the most digitally savvy, and they were dissatisfied with their digital experience, online and on their mobile devices. The bank thought service fees were the issue, and while they were a strong issue overall across all customers, it wasn’t the most important issue for their most profitable customers. As a result, the bank changed their customer experience strategy by altering their approach to service fee refunds and enabling wealth advisers to connect with customers digitally.
It’s a reality: Data is going to keep growing and technology options will follow the same trajectory. Organizations shouldn’t run from this new digital reality, but learn to embrace it by adopting and adapting their analytics strategies to remain competitive. By applying the power of data and analytics techniques such as machine learning, a firm can make smarter, faster decisions for their business and its customers, and actively disrupt their industry.



The Bash Bug Is a Wake-Up Call
By now we’ve all heard about the immediate threat posed by the Bash bug, which a security researcher discovered last week. Also known as the Shellshock bug, the software flaw exploits a vulnerability in a standard piece of software code called the Bash Shell, whose functions give users command over computer systems that are based on Linux and Unix. That means attackers can take control of your systems and run any command they wish.
Bash is a far bigger threat than Heartbleed. Heartbleed gave hackers access to personal data, like passwords. But Bash threatens the emerging Internet of Things—the global system of smart, connected products and services. That may includes GE jet engines, wind turbines, and MRI machines; Apple’s MacBooks; Amazon’s ecommerce systems; JPMorgan Chase’s accounts; Nest thermostats, and even your DVR. All these can now be controlled by attackers. The good news is that companies, open source communities, and government organizations are rushing to fix the problem and solutions have already been posted.
Bash won’t be the last threat of this magnitude, and that fact should force us to address a simple fact: The Internet of Things enables new levels of convenience and efficiency, but its comprehensive connectivity also exposes households, companies, and whole economies to attack. We can’t address every Bash bug-type discovery as a one-off. Company executives, open-source software community leaders, and government organizations have to get their act together. They must join forces and work proactively to create systems and processes that anticipate weaknesses, defend against attacks and enable rapid, coordinated fixes.
We need three levels of response to get ahead of these problems. First, all companies are at risk. Executives need to take this seriously and network security has to have a place on the CEO’s and Board of Directors’ agenda. Just as we need financial audits to ensure the integrity of business, we need continuous security audits of all IT-enabled products and services to ensure that customers and businesses are not at risk. This may even mean redesigning systems to enable automatic updates and easy lockdowns in case of an emergency.
Second, organizations need to create an emergency response team and plan that can swiftly react and solve problems once vulnerabilities are detected. It should work like any effective emergency preparation: Executives should plan for worst-case scenarios and run their organizations through the drills to ensure that they’re ready to handle problems as they may arise. Experience has shown that this can’t be relegated to the lower levels of the IT organization. Instead executives from all functions need to be involved in the response plan.
Third, companies, open source community leaders, and government organizations need to coordinate their activities to proactively detect weak spots in our digitized and networked devices, services, and infrastructure. The vast majority of the world’s internet and software infrastructure relies on solutions developed in open-source software communities. Over the past two decades these communities of developers, composed of volunteers and firm-sponsored employees, have proved themselves brilliant at creating code. However, no software system is perfect and open source code — like proprietary solutions — can have mistakes, omissions, or simply not be capable of evolving with ever-changing networked computing systems. Open-source software works on the principle of “many eyes can find any bug,” however, both the heartbleed bug and the shellshock bug allowed entry into core systems because there were simply not enough people looking critically at open-source code to detect and defend the networks.
Now is the time for companies, communities, and governments to proactively add resources to the core computing infrastructure and test it for vulnerability in a systematic fashion. While academic and government organizations like CERT (at Carnegie Mellon University) are doing an admirable job in raising awareness about security threats, more must be done. Taking a cue from the banking system, the computing industry needs to develop an approach that prioritizes proactive stress-testing, detection, and updating to anticipate problems and prevent them from occurring. This proposal is not so far out of reach. Already, competitive companies cooperate in various open-source foundations (e.g. Linux, Apache, Perl) to create software solutions. To address the security problem will take similarly effective cooperation and organization among the participants, but it also requires the will to succeed. There is too much at stake not to do it.



The Ways Big Cities Think About Large-Scale Change
Change on a grand scale requires people to come together in new and different ways, and to reimagine what’s possible. This kind of change is hard, but it’s not impossible.
In 2010, Living Cities, a long-standing collaborative of 22 of the world’s leading foundations and financial institutions, created the Integration Initiative to accelerate the pace of change in U.S. cities. We worked with five cities tackling seemingly intractable challenges such as urban revitalization in Detroit and education and health in Newark.
In our work with America’s cities and a cross-section of public and private sector leaders, we are learning a lot about what works — and what doesn’t — to lead and drive this kind of change. But these aren’t just lessons for the social sector. Much of what we’ve learned is relevant to leaders of any type of organization or partnership that want to catalyze change in the face of complex challenges.
Get the right players to the table. Change happens only when the right mix of partners, with the right experience, knowledge, and power are at the table. The problems that the cities in our initiative were taking on could not be solved by one individual, institution, or sector. Too often, actors that were fundamental to achieving the desired results were not yet involved in existing efforts — that’s why the efforts weren’t working. We asked cities to start from the results that they wanted to achieve, and then to determine who needed to be at the table in order to achieve them. Often, this meant bringing people together who were not used to working together.
We saw the greatest success when tables identified strong chairs who had credibility in multiple sectors, were willing to push the group to prioritize, and were committed to changing how their own institutions worked in order to push others to do the same. Indeed, achieving their goals required significant behavior change from multiple players who didn’t necessarily see themselves as part of the same systems even though they served largely the same families and neighborhoods. For example, although a school superintendent and the head of a community development bank have historically had little occasion to talk to each other, they both play an important role in connecting underserved communities to jobs and essential services such as education, training, child care, health care and housing, and ensuring that those opportunities exist in the first place.
In Cleveland, our work was focused on leveraging the hiring power of universities and hospitals to create economic opportunity for neighboring residents. Senior leaders from all of the universities and hospitals came together, in an unprecedented collaboration, to create universal employee metrics that tracked employment goals and outcomes across the institutions so that they could better understand the hiring landscape, and design interventions to better equip residents for jobs.
Reimagine roles. Not only do you need to bring together folks with diversity of thought and experience, but those leaders also need to challenge long-held orthodoxies that can limit progress. Being open to reimagining roles allows all of the partners and resources at the table to be put to their best use.
In Minneapolis and St. Paul, local government and foundations took on roles that were far from what is generally expected of them. To realize The Corridors of Opportunity’s goal of ensuring that the imminent build out of the regional transit system provided benefits to low income people beyond access to public transportation, the Metropolitan Council, a quasi-governmental planning agency, established a $5 million annual grant program and a new five-person office to oversee projects that would provide access to jobs, housing, health clinics, child care and other essential services. The funding and oversight of these projects was a role typically played by philanthropy and nonprofits. But the partners knew that having the public sector lead this effort would be more sustainable than a loose collaborative of nonprofits with the need to constantly fundraise. And The Saint Paul Foundation filled financing gaps, using loans, loan guarantees and even equity investments — investments commonly associated with banks or other private investors, but likely too risky for traditional lenders until there was a more proven track record.
Build, measure, learn, and declare. The most successful cities have adopted a lean “build, measure, learn” approach. They use data to measure, in real time, whether their indicators are trending up, learn whether their approaches are working and then stay or change course as needed.
This is not an easy task. Standard performance metrics for these big audacious outcomes do not exist. For example, there is no agreed-upon way to measure urban revitalization. One promising example here, however, is StriveTogether’s Theory of Action for bringing about large-scale change in education, from cradle to career. The theory of action, which has been adopted by cross-sector partnerships in cities across the country over the past two years, is a specific articulation of quantitative, measurable outcomes or ‘community level outcomes and core indicators’.
The cities have taken this approach one step further. A willingness to “declare” or speak publicly about the desired outcomes is transformative because it puts all partners on the hook for the results. Through “learning communities” all five participating cities come together to share successes and failures, and to collectively grapple with questions. Where partners were reluctant to publicly commit to or prioritize outcomes, it took much longer to move from talk to action.
There is no formula or roadmap to follow for large-scale change. While each approach will be different, we’ve learned that a willingness to challenge the status quo, making sure the right people are at the table, using data to continuously improve, and making a public commitment to goals is essential to success.



How Companies Can Learn to Make Faster Decisions
SpaceX had a problem. Managers at the aerospace manufacturer wanted to make faster decisions for one of their big clients — NASA — by finding alternatives to the high volume of meetings and cumbersome spreadsheets used for tracking projects. Initially, NASA sent a fax (yes, a fax) whenever they had a query, which SpaceX added to a list of outstanding questions. The company then assembled a weekly 50-person meeting to review product status information contained in spreadsheets, addressing each question individually before sending the responses back to NASA.
SpaceX’s dilemma is not an uncommon one. In today’s organizations, the speed of decision making matters, but most are pretty bad at it. One-third of all products are delivered late or incomplete due to an inability or delay in decision-making, according to research from Forrester Consulting and Jama Software. Others at Gartner cite “speed of decision making” as the primary obstacle impacting internal communication. No doubt you’ve been part of a team that waited… and waited… for a higher-up to make a decision before you could resume your work.
These small delays compounded over thousands of decisions soon becomes death by a thousand cuts. According to Forrester, for every hour a product teams spends on heads-down work, they spend 48 minutes waiting on decisions. That equates to more than 3.5 hours of “wait time” in an average eight-hour work day. If a company cuts wait times in half, it can gain more than $370,000 annually in productive time across a 25-person team. This doesn’t even account for the bigger opportunity captured by a company if they can avoid delivering products late to market.
The good news is that companies are finding ways to make better, faster decisions.
SpaceX increased communication in order to speed up their process. Using collaborative technology, NASA now has direct visibility into each project and can identify which SpaceX engineers are working on a specific component. Importantly, they can also start a conversation with these engineers in order to make decisions in real-time. The collaboration system allowed SpaceX to cut its average wait time for defining product requirements by 50% and eliminated the costly weekly four-hour status meeting.
Something similar happened at a large manufacturing firm, which set out to reduce wait times associated with development of its semiconductors. At this company, each product had 250 variations for global distribution, so the complexity quotient was high. Most of the review time was spent in a typical linear process, waiting for one person to make a decision or finish their part before the next person could start work. This approach was incredibly time consuming, requiring three people to drive the review process over a six-month period.
To speed this up, the company implemented advanced collaboration technologies to create review processes that would work in parallel with one another. Rather than waiting for a decision before the next person could begin work, the company brought many people into the process earlier in the cycle so several decisions could be made simultaneously. The collaboration system contained everything related to the product in a central location and used constructs similar to social networking — rather than emails and spreadsheets — to communicate and manage the complexity in real time. This new approach allowed everyone to look beyond their discreet tasks by providing more context about the entire project. As a result, the company fostered more discussions between individuals working on related parts of the product and employees ultimately made faster and better decisions.
These deeper insights also allowed engineers to identify instances where work being done by one team was duplicative or related another team, creating opportunities to reuse core design elements across multiple product lines. It also helped them avoid the need to make many individual, time-consuming decisions in isolation.
The results were dramatic: The global company reduced the review cycle to just one person owning the process, which took a single month. Prior, it took three people and six months, so that’s an 18-fold, or 90% reduction, in time.
This firm and SpaceX, which are clients of Jama Software, both adopted an approach of purposeful collaboration using technology to streamline decision-making. And behind both approaches are a few key tenants that can apply to any company:
Offer a shared vision. Many companies relegate team members to perform specific tasks or only communicate certain components of information to various groups, without providing the big picture. This creates a very limiting, myopic view. SpaceX wanted to provide better communications with NASA so they brought them deeply into the process by sharing decisions as they were being made. By offering everyone on the team — even when it’s thousands of people across different companies – a shared vision, all stakeholders can be in synch and employees can deliver better results.
Encourage more problem solving. Managers should foster an environment where employees seek information to answer questions themselves, rather than sitting back waiting to be assigned their next task. Employees should communicate and collaborate freely with one another, locating experts in certain areas of focus so they can gain insights on their own. When a product designer at the semiconductor company had a question, for example, she knew exactly who was working on the related test case or software specification so she could get her questions answered immediately rather than waiting for information to be sent to her.
Make decisions in parallel. Decisions at organizations are typically made in a linear fashion, but it doesn’t have to be that way. When teams get stuck in the old way of doing things — Decision A first, then Decision B, then Decision C and so on — we create passive teams. In the case of developing and delivering semiconductors, the company enabled multiple teams to simultaneously make decisions related to a product, and insights from one team could help others make faster decisions.
New collaboration systems that engage teams in the decision making process can be effective, even transformative, particularly for companies building complex software and technology products. However, improving decision-making relies on much more than implementing new technologies — it also requires a significant cultural shift. Managers need to let go of the command-and-control approach wherein they only disperse the necessary information to specific teams for fear they may become distracted by additional insights. Rather, they should empower employees by providing them with all the information and trust them to make the right decisions.



Sales Data Only Matters If It Helps You Take Action
In sales, as everywhere else in business, there is a buzz about big data and analytics. Vendors hype tools and mobile applications to help sales forces make sense of it all, while touting case studies that generated impressive improvements in sales force effectiveness.
Companies are anxious to capitalize on the opportunity. While some jump in, many are reluctant to move forward. Some will remember or hear stories of failed projects – big investments to give salespeople tablet computers, to develop data warehouses, and implement CRM systems that ended up racking up huge costs, while generating little value for customers and salespeople. We also hear concerns such as “the technology is too new — let’s wait until it matures,” or “we don’t want to invest in something that becomes outdated in a year.”
These are valid concerns, but here is the crux of it all. It’s not the data and technology that matter. What really matters is how technology, data, and analytics can help salespeople, sales managers, and leaders improve fundamental sales force decisions and processes.
Consider a few examples.
Helping salespeople. Consider account targeting. Traditionally, salespeople decide which customers/prospects to spend time with by examining a list of accounts in their territory and figuring out which ones to focus on to achieve a territory sales goal. But far too frequently, salespeople end up spending too much time with easy and familiar accounts, demanding customers with urgent needs, and friendly prospects. Ease and urgency trump importance.
Approaches that use data and analytics, structured around frameworks that capture the dynamics of customer/prospect needs and potential, help salespeople target the right accounts and spend time more effectively. Such an approach involves:
Identifying profile characteristics (e.g. type of business, number of employees) that predict account potential and developing an estimate of potential for each customer/prospect.
Using techniques such as collaborative filtering to identify customers/prospects with similar needs and potential (the “data doubles”) and suggest the best value proposition and sales approach for each account.
Closing the loop by providing an assessment of how effective account targeting was so as to inform better future decisions.
Helping sales managers. Analytics can help sales managers have higher impact as coaches and make more-informed decisions about issues such as sales territory design, goal setting, and performance management. Traditionally, managers rank salespeople on criteria such as territory sales or sales growth, and tie rewards or corrective consequences to these rankings. But if territories don’t have equal potential, the rankings don’t reflect true performance. Salespeople with rich territories have an unfair advantage while those with poor territories are demotivated.
Data and analytics enable performance metrics that account for territory potential, so that sales managers can reward the best salespeople, not the best territories. Such an approach involves:
Developing measures of customer/prospect potential, using company and third-party data sources (e.g. business demographics) and sales force input.
Identifying the true best performers using techniques that separate the impact of territory potential from the impact of a salesperson’s ability/effort on performance.
Rewarding the true best performers, learning what they do that’s different from average performers, and sharing the learning across the sales team.
Helping sales leaders. Analytics can help sales leaders improve decisions about issues such as sales strategy, sales force size and structure, and the recruiting of sales talent. Consider how analytics can help sales leaders design a sales incentive compensation plan. Traditionally, incentive plans are designed by surveying salespeople about their satisfaction with the current plan, benchmarking against industry and company historical norms, and checking past incentive costs versus budget. This retrospective approach can blindside sales forces with undesired consequences in terms of sales force effort allocation and financial risk.
A better plan results when companies use data and analytics, structured around frameworks that link plan design to projected costs, sales force activity levels, and fairness under varied market conditions. Such forward-looking approaches improve the odds that despite an uncertain future, an incentive plan will motivate the sales force to focus effort on the right products and customers, and be fiscally responsible. Such an approach involves:
Using analytics to test the consequences of proposed plan designs, compare alternatives, and reveal unwanted side effects and financial risks.
Monitoring payout distributions and metrics showing a plan’s strategic alignment, motivational power, and costs.
Proactively making adjustments to keep the plan on track.
It’s not about the technology or the data. Investments in sales data, technology, and analytics can only live up to their promise when sales forces focus first on understanding the dynamics of the fundamental decisions and processes that salespeople, sales managers, and leaders are responsible for.



What an Economist Brings to a Business Strategy
Although many business executives sat through one, or perhaps several, courses in economics while in school, most probably took away little more than the supply and demand graphs to which they were introduced early in their first course. Ask them if they apply much else from else from economics in their actual business careers, and you’re likely to hear “not much.”
They might be surprised at how certain economic notions have been directly applied in business, with largely positive results. Here a few notable examples.
Auctions. Consider first the increasing use of auctions, which have a distinguished history in the development of economics. In the 1900s, French mathematician-economist Leon Walras envisioned prices in a market economy being set by an auctioneer (since known as the “Walrasian auctioneer”) conducting continuous auctions for all kinds of commodities.
It may be tempting to think that the Walrasian auction is just a theoretical construct, useful primarily in a classroom setting for thinking about markets, and in the real world, only for scarce commodities or unique items, of the kind put up for sale by Sotheby’s or Christies. But that would be a mistake.
The late Julian Simon (better known, perhaps, for his optimistic views about population growth and resource abundance) thought up the idea for having airlines auction off overbooked seats and persuaded the Civil Aeronautics Board, which used to regulate airlines fares and entry, to permit the idea in the 1970s. Economist Ronald Coase proposed auctioning off segments of the electromagnetic spectrum in late 1950s, a policy idea that was later adopted in the 1990s. Many economists since have been hired by the U.S. and other governments to help them design these often complicated auctions and by telecommunications companies trying to figure out the best strategies for bidding.
Two well-known companies have also made auctions famous, and economists have played central roles in the success of each. Google generates most of its revenue through an auction-based system of selling ads that was developed by two engineers but validated by its chief economist, Hal Varian, a former consultant to the company who was also the first Dean of the School of Information Sciences at the University of California at Berkeley. Varian has since overseen the hiring of a large corps of statisticians and economists who developed other innovations for the company, notably Google Trends, which can be used to track the number of search terms that can be helpful in predicting various real world events (such as the progress of the flu or forthcoming official unemployment statistics).
Priceline introduced the “conditional price offer,” an economic notion developed by the company’s founder, Jay Walker, who put his undergraduate economics training to good use to form a company that has revolutionized travel. Walker’s innovation was to bind travelers to pay the prices they bid if the airlines and hoteliers on Priceline accepted the offers. That way, travelers took their offers much more seriously than if they simply could “name their price” without any purchase obligation.
Economics and logistics. All businesses seek to control costs; they don’t need an economist to tell them why it’s important or how to do it. But there are some very important exceptions. Companies in the transportation and communications business face complex optimization problems that mathematicians and economists have figured out how best to solve through “linear” (and later “non-linear”) programming methods. Firms in these industries and their customers who thereby benefit from lower prices (admittedly through processes they never see) benefit greatly.
Economists and big data. For several decades after World War II, economists used statistical techniques to build increasingly complex models to forecast key macroeconomic variables, notably, GDP growth, inflation and unemployment. Economists who had statistical skills worked at leading forecasting firms such as Data Resources, Inc and Wharton Econometric Forecasting Associates (the two have since merged and been absorbed into Standard & Poors). Many large banks, other financial institutions and some large manufacturing companies also had their own economic staffs.
This has all changed. Macro models are now largely out of vogue, though still used along with human judgment at institutions like the Federal Reserve Board and the International Monetary Fund. Forecasters never were very good at predicting turning points in the economy — recessions and recoveries — and it is not clear they will get better over time, though some will try.
Instead, the “Big Data” revolution ushered in by the ease of capturing, storing and analyzing large bodies of data has generated new demands for economists and statisticians. High tech companies like Amazon, Yahoo and Google, among others, now employ economists to sift through all kinds of data — retail transaction data, browsing patterns, mobile phone usage — to fine tune their product offerings, pricing and other business strategies.
Economists and market design. Most markets “clear” by having prices signal producers to make just enough that purchasers are willing to purchase. But a relatively new strand of economics, known as “market design” or “matching theory,” has focused on markets where “fit” is much more important than price in directing resources or decisions is gender neutral: matching of medical residents to hospitals, organ donor banks and on-line dating. For example, drawing on his Nobel prize-winning work shared with Lloyd Shapley, Harvard Business School emeritus professor Alvin Roth has used matching theory to design the national medical resident assignment program and kidney donor exchanges.
In the online dating world, one well-known problem is that women can get flooded with more offers for dates than they can reasonably screen or may want to spend time screening. One online service, cupid.com, hired one economist and drew on the work of another to limit the number of “roses” (requests for dates) men could send per month to women. This greatly incentivized the men to be much more selective, and knowing that, women were much more likely to reply after the limits were put in place.
Economists increasingly are also using insights from matching theory to help companies better design systems for matching potential employees with employers, where finding the right “cultural fit” is as or may be more important than an employee’s initial specific skills.
Economists and finance. Finally, not surprisingly, economists have been active for decades in formulating and testing theories in the financial world, some of which have found their into actual products (not all of them bad, like the complicated sub-prime mortgage securities at the heart of the financial crisis, which economists did not design). Examples include index funds, and their more recent variation, index-based exchange traded funds (EFTs). Index funds initially were brought to market by Vanguard founder Jack Bogle, whose idea for the S&P 500 Index fund was heavily influenced by two economists: late great MIT economist Paul Samuelson and Princeton’s Burton Malkiel, author of the classic, A Random Walk Down Wall Street.
Even more directly, the growth in financial options can be traced largely to the ease of valuing them, which is due to the Nobel-prize winning work of Fischer Black (the MIT economist and later Goldman Sachs partner who died before he certainly would have shared in the award), Myron Scholes (formerly of Stanford) and MIT’s Robert Merton.
Admittedly, better pricing of options has been a mixed blessing. While it is has greatly improved liquidity in the market of options, and facilitated the formation and growth of many tech startups (where option grants are routinely used to compensate employees, directors and advisers), better options pricing may also have contributed to excessive option grants used by companies like Enron, Tyco, and Worldcom to manipulate accounting statements to show illusory profits.
What’s the larger point to be made here? Only the one I assert in my book Trillion Dollar Economists, that business managers may want to pay a bit more attention to the scribbling of academic economists. There can be strategic advantage in consulting the sources where those thoughts are published, especially as translated for a bit broader audience than the economics priesthood (such as here in the Harvard Business Review and even in some relatively accessible professional journals like the Journal on Economic Perspectives). The first step to succeeding wildly as a first mover may be to connect with a first thinker.



Repeated Texting Can Thicken the Tendons in Your Thumbs
The tendon that extends to the tip of the thumb showed significant thickening in research subjects who were frequent texters and who repeatedly flexed the interphalangeal joint, which is closest to the thumbnail, while texting, according to a medical study reported by the Wall Street Journal. The greater the number of texts, the thicker the tendon. Frequent texters (an average of 1,209 messages per month) reported greater thumb pain in the dominant texting hand than infrequent texters (50 per month).



The Unexpected Consequences of Success
Everybody loves a winner, right? No, unfortunately, not always. In my coaching practice, many executives and entrepreneurs vent their frustrations with the unexpected negative consequences of their success — such as their anxiety over being able to maintain their winning streak, the fear that they will be set up to fail, and the envy others feel toward them for their good fortune. Turns out that, according to recent research, these kinds of worries aren’t just in their heads — they’re very real. Here’s a summary of that research, along with suggestions for overcoming these traps.
Don’t do victory laps: A recent study shows that people judge expressive winners as arrogant compared to inexpressive winners and are less likely to want to befriend them. Being judged negatively for your success is justifiably an implicit fear. As a result, success can heighten ambivalence, even unconsciously, about winning. What can you do about this? Learn to moderate when and where you express happiness about your success. Share the good news with other successful people. And focus your conversation on other things you are developing when you are succeeding so as not to annoy people. Striking a balance between authentically admitting your happiness and pretending to “not care” is important. We should enjoy the motivation that comes from being successful, rather than sabotaging ourselves when we are inauthentic. For example, Ray, a current client, often smiles in a pleased way when he announces good news to his company or the public, but always focuses on the unconquered path ahead. He avoids fist pumps and overt signs of victory even when he is overjoyed, and reserves this for conversations with select people in his life.
Focus on the value you bring, not on winning per se: Another study found that when people are similar but superior to us in their achievements, our brain’s conflict center is activated leading to envy. In addition, when these people fail, our brain’s reward center is activated leading to feelings of schadenfreude (pleasure when someone else falls from grace). When we win, we assume that others will feel similarly, as we project our own feelings onto them. This fear may be unconscious or conscious and may disrupt our confidence, causing anxiety about the effect of our success on other people. To counteract this fear of someone else wishing we would fall, focus instead on the value that you bring to the world rather than winning per se. This will help boost your confidence despite this fear. For example, Cathy, a CEO whose meteoric rise to the top left other people gasping, “distracted” people from their shock by focusing on the value that the company brought to the world.
Stay in the “here and now”: When we anticipate future reactions from others, this may actually prevent us from achieving or maintaining success, and if we think too much about these reactions, they may prevent us from subsequently adequately controlling our emotions. To manage this consequence of success, stop overthinking the success. Focus on the “here and now.” Let go of worrying about the future and rationalizing the past. Obsession with the past can be distracting and is not always helpful. Also, it will prevent you from clearing your mind. The study above shows that when we integrate what we are anticipating into the here-and-now, we are more likely to manage our emotions more effectively. This means enjoying, accepting, and motivating ourselves with our successes. Joe, an entrepreneur, always “recalibrates” after each round of funding by setting new goals and focuses on what he has to execute on now, rather than obsessively trying to “psychologize” his prior victories. He chooses a time to let go and moves on.
Reach higher: Finally, when we are at the summit of our careers, we may become bored to the point that we slow down too much and become disoriented. This is called “the summit syndrome.” To prevent boredom, you have to always be looking for stimulating ways to apply your mastery. When you have mastered something, ask yourself: How you can innovate around this? Watch out for your own boredom as it can lead you to sabotage yourself, and also watch out for reactive lateral shifts in job hierarchy simply to escape your boredom of mastery. Huang, a fund manager, sticks to his investment process within his company and after a streak of major wins, he raises the bar even more for himself and engages in this “reaching.”
People often prepare for failure, but rarely prepare for what they will do when they succeed. Even when we consciously want to be successful, enjoying that success can be a challenge. By following the suggestions above, you can create a framework for managing success so that you can more reliably sustain your success when it occurs. If you are conscious about these factors, you will create far more opportunities to sustain your success over time. More importantly though, as a society, we are likely to have more sustained wins if we manage our feelings of envy and schadenfreude. If we do this, we, and those whom we care about, will fully enjoy and savor those winning streaks.



September 26, 2014
Why the Fed Is So Wimpy
Regulatory capture — when regulators come to act mainly in the interest of the industries they regulate — is a phenomenon that economists, political scientists, and legal scholars have been writing about for decades. Bank regulators in particular have been depicted as captives for years, and have even taken to describing themselves as such.
Actually witnessing capture in the wild is different, though, and the new This American Life episode with secret recordings of bank examiners at the Federal Reserve Bank of New York going about their jobs is going to focus a lot more attention on the phenomenon. It’s really well done, and you should listen to it, read the transcript, and/or read the story by ProPublica reporter Jake Bernstein.
Still, there is some context that’s inevitably missing, and as a former banking-regulation reporter for the American Banker, I feel called to fill some of it in. Much of it has to do with the structure of bank regulation in the U.S., which actually seems designed to encourage capture. But to start, there are a couple of revelations about Goldman Sachs in the story that are treated as smoking guns. One seems to have fired a blank, while the other may be even more explosive than it’s made out to be.
In the first, Carmen Segarra, the former Fed bank examiner who made the tapes, tells of a Goldman Sachs executive saying in a meeting that “once clients were wealthy enough, certain consumer laws didn’t apply to them.” Far from being a shocking admission, this is actually a pretty fair summary of American securities law. According to the Securities and Exchange Commission’s “accredited investor” guidelines, an individual with a net worth of more than $1 million or an income of more than $200,000 is exempt from many of the investor-protection rules that apply to people with less money. That’s why rich people can invest in hedge funds while, for the most part, regular folks can’t. Maybe there were some incriminating details behind the Goldman executive’s statement that alarmed Segarra and were left out of the story, but on the face of it there’s nothing to see here.
The other smoking gun is that Segarra pushed for a tough Fed line on Goldman’s lack of a substantive conflict of interest policy, and was rebuffed by her boss. This is a big deal, and for much more than the legal/compliance reasons discussed in the piece. That’s because, for the past two decades or so, not having a substantive conflict of interest policy has been Goldman’s business model. Representing both sides in mergers, betting alongside and against clients, and exploiting its informational edge wherever possible is simply how the firm makes its money. Forcing it to sharply reduce these conflicts would be potentially devastating.
Maybe, as a matter of policy, the United States government should ban such behavior. But asking bank examiners at the New York Fed to take an action on their own that might torpedo a leading bank’s profits is an awfully tall order. The regulators at the Fed and their counterparts at the Office of the Comptroller of the Currency and the Federal Deposit Insurance Corporation correctly see their main job as ensuring the safety and soundness of the banking system. Over the decades, consumer protections and other rules have been added to their purview, but safety and soundness have remained paramount. Profitable banks are generally safer and sounder than unprofitable ones. So bank regulators are understandably wary of doing anything that might cut into profits.
The point here is that if bank regulators are captives who identify with the interests of the banks they regulate, it is partly by design. This is especially true of the Federal Reserve System, which was created by Congress in 1913 more as a friend to and creature of the banks than as a watchdog. Two-thirds of the board that governs the New York Fed is chosen by local bankers. And while amendments to the Federal Reserve Act in 1933 shifted the balance of power in the Federal Reserve System from the regional Federal Reserve Banks (and the New York Fed in particular) to the political appointees on the Board of Governors in Washington, bank regulation continues to reside at the regional banks. Which means that the bank regulators’ bosses report to a board chosen by … the banks.
Then there’s the fact that Goldman Sachs is a relative newcomer to Federal Reserve supervision — it and rival Morgan Stanley only agreed to become bank holding companies, giving them access to New York Fed loans, at the height of the financial crisis in 2008. While it’s a little hard to imagine Goldman choosing now to rejoin the ranks of mere securities firms, and even harder to see how it could leap to a different banking regulator, it is possible that some Fed examiners are afraid of scaring it away.
All this is meant not to excuse the extreme timidity apparent in the Fed tapes, but to explain why it’s been so hard for the New York Fed to adopt the more aggressive, questioning approach urged by Columbia Business School Professor David Beim in a formerly confidential internal Fed report that This American Life and ProPublica give a lot of play to. Bank regulation springs from much different roots than, say, environmental regulation.
So what is to be done? A lot of the classic regulatory capture literature tends toward the conclusion that we should just give up — shut down the regulators and allow competitive forces to work their magic. That means letting businesses fail. But with banks more than other businesses, failures tend to be contagious. It was to counteract this risk of systemic failure that Congress created the Fed and other bank regulators in the first place, and even if you think that was a big mistake, they’re really not going away.
More recently, there’s been a concerted effort to take a more nuanced view of regulatory capture and how to counteract it. The recent Tobin Project book, Preventing Regulatory Capture: Special Interest Influence and How to Limit It, sums up much of this thinking. While I’ve read parts of it before, I only downloaded the full book an hour ago, so I’m not going to pretend to be able to sum it up here. But here’s a thought — maybe if banking laws and regulations were simpler and more straightforward, the bank examiners at the Fed and elsewhere wouldn’t so often be in the position of making judgment calls that favor the banks they oversee. Then again, the people who write banking laws and regulations are not exactly immune from capture themselves. This won’t be an easy thing to fix.
update: The initial version of this piece listed the Office of Thrift Supervision as one of the nation’s bank regulators. As David Dayen pointed out (and I swear I knew at some point, but had totally forgotten), it was subsumed by the OCC in 2011.



Ello Is a Wake-Up Call for Social Media Marketing
To understand upstart social network Ello, which burst into the spotlight this week — growing from just 90 members in August to a reported 30,000 new users per hour — let’s start with its manifesto:
Your social network is owned by advertisers.
Every post you share, every friend you make and every link you follow is tracked, recorded and converted into data. Advertisers buy your data so they can show you more ads. You are the product that’s bought and sold.
We believe there is a better way…We believe a social network can be a tool for empowerment. Not a tool to deceive, coerce and manipulate—but a place to connect, create and celebrate life.
Even if you’re cheering for this phenomenon as a social media user, the view from inside any business that relies on social media advertising may be less enthusiastic.
Businesses need to take Ello and its manifesto as a wake-up call to rethink the way they use social networks to reach customers. The intense interest and discussion engendered by this manifesto attests to the profound misgivings many of those customers now have about the networks that occupy a growing place our work, our relationships and our lives.
Those misgivings are evident in the sign-ups for networks like Ello and Diaspora; in the emergence of anonymous, private and non-persistent platforms like Secret and WhatsApp; and in the growing number of Internet users who report taking steps to obscure their digital footprint.
We have a long way to go before Ello and its ilk pose a significant threat to established players like Facebook and Twitter — if they ever get there. But companies still need to pay attention to the growing public discomfort with advertiser dominance and algorithm-driven user experiences. As Internet users are growing uncomfortable with the now-established model of “you get free social networking, we get your data and eyeballs,” businesses need to do more than tinker with their social media strategies: they need to rethink their core approach to social media itself.
That means stepping back from the relentless quest for followers, clicks, and mentions, and instead thinking about why brands got involved in social media in the first place. In its early days, the promise of the social web lay in the ability of companies to have direct and ongoing relationships with their customers — to become more responsive, more accountable and more attuned to the things their customers really cared about. Instead, companies have found a world in which their old intermediaries (broadcasters, publishers, journalists) have simply been replaced by a new set of intermediaries (social networks, bloggers).
This shift provides companies with a chance to rethink their own use of the social web; the smart ones will seize this opportunity to forge a new kind of relationship with their customers.
But because any successful relationship has to be built on trust, companies will have to begin by addressing the trust gap that has emerged out of the past five or ten years of social media marketing — a trust gap that is clearly conveyed in the Ello manifesto. That gap is about more than privacy or invasive ads: it reflects the frustration with the steady commercialization of our online interactions and spaces. Instead of elbowing their way into Ello with branded accounts and “content” that takes the place of ads, companies need to recognize that our online world needs non-commercial spaces as well as ad-friendly networks, just as the offline world has room for both libraries and bookstores. Instead of relying on algorithms and ad targeting to get dollars out of their customers’ wallets, companies need to think about the value they can offer to their customers’ online lives.
Just because advertisers are unwelcome on some parts of the social web, that doesn’t mean businesses are necessarily unwelcome, though: consumers simply want businesses to engage with them in some way that goes beyond a pitch. That could mean inviting customers into your product development process through co-creation. It could involve convening meaningful conversations on topics that resonate with your customers and your brand. It could look like partnering with your customers to make the products they want, or offer the services they need, or help them sell their stuff to other people like them. All of these are ways to engage with your customers that align with the spirit of the social web, instead of treating it as a billboard.
But you’re not going to get that kind of engagement by moseying up to the social media drive-thru and asking for a double order of customer engagement, please. You can’t leave it to the established social networks to create the platform that helps you connect with your customers; you need to find a way to convene the conversations you want, in a context that will actually work for both you and the customers you serve. And as the sudden rise of Ello suggests, that will probably need to be a context in which your customers feel like you are treating both their data and their attention with the greatest respect.
And you can begin with your own version of the Ello manifesto:
Your customer relationships are owned by other companies — companies like Facebook, Twitter and Google.
Every interaction you have, every customer you acquire and every ad you place is tracked, recorded and converted into data that can serve your competitors — or the social network itself. You dedicate your ad dollars, your customer relations team and your very best content creators to building a social network that somebody else controls. You are the customer, but your own customers are the product that is bought and sold.
We believe there is a better way…We believe the social web can be a tool for customer engagement. Not a tool to deceive, coerce and manipulate — but a place to connect, create and celebrate what we can do together.



Marina Gorbis's Blog
- Marina Gorbis's profile
- 3 followers
