Michael S. Malone's Blog, page 2

September 9, 2011

Mr. President, Listen to Tech

Guest commentary from Scott Budman, technology reporter for KNTV-NBC in San Jose, California.


President Obama owes the tech industry. It helped put him into office, and, thanks to high profile visits with the likes of Steve Jobs and Mark Zuckerberg, it helped him stay relevant and hip.


Now, the tech industry is weighing in with some job advice. And in the wake of the very high-profile collapse of Solyndra, hand-picked by Obama to spotlight the green tech industry, it’s advice worth taking, whether you’re a President, or someone in need of a job.


First off, there are still lots of Green Tech jobs out there. No longer a quick sure-shot path to IPO riches, several companies are doing well in the field, just growing a lot more slowly (and more thoughtfully) than Solyndra.


We also heard from BranchOut, a job creator linked to Facebook. Not surprisingly, they talk about social networking , and how important it is to getting that next gig. Whether you’re using Twitter, GigWalk, or Craig’s List, the advice from BranchOut is to do it yourself – sort of a “create your own luck” message, with no government funding involved.


Also, from LinkedIn, not only a big job creator around these parts, but a company that’s creating a lot of wealth through an extremely successful IPO. They point out that tech companies like Google, Oracle, VMWare, even Yahoo make up several hundred job openings in Silicon Valley alone. Are you an engineer? Marketer? Salesperson? Don’t wait for the government to help you, get out there and tap your contacts.


As I write this, the FBI is raiding the Solyndra campus, possibly to collect some of the items your tax dollars paid for. Some 530 million of your tax dollars.


Not to say that government intervention can’t help with job creation. It just doesn’t have the track record that innovation and entrepreneurship have. Need evidence? Just ask the job creators.


Scott, who feels fortunate to have a job, is on Twitter: @scottbudman


 


 •  0 comments  •  flag
Share on Twitter
Published on September 09, 2011 13:55

July 20, 2011

The Biggest Show in Silicon Valley (again)

Guest contributor:  Scott Budman, technology reporter at NBC-KNTV in San Jose, Calif.


Apple, to paraphrase Jay-Z, Kanye West, and Rihanna, is still running this town.


As the Cupertino, consumer-tech giant once again opens its books to the public, its stock price has been red-hot, with a market capitalization nearing $400 billion, its cash pile continues to grow, and its products are flying off store shelves — economic slowdown be damned.


But as with any successful company in the tech industry, investors are not historians, but are rather concerned about the future. They quickly move beyond “What have you done for me lately?” and straight into “What will you do for me next?”


And once again, Apple has answered with stunning financials.  Great news — but, as always, with Apple we are left wanting to know more.


First off, how is Steve Jobs? The CEO may not be on the conference call, because he’s still on a medical leave of absence, but he’s been known to show up to launches lately, which always stokes the faithful.  What about the rumor that just ran here on Forbes.com that the Apple board may be quietly interviewing potential replacements for Jobs.


Secondly, how are sales? Lately, easy to answer — iPads and iPhones are moving like hotcakes. So what’s next? Is the planned release of the Lion OS and MacBook Air updates on schedule? Will we see the iPhone 5 soon? And, with all that cash, how about a dividend? Surely, Apple is mature enough to share the wealth with its patient (and admittedly very satisfied) shareholders.


And — but probably less important to shareholders — what about these pesky patent lawsuits? Will any of them stick?


Bottom line: Apple, like Silicon Valley itself, is setting a pace way ahead of the rest of the economy. The stock price has been hot for a while, and investors will probably demand staggeringly good numbers before lifting the price much higher. That said, as yesterday’s news showed, Apple is still capable of delivering home runs every three months.


But it still feels like a high-wire act.

 •  0 comments  •  flag
Share on Twitter
Published on July 20, 2011 00:08

June 1, 2011

Zynga Adds New Game to its “Empire”

Inside the San Francisco headquarters of Zynga, engineers who haven’t shaved – or, really, slept – for days huddle around screens, checking for updates.  The whole building has a feeling of anticipation, like the next few hours could change their lives.


No, it’s not the social networking giant’s IPO — let’s not get ahead of ourselves.  The talk today is not about stock, but about the newest Zynga game release, called “Empires and Allies.”  Launching on June 1, in 12 languages, “E&A” boasts more action that your typical farm or city, instead letting players build up their island fortresses, and then defend them in battle.


In fact, “Empires” really spreads things out.  The game’s executive producer, Amer Ajami, told me the new game is like CityVille meets Risk, “because you have multi-player city building, along with more traditional hardcore game elements like fighting, and telling an involved story with many characters and villains.”  You can choose to help your friends get ahead, or fight them.  Either way, the ultimate goal is to beat back the evil warlord named “The Raven.”


Also, like all Zynga games, you start for free, then can build up your defenses by spending real money.  Your real cash buys you virtual “points” you can use to snag tanks and airplanes, and leapfrog over levels you find too boring, or too taxing.


Yes, Zynga makes a ton of cash with these games, and given LinkedIn’s recent IPO success, it makes sense that Zynga would take its turn basking in the public market.  While you wait for their stock filing, there’s a new way to keep you busy.


Scott can be found on Twitter: @scottbudman

 

 •  0 comments  •  flag
Share on Twitter
Published on June 01, 2011 00:31

May 24, 2011

Is LinkedIn the Next Netscape?

Note from Mike Malone:  I’ve invited the noted Silicon Valley technology correspondent, Scott Budman of NBC-KNTV, to regularly publish his reporter’s notes on this blog.  Here is his first entry:


It’s the question on everyone’s lips this week:


Is LinkedIn the next Netscape? When asked, it carries a negative connotation — i.e., will this be the new bubble starter to lead us into greed, ruin, etc, etc.


My take? Yes, LinkedIn is the next Netscape, for all the right reasons.


When I started to cover the Silicon Valley tech beat, Netscape was close to going public. When it did, it unleashed a huge wave of IPOs, but more importantly, it literally took the tech industry public.


Everyone wanted in (and yeah, that wasn’t the best thing), but more importantly, everyone learned about the internet, and recognized it as a force to be reckoned with, both on the productivity side, and the financial side.


And that’s what LinkedIn is doing. Yes, the stock price seems a little frothy, and yeah, the company’s underwriters probably left a bunch of money on the table, but you really can’t predict investor sentiment, so let’s look at it this way: LinkedIn is now showing the world that social networking companies can deliver when it comes to the big dollars.





  



If you’re Zynga or Facebook, this is great news, and probably not a surprise. But if you’re a smaller social networker, this is an opportunity to get things in order, staff up, get profitable and take advantage — now, everyone knows about you, expects a lot from you, and is ready to reward you.


 


Scott is on LinkedIn, and on Twitter: @scottbudman

 •  0 comments  •  flag
Share on Twitter
Published on May 24, 2011 14:37

May 10, 2011

The Future Still Lives

  


            In a week of big stories, the biggest didn’t take place in Pakistan or Washington, D.C., but in Santa Clara, California.  Unlike Osama bin Laden, we managed to dodge a bullet.  If we hadn’t, it wouldn’t have ended modern civilization, but it might have sent it off on a much different, and much less happy, path.


            You probably didn’t read this story.  So, put simply, Intel Corp. announced Wednesday that Moore’s Law isn’t going to end anytime soon.  Because of that, your life, and that of your children and grandchildren are going to be a whole lot better than they might have been.


            Today, almost a half-century after it was first elucidated by legendary Fairchild and Intel co-founder Dr. Gordon Moore in an article for a trade magazine, it is increasingly apparent that Moore’s Law is the defining measure of the modern world.  Every other predictive tool for understanding life in the developed world since WWII – demographics, productivity tables, literacy rates, econometrics, the cycles of history, Marxist analysis, and on and on – have failed to predict the trajectory of society over the decades . . .except Moore’s Law. 


            Alone, this oddly narrow and technical dictum – that the power, miniaturization, size and power of integrated circuit chips will, together, double every couple years – has done a better job than any other in determining the pace of daily life, the ups and downs of the economy, the pace of innovation and the creation of new companies, fads and lifestyles.  It has been said many times that, beneath everything, Moore’s Law is ticking away as the metronome, the heartbeat, of the modern world.


            Why this should be so is somewhat complicated.  But a simple explanation is that Moore’s Law isn’t strictly a scientific law – like, say Newton’s Laws of Motion – but rather a brilliant observation of an implied contract between the semiconductor industry and the society it serves.  What Gordon Moore observed back in the mid-1960s was that each generation of memory chips (in those days they could store a few thousand bits, compared to a few billion today), which appeared about every 18 months, had twice the storage capacity of the generation before.  Plotting the exponential curve of this development on logarithmic paper, Moore was please to see a straight line . . .suggesting that this developmental path might continue into the foreseeable future.


            This discovery has been rightly celebrated for years.  But often forgotten is that there was technological determinism behind the Law.  Computer chips didn’t make themselves.  And so, if the semiconductor industry had decided the next day to slow production or reduce their R&D budgets, Moore’s Law would have died within weeks.  Instead, semiconductor companies around the world, big and small, and not least because of their respect for Gordon Moore, set out to uphold the Law – and they have done so ever since, despite seemingly impossible technical and scientific obstacles.  Gordon Moore not only discovered Moore’s Law, he made it real.  As his successor at Intel, Paul Otellini, once told me, “I’m not going to be the guy whose legacy is that Moore’s Law died on his watch.”  And that’s true for every worker in the semiconductor industry.  They are our equivalent of our medieval workers, devoting their entire careers to building a cathedral whose end they will never see.


            And so, instead of fading away like yet one more corporate five year plan, Moore’s Law has defined our age, and done so more than any of the more celebrated trend-setters, from the Woodstock generation to NASA to the personal computer.  Moore’s Law today isn’t just microprocessors and memory, but the Internet, cellular telelphony, bioengineering, medicine, education and play.  If, in the years ahead, we reach that Singularity of man and computer that Ray Kurzweill predicts for us, that will be Moore’s Law too.  But most of all, the virtuous cycle of constant innovation and advancement, of hot new companies that regularly refresh our economy, and of a world characterized by continuous change – in other words, the world that was created for the first time in history only about sixty years ago, and from which we can hardly imagine another – is the result of Moore’s Law. 


            When Gordon Moore first enunciated his Law, only a handful of industries – the first minicomputers, a couple scientific instruments, a desktop calculator or two – actually exhibited its hyperbolic rate of change.  Today, every segment of society either embraces Moore’s Law or is racing to get there.  That’s because they know that if only they can get aboard that rocket – that is, if they can add a digital component to their business — they too can accelerate away from the competition.  That’s why none of the inventions we Baby Boomers as kids expected to enjoy as adults – atomic cars!  personal helicopters! ray guns! – have come true; and also why we have even more powerful tools and toys – instead.  Whatever can be made digital, if not in the whole, but in part – marketing, communications, entertainment, genetic engineering, robotics, warfare, manufacturing, service, finance, sports – it will, because going digital means jumping onto Moore’s Law.  Miss that train and, as a business, an institution or a cultural phenomenon, you die.


            So, what made this week’s announcement — by Intel — so important?  It is that almost from the moment the implications of Moore’s Law became understood, there has been a gnawing fear among technologists and those who understand technology, that Moore’s Law will someday end – having snubbed up against the limits of, if not human ingenuity, then physics itself.  Already compromises have been made – multiple processors instead of a single one on a chip, exotic new materials to stop leaking electrons – but as the channels get narrower and bumpier with molecules and the walls thinner and more permeable to atomic effects, the end seems to draw closer and closer.  Five years away?  Ten?  And then what?  What will it be like to live in a world without Moore’s Law . . .when every human institution now depends upon it?


            But the great lesson of Moore’s Law is not just that we can find a way to continuously better our lives – but that human ingenuity knows no bounds, nor can ever really be stopped.  You probably haven’t noticed over the last decade the occasional brief scientific article about some lab at a university, or at IBM, Intel or HP, coming up with a new way to produce a transistor or electronic gate out of just two or three atoms.  Those stories are about saving Moore’s Law for yet another generation.  But that’s the next chapter.  Right here and now, this week, the folks at Intel were almost giddy in announcing that what had been one of those little stories a decade ago – tri-gate transistors – would now be the technology in all new Intel chips.


            I’m not going to go into technical detail about how tri-gate transistors work, but suffice to say that since the late 1950s, when Jean Hoerni, along with the other founders of the semiconductor industry at Fairchild (including Gordon Moore), developed the ‘planar’ process, all integrated circuits have been structurally flat, a series of layers of semiconductors, insulators and wiring ‘printed’ on an equally flat sheet of silicon.  For the first time, Intel’s new tri-gate technology leaves the plane of the chip and enters the third dimension.  It does so by bringing three ‘fins’ of silicon up from beneath the surface, having them stick up into the top, transistor, layer.  The effect is kind of like draping a mattress over a fence – and then repeating that over a billion fences, all just inches apart.  The result is a much greater density of the gates, lower power consumption, faster switching and fewer quantum side-effects.  Intel claims that more than 6 million of these 22 nanometer Tri-Gate transistors can fit in the period at the end of this sentence.


            The first processors featuring Tri-Gate transistors will likely appear later this year.  And you can be sure that competitors, with similar designs, will appear soon after.  But that’s their battle.


            What counts for the rest of us is that Moore’s Law survives.  The future will arrive as quickly as ever. . .

 •  0 comments  •  flag
Share on Twitter
Published on May 10, 2011 14:44

March 30, 2011

Entrepreneur for Life

 


When a man dies at 84 after a long bout with cancer, it’s natural to assume that his best days were far behind him. 


And so, when the news was carried around the world last weekend that Paul Baran – the man who really could be credited with inventing the Internet – had died, readers couldn’t be blamed for concluding that Baran’s single claim on history had been made a half-century ago.


But that would be wrong.  I believe that future generations will look back and see Baran not just as the inventor of the seminal networking technology called ‘packet-switching’.  Packet switching, devised when Baran was a young scientist working for RAND Corp., did indeed make global networks from the Internet to cellular telephony practical.  And for that revolutionary achievement, he rightly deserves an honored place in the history of the digital revolution – and the National Inventor’s Medal that President Bush pinned on him three years ago.


But Paul Baran also achieved something else of such magnitude that its implications may not be fully recognized for another generation:  he was the first true lifelong entrepreneur.  In that, he may very well prove to be a pioneer of a cultural phenomenon that will help define the rest of this century.


Baran created his first enterprise in 1968.  He was working on his last, one of the most ambitious of his career, on the day he died.  In between, Baran, often teamed with his business partner, Steve Millard, and later his son Dave, founded as many as a dozen companies.  As with any entrepreneur, many of these companies failed.  But Baran also had as many hits as anyone.  Once, after I introduced him as having founded four $1 billion public companies, he quietly corrected me:  “Only three.  The fourth was only $700 million.”


Packet switching was a brilliant invention, but Baran’s real genius lay in a deep understanding of technology combined a perfect sense of business timing.  The greatest fear of most high tech entrepreneurs is that they will fall behind the technology curve – that is, by the time they get their inventions to market Moore’s Law will have already rendered them obsolete.  What Baran understood is that there is an even greater danger in being too far ahead of that same curve – that’s when you run out of money vainly waiting for your components and your customers to catch up.


Baran never lost that exquisite timing, even in his eighties.  He had an almost supernatural ability to know when an advancing technology and a needy market were about to collide . . .and he positioned himself there just before impact.  Cable modems, computer printers, airport metal detectors, wireless Internet, smart electrical meters, medical home diagnostics – he was almost always in place (usually with a pocket full of patents) before his future competitors even identified the opportunity.


But timing is only part of what makes a great entrepreneur.  As decisive as Baran was in creating his companies, he was equally decisive – even ruthless – about walking away from them.  Never a great manager, he sold off his successful companies when they stopped being innovative and dynamic.  And more than once – most recently with a smart home electric metering design – he abandoned brilliant inventions because he didn’t have the time nor patience to deal with the obstacles (usually government bureaucracies) needed to make them real.


I first met Paul Baran three years ago when I was invited to sit on an early planning meeting of a new start-up company.  Five of us sat in the corner of a hotel lobby and, accustomed to dealing with Web 2.0 start-up teams composed of post-adolescents, I was astonished to find that, in my mid-fifties, I was the youngest person there.  Baran, whom I only knew as a legend, led the meeting like a lion tamer.  I went in expecting to meet an old man past his prime; I came away realizing I had not only been with the purest entrepreneur I’d ever met, but also the most ferocious.  With his ambition, fearlessness, willingness to fail, and commitment to the task, the tough old man put the dreamy children to shame. 


For the first time, I understood that entrepreneurship could not only be a job, a career, but a lifelong approach to the world. And that the work of starting new enterprises wasn’t just for the young.  On the contrary, old folks had certain advantages – experience, perspective, stability, personal wealth, and a lack of ego – that youngsters could never duplicate.  Paul Baran taught me – and I suspect his example will teach millions in the years to come – that there is no set age or duration to being an entrepreneur.


For the last six months, I have been part of a Paul Baran start-up – once again as a ‘junior’ member.  Amusingly, we met in a retirement home dining room . . .and anyone passing by probably assumed that we were a bunch of retirees reminiscing about the old days.  In fact, the pace of these meetings, the ego-free teamwork, and the decisiveness in dealing with the next challenge were dazzling.  And at the center, as always, was Paul Baran.  He knew he was dying, but he never complained.  Instead, he continued to file patents on his newest ideas, even as he drove us forward.  “Let’s go!” he would say at every meeting, “Let’s make this thing happen!” 


He remained audacity personified right up to the very last day of his life, when he was simultaneously working on a new invention and preparing for a business presentation to one of the world’s biggest companies. 


Paul Baran helped invent the Internet; but in the end, he also taught us how to live our own lives, from beginning to end, in the Internet Age.  And that may prove to be even more important.


Michael S. Malone is a veteran Silicon Valley journalist.


 •  0 comments  •  flag
Share on Twitter
Published on March 30, 2011 12:06

January 14, 2011

The Photograph

 


I’ve just been re-introduced to my childhood self after a separation of nearly a half-century.


While I was living a busy, but tightly circumscribed, life in California going to junior high school, playing Little League baseball and camping with the Boy Scouts, my self – or more precisely, my image – joined that of my childhood best friend and travelled the world.  It was even viewed by million of people during one of the iconic events of the 20th century.


And then, as I grew into adulthood and began my own explorations into the bigger world, my image retreated to the hermetic world of an envelope in a desk drawer . . .only to emerge decades later, almost magically, at the very moment I lost my final connection to our shared childhood.


My mother died last July after a long and remarkable life.  I gave her eulogy to a large crowd at the Museum that had been my late father’s dream, and which my mother had been both a volunteer and the largest benefactor.  I turned that eulogy into this column, which was picked up by the Web and carried by blogs around the world.  I also sent copies with the announcement to my mother’s long list of friends and family.


One of those notes went to Scott Christopher, the noted photographer based in Santa Fe, New Mexico.  Scott and I had been best friends as boys in Falls Church, Virginia, and my mom had been like a second mother to Scott.  In the intervening decades she had done a much better of job than me of keeping in touch with Scott – so I knew that the news would affect him deeply.


I wasn’t surprised when, a few weeks later, Scott sent a touching note offering his condolences.  But I was surprised – indeed, stunned speechless – by the image on the other side of the note:  It was a photograph of Scott and I, deep in conversation, sitting on a fence at what appeared to be a farm.  We looked to be about age eight.


This was no weekend snapshot taken with the family Instamatic.  This was a professional photograph, with beautifully saturated colors, tonal balancing only a darkroom could achieve, and a composition that bore the mark of a master photographer.  The instant I saw the image, I knew who took it:  Frank Christopher, Scott’s dad.


Frank Christopher – ‘Cheetah’, we later called him – was the most eccentric, and intriguing, figure in my neighborhood.  The housing development was called ‘Pine Springs’, and when we arrived in 1957 after my father’s Air Force intelligence career had taken us from Germany (where I was born) to Spokane, Washington, and finally to an old government office building located where the Air & Space Museum now stands.   Pine Springs was a new development of modest homes with interesting modernist architecture that stood on the edge of a seemingly endless forest stretching north and west to Tysons Corners (then just a gas station and road house) and beyond.


Scott and I played and explored in that forest.  We caught turtles and crayfish (some of which ended up in the Christopher’s bathtub), built forts, and brought home jars of tadpoles that would inevitably surprise us by turning into a chaos of tiny frogs in the garage and house.  When we weren’t being Tom and Huck, Scott and I played in pick-up football and baseball games, or just took off on journeys by ourselves that no 21st century suburban child would ever be allowed to do.  When Scott wasn’t at my house, I was at his, and when we were apart we still found a way to connect – including a tin can phone with a 300 foot string.


It was, as Scott has written, “truly magical”.  It began the moment the school bell rang and continued until we were ordered inside from the growing darkness – and often not even then.


Into this self-contained little world, where an entire day could be spent looking for four-leaf clovers or attacking a huge wasp’s nest or damming a rain-choked street gutter, adults only made brief . . .and mostly unwelcome . . .appearances.  My father, like most of the other dads, awoke to a cough and the click of a Zippo lighter, a quick breakfast, then was off in our ’48 Jeepster or big-finned ’57 Chrysler in his brown suit and skinny tie to the ‘office’ – only to return in the early evening, pour himself a cocktail and, after dinner, fall asleep in the Eames chair while watching Huntley and Brinkley.


But Scott’s dad was different.  I would sometimes see Frank Christopher, still lying in bed in the afternoon, watching movies on a black & white TV with a busted vertical hold.  Or he was off playing golf in the middle of the day.  But then, other times, he would disappear for several weeks at a time, his departure a flurry of activity.  I knew that he carried a camera with him, but I don’t remember him ever taking a photograph.  And when I did see one of his prints – “Strike Three”, a 1959 photo of Scott in oversized hand-me-downs taking a mighty hack, that was one of the most honored images ever taken of childhood sports – I completely missed the artistry and laughed at the fact that Scott had missed the ball.


I did notice, in the summer of 1961, that Scott’s dad was gone for a long time – but it would be another decade before I realized that he had gone to Moscow to put on the first American photography exhibition in the Soviet Union – a historic event in the slow thawing of the Cold War in which my own father was a soldier.


The blizzard of 1962, combined with my father’s first heart attack, proved to be a turning point.  My father retired from the Air Force and took a job with NASA in California.  Perhaps it was because we sensed that everything was coming to end, that spring still remains in my mind –and I know in Scott’s too – as the most idyllic moment of our childhood.  And it was on one of those days that spring, on a trip to a farm in Chantilly, VA, that Frank Christopher took the photograph.  Here it is:


 


At first glance, “On the Farm”, as it was entitled in exhibitions, appears to be a casual photo.  But it is nothing of the sort.  The composition is far too complex; so I have no doubt that Frank, in his genius, placed us on that fence – and then just waited for us to get lost in conversation.  That’s my red shirt – I wear it in my 4th grade school portrait — but as Scott has noted, he was told to wear this specific sweatshirt in order to balance the color. 


Needless to say, what is most compelling about this photograph is not just the obvious relationship between the two boys, but that amazing fence zig-zagging into the distance.  Scott, in his note to me with the photograph, wrote:  “The fence in this photograph represents life to me:  twists and turns, up and downs, but always moving forward.”  I think he’s right, but I also think it captures the conversation of two young best friends – bouncing around ideas, spinning out dreams and plans, finishing each other’s sentences.  It crackles like a bolt of lightning.


I think Frank Christopher knew all of that when he took that photo.  He understood that he was capturing a singular moment of childhood that would never be repeated; and that he was also sharing an understanding with his adult viewers of what was to come for the two of us.


Two months later, at dawn, we pulled out of our driveway, loaded for California.  The neighborhood was silent and asleep . . . except for a little face in a distant window:  Scott crying and waving good-bye.


A decade later I would visit Scott in Virginia one last time.  I drove up in my dune buggy, with long hair and tie-dyed shirt.  During most of my visit, Scott wore a baseball uniform – he was playing in the All American Amateur Baseball Association national championship.  Though we have corresponded in the decades since, we have never again met in person.  I’m not sure we even need to.


But the zig-zags of life were already taking their course.  Celebrated at age 12 in Sports Illustrated for his athletic skills, by eighteen Scott was already paying for a tumble in a pile of broken glass he’d made at age seven playing with me behind a backstop.  He severed all seven tendons in his right, throwing, hand – and only through years of dedicated therapy had he managed to come back – even earning a baseball scholarship and two MVP awards at the University of Maryland.  The tendons had healed, but his right hand had been left slightly smaller and weaker than his left . . .just enough of a flaw to have pro scouts mark him down for having a ‘withered’ hand. 


At age 26 he wrote to me and described his final at-bat playing AA ball (with Cal Ripken) at the Charlotte Orioles.  Reaching first on a fielder’s choice, he had stolen second; then broke for home on a single and made a spectacular scoring slide at the plate.  Knowing it would never again be as good as this, Scott brushed off both dust and tears, walked through the dugout shaking everyone’s hand, cleared out his locker and left baseball forever


Scott married young, had three beautiful daughters, and was widowed while still in his early thirties.  But then he found, in Elizabeth, the great love of his life – and in photography, his lifelong career.  He was Michael Jackson’s photographer for a while, as well as special assignment photographer for the Discovery Network; and has taken many of the photographs you see on posters and prints.  These days, he lives in Santa Fe, and with Elizabeth has produced a movie and started a foundation for the arts.


And the photograph?  Though never as famous as the baseball image, it nevertheless was part of Frank Christopher’s award-winning, international travelling exhibition.  It was almost certainly shown at the 1964 New York World’s Fair – a legendary event I only saw on television from 3,000 miles away.


I knew none of this, of course.  But those years have never been far from my mind.  A childhood best friendship is unlike any other relationship in one’s life – purer, seemingly infinite in its possibilities, and forever bathed in a golden glow.  And to have that friendship play out against backdrop of such freedom and adventure was indeed “truly magical.”


The hard part about such a childhood is the certain knowledge that it is a paradise lost that can never be regained.  The good part is that, wherever that fence leads, you always have that time in your heart to fall back upon whenever you need it.  And when, this Christmas, a package from Santa Fe was opened to reveal a large print of “On the Farm”, I was able to turn to my wife and sons and say, “There’s my childhood”.  And to tell myself:  It was real.

 •  0 comments  •  flag
Share on Twitter
Published on January 14, 2011 14:18

December 24, 2010

Yes, Virginia . . .

 Dear Forbes.com –


            I am 8 years old.  Some of my friends say there is no Santa Claus.  My dad says, “If you read it on the Web, it must be so.  Please tell me the truth, is there a Santa Claus?”        


  Virginia O’Hanlon


Virginia, your friends are wrong.  We live in a very strange time, in which very clever, but cynical people, claim there is no such thing as the truth – and yet never miss a chance to tell young people what that truth is.  They tell this same story over and over, in forms as different as songs and cartoons and video games, to you and your friends.  Your friends have listened and accepted; to your credit, you have listened and questioned what you’ve heard.


Yes, Virginia, there is a Santa Claus.  I know if you surf the Web you’ll be linked to more web pages and blogs that suggest that he is just a myth – or worse, a joke – than that he is real.  Saddest of all are those sites that argue that Santa Claus is impossible, that reindeer can’t fly or that no one could visit so many homes in a single night.  These last stories are written by confused adults who don’t believe in miracles and want to force children to think as they do.  They call it “being realistic”. 


But Virginia, how can anyone not believe in miracles?  Look around you.  There are miracles everywhere – oddly enough, many of them created by the same people who tell you not to believe in them (grown-ups are funny that way).  Think about this:  it is very possible that the entire universe is made of invisible strings.  These strings vibrate in such a way as to create galaxies and bluebirds, atoms and daffodils . . .in other words, Virginia, the entire universe may be made out of music!  Isn’t that a miracle?  And isn’t it a miracle too that human beings — tiny creatures on a tiny planet in a corner of the Milky Way – could even imagine such a thing?


Oh, Virginia, there are so many miracles.  Think of that computer chip in your Wii or iPad that goes through as many thoughts in a second as you will have heartbeats in your entire life.  Or of those thousands of people in the world now who carry around transplanted hearts and livers and lungs.  Or the fact that one of our spacecraft, Pioneer 10, launched long before you were born, has now just left the solar system – and has become our emissary to outer space.  Or the fact that you hold in your hand a little machine that can connect you with almost anyone else in the world, instantly.  These are miracles, Virginia, every one of them.


But there are other miracles too, Virginia, much closer to home. Every day, when you walk down the street or through the mall you meet many people who carry with them enormous burdens – some have terrible pasts, some are sick or frightened, others have cares and responsibilities they can hardly bear, some are even dying – yet you would never know by looking at them.  Some of them may even live in your own house.  Even in this difficult year, when so many people have lost their homes and jobs, when the future is so uncertain and the world just seems to get more and more dangerous, these good people still get up each morning, put on a smile and go out into the world and try to make it a better place.  They are very, very brave people – and what they do each day is no less a miracle than the birth of stars.


Not believe in Santa Claus!  You might as well not believe in Facebook, or electrons, or black holes.  Nobody sees Santa Claus, but that doesn’t mean there is no Santa Claus.  No one has seen a quark either, or a computer bit, but that’s no proof they aren’t there.  The most real things in the world are those that neither children nor adults can see.  Great new discoveries and wonderful acts of human kindness are made every day.  Nobody can conceive or imagine all the wonders that are unseen or unseeable in this world.


If, in this age of science and technology, we have learned anything it is that there is a veil covering the invisible world which not even the most powerful computer or space probe or microscope can penetrate.  Only faith, hope, art and love can push aside this curtain and let us see the beauty and glory beyond.  Is it all real?  Oh Virginia, there is nothing more real.


No Santa Claus?  Of course there is.  He has been with us now for a thousand years.  As long as little boys and girls like you believe in miracles, Santa Claus will gladden the heart of childhood.  And he will live forever.

 •  0 comments  •  flag
Share on Twitter
Published on December 24, 2010 21:31

December 16, 2010

Why Can’t We Do Big Things Anymore?

 


by Michael S. Malone and Tom Hayes


The recent quick fade of the Deficit Commission was the latest reminder that America no longer seems to have the stomach for big challenges.  There was a time – was it just a generation ago? – when Americans were legendary for doing vast, seemingly superhuman, projects:  the Interstate Highway System, the Apollo Missions, Hoover Dam, the Manhattan Project, the Normandy invasion, the Empire State Building, Social Security. 


What happened?  Today we look at these achievements, much as Dark Age peasants looked on the mighty works of the Roman Era, feeling like some golden age has passed when giants walked the Earth.  Even when we can still see the aged survivors of that era sunning themselves outside the local convalescent home – or sitting down with us for family holiday dinner – it’s hard not believe that there was once something larger-than-life about them that they failed to pass on to us.  The ‘Greatest Generation’, and those before them back to the birth of this country, seemed to be able to do big things, and think big thoughts, in a way that is now beyond both our abilities and our desires.


We no longer build the world’s tallest buildings – other countries do.  We no longer reaching towards the moon – other countries are.  And when we do attempt something big – universal health care, alternative energy, improved educational standards, mass transportation – the initiative inevitably snarls up in bad planning, corruption, political pay-offs, lack of leadership, impracticality and just sheer incompetence.  The comparatively tiny Lincoln Administration managed to win the Civil War, open up the Great Plains through the Homestead Act, and kick off construction of the transcontinental railroad. . .all in four years. 


Why are things so different now?  Why can’t we seem to do big things well anymore?  We think there are a number of reasons, some consoling, others worrisome:


Big isn’t big anymore:  Big has, in many cases, become Small:  nanotechnology, microelectronics, human genome project, distributed networks, ‘smart’ objects – and there is a lot more reward these days in developing a smaller, more power-efficient microprocessor than in pouring a million yards of cement for new dam.  So, perhaps much of our sense of failure in achievement is, in fact, merely a failure of perspective.


Collective individualism: Today’s technology, which allows us to connect and communicate directly with each other, makes us less inclined to centralizing themes and collective action. Our networked world gives equal voice to every person, while marginalizing intermediaries, including political parties . . .making it much harder to win policy consensus for really big problems.  Worse, in a paradox of our times, the more connected we get the more divided we become.  The most vocal, outraged group wins. 


The Way of the Wiki:  The most important organizational innovation of the last quarter century, and our new defining social metaphor, is ‘the cloud’.  The Cloud is bigger than Big, but it is also amorphous and composed of millions of tiny, discrete elements.  It is good in bursts, but weak in follow-through.  In the wisdom of the cloud, there is an expert for everything.  Hammers are always in search of nail – and so, armed with these new decentralized, horizontal, ‘Army of Davids’ we tend to attack problems (and sometimes create them) that respond to a ‘wiki’ strategy. 


Been there, done that:  Watching Malaysia, Hong Kong and Dubai compete to build the world’s tallest building can be both thrilling and depressing – i.e., cool constructions, but why isn’t the U.S. in this race?  One answer is:  we’ve already run that race, and won, several times, so why not move on to other challenges?  Edifice construction seems to be a phase in the development of successful modern nations; ditto national transportation and communications infrastructures.  We passed through that phase fifty years ago – and all that’s left now are occasional upgrades.  On the other hand, you can’t help noticing that this type of epic construction is also synonymous with national ambition and confidence, two things that seem sorely missing in modern American life. 


Analog is messy:  You may not have noticed, but over the last half-century almost every successful U.S. industry has found away to climb aboard Moore’s Law of semiconductors and take advantage of its exponential growth curve.  This has inevitably rewarded pure digital plays, such as the Internet, while only conferring partial advantages on physical – analog – industries, such as medicine, automobiles and construction.  Big projects tend to be very physical activities . . .and our economy now directs smart players elsewhere to more immediate rewards. 


Everybody’s a winner:  The recruiting ad for the Pony Express said:  “Orphans Preferred.”  The ugly fact is that the building of America cost a lot of lives by putting men (and sometimes women) in dangerous, high-risk situations.  We don’t seem to have the intestinal fortitude for that kind of sacrifice anymore – and even if we did, our robust system of torts laws would make it too expensive to pursue anyway.  You probably can’t conquer outer space with a society that doesn’t keep score in youth soccer games, hands out participation trophies, and sues for every cut and bruise.  After all, the virtual bullets in a Halo gunfight don’t hurt.


Big has gotten harder:  Fusion power is infinitely more complicated than internal combustion, and a laptop computer inhabits a different universe from an adding machine.  Almost everything big we attempt now is much, much more complex and expensive than anything our ancestors could have ever imagined.  On the other hand, they probably said exactly the same thing . . .and then went ahead and built it anyway.


Nowism: Big projects require both patience and a belief in history.  Our society appears to have neither.  Instant downloads, endless channels and movies on demand have trained us to want exactly what we want, when we want it.  All the forces that satisfy our consumer desires make us less able to invest in tough, unglamorous, inconvenient things – especially if they take time.


Put them all together and what can we learn about ourselves and our seemingly growing inability to do anything big and important? 


First of all, we are doing big new things; they just aren’t like the big old things.  The customer base of Facebook is now bigger than the populations of all but two countries in the world.  We’ve mapped the entire human gene sequence.  With a few keystrokes in a Google search we can now find almost any piece of information on the planet.  We make microprocessors so small that their walls are bumpy with molecules and that can perform ten billion computations in a second. 


That said, however, the world is still a material place – and it is with big physical projects that we seem to be slowly losing both our competence and our nerve.  Part of this is due to the loss of intellectual capital as skilled veterans of past Big Projects fade away; part of it is a new economy that offers better incentives elsewhere; and part of it is a growing national aversion to physical risk, discomfort and deferred gratification.  But most of all, it is the lack of the very confidence that once made America and its leaders willing (to quote one of those leaders) to damn the torpedoes and to charge into the future under full steam. 


Sure the digital world is exciting, engaging and often quite rewarding.  But someday a whole new set of roads and canals and bridges will need to be built – maybe on Mars, maybe on a ruined Earth.  We just might want to start practicing for that day right now. 


[Tom Hayes, co-author of this article, is a vice-president of Marvell Semiconductor, and co-author with Mr. Malone of the book No Size Fits All:  From Mass Marketing to Mass Hand-Selling (Portfolio)]


 

 •  0 comments  •  flag
Share on Twitter
Published on December 16, 2010 14:15

November 29, 2010

Hear! Hear!

You are looking at . . .well, reading . . . the current reigning Oxford Union debate champion, a position I and my three teammates will hold until, well, this Thursday.


Eleven years ago I gave a speech at the brand new Said Business School at Oxford University.  The topic was ‘Entrepreneurship and Freedom.’  Afterwards, then-dean Anthony Hopwood, invited me to come back the next year “and bring some of my friends.”


The result was Silicon Valley comes to Oxford, which just completed its tenth annual gathering.   Among my ‘friends’ who have attended over the years, the list includes Elon Musk (Tesla, Space X), Jeff Skoll (eBay and Participant films), Ev Williams and Biz Stone (Twitter) and many more.  In the process, SVCO has done its part to transform venerable old Oxford itself.  Said B-School is now perhaps the most entrepreneurial MBA program in the world, and Oxford’s own undergraduate Entrepreneurs club, now has  7,000 members — more even than Stanford.  Today, Oxford and Said business school stand as the most important center for entrepreneurship education in Europe.


Two years ago, we added an Oxford Union debate into the program, just to keep things exciting.  For that first one, I acted as the initial questioner from the audience.   This year, I was asked to be on the team.  I was more than honored to do so — especially as the topic:  “Resolved:  Silicon Valley is dead; Long live Green Valley” seemed perfectly designed to make this lifelong Silicon Valleyite’s blood boil.


The two teams were an interesting match-up.  For the ‘Ayes’, the line-up was Joe DiNucci, Valley legend from his days at Mips and Silicon Graphics and now advising Coulomb Technologies; attorney Maria Sendra from Baker & McKenzie; former banker and now Skoll Scholar in social entrepreneurship, Sean Holt; and Malcolm McCulloch, Oxford engineering professor and head of its Electric Power Group.  The ‘Noes” were Tom Hayes, author and VP marketing at Marvell Semiconductor; Xavier Helgesen, Skoll Scholar and co-founder of Better World Books; me; and Reid Hoffman, co-founder of Linked/In.  The assumption was that the Ayes had the edge because, after all, who wants to vote against Green?  It’d be like voting against puppies and unicorns.  But I knew that in Reid we had a secret weapon batting clean-up: his supercomputer-like mind would easily gather our opponents arguments and systematically destroy them – which is exactly what happened.


So, we put on our tuxedos and made our way down the chilly side street that led to the neo-Gothic buildings of the Oxford Union.  As was customary, we drank champagne, posed for a group photo that will go on the wall near the likes of Gandhi, Churchill and even Malcolm X, then had dinner with the University Vice-Chancellor.  Making the experience even more fun was that I was joined by my oldest son, Tad, who is currently studying in Oxford. 


At the appointed time, the two teams lined up and we marched into the Union through the doors marked Ayes and Noes — and through which, in the opposite direction, the audience would exit to cast their votes.  The debate hall is arrayed a lot like Parliament.  At one end there are two sets of benches, on which the opposing teams sit and face each other, backed by their supporters.  Between them, but set back, is a large table, on which sit two dispatch boxes — both from Parliament and donated by Winston Churchill.  These boxes serves as lecterns for the speaker’s notes.  Behind the table and raised up as if on thrones, sit the young president, secretary, librarian and treasurer of the Union, all looking like Edwardian dandies in white tie and tails.   The other half of the hall is filled with seats for the audience; and there is a balcony as well that runs all of the way around the hall.  It’s all a bit dizzying to the participants, especially after all that champagne and wine.


Each speaker is given ten minutes.  You don’t win Oxford Union debates solely on logic and argument.  Shallow theatrics and cheap entertainment are also useful — both for which I have some talent, which may have been why I was selected for my team.  I left the actual debating to the smart guys.


I wish I could post the entire debate for you, as it was quite compelling.  Unfortunately, I only have my own notes, so you’ll have to settle for them.  Here’s my speech (and I’ll tell you the final vote at the end):


*


 Good evening.  It’s an honor to be here tonight – and I wish to thank the Oxford Union and the Said Business School for making this debate possible.  It is especially appropriate that tonight, during the 10th Annual Silicon Valley comes to Oxford gathering, a program that has helped to make Oxford one of the entrepreneurial centers of Europe, that we debate a topic that is, ultimately, about the nature of entrepreneurship itself.


 The topic at hand is whether Silicon Valley been replaced by something called “Green Valley.”   I’m in a unique position to talk about this topic because I have probably written more about Silicon Valley than anyone alive.  But even more important, I’m pretty sure that I’m the only person in this hall who ever got drunk with the man who named Silicon Valley and heard the actual story of that naming.


 I think that understanding the etymology of the term “Silicon Valley” is crucial to this debate, because, as we have already seen, our worthy opponents are attempting to narrow that definition in order to make it easier to topple.  So, let’s go back to 1970, when the term was coined.  Don Hoeffler, before he became notorious as a technology gossip writer, was a reporter for Electronic News magazine.  That year, he was sent by the magazine to Santa Clara Valley in the San Francisco Bay Area to do a series of articles on the explosion of new chip companies – companies like Intel, National Semiconductor and Advanced Micro Devices — that had spun out from the disintegration of Fairchild Semiconductor. 


 It was while he was working on one of the stories that Hoeffler found himself sitting in the lobby of the Santa Clara Marriott hotel and overheard a couple salesmen talking about the very subject he was covering.  “Wow,” said one of the salesman, “There sure are a lot new semiconductor companies around here.”  “Yeah,” replied the other, “This place is turning into a regular Silicon Valley.”  And with those words, Don knew he had the dateline for his series.  Needless to say, the phrase took hold.


 In other words, the name Silicon Valley had less to do with any fundamental role that silicon semiconductors played in the region, and more with the fact that chips just happened to be the hottest new industry around at that moment.  Had Don found himself in that same overstuffed chair in 1976, he might have named his series “Calculator Valley”, in 1979 “Disk Valley” and in 1984 “PC Valley.”  The name, Silicon Valley, therefore, was just a coincidence of time and place. 


 I might add that silicon-based companies weren’t even the dominant enterprises in the Valley in 1970 – test & measurement instrument companies like Hewlett-Packard and Varian were.  Strictly speaking, Silicon Valley has never really been “Silicon” Valley.


 So why then did the term stick?  It did because it captured the fundamental spirit of the region, as exemplified by that first generation of semiconductor entrepreneurs.  These men, like Bob Noyce, Charlie Sporck, Tom Bey, Andy Grove and Jerry Sanders, were technically brilliant, fiercely independent, fearless risk-takers, and almost insanely competitive.  They were wildcatters, cowboys – and Silicon Valley was their Wild West.  And they won – big – and in the process changed the world.  And when we speak of Silicon Valley today, we are talking about their attitude, not the fact that they made 16K bit CMOS DRAM memory chips.


 So, what was this Silicon Valley attitude? 


 First, it was entrepreneurial.  The founders of the semiconductor industry were obsessed with remaining masters of their own lives.  That’s why they walked out of fast-track careers with big East Coast companies and came to the Valley – and why, when the parent company of Fairchild refused to give them stock – that is, a stake in their own success – they walked out and started their own companies.


 Second, it was independent.  The founders of the chip industry refused to be beholden to anybody.  When larger companies tried to buy them, they threatened to destroy their own companies rather than sell out.  When they sold devices to the military, they consistently refused to sell them anything but off-the-shelf parts.  They reinvested dividends rather than give them shareholders.  And most of all, until they grew too successful to remain invisible, they resisted any contact with the federal government.


 Third, it was innovative.  The chip industry was focused on technological and product breakthroughs – category creators – not merely incremental improvements.  That attitude infected the rest of the Silicon Valley – which is why we live in a world of personal computers, cell phones, the Internet and social networks.  None of that would have happened if the chip companies had settled for making the next generation of transistors just a little better than the last one.


 Fourth, and perhaps most important, it was dedicated to profitability.  The founders of the semiconductor industry didn’t begin with a vision of a better world, but of new levels of engineering performance for which customers would be willing to pay a premium price.  They had no choice but to do so, because the venture capitalists and other investors who underwrote them demanded a quick return on investment.  In the end they did change the world, but only because the bigger the revolution they could create in chips, the bigger the financial success their companies would enjoy.


Ladies and gentemen, I would submit that these traits still characterize the Valley – and much of the rest of the global technology industry.  I see it everyday in new entrepreneurial start-ups, both large like Qik, and small like the hundreds of individuals writing new Android and iPhone apps at this very moment.  That was Silicon Valley then, and that is Silicon Valley now – only the technology has changed, not the attitudes.


 My honored opponents wish to convince you that all of this has now been abandoned, replaced by a fundamentally new focus in the Valley on alternative sources of energy and power conservation – a revolution so complete that the very name of ‘Silicon’ Valley, and the attitudes it represents — is now obsolete.


I submit to you that just the opposite is the case.  That, in fact, most current green companies and initiatives are antithetical to the Silicon Valley attitude:  they are born beholden to the government or non-profit foundations, they place political goals and utopian visions before pragmatic solutions, they mostly focus on incremental improvements over free-swinging innovation and category creation, and most will never, ever, be profitable.  In the end they are built as much on dreams and a vision of a better world, than they are on the down and dirty job of attracting customers, capturing market share and turning a profit.


 Indeed, the best and most successful ‘green’ companies of Silicon Valley, like TJ Rodgers SunPower and Joe DiNucci’s Coulomb Technologies – and perhaps soon Elon Musk’s Tesla Motors — have succeeded precisely because they are more ‘silicon’ than ‘green’ in both technology and attitude. 


 And so, while it may be nice to dream that somehow Silicon Valley has abandoned its fifty year old cut-throat culture to dedicate itself from now on to energy conservation and a better world, and as lovely as the title “Green Valley” sounds, wishing won’t make it so.  And I submit, given the real Silicon Valley’s track record for successful innovation, that it’s old attitude – as found in places like Tom Hayes’ Marvell Semiconductor, the greenest chip company on the planet, but still a silicon enterprise first and last – will, in the end, make the world more Green than all of those new Green companies my worthy opponents are celebrating tonight. 


 So, ladies and gentlemen, not only isn’t Silicon Valley dead, it is more alive than ever – and not just in Silicon Valley, California but in places like Oxford, England, and in all of the technology communities of the world. 


 And thank God for it!


*


Okay, I’ll admit the end was a little much.  But it wasn’t nearly as over the top as Dr. McCulloch comparing the Silicon Valley philosophy to South African apartheid . . .or, frankly, to Reid telling those audience members voting Aye to throw away their cell phones and close their Facebook accounts — or even Tom telling the crowd that Marvell would be accepting job applicants behind the Noes door.


In the end, we Noes won, by almost two-to-one.  We all headed off to the drawing room to celebrate with even more champagne.  The Oxford Union has, notoriously, on occasion voted against God and Country, but on this night at least, it still voted with its feet for Entrepreneurship.

 •  0 comments  •  flag
Share on Twitter
Published on November 29, 2010 16:08

Michael S. Malone's Blog

Michael S. Malone
Michael S. Malone isn't a Goodreads Author (yet), but they do have a blog, so here are some recent posts imported from their feed.
Follow Michael S. Malone's blog with rss.