Colin Strong's Blog, page 4

February 16, 2015

Bats, big data and market research

shutterstock_225881929


It’s well known that the marketing industry has serious ambitions for big data, hoping it can generate a fundamentally new way in engaging with consumers through ever more accurate prediction and targetting.  A huge industry has built up around this, with large scale investment being undertaken by brands to realise the promise of enhanced growth and profits.  But just how far can data go in helping us to understand consumers?  Is there a point at which our ability to leverage growth from big data starts to plateau?  Undoubtedly there are opportunities but to make best use of big data we also need to understand the boundaries.


To explore this let’s turn to philosopher Thomas Nagle who famously asked ‘What is it like to be a bat?’ (stay with me here!).  His paper brought into the mainstream the idea that we are conscious only if ‘there is something it is like to be that organism’.  To unpick this, what he means is that we need to recognise the validity of our own experience of the world.  So a pebble, for example, does not have an experience of the world but as humans we do.  Knowing what the subjective experience is, can of course be hard to establish.  To his thought experiment, we may know everything there is about a bat but still not know what it is like to be a bat.  This notion is core to Nagel’s book, The View from Nowhere, in which he argues our perceptions cannot be reduced to an objective view of the world and, in fact, an objective viewpoint does not replace a subjective perspective.


In a related thought experiment, a fictional scientist called Mary studies the world from a room which is purely in black and white.  She has the full knowledge of how our sight functions to see colour and indeed the physics of how colour is created but can only see a world of black and white in her room.  It is only when she leaves the room and sees the world in all its glory that she has the direct experience of what it is like to see colour.


So what does this have to do with data?  There are many implications but at its heart is the point that whilst data represents a view of the world it does not replace the subjective, human perspective.   According to Nagel there is no such thing as a completely objective viewpoint, there is no ‘view from nowhere’.  This matters because it is inviting to think that data is giving an ultimate insight from which we can shape marketing strategy.  We may know everything there is to know about a shopper from their behaviour but we do not know what it is to be like them. Yet it is this experience that surely shapes much of our consumer behaviour.


So how does this fit into the market research landscape?  I would argue that big data has led to sentiment in consumer research switching from the recording of experiences (market research) to observing behaviours (big data).  However, if we accept Nagle’s argument then we can be creative as we like with big data but it is not necessarily the full picture; as such, we may start to find that if we do not consider consumers’ subjective experiences then we may come to fundamentally wrong headed conclusions.


I am a big advocate of the use of big data in market research but to me this is a core reason why it needs to sit alongside other approaches. Perhaps we need to think about a continuum of ‘experience capture’ from qualitative research (within which there is a range of course), to the more nuanced end of survey research and then big data analytics.  We can start seeing how we need to have a portfolio of approaches to properly understand consumers.  Big data can deliver big insights but we ignore understanding consumer experience at our peril.



A good article exploring some of these philosophical concepts can be found here.


 •  0 comments  •  flag
Share on Twitter
Published on February 16, 2015 06:50

February 9, 2015

What’s the future for CPG in a data economy?

shutterstock_99655799


Data is fundamentally changing the nature of our relationships.  The use of social media is now underpinning the way in which we talk to each other, whilst ecommerce and advertising platforms are changing the way we communicate with brands.  At times, however, the consumer packaged goods (CPG) market appears hesitant about fully engaging with the emerging data economy.  Whilst some such as Coca Cola may have embraced social media, the number of CPG brands where customer relationships are primarily mediated by data rather than via a retailer appears limited.


Data now feels so central to business success that without an ongoing, data-mediated relationship with their customer base, we may be looking at an environment where data-poor brands will struggle to compete effectively.  This is a structural challenge that many manufacturers face in a variety of sectors – the intrinsic nature of a product based customer relationship is that very little data is typically generated for the manufacturer.  The retailer of the product will usually own the relationship and therefore the customer data.


This is one of the reasons why the creation of services rather than products appears so attractive.  For brands offering a service, there are many more opportunities for relationships with consumers and all the data capture that brings.  It is of little surprise, therefore, that in some categories we can see that products are rapidly becoming overtaken by the development of services.  This is largely driven by technology creating new capabilities but the opportunities promised by data capture are starting to be a driver in themselves.  There are other reasons to develop services – they are typically associated with a range of positive attributes such as higher margins, stable revenue streams and greater ability to flexibly respond to customer needs.  But customer data is now rapidly becoming a driver for brands to launch new services.


We are seeing the ‘servitisation’ of products all around us, notably in consumer categories such as music (iTunes and Spotify) and books (Amazon Kindle) but also in business services with companies such as Xerox moving from photocopiers to document services and IBM moving from hardware and software goods to business solutions.


But there is one sector that has appeared to be more resistant to the move from products to services – CPG.  Historically, CPG brands’ relationship with their consumers has been mediated via retailers who hold huge amounts of customer data.  Of course this is shared with brands (anonymised and at a price) but the CPG companies’ relationship with consumers is still largely meditated via retailers.  This is changing with companies like Kimberley-Clark building their own database so they can track the behaviour of their customers.  They are using this to explore at the way in which people redeem digital offers but also how widely they share them and through which channels.  But given that these sorts of databases are generated through marketing activities such as competitions and social media or bought in via database marketing companies, it can be hard for any brand to maintain that relationship. Indeed, there is an argument that these work more in favour of the consumer rather than the brand as active participants are generally brand loyalists who would probably continue to purchase the product anyway.


Surely there are bigger opportunities available for CPGs by creating services from their highly successful products.  This is an opportunity that has not escaped the notice of some new competitors that CPG brands should be worried about.  Dollar Shave Club is one such example.  For a monthly subscription this US based company deliver razors and other personal grooming products by mail. They now have more than 700,000 subscribers since launching in 2012.


Another well-known example is of course Nespresso  (which is Nestle brand), the coffee machine and capsule brand.  Their coffee makers can be purchased across a variety of bricks and mortar and online channels but to purchase the coffee capsules you have to sign up to Nespresso Club thus participating in an ongoing service relationship.


And an interesting variation is US company, Blue Apron.  They deliver “meal kits”–consisting of pre-measured ingredients with recipes in chilled boxes.  Members subscribe to the service for about $60 a week. For which typical customers get 3 meals for two per week.  Naturally there is a choice of recipes which change each week and the company emphasise the freshness and provenance of their ingredients.


An interesting UK example is Graze.  They offer subscriptions to boxes containing healthy snacks that are delivered weekly.  There is a wide selection of snacks to choose from and there is a strong social media element where customers can let Graze know, amongst other things, what new snacks they would like to see offered.


Although these are small fry in comparison to the dominant CPG brands, they nevertheless turn existing business models on their heads in an extremely interesting way.  And this has the potential to be a real threat.  Because as a category moves into a digital environment, the rules of the game change completely.  Consumer decision making becomes much more opinion-led via social media, potentially mitigating the brand equity carefully nurtured via expensive advertising.  Niche brands can quickly gain market share via smart manoeuvring with search rankings, rather than making a heavy investment in shelf space.  The barriers to entry are also much lower with new competitors bringing in novel product and marketing thinking, making life hard for established brands used to bricks-and-mortar marketing strategies.


There clearly are challenges for CPG brands in this space, not least because some of these service propositions could get dangerously close to alternative channels to market rather than representing new revenue streams.  When Proctor & Gamble collaborated with Amazon’s ‘Amazon Mom’ nappy subscription service in the US, the Wall Street Journal reported that major US retailer, Target, retaliated by downgrading their positioning within their store.  It’s a delicate balancing act for CPG brands as their revenues are heavily dependent on relationships with retailers who may see experimentation with these sorts of service propositions as a threat.


One thing is clear, the idea of CPG as a service has potential to disrupt the entire category.  This is a real opportunity for brands as data-mediated services offer a means of generating growth through deep customer relationships.  But there is also a serious threat for CPG brands that do not move quickly enough as the traditional rules of the category look set to change beyond recognition.


This post first appeared on Wired Insight Innovation blog 


 •  0 comments  •  flag
Share on Twitter
Published on February 09, 2015 01:20

February 6, 2015

Do I really want what I think I want?

shutterstock_94123438


I am pretty convinced that the shiny new tablet PC I have been playing with in the shop around the corner from work will bring me a lot of satisfaction, just think of all the new things that I will be able to do…..and that new on-demand film service I signed up for at the weekend will make a real difference to the happiness of me and my family.  In fact a lot of the purchase decisions that I make are because I think they will make me more happy.   Indeed our quest for future happiness seems to figure particularly strongly in many technology purchases,  where consumers often make a huge investment in the happiness that a new device or service will bring them in the future.


Of course this will typically be relevant to discretionary purchases but the choices made within even non-discretionary purchases can be for this reason (e.g. by choosing this broadband supplier I will have greater peace of mind = happiness).


The difficulty comes, of course, when you find that you don’t actually like the item that you wanted so much.  Or maybe you like it but you don’t seem to like it as much as you thought you would.  This mismatch between ‘wanting’ and ‘liking’ has been called ‘miswanting’ and has big implications for technology markets.


So what causes this discrepancy between wanting and liking?  Surely in a rational world we would be well informed and sensible consumers that purchase rationally so we end up liking the things that we so desired originally?  Psychologists Daniel Gilbert and Timothy Wilson identified three ways in which miswanting can occur:



 Imagining the wrong event:  Research suggests that people tend to imagine a particular scenario for their ‘want’ and underplay alternatives.  So, I might imagine the purchase of a new tablet to be my passport into a new world of entertainment  .  However, I may then find the format of the new device does not really make any difference to my enjoyment of what I find to be pretty much the same content that could be accessed by my now abandoned  laptop.  So the very different scenario that I imagined was, in reality, at odds with reality.



Using the wrong theory:  There may of course be situations that we are more familiar with and we can predict how they will unfold.  However, even though we know precisely what ‘event’ will be waiting for us,  we can still end up liking the subsequent purchase less than anticipated due to our inadequate theories about ourselves.  So to use our tablet example, I may purchase a tablet because I think I want an entertainment device and consider I am happy to accept that the device will not be optimal for running Microsoft Office functions.  However, after a period of using the tablet I realise that I actually have to spend most of my time using  it for word processing and spreadsheets and only occasionally find time to watch movies.  My theory about myself and how I would use the device was therefore wrong, what I actually needed was a laptop that could do both.     There are many examples of this mechanism at play as my overgrown allotment and  lapsed gym memberships will testify.



Misinterpreting feelings:  When we are imagining a future event we will often have an emotional response.  I may consider that the new tablet PC will make me happier as I am able to video call my family.  That evokes a warm, pleasant emotion that I tend to consider will be the same emotion that I will feel when I am experiencing the video call.  The difficulty here is that the emotion that I experience when ‘wanting’ can be influenced be all sorts of factors – a TV advert, a good night out with friends, a promotion at work.  We are all aware of the dangers of supermarket shopping when hungry for example!  I may find that whilst it’s pleasant to talk to my family over my tablet PC I quickly realise that the format does not make any real difference to my enjoyment of the experience than an ordinary phone call. We cannot therefore always tell if how we are feeling about the future event is due to the event itself or other factors.  As such, it is easy to develop a mis-wanting,

And miswanting does not necessarily just have an implications for the way we think about things in the short term.  This is because we often overestimate the way in which things will impact on us in the long term (both positively and negatively).  So we tend to ‘over-focus’ on the impact of a new tablet on our future state of mind and give it a disproportionate influence on our predictions of our longer term happiness (above and beyond the myriad of other things that can influence our happiness).  But  the reality is that humans are extremely adept at assimilating new experiences so for any new device, service or experience is our reactions to them is typically not as enduring as we might expect.  So we can have a scenario where a significant level of wanting has built up that any experience is simply not going to deliver in the long term as we simply get used to the new device. The same principle applies to lottery winners and those who move to sunnier climates although knowing this does not stop me from wanting both of these!


So what does this mean for brands selling technology products and services to consumers?  There are clearly signposts for brands that wish to make a quick sale – encourage consumers to imagine a very aspirational setting for the use of the product, throw into the mix ideas for using it that appear attractive and set it all to some feel-good music.


And this may be fine if these are consistent with the way in which consumers live their lives but if not, there is a danger of creating a miswant.  In the longer term, creating miswants can seriously damage the brand as expectations that have been raised and then not fully delivered on will create a negative reaction.


The huge body of research on the ways consumers forecast their future happiness has a major implication for brands wishing to manage for long term success:



 understand your consumers’ needs.  This deceptively simple recommendation allows brands to effectively match the (often multifaceted) features and benefits of technology goods with the needs of target consumers.  In this way the risk of consumers experiencing miswants based on inaccurate predictions of the events or erroneous theories about themselves will be lowered (it is impossible to eliminate them altogether of course)



take care when interpreting consumer emotions related to your proposition.  These are highly vulnerable to the context in which consumers find themselves and as such it is possible to misinterpret whether the emotions that are being expressed in relation to the proposition will be pulled through to the actual experience



market your products and services in a way that is engaging with genuine consumer needs, rather than creating focusing on generating heightened emotional states.  This will help to eliminate the miswants brought about by consumers misinterpreting their feelings.


Of course, these recommendations tend to go against the flow of much of today’s sales and marketing practices but it’s important to remember that truly great brands are built on fantastic user experiences that are intimately shaped to meet the real needs of consumers. Market research has a critical role to play in generating an intimate understanding of consumer needs, perceptions and emotions and reducing the potential for miswanting.


And perhaps the miswanting mechanism may be actually at times be beneficial in technology markets  where devices are enabling new and unexpected behaviours.  What can happen is that the ‘anticipated events’ for the use of the device are way surpassed by the much broader portfolio of events that the device is actually able to deliver.  So, for example, when my colleague Ryan bought an e-book reader he thought it would just be a more convenient way to carry and read the usual sorts of books he purchased. What he actually found was that the e-book reader facilitated the discovery of new books and authors which means he now reads a lot more than he used to, a highly positive outcome for him. Perhaps miswanting can therefore operates in a disproportionately positive way when it goes in the consumers favour.


So whilst miswanting has dangers for brands maybe there is also the potential for technology brands to use the mechanism to their advantage to not only satisfy their customers but to delight them.


This article was first published on GfK’s blog.


 •  0 comments  •  flag
Share on Twitter
Published on February 06, 2015 02:17

December 16, 2014

Is it time to rethink time?

shutterstock_121082374All the signs are that we are becoming an impatient nation, with attention spans that are rapidly declining.  This is certainly the position of noted technology sceptic Nicholas Carr who cites a variety of evidence to support this claim.  First, back in 2006, researchers found that a third of online shoppers abandon a retail site if the pages took four seconds or more to load.  Other studies by companies like Google and Microsoft subsequently found that it takes a delay of just 250 milliseconds in page loading for people to start abandoning a site. Considering that it takes about 400 milliseconds to blink an eye this does not allow much time.  And a more recent study of online video viewing from researchers Shunmuga Krishnan and Ramesh Sitaraman shows advances technology appears to be reducing our patience.  The examined over 23 million online video views and found that people rapidly started abandoning a video after a two second delay.  But perhaps the more interesting finding is that they identified a relationship  between higher connection speeds and higher abandonment rates.  So, as Carr points out, ‘As we experience faster flows of information online, we become less patient people’.  This is certainly support or the concept that the tools that we use effect the way that we see the world and as such ever faster connectivity and processing speed is reducing our patience and attention spans.  On this basis we feel we have ever decreasing amounts of time available and as such there is an arms race for advertisers and technologists on how to get ever more efficient at getting a message across as quickly as possible.


But is it really this simple?  We don’t talk every much about time perhaps because we are a little like fish swimming in water.  It’s such a fact of life that we assume the way we don’t even notice it but when we do we assume everyone else thinks about it is the same as the way us.  But as Carr notes Henry James pointed out in 1890, ‘Time seems subject to the law of contrast.’  In other words the way we perceive time is something that is context and person dependant.  And if this is the case is it something that advertisers need to better to consider as part of their repertoire.


In his essay, Brain Time, David Eagleman cited many examples of the way in which time is actually a brain construction.  He notes how we have all experienced when glancing at a clock that the second hand sometimes seems to take longer than normal to move—as though the clock has momentarily stopped.  Or when something terrifying happens, time feels as if it is slowing down.  In the laboratory it’s possible to distort perceptions of time using rapid eye movements or after watching a flickering light. If we introduce a delay between an action and the sensory feedback, it can feel as if the timing of the actions and sensations is actually reversed.  So time is not as straightforward as it first appears, our perception of it seems eminently malleable.


One of the leading thinkers in this area is Philip Zimbardo.  He identified three broad orientations in which we think about time – past, present and future.  Someone that has a past orientation to time tends to believe that what happened in the past influences their present thoughts, feelings, and behaviour.  Those living in the present tend not to consider that what they currently do will have any implications for them in the future and those with a future orientation tend to defer gratification for future rewards.  It’s easy to see that in the West we are typically future oriented and as such we tend to be impatient and time obsessed as the ‘present’ is always a means to an end which is some point in the future.  Other regions such as Mexico famously have a mañana attitude based on a feeling that as they cannot affect the future, we will do today only those things that must be done today.  So fundamentally the way we think about time is very much based on social-cultural influences.


And what about the situations in which we find ourselves?  One of the key findings in time research is that time slows down when we are engaged in new experiences and environments.  So go away for a week’s holiday, doing a lot of different things and it feels as if you have been away for much longer.  On the other hand, time passes much more quickly when you are engaged in an activity that occupies your attention.  This is either active absorption that psychologist Mihaly Csikszentmihalyi calls ‘flow’ or it may be passive absorption, such as when we are watching television or online browsing or active, when we are concentrating on a task, such as playing a musical instrument.


So it is little wonder that technology is shaping the way in which we experience time, as we are often in a state of passive absorption when using our many different devices.  In this state time is passing quickly and as Carr points out, the changing nature of the technology itself is shaping expectations about time passing.  Our online environment combined with our Western attitude towards time is causing us to feel increasingly time poor causing us to have shorter attention spans and be more impatient with delays.  And ultimately not to engage with the subject matter.


So what is an advertiser to do?  One the one hand it’s tempting to experiment more with the online advertising medium to see if there are ways to refocus consumers, to attempt to capture that same feeling of the second hand of the clock having frozen.  But ultimately it’s perhaps about recognising that brands need to creating relationships and experiences where there is that sense of flow for consumers.  And this is the rationale for brands such as Nike and Apple to focus their retail outlets more on this rather than shifting boxes.  And why, by incorporating singing and dancing into the onboard recitation of safety rules, Virgin Airlines tries to shift customers into a more positive bond with the company.


Declining attention spans have perhaps drawn our attention to the way we think about time.  Maybe now marketers can spend some quality time thinking how to make best use of it.


This article first appeared in BRAD 60th Magazine


 •  0 comments  •  flag
Share on Twitter
Published on December 16, 2014 07:38

June 3, 2014

The big data challenge: From cargo cults to unknown unknowns

shutterstock_126503279


The era of big data has firmly reached the media industry with Sky’s AdSmart and YouView being prime examples of the way in which the traditional broadcast advertising model is being disrupted by data driven technology platforms.  Big data surely offers brands real opportunities to enhance their business and drive up both revenues and margin.


But in the midst of the optimism and excitement I often hear media brands expressing concern that they don’t make enough use of the data that is available to them.  On the one hand, all too often they stick to the familiar, using traditional metrics which have traction in the business, selecting data points on ‘because we can’ rather than because they are sure they are the right ones.  Conversely, brands also seem to assume that simply having swathes of data to analyse will necessarily lead to findings that are grounded in reality, that really make a difference. But simply running off vast numbers of correlations or looking at patterns in the data have their own problems in terms of false positives in the data, an issue that Nate Silver made very clearly some time ago.


For me the concern here is that if we are not careful we can fumble the big data opportunity and become members of the ‘cargo cult’.  By that I mean falling victim to the illusion that something is scientific when it has no actual basis in fact.   The term comes from a supposedly true story of a group of islanders in the South Seas who watched the American military busily build and maintain airstrips on their islands as bases from which to defend against Japanese attacks following Pearl Harbour.


After the war, and the departure of the Americans, the islanders wanted to continue to enjoy all the material benefits the American airplanes had brought: the ‘cargo from the skies’. So they built replica runways, a wooden hut and wooden headset for their version of a controller in the hope that it would all return.  But, of course, the airplanes never came, even though the islanders went about it ‘scientifically’.  In other words, the data they used an as input was flawed.


The moral of this story is that without properly considering the context, or really figuring out what questions we want answered, the data we collect can often prove meaningless.  Of course, as Alastair Croll and Benjamin Yoskovitz point out in their book Lean Analytics, it’s far too easy to fall in love with ‘vanity’ data points. These are the ones that consistently move up and make us feel good but really don’t help us make decisions that affect actual performance.  Well known examples of these could include number of hits, visits, followers, friends, likes, time spent on site etc.  All cases where the data collected often (but not always) bears no real relationship to the success or otherwise of the business model.


So we need to find ways to help us sift out these vanity metrics, helping us to organise our thinking to navigate the mass of data and avoid joining the ‘Cargo Cult’.  Fortunately, there are a number of frameworks available to help with this process of identifying what the appropriate data should be for your own organisation to help meet your particular goals.  One such framework comes from the former US Secretary of Defence Donald Rumsfeld who famously said:


“… there are known knowns; there are things we know that we know. There are known unknowns;  that is to say, there are things that we now know we don’t know. But there are also unknown unknowns – there are things we do not know we don’t know.”


He made his comment at a press briefing in 2002, where he addressed the absence of evidence linking the government of Iraq with the supply of weapons of mass destruction to terrorist groups.  His somewhat unusual phrasing got huge coverage, to the extent that he used it as the title of his subsequent autobiography, Known and Unknown: A Memoir (Rumsfeld, 2012).  Opinions were divided about his comments. For instance, it earned him the 2003 Foot in Mouth Award and was criticised as an abuse of language by, among others, the Plain English Campaign.  However, he had his defenders— among them Croll and Yoskovitz who made good use of Rumsfeld’s phrase to design a way of thinking about data.  Their view is that analytics have a role to play in all four of Rumsfeld’s quadrants:



Things we know we know (facts). Data which checks our assumptions — such as open rates or conversion rates.  It’s easy to believe in conventional wisdom that ‘we always close 50% of sales’, for example.  Having hard data tests the things we think ‘we know we know’.
Things we know we don’t know (questions). Where we know we need information to fill gaps in our understanding.
Things we don’t know we know (intuition). Here the use of data can test our intuitions, turning hypotheses into evidence.
Things we don’t know we don’t know (exploration). Data which can help us find the nugget of opportunity on which to build a business.

There is something quite appealing about this approach, not least because of the way it engages an audience to think about the different challenges of data.  It also introduces the concept of exploratory analysis, an important distinction to make but one which is often confused.  ‘Reporting’ data points support the business to optimise its operation to meet the strategy while ‘exploratory data points set out to find the nugget of opportunity.  These unknown unknowns” are where the magic lives. They might lead down plenty of wrong paths, but hopefully toward some kind of “eureka!” moment of a brilliant idea that disrupts markets.


Nevertheless, in any kind of business, both types of data analytics are of course essential.  In smaller start-ups the balance will often more titled towards the ‘things we don’t know’ while in more established businesses there may be more focus on measuring the ‘things we know’.  But any business ignores either side of this at their peril.


Media brands are often at the struggling to determine the best way to navigate the immense oceans of data at their disposal.  By taking a more structured approach to the design of the data analytics process, brands can avoid becoming members of the Cargo Cult, with all the business implications that can lead to.


 


Feynman, Richard, Surely You’re Joking, Mr. Feynman!, W.W. Norton & Co., 1997


 •  0 comments  •  flag
Share on Twitter
Published on June 03, 2014 08:45

May 30, 2014

Are brands entering an Uncanny Valley?

shutterstock_41157


Marketers are increasingly engaged in a technology arms race, finding ways to derive ever more intimate details of consumers so that advertising can be targeted with precision accuracy.  This explains why , according to the IAB,  brands spent a record figure of £3.04 billion on online advertising in the first half of 2013.  And because of the positive ROI implications of targeted advertising, we are now seeing brands finding ways to turn what were previously offline activities into data.  So, for example, it has been reported that Tesco, intends to install Amscreen’s facial recognition software at their petrol pumps enabling advertising to target advertising based on personal characteristics such as age and gender.


But just how comfortable are consumers with these sorts of activities?  Amid the excited talk of algorithms and programmatic advertising it is easy to forget that at the end of the chain sits a sometimes emotional, often irrational and typically unpredictable human.  And they are not always convinced about the way in which technology is being used to target them.  At GfK we found, for example, that 68% of UK online consumers find it creepy the way that brands currently use information held on them.  And perhaps a tangible manifestation of this is another recent finding that 38% of consumers now use some form of online ad blocking, sending a very strong message about their unwillingness to receive personalised marketing.


It may be possible to claim that these are merely the symptoms of a mindset that is taking time to adjust to new forms of advertising.  After all, it can be hard to find people that warm to TV advertising whilst there is a long history of research demonstrating the consumers both accept it and are influenced by it.  However, is there something different about advertising that targets consumers based on their personal details?  It is a question worth asking as there is a growing debate whether brands risk are peering into an ‘uncanny valley’ where consumers come to reject highly personalised marketing approaches.


The term uncanny valley was first used in 1970 by Japanese roboticist Masahiro Mori, who noted that although we tend to warm to robots that have some human features we tend to be disconcerted by them if they start becoming too realistic.  And whilst there is little empirical evidence to support this claim it has nevertheless gained traction and steadily been gaining momentum ever since.


The uncanny valley effect has since been blamed for the failure of a number of films that used CGI where the characters have been very human like whilst the audience are aware that they are in fact animations.  Polar Express is often cited as an example where the effect left it with lacklustre box office sales whereas films such as Brave or The Incredibles used characters that were clearly not human and fared much better.  Circumstantial evidence to be sure but interesting speculation nevertheless.


So do brands face their own uncanny valley of personalised advertising?  In principle we may well expect to see the same issues that Mori found, given that this the advertising is via computers generating increasingly personal interactions based on ever more intimate knowledge about their targets.  From a brand relationship perspective it perhaps reflects the way in which personalisation creates a basic imbalance between the players.  So whilst the brand gets increasingly personal with the consumer based on all manner of information it holds, the consumer is unable to generate the same intimacy with the brand which can lead to a lack of trust and suspicion.  Just as we are likely to be suspicious of people we barely know who seem to know all about us.


This may then explain why we tend to find such negative reactions to personalised marketing.  A possible example of the uncanny valley in action comes from Qantas. Flight attendants now have very detailed data on the airline’s highly valued frequent flyers displayed on their onboard tablets. Despite this being something actually sought by Qantas staff they found it difficult to incorporate this information into their interactions in a way that felt natural. So instead of making their most valued customers feel looked after, this data-driven approach too often ‘creeped them out’.  And in another example Urban Outfitters attempted to personalise their site so that the website’s product displays to matched a user’s gender. Instead of appreciating this move, it came too close for many of their consumers’ comfort.


Research we undertook at GfK has identified an uncannny valleyphenomenon in relation to marketing– initially consumers enjoy the personalisation of marketing communications, with steadily improving brand attachment as personalisation increases. However, there appears then to be a line which is crossed where there is too much personalisation for consumers’ comfort and brand attachment rapidly declines, falling into an ‘uncanny valley’. This has to be a serious concern for brands who often assume more personalisation can only ever be a good thing. The early signs from our research clearly indicate that this is not necessarily the case.


And where brands tip over into an uncanny valley may well vary depending a variety of factors.  So some categories may be more associated with hyper-personalisation and therefore more accepted- what is appropriate for Google may not be right for an FMCG brand.  And we also find very clear differences in receptiveness to targeted advertising across different population segments – partly but not wholly based on demographics such as age, gender and lifestyle.


Marketers have enthusiastically adopted the mantra that personalised marketing is the right strategy for their brand.  And whilst much of the time this may well have much to recommend it there is currently no understanding of the optimal way it should be deployed.  But it is increasingly looking as if marketers need to start asking where the uncanny valley begins for their brand.  Because your brand might just be doing the opposite of what you intend and actually turning consumers off through your activities.


A version of this article first appeared on CMO.com


 •  0 comments  •  flag
Share on Twitter
Published on May 30, 2014 09:04

May 28, 2014

When doing nothing works in your favour

shutterstock_91893386


There is a fundamental problem at the heart of many industries – consumers seem to be unwilling or unable to do anything to switch their supplier despite the savings or enhancements to their service that could accrue by doing so. For example, less than 5% of consumers have switched their current account in the last year and despite somewhat higher levels of churn in energy and telecoms, it’s overall still pretty anaemic.


Is keeping it simple the answer?


So why don’t consumers switch? If you asked people would they be willing to put in an hour or two of work to save what could amount to in some cases hundreds of pounds then it seems like a no-brainer that they would do so. Regulators have long been scratching their heads on this issue. They increasingly see the problem as being due to the way brands in these sectors often seem to build complexity into their offerings, making it harder for consumers to compare propositions. The Office of Fair Trading has called this aConfusopoloy, effectively accusing suppliers of attempting to keep consumers confused and therefore adverse to engaging in the market. Regulators have tried to fix this by simplifying things – so Ofcom is proposing to simplify switching across telecoms markets, the Payments Council’s Current Account Switch Service aims to make it easier to switch bank accounts and Ofgem’s Retail Market Review is aiming to take steps to help consumers find better deals.


More radical solutions are called for


But, asks  in a recent report published by Consumer Futures, all these reforms seem to suggest that eager consumers are all waiting in the wings for the processes to become easier whereupon they will flood the marketplace seeking out new suppliers. The problem is that this places the burden purely on the shoulders of the consumer, the assumption being that once the clouds are lifted, our rational economic sides will kick in and will propel us to make sensible decisions. But we know that it is not that simple as a big part of the reason for the lack of activity is that we don’t think rationally, so it’s not just about making the process simpler. If it were that simple then surely a smaller brand seeking to generate market share would have cleaned up long ago.


Wired for inaction


Unfortunately, it seems that we are wired for inertia, with behavioural economics showing us time and again that the way in which we think about these things is in favour of basically doing nothing. For example, ‘hyperbolic discounting’ means we tend to over emphasise in our minds short term costs over long term benefits. So the hassle of having to find our usage details and search for the best offer pretty much always outstrips the attractiveness of savings which may accrue over the course of the year. Another example is ‘regret aversion’ where people are worried about making a decision that they fear they will come to regret in the future. People are concerned not only with what they have but how it compares to what they might have had – so again they do nothing. There are many other examples of the ways in which our brains are wired in favour of the status quo.


So it seems highly unlikely that any of the activities of regulators to simplify the decision process will make any real difference to the market, the problem is getting consumers to engage in the first place. But a new breed of companies is set to potentially change all this. Intermediaries which sit between brands and consumers to help find the best deal. Of course we are all used to the first generation of intermediaries, price comparison sites. But in their current form these are limited. They focus on price, one service at a time and, importantly, don’t always have the trust of consumers that they will always see the best deal.


A disruptive innovation coming this way


The new breed of intermediaries has a much more rounded offer. They not only look at price but also supplier performance from third-party rating sites but also use personal data that has been volunteered by individuals about their preferences. And as well as using a wider range of information, the fundamental business model changes, so it is no longer about a one-off or occasional use but an ongoing relationship that checks you are on the right tariff with the right supplier and not just for one, but for a whole range of services. So you can continually be assured that you are getting the best deal on the terms that you specify in advance (whether that be level of price saving, ethical suppliers, quality of service etc). And not only that but these services increasingly do the actual switching on your behalf when a better supplier is found, thus avoiding another hurdle that otherwise means consumers do nothing. So these services turn consumer inertia on its head. The very fact that consumers are inert becomes something that works in their favour as these ‘choice engines‘ constantly work on their behalf without having to do anything.


Examples of these sorts of intermediaries include Incahoot’s concierge service where you send in household bills and they will take responsibility for overhauling your household finances. Another is billmonitor which examines your historical mobile phone transaction data, identifying trends you may not be aware of to put you on the best deal. Intently allows consumers to broadcast their purchase intentions to the market, letting sellers approach them in a privacy friendly way.


There are a growing number of these sorts of services and the market is surely set to grow as government regulation concerning personal data comes into force. In the UK the government’s midata programme is requiring firms to make available to consumers their transaction data in a machine readable, transportable format. A similar programme calledGreen Button is underway in the US, providing consumers access to online personal energy-usage data. These developments mean that it will be easier than ever for the new breed of intermediaries to collect information on your behalf. Currently you either have to enter it yourself or exist in a somewhat legally grey area of providing your online account details. Spreadsheets are available from suppliers in some sectors but these can be pretty cumbersome and onerous for the average consumer to navigate.


Implications are immense


The implications of these changes are huge. If it works as many expect then sectors which are often considered not to be working in the favour of the consumer, such as the utilities market, can expect to be massively disrupted. There are serious implications for these brands which will now need to rely less on consumer inertia through obfuscation about proposition and rise to the challenge of differentiating themselves through new products and services as well as helping consumers make the best decision. And last but not least, consumers should expect these services to deliver them real empowerment as well as money in their pocket. This is powerful stuff which brands ignore at their peril.


This article first appeared on Huffington Post


 •  0 comments  •  flag
Share on Twitter
Published on May 28, 2014 12:37

Music choice: Humans versus algorithms

shutterstock_121086151


Growing up in rural Devon my teenage experience with the world of music was limited to rifling through the discount cassette tapes at Woolworth’s or deliberately hunting out obscure bands by listening to John Peel’s radio show at night.  It’s so different for today’s teenagers who have a huge variety of ways to ‘discover’ music from YouTube through to any number of music streaming services with their own recommendation services. And yet, when we ask people where they find new music, the vast majority continue to say the radio.


In a world of multiple channels and algorithmic music recommendations it seems strange that a broadcast medium would still be the place most people listen to new music. Why are more people not finding new music from sites where they can curate their preferences and music is streamed to them that fits their preferences?


I would argue that it’s partly because we still have a strong affection for music based radio – a low effort source of entertainment that is tapped into as part of our daily routine. The fact that all our cars are equipped with FM radio helps of course. And it’s also a communal experience – we typically enjoy a sense of sharing around music.


But it does beg the question whether recommendation systems will ever be able to properly reflect our musical tastes. Because if they do, why are we not using them more?


A recent study by GfK found that music discovery via computer based means such as music streaming service recommendations is still much lower than the radio.  But as Nate Silver, a man you may expect to be a strong advocate of computer based prediction said: “It is questionable whether any computer will be able to capture the subtlety and personalisation that real human beings demonstrate across social contexts.”


So is music preference too slippery to really be able to pin down using technology? Are we simply too unpredictable, too dependent on the context we find ourselves in to be able accurately predict our preferences?


Duncan Watts, a scientist at Microsoft explored this very issue and specifically wanted to understand the role of social influence in music preferences. To this end he created an artificial music site with over 14,000 consumers from a teen related website. All participants in the study were asked to rate a list of previously unheard songs from unknown bands. They were required to assign a rating to the song and were then given an option to download it. This corresponds to an ‘individual’ model of decision making, where we are making decisions without reference to others.  As might be expected, there was a normal distribution of preferences for the different songs, with the most popular songs being around three times as popular as the least popular.


The second group of individuals did exactly the same task but with a crucial difference: they could see the number of times that the songs had been downloaded by others. When the consumers could see the preferences of others (in the form of downloads) there was a significant shift in consumers’ preferences, with just a few songs being hugely popular and the majority of songs getting much lower ratings. In this scenario the ratio between the most popular and the least popular was at least thirty to one.


And the tracks that were popular when selected individually (i.e. not seeing what had been downloaded) bore little relationship to the tracks that were selected when consumers could see what had been downloaded by others. So we can see that interactions between individuals ended up drastically enhancing small fluctuations to produce outcomes that would have been very difficult to have predicted.


So, on this basis, music preferences may often have less to do with attributes of the music itself and more to do with a shared experience with others. Hence whilst sites that rely on algorithms based on musical features alone will help ‘sift’ the huge choice available, it at best looks debatable how effective they are in identifying music choice which really hits the spot, as this is primarily a social, rather than individual, choice.


We would therefore expect social recommendation to become an increasingly popular source of music discovery and indeed the likes of Spotify, Rdio and Songbird now include ‘frictionless’ integrations with Facebook. We are also seeing the rise of purpose-built sharing sites such as ‘This is My Jamm’.


Whilst these services theoretically help to deliver more powerful recommendations, at the moment many still create more noise than value due to the user simply having too many social connections. You may not take your music cues from your family, for example. However, in principle, discovery that integrates the social cues of your friends or those you may consider ‘experts’ would seem to be a sensible bet for the future.


If we accept the premise that music discovery is primarily a social phenomenon then this will explain the continued popularity of radio, but also the rise in ‘human-curated’ sites such as Pitchfork. Targeted at music enthusiasts, feisty Pitchfork music reviews provide guidance around independent music. There are also a number of bespoke expert services which will fulfil a similar role; record-store Rough Trade has started a subscription service where they send subscribers their six recommended tracks of the week.


It’s this kind of personalised, trusted experience (created by humans rather than algorithms) that may well end driving the market. And this has got more sophisticated as the likes of Shuffler.fm offer ‘audio magazines’ pulling together tracks from a wider selection of influential blogs from a virtual panel of music ‘tastemakers’.  And surely these very same principles explains the continued success of radio – after all, it is a human curated music channel which has a long history of social engagement.


So whilst huge investments are currently made to distil preference from analysis of tracks played, there is plenty of evidence that social effects are equally – if not more – critical in determining choice.  At GfK we are developing analysis techniques which explore the type of social influence at play to assess whether it is, for example, mediated by ‘indiscriminate copying’ or ‘influencer effect’. By identifying the ‘species’ of social influence in operation for particular genres of music it should be possible for recommendation sites to more accurately shape their offering.


I would argue that better understanding of these effects and the associated marketing implications is an area which increasingly needs to be the focus of activity not only for music sites but for content discovery more generally


This article first appeared on MediaTel’s Newsline


 •  0 comments  •  flag
Share on Twitter
Published on May 28, 2014 07:31