Quirk's Blog

Why Sony turned to user-generated content

Editor’s note: Brent Robinson is a social media strategist at Bazaarvoice, Austin, Texas. This is an edited version of a post that originally appeared here under the title, “Creating community with user-generated content: An interview with Sony.”

Sony Electronics is responsible for some of the most popular consumer electronics in history. Recently, they started a new program to showcase user-generated content in stores and on their Web site store.sony.com. We had the chance to sit down with Katie Babineau, senior communications strategist at Sony, to learn how Sony is using social content to highlight the Sony community and the meaningful role their products play in their customer’s everyday lives.

How do you think user-generated content (UGC) has changed the consumer journey over the past few years?

Brands are now really starting to listen and understand the power of UGC. I think user-generated content has helped transform the consumer journey in a way that forces brands to listen to the consumer, take that feedback and make improvements.

In a way, it has put the customer even more at the forefront of the journey by driving home the point that the customer is a valuable asset. These voices are very valuable, especially for companies that are producing products where this feedback is used to make improvements. That is pretty priceless.

What inspired you to incorporate user-generated content in to your marketing, and did you have any challenges convincing internal stakeholders to adopt UGC?

What convinced us as a company and brand internally was really pulling up our hashtag #SonyAlpha, and seeing the tremendous content. The talent and passion of our community shines through. We have a number of product specific hashtags we tap for content on various product detail pages. We are lucky in the sense. Our products have such passionate communities behind them that people are naturally creating content and sharing it online.

For us, it was all about tapping in to that close-knit community, showcasing their talents and giving them the spotlight. Moderation obviously is a big element in allowing us to get that content on our e-commerce site and to know that there is a mechanism in place to filter content that aligns with our brand and our various product groups. Once we were able to showcase some of that content and show it to our internal stakeholders, it seemed like a no-brainer.

There is a huge interest right now in photo-sharing Instagram application on Apple iPhone 5Splatforms like Instagram as a source of user-generated content. Could you tell us a little bit about the experience you have had so far sourcing visual content from social media?

Sony is very fortunate to have such a passionate, talented community that is already creating content with our products. They want to share it and they are very excited to be highlighted across our channels. Sourcing content from social is very organic for us. A majority of the content that we’re seeing onCurations is content that’s actually taken, and created, from our action cam by our community. They’re owners; they’re users; they’re advocates, which is really cool to see.

There is also that sense of instant gratification for the content creators in our community. I think that goes hand-in-hand with how it is a mutually beneficial relationship. They are creating and sharing this content, which is awesome to see. Then we are able to offer them a spotlight across some of our e-commerce channels.

I wanted to talk a little bit about how Sony is leveraging user-generated content in stores. You are displaying UGC in Sony retail locations. How has this added to the in-store experience for your customers?

It’s been good, and we definitely rolled it out across the country in our Sony stores. The goal was a little different at retail because the execution was different. We wanted to not only pull in social content in the form of tweets and Instagram images but also pull in product reviews.

We chose to place curations behind the cash wrap to have that positive affirmation of: “Hey, you’ve just joined this really passionate community of other Sony users, and here are examples of what they’re doing with this product. And by the way, we want to spread the notion of writing reviews to help others online as well.” We use this UGC as a way to enforce positive purchase affirmation, then also as a reminder to visit us online and review the product that they are purchasing in-store. In a way, it provides a unique perspective. A different perspective that brings the voice of the customer in to the store and allows them to feel like they are part of a larger community.


Posted in Advertising Research, Brand and Image Research, Consumer Research, Product Research, Promotion Research, Public Opinion/Social Research, Shopper Insights | Comment

Ads may benefit from Snapchat’s disappearing act

Editor’s note: Nigel Hollis is chief global analyst and Chief Global Analyst at market research firm Millward Brown, New York City. This is an edited version of a post that originally appeared here under the title, “Snapchat users positive about advertising on the app.

The range of potential social communication platforms is growing bigger and marketers need to know which ones will best reach and engage their target audience. However, when dealing with emerging platforms opinions differ on which one will work well for brands and when. When it comes to the role of different social platforms anecdotes aboundSmartphone, never mind whether new ad formats are effective or not. Recently Dan Calladine highlighted an interesting post by an “actual teen,” Andrew Watts, giving his view on social media.

Calladine cautioned that anecdotes are not data but even so I found myself momentarily swept along with the narrative; yes, what is the point of Twitter? Then a couple of days later I came across this post by Dana Boyd trying to put the genie back in the bottle. Her attempt to contextualize Andrew Watts’ comments is a legitimate reminder of the need to make decisions based on more than just anecdotes.

That of course is where research comes in. If you had asked me whether Snapchat users would be open to advertising on the platform I would have said no. Wrong! The evidence suggests they are. Snapchat is famous for the fact that photos and other content sent with the app disappear from recipients’ devices shortly after viewing. On Snapchat one form of advertising is when brands sponsor Snapchat’s our stories. Sponsors can add 5 to 10 seconds of their own photo and video content. The other form is advertising that appears in people’s recent updates. Our analysis has shown that the campaigns were received positively by Snapchat users: 60 percent of “our stories” and 44 percent of “brand stories” viewers enjoyed the ads, higher than expected based on general attitudes toward mobile advertising. This article in Digiday gives a good description of the type of ads that have appeared on Snapchat, including those tested in the study.

We measured the first six advertising campaigns to appear on Snapchat and found that the average increase in ad awareness was 16 percent points, ranking in the top quarter of similar mobile campaigns according to the mobile database. Most notable was the fact that online exposure was linked to subsequent offline behavior. Viewers of the Dragon Age video game brand story were 7 percent more likely to buy the game and those that viewed NBCUniversal’s Ouija and Dumb and Dumber To brand stories drove a 14 percent increase in actual attendance amongst those exposed.

So why is Snapchat an effective advertising medium? First and foremost it represents a good means to reach a younger target audience. However, reach alone does not guarantee effectiveness. There are a couple of reasons why I believe the new ad formats might get more attention than the typical display ad. First, if unopened, the ad sits there in someone’s feed for 24 hours. It’s tempting to simply see what it has to offer and get it out of there. Second, you can only view if your finger is on the screen, something that almost demands you pay attention to what is shown. Besides, a few seconds of advertising is a lot easier to endure if you know you won’t have to watch it time after time after time.

Of course, there is also the need take the “new factor” into account. Unfortunately, fifteen years of testing new digital ad formats has taught us that interest will wane but Snapchat’s ad format may hold up better than many. What do you think? Please share your thoughts.

Posted in Advertising Research, Consumer Research, Market Research Findings, Public Opinion/Social Research | Comment

IVR’s ability to reach and engage respondents

Editor’s note: Matthijs Visser is a principal consultant at Advanis, an Edmonton, Alberta-based market and social research firm. This is an edited version of a post that originally appeared here under the title, “A case for IVR.”

IVR data collection continues to surprise me. Before explaining why, let’s take a step back and let me quickly explain what data collection via IVR looks like. An IVR survey consists of a series of pre-recorded questions, administered via phone. As an example a, “likelihood to recommend” question could be asked as follows:

How likely would you be to recommend [brand X] to a friend or family member on a scale of one to five, where one means “Not at all likely to recommend” and five means “Extremely likely to recommend?” Please press any number between one and five.

Administering a survey via IVR is a highly cost-effective approach when reaching out to respondents via phone is needed (e.g., when no e-mail address is information is available for respondents). The question that usually comes up though is what response rates we’re seeing for IVR surveys: Doesn’t everyone hate these automated calls? Are respondents at all willing to do a survey via IVR?

And this is where the IVR continues to surprise me. As it turns out, respondents are quite open to completing a survey via IVR and for some demographics in particular, the response rates are actually quite staggering.

Here are some of the studies that convinced me of IVR’s ability to reach and engage respondents:

On one of our studies, we executed three data collection methodologies side-by-side: e-mail-to-Web, SMS and IVR. The survey pertained to a recent visit to a retail store and the SMS and IVR methodologies reached out to respondents on their cell phones. The response rates (calculated as completed surveys/everyone invited) we saw were as follows:

Response rates

Across the three data collection methodologies, the IVR obtained the highest response rate.

Certain demographics are more receptive to the IVR methodology than others though. To illustrate, on one of our studies we gave respondents different options for completing the survey, as follows:

  • On day one, we sent respondents a text message that invited them to do the survey via SMS or via a mobile-optimized Web survey.
  • On day two we gave those who hadn’t responded to the invite a follow-up call via IVR.

IVR chart

So while SMS is, not surprisingly, most popular among younger demographics, the IVR is most popular among older demographics and in particular among individuals 65-years-old and older.

In light of this, an example of the application of the IVR methodology is on a study for a home health care provider, whose client base consists largely of older demographics. On this study, we follow up with clients who recently received a call from the provider or who called the provider. The response rates here look as follows:

Inbound calls

So depending on the service and the type of call the client made or received, the response rates we are seeing can be as high as 50+ percent.

One last powerful application of the IVR methodology is to access low-incidence populations. Trying to reach these populations using live interviewers can be quite cost-prohibitive. We have leveraged the IVR methodology to reach a wide range of low-incidence populations, such as smokers, home health care users and gym members.

The IVR methodology proves to be an effective and cost-effective methodology, in particular in cases where:

  • the survey is short, e.g., for customer experience measurement programs;
  • phone contact information is more readily available than e-mail information;
  • target respondents less tech-savvy (e.g., older demographics); and
  • the target population’s incidence is low.


Posted in Consumer Research, Health Care Research, Market Research Best Practices, Market Research Findings, Market Research Techniques, Telephone Interviewing | Comment

Net neutrality: All in favor?

Editor’s note: Arun Mamtani is founder and CEO of market research firm ResearchFidelity, Inc., Dallas.

At the heart of the Internet is the premise that all traffic be treated equally. This principle, known as net neutrality, was built into the fabric of Internet since its creation. It guarantees a level playing field for users and content providers by ensuring that no one has to pay more for better access to online content. A free and open Internet is key to promoting innovation and driving entrepreneurship, to sharing ideas and to freedom of speech.

As Internet usage has grown, so has the need to ensure rapid delivery of content to the end user. Large Internet service providers (ISPs) are starting to charge Web companies like Netflix and Google in order to offer speedy delivery of their content. Proponents of net neutrality worry that ISPs have too much power and can break the way the Internet works. They are asking the FCC to stop it.

In May 2014, the FCC voted to move forward with a net neutrality proposal that would allow broadband providers to charge Web sites for faster delivery of their content – in essence allowing the creation of “fast lanes” on the Internet. The proposal also defined rules for ISPs to remain transparent about their performance and prevented them from blocking/slowing any content (such as from competitors). But the fast lane options would essentially allow the creation of two tiers of Internet service.

This proposal received near unanimous opposition from the public, as 3.7 million comments were filed on the FCC Web site. Internet advocates fear that the proposed rules would undermine equal treatment of all Internet traffic, and some have pushed for reclassification of the Internet as a public utility under Title II of the Telecommunications Act, which would give the FCC more power to regulate broadband providers.

President Obama joined the debate, expressing his opposition to the proposed rules and asking the FCC to keep the Internet open and free by preventing blocking, throttling or paid prioritization of content. He has requested that the FCC classify broadband service under Title II while forbearing from rate regulation and other provisions that are less relevant.

In response to this feedback, FCC chairman Tom Wheeler has postponed the net neutrality decision to 2015, when he is expected to offer a new proposal.

In June 2014, we ran a consumer survey in the top ten U.S. DMAs to measure public understanding and opinion on various topics related to net neutrality. We surveyed nearly twenty-eight thousand respondents in these top ten markets and analyzed results by demographic segments and geography. Our findings were somewhat in contrast to what was being reported based on the public comments filed with FCC.

We found that public opinion is not as unanimously in favor of net neutrality as would appear from the comments to FCC. Nearly one in four adults in the top ten DMAs are in favor of paid content prioritization, and just under half of the respondents agree that broadband providers should be regulated like utilities. We found that consumers in southern DMAs such as Dallas, Houston and Atlanta are more in favor of implementing fast lanes and less in favor of Title II than those in other parts of the country. Our analysis revealed that affluent households are more likely to support higher Internet fees for higher bandwidth activities.

Why were the findings different from the unanimous opposition being reported from the FCC comments? We believe it’s because a majority of the comments to the FCC (at least 60 percent of the 800,000 comments that were analyzed by independent analysts) were form letters written by organized campaigns. Our survey, in contrast, is fielded to random adults in the top ten DMAs representing the overall population. The results are weighted to demographic benchmarks in those DMAs to account for sample bias due to non-coverage and non-response. Furthermore, we ran conditional logit and generalized linear models controlling for a full range of demographic attributes in order to identify key predictors impacting public support for net neutrality. Building on model output, we also highlight contrasts and insights by population segments.

The net neutrality decision is far from over. As the debate continues, market researchers must continue to monitor and track public opinion and deliver new insights on public understanding and opinion of a complex and evolving topic.

Net Neutrality

Posted in Consumer Research, Market Research Techniques, Public Opinion/Social Research | Comment

Auto industry demand slowing in Brazil

Editor’s note: Janaina Gilsoul is a research consultant and industry analyst for Atlanta-based Geo Strategy Partners. This is an edited version of a post that originally appeared here under the title, “Will Brazil remain the country of the future?”

In the last decade the automotive industry in Brazil experienced increased sales at an average of 10 percent per year while the economy grew 4 percent. This great success attracted loads of investments between 2008 and 2012 for new plants and expansion of existing units. However, the Brazilian automotive industry is now facing strong headwinds resulting from the poor performance of the Brazilian 2014 economy coupled with the collapse of exports to Argentina.

“Brazil has gone in reverse,” said Matthias Wissmann, president of German Association of the Automotive Industry (VDA). “The high revenues we have seen to date in important markets will ease off somewhat.” Vehicle production fell 15.3 percent in 2014 compared to the previous year, reaching the lowest level since 2009 according to data released by the Brazilian Association of Manufacturers (ANFAVEA).Brazillian automotive production

Industry demand is slowing just as new capacity begins to come online, increasing supply while demand remains on a downward trend. The industry estimates it will soon experience approximately 40 percent idle operation, almost double that of when the market was booming. According to data from ANFAVEA, 12,400 automotive workers were laid off in 2014. General Motors announced a cut off of exports to Argentina and forecasts the same number of vehicles manufactured in 2014 for 2015.

In contrast to declines in Brazil (and Argentina), car sales will expand globally by another 2 percent this year to 76.4 million vehicles according to the 2015 forecast by VDA. China expects a rise of 6 percent to 19M new cars and US sales will increase by 2 percent to 16.4M vehicles. At least in this industry, things are turning to the norm – US and China lead the pack.

What about Brazil’s automotive industry future? After growth of 263 percent in vehicle sales over the last 12 years, it is natural the pace would slow down. Brazil’s automotive industry needs to explore new markets to compensate for its sluggish domestic market and the big economic crash in Argentina, which had been responsible for 80 percent of exports.

We see Colombia as the best market opportunity for Brazil’s automotive industry. Colombia boasts the fourth-largest automotive market in Latin America after Brazil, Mexico, and Argentina. General Motors, Renault, and Mazda manufacture in Colombia but their combined capacity can meet only 35 percent of domestic demand; the balance must be imported. Will Colombia absorb excess Brazilian automotive capacity? Will the 2016 Olympics cause resurgence in the Brazilian economy? The future is uncertain and unfortunately Brazil has always been a country of the future.

Posted in Automotive Research, Behavioral Research, Consumer Research, Product Research | Comment

Will path-to-purchase data change merchandise planning?

Brian Kilcourse is managing partner at Retail Systems Research, San Francisco. This is an edited version of a post that originally appeared here under the title, “Is the customer dimension in merchandise planning over-hyped?”

I remember the first time I ever heard the phrase customer-dimension. The year was 1998 and our IBM account executive had recommended that I meet Mike Blyth, who at the time was a principal at a company called 1.2.1 Marketing (Blyth is now the COO at Aginity, a Chicago-based big data analytics company). Mike’s presentation of his company’s value proposition referenced a book by Don Peppers and Martha Rogers entitled The One To One Future. Essentially, the gist of their position was this: while product-centric marketing focuses on how to maximize the value created by each product, customer-centric marketing focuses on maximizing the value created by each customer. This idea intrigued me so I introduced Mike and his colleagues to our vice president of merchandising. And that is how I started down the customer centricity path.

Since that time, it has become increasingly obvious that retailers must consider the needs of its customers when making assortment decisions at a level of granularity appropriate to the need that the retailer’s brand addresses. If that seems like a mouthful, then consider this chart (apologies to all who have seen it before):

customer needs/value chart

When applied to the notion of customer centricity, what this chart alludes to is that for basic needs, the granularity of customer data needed to make assortment decisions is low, whereas for highly discretionary needs (things that aren’t needed to sustain life but certainly make life a little sweeter), the level of granularity of customer data needed is high, sometimes even approaching one-to-one, just as Peppers and Rogers espoused.

All of that is well and good and at this point in our industry’s evolution most people know that there is no one-size-fits-all for customer-centricity. But consumers threw a significant wrinkle into our thinking about how much customer data is too much when they started using the digital domain in their paths-to-purchase. The reason this is such a big deal is because today it’s not uncommon for consumers to do their investigations and even make product selections outside of the physical domain of the store, and so it’s in retailers’ best interests to get the rightvalue propositions in front of consumers that are most likely to see the value as relevant, at the right moment in each path-to-purchase.

To understand how to do that that effectively, many of today’s retailers are capturing non-transactional data generated during consumers’ digital wanderings (that’s where all that big data is coming from). There’s no question that this kind of information really helps marketers to target value messages to consumers in the digital domain. But there has also been a lot of thinking around the idea that data captured in the digital space can help with assortment planning, as retailers strive to position physical inventory to maximize the profitability of their increasingly important omni-channel fulfillment processes.

Okay, that’s the theory anyway. But we wanted to test just how real that theorizing has become, and so in a study RSR conducted sponsored by JDA, Assortment & Planning: Changing Times, New Opportunities, we asked retailers to rank factors used in assortment planning vs. whether or not consideration of those factors is automated. And here’s what we found out:

Assortment & Planning: Changing Times, New Opportunities study

Basically, the study responses indicated that those factors that might be considered directly related to consumers’ paths-to-purchase (in particular, “channel-driven affinities” and “cross-channel demand aggregation”) are not highly valued by most retailers and even less automated. The inference is clear: when it comes to using non-transactional big data from the customer dimension for merchandise planning, the drum major is a little too far in front of the marching band.

All of this leads to the inevitable question, are the so-called thought leaders over-hyping the concept of using the customer dimension in merchandise planning? I think not – but it’s early and as I said at the start there is no one-size-fits-all right way to do it. It comes down to what value the brand is fulfilling. For example, while it may not take a lot of customer-centric smarts for a jeweler to know to carry watchbands, it takes a very customer-specific knowledge to know whether to offer Rolex, Breitling, Fossil or Timex at a particular store. While you might argue that all those products exist to help people tell time, in the case of Rolex and Breitling, the value is about esteem. And that implies an intimate knowledge of the local customer base – you certainly don’t want a display full of Rolexes in store if they won’t sell!

That’s pretty intuitive and exactly how most retailers have made such decisions in the past. But in today’s world, the customer is anywhere and the points of fulfillment can be anywhere. So it behooves retailers to try to position inventory in places where the demand can be most profitably fulfilled. That in turn implies knowledge of consumers’ favored paths-to-purchase and that knowledge is generated by data that comes from the customer dimension.

Bottom line: while retailers certainly are well on the way to using the customer dimension appropriately to market to consumers, they have a lot further to go before they unlock the value in that dimension for merchandise planning.

Posted in Advertising Research, Behavioral Research, Big Data, Brand and Image Research, Business and Product Development, Consumer Research, New Product Research, Product Research, Retailing, Shopper Insights | Comment

Avoiding 3 common conjoint analysis pitfalls

Editor’s note: Liz White is an analyst with Chadwick Martin Bailey’s advanced analytics team in Boston. This is an edited version of a post that originally appeared here under the title, “Conjoint analysis: 3 common pitfalls and how to avoid them.”

If you work in marketing or market research, chances are you’re becoming more and more familiar with conjoint analysis: a research technique used to predict customer decision-making relative to a product or service. When conducted well, a conjoint study provides results that make researchers, marketers and executives happy. These results:

  • are statistically robust,
  • are flexible and realistic,
  • describe complex decision-making and
  • are easy to explain and understand.

If you need a quick introduction or a refresher on conjoint analysis, I recommend Sawtooth Software’s video, which can be found here.

However, as with any analytical approach, conjoint analysis should be applied thoughtfully to realize maximum benefits. Below, I describe three of the most common pitfalls related to conjoint analysis and tips on how to avoid them.

Pitfall 1: Rushing the designAvoiding pitfalls

Rushing the design is the most common pitfall but it is also the easiest one to avoid. As anyone who has conducted a conjoint study knows, coming up with the right design takes time. When planning the schedule for a conjoint analysis study, make sure to leave time for the following steps:

1. Identify your business objective and work to identify the research questions (and conjoint design) that will best address that objective.

2. Brainstorm a full list of product features that you’d like to test. Collaborate with coworkers from various areas of your organization – including marketing, sales, pricing and engineering as well as the final decision-makers – to make sure your list is comprehensive and up-to-date.

You may also want to plan for qualitative research (e.g., focus groups) at this stage, particularly if you’re looking to test new products or product features. Qualitative research can prioritize what features to test and help to translate product-speak into language that customers find clear and meaningful.

If you’re looking to model customer choices among a set of competitive products, collect information about your competitors’ products and pricing.

Once all the information above is collected, budget time to translate your list of product features into a conjoint design. While conjoint analysis can handle complex product configurations, there’s often work to be done to ensure the final design (a) captures the features you want to measure, (b) will return statistically meaningful results and (c) won’t be overly long or confusing for respondents.

Finally, be sure to budget in time to review the final design. Have you captured everything you needed to capture? Will this make sense to your customers and/or prospective customers? If not, you may need to go back and update the design. Make sure you’ve budgeted for this as well.

Pitfall 2: Overusing prohibitions

Most conjoint studies typically involve a conversation about prohibitions – rules about what features can be shown under certain circumstances. For example:

Say brand X’s products currently come in red, blue and black colors while brand Y’s products are only available in blue and black. When creating a conjoint design around these products, you might create a rule that if the brand is X, the product could be any of the three colors but if the brand is Y, the product cannot be red.

While it’s tempting to add prohibitions to your design to make the options shown to respondents more closely resemble the options available in the market, overusing prohibitions can have two big negative effects:

  1. Loss of precision when estimating the value of different features for respondents.
  2. Loss of flexibility for market simulations.


The first of these effects can typically be identified in the design phase and fixed by reducing the number of prohibitions included in a model. The second is potentially more damaging as it usually becomes an issue after the research has already been conducted. For example:

We’ve conducted the research above for brand Y, including the prohibition that if the brand is Y, the product cannot be red. Looking at the results, it becomes clear that brand X’s red product is much preferred over their blue and black products. The VP of brand Y would like to know what the impact of offering a brand Y product in red would be. Unfortunately because we did not test a red brand Y product, we are unable to use our conjoint data to answer the VP’s question.

In general, it is best to be extremely conservative about using prohibitions – use them sparingly and avoid them where possible.

Pitfall 3: Not taking advantage of the simulator

While the first two pitfalls are focused on conjoint design, the final pitfall is about the application of conjoint results. Once the data from the conjoint analysis has been analyzed, it can be used to stimulate virtually any combination of the features tested and predict the impact that different combinations will have on customer decision-making, which is just one of the reasons conjoint analysis is such a valuable tool. All of that predictive power can be distilled into a conjoint simulator that anyone – from researchers to marketers to C-suite executives – can use and interpret.

Once you receive a conjoint simulator, I recommend the following:

  1. Distribute copies of the simulator to all key stakeholders.
  2. Have the simulator available when presenting the results of your study and budget time in the meeting to run “what-if” scenarios then and there. This can allow you to leverage the knowledge in the room in real time, potentially leading to practical and informed conclusions.
  3. Continue to use your simulator to support decision-making even after the study is complete, using new information to inform the simulations you run. A well-designed conjoint study will continue to have value long after your project closes.

Posted in Consumer Research, Data Processing, Market Research Best Practices, Qualitative Research, Quantitative Research | 1 Comment

5 trends for qual in 2015

Editor’s note: Ray Fischer is the CEO of Detroit-based online marketing research platform Aha! This is an edited version of a post that originally appeared here under the title, “5 emerging trends for online qualitative research in 2015.”

2014 proved to be a great year for online qualitative research technologies as more and more marketers, agencies and researchers have discovered the benefits of using online qual to mine consumer insights. Here are my top five predictions on emerging trends for 2015.

1. Consumers will sophisticated and inexpensive smartphones and tablets continue to invest in technology, further opening the window into their world.
The international landscape has changed rapidly in the past five years with the continued adoption of smartphones and broadband around the world. As recent as 2010 we were still sending out Flip video cameras so respondents could videotape and upload their world to us digitally. Now respondents have super sophisticated and inexpensive smartphones and tablets enabling them to share with greater ease. This has given researchers a larger pool of qualified respondents with the ability to connect with and deliver richer insights – faster and cheaper. As a result, digital ethnography (video) will play a greater role in the online research methods mix.

2. Client adoption and use of online qualitative will accelerate.
Reports suggests there are still some clients out there who have yet to make the leap to online qual. Shocking yes but with the online methods becoming so much more sophisticated, effective and arguably cheaper than face-to-face methods, the laggards will make the leap. 2015 will be the year that holdout marketers and researchers cross the threshold into online qual.

3. Recruiting will get better with more refined panels and better respondents.
Right now studies lasting more than one day or one activity are best recruited the old fashioned way – focus group style. The large quant sample panel providers have been understandably reluctant to cede contact/control to researchers on studies that last more than a few days. Multi-day and multi-week studies involve a very high level of engagement between the researcher and respondents, requiring a higher level of connectivity that focus group style recruiting delivers. Unfortunately, focus group style recruiting is typically more expensive than quant sample. I believe this will begin to change in the coming year as the quant sample houses push to get in on the online qual action, modifying some of their respondent contact/control constraints to make longitudinal studies more viable and less expensive. Or traditional online qual recruiting will get less expensive. Or both!

4. Data Management tools will become more powerful and easier to use.
For those who regularly conduct online qual studies you know how much data they can produce. Text analytics software cannot write the report for you (yet!) but it certainly can help shape the narrative and give more credence to your findings. These tools continue to get more effective and easier to use. You need to try them now if you haven’t already.

5. Researchers are evolving – and new talent is entering the MR space.
Go to any conference and you will see first-hand the new wave of research talent becoming established in the industry. They are smart, open-minded and fully embrace technology and all of the promise it holds for qualitative research. They grew up on computers, tablets and smartphones, recognizing these tools as an essential part of everyday life and valuing the role these omnipresent technologies play in the lives of consumers and respondents. Their innovative ideas on how online tools work are having a dramatic impact and are shaping new methods.

Posted in Consumer Research, Market Research Best Practices, Market Research Techniques, Online Surveys and Research, Qualitative Research, Research Industry Trends | Comment

Why customer satisfaction isn’t enough

Editor’s note: Greg Mishkin is vice president of research and consulting at Market Strategies International, Atlanta. This is an edited version of a post that originally appeared here under the title, “To trust or not to trust: building loyalty in telecom.”

For nearly a decade I have watched the largest and smallest telecommunications companies attempt to gauge their customers’ experience and overall satisfaction. Measures like Net Promoter Score (NPS) have been all the rage for the past several years, where company leadership views likelihood to recommend as a proxy for brand health and happy customers. We also focus a lot on the quality and value of the services they provide. Reliability, accessibility, fewest dropped calls, network speed and cost give a robust picture of which parts of the service offering are working best and which are in need of attention. Taken together, these measures provide valuable information and critical key performance indicators (KPIs) that the C-suite uses to guide their decisions and maximize shareholder value.

But is it enough to have a strong product and satisfied customers? I argue that it is not. There’s a missing ingredient – one that is essential to any long-lasting relationship: trust.

When companies had your backTrust

Back in the mid-1990s when I was a Sprint customer, they provided my local and long-distance landline phone service and, ultimately, my cellular phone service. I do not remember how good the quality of Sprint’s service was back then – it’s all a bit hazy in my aging memories. What I do remember is that Sprint told me I was one of their most valuable customers. I was part of a group they called “Sprint Gold.”

As a member of this esteemed group of customers, I received a personalized phone call from Sprint every six months to review my service and plan. Sometimes I learned that I was already on an optimal plan and there was no need for adjustment. Other times, I learned of new programs that improved my experience and saved me money. Sprint never tried to cross sell me or push for upgrades I didn’t need. No, they were looking out for me (or so I always believed) and weren’t contacting me just to get me to spend more.

I trusted them because I believed they were looking out for my best interests, even if it meant I would be paying them less money. This trust led to loyalty. I remained a loyal Sprint customer until I started working for Cingular Wireless in 2006, and I often think back to those semi-annual calls from Sprint and long for those days.

Connected – but not bound – by trust

Fast forward to today, and I am happy to say that my telecommunications services are light years better than they were in 1995. AT&T is my mobile provider and Comcast provides my high-speed Internet. Like so many, I am connected to these networks nearly 24 hours a day. In fact, I recently purchased a Microsoft Band that, when connected to my iPhone, ties me to telecommunications more than ever. Additionally, I have two laptops and an iPad that are all connected more often than not. I am very satisfied with the services my telecommunications providers offer and, from the vast research I have seen, most other customers are reasonably satisfied with their providers.

The missing ingredient today is trust. Sure they provide great services with unbelievable up-time percentages. And I would even go so far as to say that they do so at a very fair price. But the industry as a whole has been acting untrustworthy for a long time. Think about the some of these common customer complaints:

  • complicated bills that hide inexplicable fees;
  • complicated price plans;
  • long contracts and high early termination fees;
  • locked phones that can’t be easily transferred to other carriers’ networks;
  • “cramming” third-party charges to bills;
  • poor customer service; and
  • mergers and acquisitions that leave customers confused about who is actually providing the services (see Merger Mania).


Beneath all of these complaints is distrust. For the most part, telecom companies have not shown that they care about their customers. Rather, their actions have shown that they care most about making money. Granted, this is starting to change with the elimination of – or reduced reliance on – contracts and clearer billing but some might say this is too little, too late.

Building trust in a changing market

As telecommunications services become increasingly commoditized, carriers are realizing that they need to expand into ancillary offerings and markets in order to maintain the growth rates that shareholders have come to expect. Telecom carriers are now offering services such as home automation/security (e.g., AT&T’s Digital Life), streaming content (e.g., Verizon’s partnership with RedBox Instant) and financial services (e.g., Softcard). In many cases, these businesses have failed to gain the expected market share due, at least in part, to the consumer simply not trusting telecom carriers to provide these services.

Without trust, carriers will struggle to find new areas of growth. Yet, many carriers today don’t think to research and measure customer trust. Telecommunications is a rapidly changing and increasingly competitive environment, and telecom researchers need to adapt to this change by adding critical trust measurements and KPIs to their research portfolio. As the old adage goes, “You cannot manage that which you cannot (or do not) measure.”


Posted in Advertising Research, Brand and Image Research, Consumer Research, Customer Satisfaction | Comment

Top retail trends to watch in 2015

Editor’s note: Kelly Short is the director of global communications for San Diego-based Interactions Marketing and the editor in chief of Retail News Insider. This is an edited version of a post that originally appeared here under the title, “Retail outlook 2015: Industry experts highlight the top trends to watch.”

The world of retail is evolving. Though many industry insiders initially brushed these trends off as passing fads, the expansion of mobile apps and m-commerce, the use of tracking beacons for in-store personalization, and the widespread integration of omnichannel solutions amongst both brick-and-mortar and online retailers have all become mainstream in the retail industry in 2014. So what’s on tap for 2015? We asked five industry experts to give us their predictions for the trends that will have the biggest impact on the U.S. retail environment in the coming year.

Catering to the now generation

Susan Reda, editor at STORES Media, thinks the biggest trend in retail in 2015 will be the rise of same-day and next-day delivery. “We live in a society where consumers … want convenience and control, and they want it now,” she says. “While next-day and same-day delivery really challenges retailers – supply chains were never set up for that – we have already seen folks adjusting to [the demand from consumers].”

“A number of upstart companies, like Uber, WeDeliver and ShopRunner, who didn’t have a hand in retail before are now the ones who are going to change how quickly products are getting into the hands of consumers,” continues Reda. “Right now these services are only available in big cities like Chicago and New York, and they’re expensive. But we’re seeing that it’s a luxury consumers are willing to pay for.”

Reda also notes that “as much as [this is] important for online retailers, I think it’s even more important for brick-and-mortar retailers. If they’re in the cities where this is becoming desirable by consumers, this is their opportunity to get that bit of revenge they’ve been looking for.”

“In the past, brick-and-mortar retailers were not set up to do deliveries as fast as online retailers. But now if [a traditional retailer like] Macy’s gets equipped to do this and can connect with a service like Uber and deliver an item in 3 hours, they can satisfy the customer and drive consumer loyalty. When you are on the receiving end of a gift that arrives just in time or an item that saves the day – it makes all the difference [in your perception of that retailer].”

Convenience stores become one-stop-shops for brick-and-mortar and e-commerce

Danny Hongyi Chen, vice president of Asia Retail Services for Interactions, has a strong background in the Asian retail market. When asked what overseas trends he’s currently seeing that he thinks will influence the U.S. retail market in 2015, Chen cites the expanded role of convenience stores as a go-to shopping destination.

“There’s an interesting trend in convenience stores in Taiwan,” Chen says. “Typically, these stores have the advantages [over supermarkets and hypermarkets] of store location and a robust supply chain. The big players in convenience are now offering more things because they are closer to the consumer. They’re starting to change their selection two or three times a day and to offer new things, like fresh. 7-Eleven in Taiwan launched a very strong fresh program. They offer ready-to-eat and ready-to-cook items.”

“This is significant because it used to be rare to see convenience stores have a direct relationship with produce and seafood providers,” explains Chen. “But now they do. And they want to do more. They’re also selling things like concert tickets, train tickets and other services.”

Chen also thinks we’ll see a convergence of e-commerce and convenience stores. “Some e-commerce players in Asia are partnering with convenience stores to do the delivery,” he says. “In the future, they may start taking some of the shelf space in the convenience store to display seasonal items. The shopper could buy an item right there. Essentially, convenience stores will become the showroom or experience center for e-commerce players.”

A year of fragmentation and catering to the customer

“2015 is going to be a year of fragmentation,” says Virginia Morris, vice president of global consumer innovation and strategy for Daymon Worldwide. “It’s been tricky to keep primary shoppers in our stores. But it’s getting even more challenging. Consumers are getting more choices in destinations and options to fill their shopping needs. Because of this fragmentation, it’s more important than ever to keep shoppers in your ecosystem, whether it’s online, mobile or in-store.”

One thing retailers are going have to deal with in their efforts to keep shoppers coming back is an increasing demand for transparency. “Retail continues to go under the microscope,” says Morris. “There’s increased sharing of information and a quest for understanding from a source perspective and an ingredient perspective. For example, right now a short ingredients label is perceived to be healthier. [In the coming year] additional transparency is going to become a point of entry, not differentiation. It’s about traceability, supply chain and retail innovation. It has pricing implications … and social media amplification implications. [Going forward] retailers have to be having a conversation with consumers, not talking at the consumer.”

Data security becomes top of mind

Paula Rosenblum, managing partner at retail systems research, predicts the biggest trend that will influence the retail market in 2015 is an obsession with data security. “Retailers storeare going to have to face the fact that keeping customer data safe is now an ongoing job,” says Rosenblum. “It’s going to be a time and money distraction.”

“You can’t have an expectation that you won’t have an intruder. You must have an expectation you will have an intruder,” she warns. “You have to presume that the firewalls will be breached and figure out how to catch the intruders right at the start so they do minimal damage – unlike the recent Home Depot breach where they milked data from self-checkout terminals for five months before they got caught.”

Rosenblum also notes that this focus on security won’t just affect retailers. “You can expect that consumers will go online [in response to data breaches],” she says. “Currently fraud is not very high online in the U.S. But we’ve learned from other countries, once you squeeze fraud out of the stores, it will go online.”

Blurring the lines between virtual and reality

“In the next year, we’re going to see the lines between online mobile and the real world begin to blur,” predicts Abhi Beniwal, senior vice president of IT for Interactions. “There are new technologies coming out that will change the experience of online and how you look at, touch and feel a product. The reason people go into the store today is to touch, feel and see. That line of differentiation will start getting blurry.”

“These technologies aren’t just augmented reality,” continues Beniwal. “They’re more than that. For example, Google just invested in Magic Leap, a technology that’s not augmented reality or virtual reality but something that’s actually making it hard to see where the physical world ends and the online world begins.”

According to its CEO and founder Rony Abovitz, Magic Leap is a mobile wearable computing system that works with the human visual perception system to create realistic 3D images that will revolutionize the way people communicate, purchase products, learn, play and share.

“Technologies like this are not only going to be online but also in-stores,” says Beniwal. “It will be interesting to see which retailers will make the first breakthroughs with this technology because there are a few in the forefront. The rest will be watching and following.”

Though these trends seem quite diverse, one thing all of our experts agree on is that consumers are the driving force behind these – and the majority of other – retail changes. As Morris puts it, “These are consumer-led trends. And retailers are having to respond. Retail is shifting to meet the ever-changing needs of the consumer.”

Posted in Behavioral Research, Brand and Image Research, Consumer Research, Customer Satisfaction, Data Privacy, Retailing, Shopper Insights | 2 Comments