Benchmarking Case Study Ppt Slide

For the third year running, NCS has collaborated with 4INFO to benchmark the performance of mobile advertising where it counts: at the cash register.

Compiled from 248 studies of CPG brands covering mobile display and video, our 2017 Mobile Benchmarks highlight some interesting trends right off the bat — namely, that ROAS (Return On Ad Spend) of mobile campaigns has increased 30% since our last report, as the industry is quickly ramping up its expertise in mobile as a medium, and technological innovation improves both platforms and operations alike.

And yes, the industry is indeed getting smarter. Mobile campaigns now generate positive ROAS 90% of the time — a notable improvement and continued positive trend over recent years.

Now, there’s plenty more where that came from, and you can download the full Mobile Benchmarks report here. But there’s an additional wrinkle to the value of ROAS as a primary performance metric, and in a recent AdAge webinar, we discussed how sales measurement can clear up the discussion around fraud.

When one considers the requirements of a sales-based outcome — namely, that the viewer of the ad must also be connected to a frequent shopper card, and that card must report a purchase event — the fraud-fighting value of measuring on sales performance becomes clear.

In the webinar, Bought Vs. Bot, 4INFO laid out a strategy for vetting publishers’ fraud prevention capabilities by using results at the register as the gauge. Through incremental sales and ROAS, media buyers can more effectively examine and compare inventory and publishers with the confidence that bots don’t buy products. In a sector increasingly focused on accountability, these metrics are powerful assets.

Get into the details by checking out our Bought Vs. Bot webinar on-demand. Or, download the scripted presentation here.

NCS and Nielsen are proud to present the most comprehensive analysis of what drives advertising effectiveness—creative, reach, targeting, recency and context—based on nearly 500 studies from 2016 and 2017 and over a decade of experience in linking advertising to sales results: Five Keys To Advertising Effectiveness.

This report evaluates the sales contribution of these five key drivers—and how their roles have changed since the ambitious Project Apollo studies of 2006. It also incorporates a separate study that examines the reach of TV & digital for nearly 900 cross-media campaigns.

Key Takeaways:

  • As in 2006, creative quality is the most important factor for driving sales, but most likely due to new breakthroughs in data and technology, media is playing a much larger role than before.
  • Less than half of all campaigns are doing a good job of targeting buyers of the brand or category — 80% of TV campaigns are On-Buyer Target and 31% of digital campaigns are On-Buyer Target.
  • TV ads generally have consistently high quality creative—as opposed to digital ads, which have a wider range of quality, including both much higher and much lower.
  • For large cross media campaigns, reach still comes primarily from television.
  • Understanding consumer purchase cycles and timing advertising closer to purchases can boost sales dramatically.

Attend the On-Demand Webinar here, or download the Webinar presentation.

Download the full Five Keys metastudy here.

Download the Infographic here.

In concert with Viant and Joel Rubinson, NCS purchase data has unearthed further evidence of the value in targeting actual brand buyers, and the impact of reaching them at the right moment.

Dubbed “the Persuadables”, this three-brand CPG study crafted a consumer segment composed of heavy brand buyers who were about to mature in their purchase cycles, and pitted their results at the register against an array of other, more traditional consumer segments. The outcome was nothing short of incredible:

  • The Persuadables (heavier volume shoppers who are close to their next purchase) generated a 16x greater ROAS than the other exposed audiences
  • Targeting heavy brand buyers provided a significant lift in ROAS; proof that targeting brand buyers vs. non-brand buyers is uniformly a better strategy

NCS and our clients have long-known that heavy buyers bring the lion’s share of incremental sales, and The Persuadables study is one more proof point to that argument. What makes this body of work unique is how it quantifies that which many of us have always suspected: recency plays a significant role in the success of a campaign.

And so, we find new wrinkles still tell the same tales: in CPG, past purchase behavior is the #1 predictor of future purchase behavior… and timing is everything.

Head over to Viant’s website to download the full study, or contact NCS to learn more.

At this year’s ARF Re!Think Conference, NCS debuted its Learning Lab on Viewability and Attention to extend the industry’s understanding — and opportunity — around the effectiveness of viewable ads.

Along with an all-star cast including Kellogg’s and Yahoo, Nielsen Catalina Solutions worked with Moat to reveal how two-second standards for video viewability don’t tell the whole story. Using in-store incremental sales as the KPI, early results illustrate the correlation between attention and outcomes: as Time In View increased, so did Sales Impact. Further, the 0-2 second window which would not qualify as a viewable impression did indeed drive some return.

What’s the implication for brands and publishers? For one, it means not looking beyond the current viewability standards leaves a lot of effectiveness on the table. Brands should (and can, with NCS and Moat) assess inventory performance second-by-second to optimize the sweet spot for future creative and media buys. In the same breath, publishers can more accurately inspect and promote the value of their inventory against results at the register.

To expand your campaign’s view into attention metrics, get in touch with NCS here.

In an age where the phrase “cross media” is used as a business development buzzword masquerading for innovation, the industry has eagerly awaited a solution that truly crosses the major arenas of ad spend: linear TV and digital. NCS Cross Media Sales Effect On Audience Link — announced at ARF Re!Think 2017 — is that solution, measuring the campaign’s results at the register, as always.

To understand the effectiveness of advertising across TV and digital (including mobile), it’s critical to be aware of the following:

  • The unduplicated reach delivered by your campaign
  • How each media contributes to sales results individually
  • The synergistic effect of the combined media that drives higher sales

The insights revealed thus far through studying several Facebook campaigns show, above all else, that brand characteristics and creative have a major impact on cross media synergy. In short: every media campaign is different, so every measurement matters.

Download the full ARF study here. Or, to find out if your campaign qualifies, get in touch with NCS.

You asked for more yogurt case studies, you got ’em.

Okay… so no one really asked for more yogurt case studies, but the work we’ve done here with TNT is well worth the second serving.

In a recent TV study for a major yogurt brand, TNT leveraged NCS buyergraphic audiences and Sales Effect measurement for a closed-loop solution that drove nearly $5MM in retail sales above and beyond what consumers would have spent had the ads not run.

A big part of that success came from TNT’s TargetingNOW technology. In their words:

Turner uses an in-house, proprietary model called CAE (Competitive Audience Estimation) in order to optimize the schedule. CAE is a predictive model that ingests a variety of data sets and builds 30 minute impression level estimates against the target – it is the most granular audience estimation tool in the industry.

One of the unique finds of this particular study was the high ratio of sales driven by non-loyal buyers; a somewhat rare outcome in the CPG world, and a testament to great alignment between the campaign goal, the creative, and the target audience.

Are you looking for a smarter approach to this year’s upfronts? See the full TNT case study to learn how buyergraphic audiences and measurement at the register can boost your in-house research.

In our second industry-first case study this month, NCS paired up with Yahoo and Chobani to measure the incremental sales impact of search advertising, and the resulting 1.3% conquest of market share was enough to sweeten anyone’s day.

This was a closed-loop campaign, meaning that Chobani not only used NCS to measure results at the cash register, but also took a smarter approach to its media plan from the very beginning by targeting NCS’ buyergraphic audiences (Chobani buyers, in this case) on the Yahoo search platform. An incremental sales lift to the tune of 9% was the reward for a job well done on everyone’s part, optimizing the media plan and quantifying the outcomes.

If you’re a bit hazy on how incremental sales are measured, read the AdAge coverage or allow our friend Marty to drop ninety seconds of knowledge:

In a first-of-its-kind study, NCS came together with TapInfluence’s influencer marketing platform to realize in-store sales lift for WhiteWave Foods’ Silk brands.

This was an especially interesting study for us, not only because it was the first to tie influencer marketing to attributable sales lift at the cash register, but because it blurred the line between what traditional media would consider a pageview vs. an impression.

To test the TapInfluence platform and influencer content, WhiteWave selected over 250 bloggers to create recipes themed around “Meatless Monday Nights”, wherein a Silk product would be featured among the ingredients. NCS then measured the in-store behavior of consumers who read the articles vs. a control group who did not, resulting in a 10% incremental sales lift and $285 of incremental sales per 1,000 pageviews.

Even more interesting was the fact that WhiteWave did not ask to have the Silk brand mentioned in any of the bloggers’ social shares linking to their articles — so while the audience made a conscious decision to read the content, their exposure to Silk was unsolicited, much like a traditional impression.

Download the full case study here, and check out TapInfluence’s webinar coverage of this study to learn more about influencer marketing.

Kellogg’s has released notable results from a holiday cross-device mobile campaign powered by Opera Mediaworks, LiveRamp, and Nielsen Catalina Solutions.

The Rice Krispies® brand wanted to stay top-of-mind as a key ingredient to the wide world of holiday treats a mom could cook up with her kids, and moms responded to the tune of a 28% incremental sales lift, driven primarily by increased store trip frequency. The campaign simultaneously succeeded in extending the brand’s equity and driving in-store sales, with a Kellogg’s overall return on ad spend (RoAS) surpassing 62%.

The 28% incremental sales lift was realized through NCS’ industry-leading test-and-control methodology, analyzing over 500 variables to extract the consumer dollars influenced by the campaign, above and beyond what dollars would have been spent independent of the ad.

Rice Krispies® success came at the expense of its primary competitors, which not only serves as a short-term win, but informs future campaigns of the potential for competitive conquest. Download the full case study for more details.

Learn more about measuring mobile campaign performance at the register:

Most metrics in the advertising world suffer from being bland and uninspiring. In other words… tastes like chicken.

Campbell’s Swanson wasn’t satisfied subjecting their holiday campaign to such a grim outcome, and so they cooked up a much bolder recipe for campaign insights with the help of Millennial Media and Nielsen Catalina Solutions (NCS).

What’s the secret ingredient, you ask? Well, if we told you, then it wouldn’t be a secret… but you look like a trustworthy person, so perhaps we can let it slide this just this once. The secret ingredient is: people. More specifically, the actual incremental sales dollars generated by those people at the cash register.

NCS and Millennial delivered Swanson a 7% lift in incremental sales and a 4x return on their ad spend, outperforming CPG campaign benchmarks by 43%. These sumptuous insights were the result of a measurement methodology which isolated the behavior of buyers exposed to the ad vs. the control group of unexposed buyers.

Check out the full case study here.

Balanced Scorecard and Benchmarking

Last updated on September 27, 2016, first posted on January 20, 2010 by BSC Designer Team

Title: Balanced Scorecard and Benchmarking

Summary: The Balanced Scorecard and Benchmarking document reviews ideas behind benchmarking, the possibility of using Balanced Scorecard for benchmarking, real-life samples of using Balanced Scorecard approach for benchmarking. This presentation is a part of Balanced Scorecard Toolkit.

Slides number: 65. Formats: PPT (MS PowerPoint), Adobe PDF

  • Introduction into Balanced Scorecard and Benchmarking
  • Benchmarking structure and procedures
  • Pitfalls to be avoided while benchmarking
  • Using Balanced Scorecard for Benchmarking
  • Best Practices in Benchmarking
  • Example to illustrate the use of BSC for Benchmarking
  • Balanced Scorecard need to be designed for benchmarking
  • Benchmarking stages: Planning, Evaluation, Action, Revise
  • Checklists, Case studies and FAQs
  • Benchmarking and Balanced Scorecard tools

Buy full version as a part of  Balanced Scorecard Toolkit

Sample slides

Slide 14. Benchmarking details. Processes.Slide 24. Efficiency frontiers of organizations.

Presentation Content by Slides:

  1. Balanced Scorecard and Benchmarking
  2. Using Balanced Scorecard for Benchmarking
  3. Understanding Benchmarking Process
  4. What is Balanced Scorecard
  5. Benchmarking using Balanced Scorecard perspectives
  6. Benchmarking using Balanced Scorecard indicators
  7. Balanced Scorecard vs. earlier concepts
  8. What is Benchmarking?
  9. Why Benchmarking?
  10. Benchmarking and comparison
  11. Benchmarking and competitive analysis
  12. Benchmarking structure
  13. Benchmarking procedures
  14. Benchmarking details. Processes.
  15. Real-life Benchmarking
  16. Using surveys in Benchmarking
  17. Pitfalls to be avoided while benchmarking
  18. Using Balanced Scorecard for Benchmarking
  19. Best Practices in Benchmarking
  20. Inter-relationships of management practices
  21. Example to illustrate the use of BSC for Benchmarking
  22. Using of BSC for Benchmarking. Formulas.
  23. BSC for Benchmarking. The range of application.
  24. Efficiency frontiers of organizations
  25. Balanced Scorecard need to be designed for benchmarking
  26. Sharing positive effects with benchmarking
  27. Process of Benchmarking
  28. Planning stage in benchmarking
  29. Using metrics during planning stage
  30. Planning stage checklist
  31. Planning stage. Case study.
  32. Using Balanced Scorecard for benchmarking on planning stage
  33. Evaluation stage of benchmarking
  34. Evaluation task in benchmarking
  35. Evaluation stage in Benchmarking – checklist
  36. Benchmarking evaluation stage. Case study
  37. Case study. Moving from planning to evaluation stage.
  38. The Action stage in the Benchmarking process
  39. Release the planned program
  40. Action stage in Benchmarking – Checklist
  41. Action Stage of Benchmarking. Case Study
  42. Benchmarking in action. Case Study
  43. Revise. Stage of benchmarking.
  44. Analyze results obtained from benchmarking.
  45. Benchmarking Revise. Checklist
  46. Benchmarking results analysis. Case Study
  47. Benchmarking and Balanced Scorecard tools
  48. Using the BSC Designer for benchmarking
  49. Indicators and weights in BSC Designer
  50. Weighted indicators for scorecard or benchmarking
  51. Creating evaluation indicators with BSC Designer
  52. Measure and evaluation unit in BSC Designer
  53. Min and max values for scorecard indicator
  54. Balanced Scorecard and Benchmarking.Frequently Asked Questions (FAQs)
  55. What changes should be made to scorecard for benchmarking? Frequently Asked Questions (FAQs)
  56. Choosing right indicators and activities to benchmark. Frequently Asked Questions (FAQs)
  57. Does “benchmarking” mean “comparison”? Frequently Asked Questions (FAQs)
  58. Using Balanced Scorecard for Benchmarking. Case Study 
  59. Benchmarking and TQM. Case Study
  60. Successful implementing of benchmarking. Case study.
  61. Data collection for benchmarking. Case study.
  62. Metrics in Balanced Scorecard and Benchmarking
  63. Results of Balanced Scorecard for Benchmarking. Case study.
  64. Results and Conclusions
  65. Using Balanced Scorecard for Benchmarking – Checklist

Posted in BSC Toolkit
Tags: balanced scorecard, benchmarking, BSC

Categories: 1

0 Replies to “Benchmarking Case Study Ppt Slide”

Leave a comment

L'indirizzo email non verrà pubblicato. I campi obbligatori sono contrassegnati *