Many performance measurements and KPI’s mean little in and of themselves without context and so competitive benchmarks provide valuable objectivity. They help an organisation to evaluate their position within an industry and to identify SMART performance targets.
At Zing we’ve developed a series of benchmarks for the events and exhibitions sector. We review our benchmarks on an annual basis to ensure they are as up-to-date and relevant as possible.
This year, we’ve further enhanced our benchmark by weighting events within the benchmark based on their size and scale, i.e. each event is weighted within the overall benchmark, to ensure that it has appropriate influence on the overall benchmark
By doing this we’ve seen some significant differences in some benchmark levels year on year. For example, the average consumer show Net Promoter Score® has increased from +18 to +30.
Tracking our consumer show benchmarks over time, we’ve seen a shift in the balance of new and previous visitors at consumer events, with growing reliance on previous visitors. Given an increased reliance on previous visitors, the overall attendance decision-making is now earlier with fewer visitors making their final decision to attend last minute and significantly more making their decision more than 6 months out.
While NPS® has increased, overall experience ratings remain relatively static however, value for money ratings have increased, perhaps correlating to the increased proportion of AB visitors, who now represent almost two-thirds of consumer show visitors.
Trade event benchmarks have seen less movement year-on-year, although there are some clear trends. For instance, events are typically seeing more senior decision-makers and trade visitors have more objectives for attendance, suggesting there is a greater need to justify event attendance, even to themselves!
Overall, benchmark figures are useful and valuable in their own right, what’s even more helpful is to understand where, within a range your event sits. Having this understanding can make a huge difference to setting performance improvement targets, for instance, knowing your event achieves an NPS above benchmark level is one thing, but knowing where your event sits within the range of the benchmark is another. For example, an overall average consumer NPS® of +30 masks a range from -15 through to +60-something, and the number and scale of events delivering results above +30 must be considered to understand the percentile rank of yours.
Why does it matter? Because it’s possible you could be setting unrealistic expectations which can quickly demoralise, demotivate and create insecurity within even the strongest team. Get it right and they’ll all be keen to soar to the heights with you.
Posted by Lisa Holt