Which multiplier do NZ H&S professionals typically use for comparison - 200k or 1m?
Does anyone have a good and current source of comparison by industry?
I have seen both 200 k and 1m . 1m is the OSHA standard, and pretty much the internationally accepted standard - which means you can source industry comparisons from their website to benchmark if that's what you want to do. The Citi spotlight report (closer to home and includes NZ companies but mainly aussie ASX 100) also uses 1m - they adjust for companies the report using 200 k. I think business leaders forum in NZ was using 200 k for some reason as when I was reporting Fonterra data to them we had to adjust down from 1m.
If you are only using internally to understand relative improvement year on year then it doesn't really matter. If you are trying to benchmark then its better to use 1m or you run the risk of comparing different stats and thinking you are better or worse than you are. The Spotlight report is pretty good - OSHA stats are very comprehensive.
Hi Paul, we are in the Transport and Logistics sector.
The calcs and multipliers used were historical for us so it has made sense to continue to make internal comparisons.
Thanks Chris, however I have googled the Citi spotlight report several ways and can only find references to it, not the report itself. Is it available, or is it a subscription report?
Paul
I am slightly surprised no one has yet jumped in to say that LTIFR and TRIFR are poorly regarded measures these days. They are backward-looking and are not at all good indicators of where your critical risks are brewing, undetected. Oh well, looks like I've said it.
We use 1m but I hope very few organisations that are worth their salt use this metric in any meaningful way. It's shown to be flawed, subject to manipulation and a very poor indicator of performance. Have this in the background to measure your low consequence injuries for sure, but don't use this as a measure of success, or indeed failure.
Paul you need to look at the hours your organisation works. If they are not exceeding 1 million per year then that is a pointless measure and you would be better to look at 200,000. TRIFR and LTIFR are only worthy as a point in time comparison to other organisations as a check to see if you are better or worse than anyone else when the measure is applied. Taking Peter B's view proactive measures are far more worthy of organisations wanting to continuously improve.
When benchmarking lagging indicators we use the 200,000 rate as it is the closest measure to manhours for our organisation per year. If we applied the 1 million per year we would be suggesting that if our workforce worked 5x the manhours it would be expected to have 5x the injuries. Manhours are only one risk factor and a planned increase in manhours (with controls in place) does not automatically increase injury rates. For internal reporting we focus more on the leading indicators.
I think it's worth pointing out that EVEN IF you think that reported injury counts are a good measure of safety, dividing them by another uncertain number (the hours worked) can only ever make them a worse indicator.
Any organisation that genuinely cares about the number of injuries as a statistic should report the raw number, and then, if they think any change is due to a change in exposure, discuss that. As Denise points out, hours worked is a terrible measure of exposure anyway. Who is spending the hours? Where are they spending the hours? What type of work are they doing? Whether you divide by 100,000, 200,000 or 1m, you're claiming that all of those hours are fundamentally the same. It's a factually incorrect claim, and we have a professional obligation not to communicate misleading information about risk.
Many companies here seemingly rely on these figures and it really doesn't indicate how safe a company is, it merely indicates common reporting and those figures are or can be manipulated. I'm with Peter on this as its a rear view mirror and not that useful. I've come across organisations that have 'buried' incidents/events in order to gloss over a stat thus creating a parallel reporting structure.. I prefer transparency and clarity. Even investigations can become "learning groups" to move forward with.....