Unless I’m really misreading this, this paper reaches the utterly unsurprising conclusion that when you have a larger dataset, an individual datapoint must deviate further from the mean in order to have a lower probability of being noise? They’ve concluded that a standard deviation is a standard deviation?
There's zero dispute that the planet mean has been increasing since the 1970s (and earlier).
It's simply stating that the the sudden jump in the last year is insufficient on it's own to conclude that the undisputed rate of warming has itself increased (sudden acceleration of heating).
Couple more years of unusual increases and we'll know for sure that things are getting hotter even faster .. until then we settle for things are getting hotter by at least the regular rate of increase due to human changes to the atmosphere.
>The paper does not dispute there is a warming at some rate. What they want to determine using statistics is when the rate changes; as in, the change changing faster, or the change changing slower. Although the trend is warming, there is a lot of noise, random accelerations and braking, so they claim it is statistically difficult to pinpoint when the changing change changes.
Perhaps currently accelerating (e.g. methane release) and decelerating effects (e.g. oceans) cancel out so far?