I was on Sean’s case like crazy regarding ERA+, as many of you know. Basically, while every other index stat in the world did the value of the metric divided by the “average”, ERA+ did the league average divided by the ERA of the player. In effect, instead of ER per IP, it was doing IP per ER. What made it worse is when people started to use this in calculations, using it for simple averages etc. The math did not work out.
I had proposed that he do it the consistent way, which would mean someone who gives up runs at half the league average show as 50, rather than 200. Sean was rightfully concerned that people are used to “bigger is better”, and so, that would look like a sticker shock.
Guy proposed something very simple: 2 - ERA/lgERA then times 100. This way, what would look like 50 for me would show up as 150. And the top end is 200 in the Guy method (or 0 in my method). And Sean did just that.