Designated HitterMay 03, 2007
Was the 1990s Home Run Production Out of Line?
By David Vincent

In the last five years, baseball fans have read and heard a lot of commentary from politicians and the media about what a travesty the home run totals have been since the mid-1990s. The average fan, having heard this mantra so much, has come to believe it is true. But is it?

In order to examine this question, we need a way to compare eras. Raw counting totals will not suffice. The method employed here is a "home run production rate." It is calculated not by dividing homers by at bats, similar to batting average, but by calculating how many circuit drives were hit per 500 plate appearances. The 500 plate appearance standard was chosen because the official minimum performance standard for individual batting championships as listed in rule 10.22(a) [in the 2007 edition of the rules] is 3.1 plate appearances times the number of games scheduled for each team. Thus, in the 162-game schedule, 502 plate appearances is the minimum, but that was rounded here to 500 for simplicity. The home run production rate will generate numbers that can be compared to other numbers that have some context for the reader, such as a 30-homer season by a batter.

Figure 1 shows a graph of the home run production rate for all major league players each year since 1919. One can easily see a gradual increase from 1919 to the present. The numbers in the charts do not represent the total homers hit in the major leagues for any one season but rather the home run production rate (homers per 500 plate appearances).

Figure 1 - Home Run Production Rate (1919-2006)

The fact that the home run production rate in the major leagues has increased steadily from 1919 to the present should not come as a surprise to many people. Many factors have affected the production rate, including rules changes, equipment changes and even some events outside of baseball. For a complete discussion of Figure 1, please read Home Run: The Definitive History of Baseball's Ultimate Weapon, from which the figure is taken.

Figure 2 adds a trend line to Figure 1 and this trend line shows the steady increase in home run production from 1919 through 2006. The movement of the rate line around the trend line documents the pendulum effect of the production through the years. The home run rate topped 10 for the first time in 1950 when it reached 10.7 homers per 500 plate appearances. It dipped below 10 in the next two seasons, but from 1953 through 1966 the production rate was above 10 each season. This time period is the bubble above the trend line about half way through the chart from left to right.

Figure 2 - Home Run Production Rate with Trend Line (1919-2006)

In 1994, the production rate reached 13.8 homers per 500 plate appearances, only the second time in history that the rate climbed above 13.0. From 1994 through the present, the production rate has been above the trend line with the exception of 2005. The highest point in the chart is 2000 when the production rate reached 15.0. However, it is evident from looking at Figure 2 that the period from 1950 through 1966 is further above the trend than is the period starting in 1994. Both periods follow time frames when the home run production rate was well below the trend line, further accentuating the explosion of homers in the following era.

As a side note about the last 13 years, Figure 3 shows the home run production rate from 1994 through 2006. The rate has held fairly steady through the period and, contrary to pronouncements by the commissioner, the production rate has not dropped in the years since Major League Baseball instituted its drug testing policy. This is clearly shown by Figure 3 as the rate has held steady since 2001, slowly undulating around the 14.0 per 500 plate appearance line.

Figure 3 - Home Run Production Rate (1994-2006)

Another series of negative comments made in the last few years concerns the number of players joining the 500 Home Run Club. From August 5, 1999 through June 20, 2004, five players joined the club: Mark McGwire (1999), Barry Bonds (2001), Sammy Sosa (2003), Rafael Palmeiro (2003) and Ken Griffey, Jr. (2004). That is five sluggers in about five years. Let's compare the period from September 13, 1965 through September 13, 1971. In those six years, seven players joined the 500 Home Run Club: Willie Mays (1965), Mickey Mantle (1967), Eddie Mathews (1967), Hank Aaron (1968), Ernie Banks (1970), Harmon Killebrew (1971) and Frank Robinson (1971). Thus, more players (seven) joined the club in six years during the late 1960s than the five who joined in the first part of the 21st century. These 12 sluggers are the players primarily responsible for the surge in the home run rate in the 1950s and the 1990s. Four hitters are poised to join the club in 2007: Frank Thomas, Alex Rodriguez, Jim Thome and Manny Ramirez.

It is clear that the production rate of the late 1990s is closer to the trend line than was the rate during the 1950s. Perhaps the emotional statements at the beginning of the twenty-first century are overblown and misleading, since they are not based on factual evidence but rather on conjecture, and are more inflammatory than informative.

SABR member David Vincent, the "Sultan of Swat Stats," is the recognized authority on the history of the home run. He is the author of Home Run: The Definitive History of Baseball's Ultimate Weapon, published by Potomac Books, Inc.


As far as the increase in the number of 500-homer players...I see three factors: lowering the pitching mound after 1968; shrinking the strike zone; expansion (that cuts two ways--more players to hit homers & dilution of pitching skills at the #4 & 5 spots in the rotation.

One of the reasons for the magnitude of the curve above the trend line in the 1950s could be the expansion of the player pool to include African American players. The players who entered the 500 home run club following this period reflect that change.

I'd be interested in seeing Figure 1 broken out differently. At first glance, it appears that there are three different established levels of HR/500PA. The first level, 1919 - 1949, appears to have an established rate of about 6 HR/500PA. The second level, 1950 - 1994, seems to have established a level around 10 HR/500PA. The third level is well represented in Figure 3 with an established level of about 14 HR/500PA.

If indeed this is the case, and I think that there is a good chance that it is, it gives an indication of why people began to notice/complain in the late 1990's that the home run totals were getting out of control. It appears as if we are in a different era than prior to 1994 and since the previous era began nearly 50 years before, most people assumed the game would never change.

Though the data is limited, can you show us HR/strike and HR/swing?

HR per PA is not a strong measure if we are looking for indications of changes in production, in that a ball cannot be hit out till it's hit, period, and batting averages have fluctuated dramatically over the various eras. A better indicator is HR/H; what I think better yet is TB/H (sometimes called Power Factor), because it includes rises in other extra-base hits, which ancillary rises would derive from the same factors that affect home runs.

The image located at shows an annotated PF graph running to the year 2000. The occasional sudden, discontinuous jumps owing, without a serious doubt, to changes in the ball are distinctly visible, including the one in the early 1990s.

What seems never to be mentioned in these discussions is the concept of an "ideal" R/G figure, but I believe there is such a thing, though there can be minor disagreement over just what it is. Its essence is that when a team scores a run, it should be neither the crux of the game nor a ho-hum event. I reckon that 7 or 8 runs (both teams) is a good figure, but the point should be considered; the current thinking, if we may so dignify the process, is the more the better: parity with football scoring, no doubt to eventually be followed by parity with basketball scoring.

I use HR/H alot, as well as HR/XBH. All of these, including HR/strike and HR/swing have their merits. The HR/swing would be the preferred for me, but should also be done for XBH/swing, and H/swing.

As for the "ideal", it should be based on Leverage Index.

I think your trendline is off a little bit. It works for your argument, but I dont like that it puts the 70s, 80s, and early 90s almost entirely below the trendline. It seems to me it should be dropped down a few degrees on the modern end, showing another surge in the 2000s similar to the late 50s and early 60s.
The first one is likely explained by the expanded player pool, the second by steroids. (example: 1996, 2001-2004 NL MVPs)

Yes, the 1950s surge is due to integration. As I said in this very short article, please see my book for more discussion. The chapter on the late 1940s and 1950s is titled: "Integration." I have no doubt that there are many ways to compare eras offensively. I was looking for a simple way to do that, one that would make sense to the average person. I was not looking for a complicated formula as this book was not targeted at the highest level of SABR thinkers. Thanks for the comments.

Here is an interesting analysis using, as others have suggested, HR/H as a measure.

The number of 500+ HR hitters and the surge in record breaking performances in recent years is explained by the hypothesis that HR-hitting ability is not distributed along a gaussian curve, but rather a power law curve, i.e. a thick tail. Using HR/H there isn't really anything unexpected in the recent data, i.e. the data don't suggest steroids or some other factor have increased HR hitting.

Why did you change the y-scale on the last graphic? Best to leave it the same for comparison purposes.

That second sentence should be rephrased to: I always thought it was best to leave it the same for comparison purposes. I'm not the expert here.

That HR Production chart looks awfully similar to a chart of global temperatures of the same time period.