Bill James has conducted and posted on his website a new study of players' general offensive streakiness in baseball. Using a large sample of player-years (e.g., Lou Brock in 1967), James ranked a given player's games in a given year from highest to lowest runs-created. As James explains:
Let us suppose that the player plays 160 games; we rank the 160 games 1 to 160 in order of the number of runs that he has created in each game, and we divide those into his 80 best games and his 80 worst games.
Labeling each of the 80 best games as "good" and each of the 80 worst games as "bad," James then looked at the player's games in chronological sequence, identifying consistent sequences (e.g., good-good-good or bad-bad) and inconsistent sequences (e.g., good-bad-good). Consistent sequences were given positive point-values, larger the longer the stretch, whereas inconsistent sequences were given negative values, more sharply negative the longer the stretch.
The results? "The average player in the study played 141 games, with an average of positive streak score of 139, and a negative streak score of 131.5." Stated differently, "There are 34,683 cases within the data in which a player followed a good game with a good game or a bad game with a bad game, and 33,626 cases in which the two games did not match—50.8% 'matches', 49.2% 'non-matches'."
James characterizes the results as showing "some clustering of good games and bad games within a player’s season." The difference seems pretty small, though. Further, James acknowledges several possible extraneous factors that could affect clustering, including playing the same (good or bad) opponent multiple games in a series. I would focus more specifically on quality of pitching. If batters faced the mid-1990s Atlanta Braves with Greg Maddux, Tom Glavine, and John Smoltz pitching on successive days, you bet they would be highly likely to exhibit a clustering of subpar offensive games!