From time to time, I come across a message board or blog comment from someone who doesn't "get" the title of this site -- someone who actually thinks I believe that non-conference games with SEC teams consists of Southern gazelles lapping plodding Midwestern anteaters.
This is not entirely true. I've said as many times as I care to that the site's name is tongue in cheek -- not entirely a joke, but by no means entirely serious. There are certainly some very fast teams in the SEC, Florida chief among them. But is the average SEC player at any position faster than the average Big XII or Pac-10 player at that position? I have no idea. And I have no idea how you would tell.
Even so, it catches my eye when a fellow SBN blogger posts "The 'SEC Speed' Myth Debunked." Usualy, I would fold something like this into Sprints with a snarky comment or pull out the rhetorical Howitzer and blast away. But it's obvious that dishingoutdimes put some serious thought into his post and actually marshals facts -- facts! -- to support his argument, so I'll respond seriously. I should also note before I start that I'm not trying to prove that the SEC is the fastest or the best conference in the land -- just that I disagree that the case that the league isn't has been proven.
The post starts simply enough, with the question of whether the South is faster at all. As a measure, dishingoutdimes decides to go with 40 times.
But a better metric for speed, and for NFL success, among running backs is the speed score. ... Of the RB prospects at this year's combine, only 7 running backs had a speed score above 100. None - I repeat none - were from the SEC. 2 were from the ACC, 2 were from the Big Ten, 1 was from the Big East, 1 was from the WAC, and the last from Tennessee State.
The fastest 40 time at the combine this year was posted by Darrius Heyward-Bey of Maryland (ACC) who ran a 4.30. In fact, the top time posted at any of the skill positions (WR, RB, CB, S) was not from anyone in the SEC. But none of that should matter anyways, because the 40 time is irrelevant when it comes to competing on the football field. [Emphasis from the original]
So the SEC players don't ran faster 40 times, but 40 times are completely irrelevant, which means this proves -- what, exactly? Nothing. If you assume that the measure we're talking about is irrelevant, as dishingoutdimes does, then there's no proof here that the SEC is or isn't faster.
Now, I'm not really going to disagree with the idea that the 40 time is irrelevant, because it is, but not always for the reasons dishingoutdimes highlights here and later. The reason I don't like 40 times has as much or more to do with the variance between draft classes among certain conferences and other factors than with the flaws with the 40 as a football measure itself.
In fact, any effort to compare conferences' relative speed by any combine measure suffers from two key flaws:
(1) The average player might not even be represented at the combine. If the average player is represented, he is not going to be in the middle of the pack, but probably near the low end. The combine, by definition, contains the best players in the country among seniors and declared juniors. Nothing from the combine can be taken as a serious measure of a conference from top to bottom, because the bottom half or more isn't even there.
(2) All draft classes are not created equal. What happens if the fastest Big XII running back or the quickest Big East WR is a sophomore or a junior who remains in school? Now, take this a step further. Let's say the 1st, 9th and 10th fastest Big Ten RBs are at the combine and the 2nd, 3rd and 4th fastest Pac-10 RBs are at the combine. Again, even assuming the NFL scouts could come up with a perfect metric for speed, what would it tell us if the Pac-10 average was better than the Big Ten average? But there would be no way to know which players rank where in each conference because, once again, the only players being measured would be those at the combine. The only way to make sure the sample size basically balanced out would be to measure over some longer period than one year. Then again, you would still run into problem No. 1 -- only some of the players are being measured, even under the larger sample size.
dishingoutdimes also falls into the trap of assuming that the idea of SEC speed -- true or not -- has something to do with skill positions, a point aptly rebutted by mlmintampa, among others.
The difference is in the lines. The DEs and LBs at Florida and LSU are faster than the ones in the Big Ten. As for the offensive lines, SEC teams rarely run a true pro-set offense. With the exception of Ole Miss' Michael Oher and Bama's Andre Smith, SEC teams do not have that massive anchor lineman. Instead are tall linemen who use a quick punch to open holes. These guys also have to pull and get up field quicker. That is the speed difference.
No one disputes that WRs and RBs and other skill players all over the country are fast -- that's why they're WRs and RBs and other skill players. The difference, if there is one, is found elsewhere on the field.
dishingoutdimes responds to this in the comments section of his post.
Someone actually compared Florida, LSU and Ohio State by position back in 2007 and using 40 times. The Ohio State players were actually faster than either of the SEC schools at every defensive position other than defensive tackle. Perhaps the defensive players aren’t much faster in the SEC?
Again, though, that proves only that one Big Ten team's defensive players were faster than two SEC teams' defensive players -- again, by a measure of speed that dishingoutdimes himself says is irrelevant. That doesn't really tell us that the average Big Ten defensive player is faster or slower than the average SEC defensive player in running the 40, much less tell us anything about "playing speed."
Having proven that some SEC players are slower in an exercise that proves nothing about playing speed, dishingoutdimes proceeds to argue that speed is unimportant.
What does that little exercise mean? It means that someone who is perceived as slow (4.65 40 time) could get to a spot on the field just as fast as a player who is perceived as fast (4.40 40 time), if the fast player made only a slight error in the angle that they took.
Taking good angles is all about good coaching and player instincts.
All of this is very perceptive, but has two major holes. First, dishingoutdimes makes no attempt to actually argue that non-SEC players take better routes or angles than SEC players. Second, it has nothing to do with whether the SEC is faster or not. Even if you somehow grant that running crisp routes and taking good angles is part of playing score, hooper at Rocky Top Talk points out:
The comparison shows how a slower guy might make up the difference by running smarter, but what if the faster guy makes proper decisions and takes the right angles on the field as well?
Or, as some are mentioning, what if a whole team of relatively fast guys take the right angles?
So far, dishingoutdimes has still persuasively argued only that the SEC's 2009 draft class is not as good as other conferences at an irrelevant measure of playing speed, even thoug playing speed really doesn't matter.
Having not proven that SEC speed is a myth, he's off to proving that the SEC's lack of speed is instead proven by on the field results.
Against non-BCS conferences, their winning percentage is undoubtedly high. That is to be expected. However, against BCS conference teams, the playing field is decidedly more even. SEC teams have a winning percentage of 60% over BCS conference teams. It is important to note that that is inflated by the fact that 1 out of every 2 games against a BCS conference opponent was against an ACC team -- the ACC being a historically weaker conference with less powerhouse teams than say the Big 10, Pac 10 or Big 12. Subtract out the ACC games and the SEC only has a winning percentage of 54%.
But of course people want to take a snapshot of the recent past. How about the last 2008 college football season? Suite101.com compiled a list of the records of BCS conference teams plus Mountain West Conference Teams against each of those conferences. Which conference came out on top? The Mountain West! The ACC was 2nd, and then in order SEC, Big 12, Big East, Pac 10 and Big 10.
So, to repeat: The ACC is a historically weak conference, so the SEC's results against the ACC shouldn't count. Now, this other measure dishingoutdimes is using to prove his point shows the second-best conference in the country to be -- the ACC?!? Either the SEC's victories against the ACC should count, or dishingoutdimes can't use Suite101's list, which shows the ACC to be strong. He cannot have it both ways.
The SEC in their schedule had a combined winning percentage of 56.6%. That may seem stellar, but consider that the Big 12 and Big East both had slightly better, but comparable winning percentages in their schedules around 57%. The ACC and Mountain West were not far behind either (several percentage points).
Consider also that the SEC was a combined 31-1 against Conference USA, Independents, the MAC, the Sun Belt, the WAC and FCS teams. The SEC was 10-13 against everyone else in non-conference play. Not remarkable at all.
But wait -- there's more. From the same list dishingoutdimes uses, the SEC was 6-6 against the ACC, while the Big XII was 1-4 against that either historically weak or titanically strong conference (we're left to decide for ourselves which one) -- so SEC RULLZ!!!! HOORAY SELECTIVE USE OF STATISTICS!!!!
Here's the only thing that a chart like this one lacks: Any form of context. To use a fictitious scenario, let's say Oklahoma beats Vanderbilt and Mississippi State by 1 while Florida cruises past Texas 52-10. Does the Big XII's 2-1 record against the SEC then tell us which conference is better? Of course not.
But even if we assume that all of these matchups were relatively even, dishingoutdimes still can't say that the SEC's record "against everyone else in non-conference play" was "not remarkable at all" -- because, from the evidence he's presented, we don't know how that ranks when compared to other leagues. For example, the Big Ten went 7-11 against that same group of teams that "count" -- which for some reason now includes the ACC again -- for a winning percentage of .389 compared to the SEC's .435.
Unless you go through each league's non-conference record for comparison's sake, the SEC's record alone tells you nothing. And then, unless you find some way to weight the results for the first problem I mentioned -- i.e., the relative strength of the teams from each conference in each game -- it's still largely meaningless. Finally, you could only really count it if you added back in all the "cupcake" games -- weighted somehow, of course. Those games might not be quality games, but they still give some relative indicator of how good a team is. All other things being equal, a team that convincingly defeats a Sun Belt team is likely better than a team that loses convincingly to the same Sun Belt team. You can't just throw out those games because you want to.
Let's go a step further, though. Let's concede every single point dishingoutdimes makes about the schedule and the SEC's record and what it says about the conference's relative strength is true. Then let's take a look at the "kill sentence" in the post: "Statistics and 40 times can lie, but game results don't."
Except speed doesn't always decide games. Sometimes tackling does. Sometimes turnovers do. Sometimes making good decisions is the difference. A hundred different things can decide a game. Team speed is just one of them.
Give credit to dishingoutdimes where credit is due. He tries to move beyond just regional chest-thumping to provide a fact-based argument for why the SEC isn't the fastest league in the country or the best. He just comes up short.