Do battery manufacturers use the same standards (allowable voltage drop and same load) to test batteries in order to establish amp-hour ratings.
My understanding is that amp-hour ratings can be manipulated to achieve higher ratings by allowing for a greater drop in voltage during the test to establish the rating. If one manufacturer uses a 12.3V limit while another uses an 11.8V limit they may call them both 105 amp-hour batteries but you're not going to get the same performance. I would think there is an industry standard for setting amp-hour ratings but I'm not aware of it.
The amperage used by the manufacturer to establish the amp-hour rating can also have an affect. You can establish a longer amp-hour rating by drawing a lower current over a longer time.