Abstract
This study assessed whether the distance-time relationship could be modeled to predict
time to exhaustion (TTE) during intermittent running. 13 male distance runners (age:
33±14 years) completed a field test and 3 interval tests on an outdoor 400 m athletic
track. Field-tests involved trials over 3 600 m, 2 400 m and 1 200 m with a 30-min
rest between each run. Interval tests consisted of: 1 000 m at 107% of CS with 200 m
at 95% CS; 600 m at 110% of CS with 200 m at 90% CS; 200 m at 150% of CS with 200 m
at 80% CS. Interval sessions were separated by 24 h recovery. Field-test CS and D′
were applied to linear and non-linear models to estimate the point of interval session
termination. Actual and predicted TTE using the linear model were not significantly
different in the 1 000 m and 600 m trials. Actual TTE was significantly lower (P=0.01) than predicted TTE in the 200 m trial. Typical error was high across the trials
(range 334–1 709 s). The mean balance of D′ remaining at interval session termination
was significantly lower when estimated from the non-linear model (−21.2 vs. 13.4 m,
P<0.01), however no closer to zero than the linear model. Neither the linear or non-linear
model could closely predict TTE during intermittent running.
Key words
critical speed - interval training - modeling performance