scispace - formally typeset
Search or ask a question

Showing papers in "British Journal of Sports Medicine in 2002"


Journal ArticleDOI
TL;DR: Various risk factors were shown to be positively associated with a risk for, or protection from, specific injuries, and future research should include a non-injured control group and a more precise measure of weekly running distance and running experience to validate these results.
Abstract: Results: Most of the study group were women (54%). Some injuries occurred with a significantly higher frequency in one sex. Being less than 34 years old was reported as a risk factor across the sexes for patellofemoral pain syndrome, and in men for iliotibial band friction syndrome, patellar tendinopathy, and tibial stress syndrome. Being active for less than 8.5 years was positively associated with injury in both sexes for tibial stress syndrome; and women with a body mass index less than 21 kg/m 2 were at a significantly higher risk for tibial stress fractures and spinal injuries. Patellofemoral pain syndrome was the most common injury, followed by iliotibial band friction syndrome, plantar fasciitis, meniscal injuries of the knee, and tibial stress syndrome. Conclusions: Although various risk factors were shown to be positively associated with a risk for, or protection from, specific injuries, future research should include a non-injured control group and a more precise measure of weekly running distance and running experience to validate these results.

1,510 citations


Journal ArticleDOI
TL;DR: The Concussion in Sport Group (CISG) as discussed by the authors is a group of experts from epidemiology, basic and clinical science, grading systems, cognitive assessment, new research methods, protective equipment, management, prevention, and long term outcome.
Abstract: Recommendations for the improvement of safety and health of athletes who may suffer concussive injuries In November 2001, the first International Symposium on Concussion in Sport was held in Vienna, Austria. This symposium was organised by the International Ice Hockey Federation (IIHF), the Federation Internationale de Football Association Medical Assessment and Research Centre (FIFA, F-MARC), and the International Olympic Committee Medical Commission (IOC). The aim of the symposium was to provide recommendations for the improvement of safety and health of athletes who suffer concussive injuries in ice hockey, football (soccer), and other sports. To this end a range of experts were invited to address specific issues of epidemiology, basic and clinical science, grading systems, cognitive assessment, new research methods, protective equipment, management, prevention, and long term outcome, and to discuss a unitary model for understanding concussive injury. At the conclusion of the conference, a small group of experts were given a mandate by the conference delegates and organising bodies to draft a document describing the agreement position reached by those in attendance at that meeting. For the purpose of this paper, this group will be called the Concussion in Sport Group (CISG). This review seeks to summarise the findings of the Vienna conference and to provide a working document that will be widely applicable to sport related concussion. This document is developed for use by doctors, therapists, health professionals, coaches, and other people involved in the care of injured athletes, whether at the recreational, elite, or professional level. During the course of the symposium, a persuasive argument was made that a comprehensive systematic approach to concussion would be of potential benefit to aid the injured athlete and direct management decisions.1 This protocol represents a work in progress, and, as with all other guidelines or proposals, it must undergo revision …

672 citations


Journal ArticleDOI
TL;DR: The injury definition of this study does not produce incidence rates that are complete for all minor injuries, but the determination of an injury is made by a single entity in exactly the same manner for all teams, which overcomes a significant methodological flaw present in other multiteam injury surveillance systems.
Abstract: Objective: To describe the epidemiology of injuries in the Australian Football League (AFL) over four seasons. Methods: An injury was defined as “any physical or medical condition that caused a player to miss a match in the regular season.” The rationale for this definition was to eliminate a previously noted tendency of team recorders to interpret injury definitions subjectively. Administrative records of injury payments to players who did not play matches determined the occurrence of an injury. Results: The seasonal incidence of new injuries was 39 per club (of 40 players) per season (of 22 matches). The match injury incidence for AFL games was 25.7 injuries per 1000 player hours. The injury prevalence (percentage of players missing through injury in an average week) was 16%. The recurrence rate of injuries was 17%. The most common and prevalent injury was hamstring strain (six injuries per club per season, resulting in 21 missed matches per club per season), followed in prevalence by anterior cruciate ligament and groin injuries. Conclusions: The injury definition of this study does not produce incidence rates that are complete for all minor injuries. However, the determination of an injury is made by a single entity in exactly the same manner for all teams, which overcomes a significant methodological flaw present in other multiteam injury surveillance systems.

572 citations


Journal ArticleDOI
TL;DR: Heart rate monitoring during soccer specific exercise is a valid indicator of actual exercise intensity and may be performed as aerobic interval training.
Abstract: Background: In professional soccer, a significant amount of training time is used to improve players' aerobic capacity. However, it is not known whether soccer specific training fulfils the criterion of effective endurance training to improve maximal oxygen uptake, namely an exercise intensity of 90–95% of maximal heart rate in periods of three to eight minutes. Objective: To determine whether ball dribbling and small group play are appropriate activities for interval training, and whether heart rate in soccer specific training is a valid measure of actual work intensity. Methods: Six well trained first division soccer players took part in the study. To test whether soccer specific training was effective interval training, players ran in a specially designed dribbling track, as well as participating in small group play (five a side). Laboratory tests were carried out to establish the relation between heart rate and oxygen uptake while running on a treadmill. Corresponding measurements were made on the soccer field using a portable system for measuring oxygen uptake. Results: Exercise intensity during small group play was 91.3% of maximal heart rate or 84.5% of maximal oxygen uptake. Corresponding values using a dribbling track were 93.5% and 91.7%. No higher heart rate was observed during soccer training. Conclusions: Soccer specific exercise using ball dribbling or small group play may be performed as aerobic interval training. Heart rate monitoring during soccer specific exercise is a valid indicator of actual exercise intensity.

494 citations


Journal ArticleDOI
TL;DR: The effects of hormonal factors and growth on bone mineral change during puberty are examined, and the possibility of a critical period during which bone is especially adaptable to exercise is discussed.
Abstract: This systematic review examines and compares the bone mineral changes in children and adolescents, as measured by dual energy x ray absorptiometry, reported in exercise intervention studies. The effects of hormonal factors and growth on bone mineral change during puberty are examined, and the possibility of a critical period during which bone is especially adaptable to exercise is discussed.

359 citations


Journal ArticleDOI
TL;DR: Sclerosing neovessels appears to be an effective treatment for painful chronic Achilles tendinosis, suggesting that neovessel play a key part in causing chronic tendon pain.
Abstract: Background: The mechanism that causes pain in chronic Achilles tendinosis is not known. However, high resolution colour Doppler ultrasound has shown that neovascularisation may be involved. Objective: To investigate if sclerosing the neovessels would affect the level of tendon pain. Methods: The effect of colour Doppler ultrasound guided injection of a sclerosing agent, polidocanol, against neovessels was studied in 10 patients (seven men and three women, mean age 55 years) with painful chronic mid-portion Achilles tendinosis. Results: Eight patients were satisfied with the results of treatment. There was significantly reduced pain during activity (reported on a visual analogue scale (VAS)) and no remaining neovascularisation after an average of two injections. Two patients were not satisfied, and neovascularisation remained. At the six month follow up, the same eight patients remained satisfied and could perform Achilles tendon loading activities as desired. Their VAS score had decreased from 74 before treatment to 8 (p<0.01). Conclusions: Sclerosing neovessels appears to be an effective treatment for painful chronic Achilles tendinosis, suggesting that neovessels play a key part in causing chronic tendon pain.

350 citations


Journal ArticleDOI
TL;DR: Prevention of preseason injury is important to ensure availability of players for the commencement of the season and to decrease the risk of injury later in the season, and the implementation of a risk management policy for this purpose is recommended.
Abstract: Objectives: To conduct a detailed analysis of preseason football injuries sustained in English professional football over two competitive seasons. Methods: Club medical staff at 91 professional football clubs annotated player injuries. A specific injury audit questionnaire was used together with a weekly form that documented each club’s current injury status. Results: 17% (1025) of the total number of injuries over the two seasons were sustained during the preseason, the mean number of days absent per injury was 22.3 days. Younger age groups (17–25 yrs) were more likely to sustain a preseason injury than more experienced players (26–35+) (p<0.01). There were relatively more “slight” and “minor” injuries (as defined in the methodology), overuse, and tendon related injuries sustained during preseason compared to the in season (p<0.01). The thigh (23%), knee (17%), and ankle (17%) were the most common locations for injuries during the preseason, there was a relatively greater number of lower leg injuries (15%) during the preseason (p<0.05). Achilles tendonitis was most prevalent in the preseason, with 33% of all Achilles related injuries sustained during this period (p<0.01). Muscle strains were the most common injury during preseason (37%). Rectus femoris muscle strains were observed twice as frequently during the preseason relative to the in season (p<0.01). Ligament sprains were the second most common injury during preseason (19%). Non-contact mechanisms were the cause of significantly more injuries during the preseason (p<0.01), with relatively more preseason injuries sustained while running or shooting (p<0.01). For 70% of the injuries reported during the preseason, the ground condition was described as dry. Conclusions: Players are at a greater risk of slight and minor injuries, overuse injuries, lower leg injuries (especially the Achilles tendon) and rectus femoris strains during the preseason period. Prevention of preseason injury is important to ensure availability of players for the commencement of the season and to decrease the risk of injury later in the season, we recommend the implementation of a risk management policy for this purpose. Areas requiring further investigation include methods of prevention for the common preseason injuries that have been identified, a detailed analysis of preseason and closed season training programmes, and a smaller study involving exposure data.

286 citations


Journal ArticleDOI
TL;DR: Injury rate increases at higher levels of play in rugby union, and most injuries are now seen in the third quarter of the game, a finding that may reflect new substitution laws.
Abstract: Objectives: To assess injury patterns and incidence in the Australian Wallabies rugby union players from 1994 to 2000. To compare these patterns and rates with those seen at other levels of play, and to see how they have changed since the beginning of the professional era. Methods: Prospective data were recorded from 1994 to 2000. All injuries to Australian Wallabies rugby union players were recorded by the team doctor. An injury was defined as one that forced a player to either leave the field or miss a subsequent game. Results: A total of 143 injuries were recorded from 91 matches. The overall injury rate was 69/1000 player hours of game play. The injury rates in the periods before (1994–1995) and after (1996–2000) the start of the professional era were 47/1000 player hours and 74/1000 player hours respectively. The lock was the most injured forward, and the number 10 the most injured back. Most injuries were soft tissue, closed injuries (55%), with the head being the most commonly injured region (25.1%). The phase of play responsible for most injuries was the tackle (58.7%). Injuries were more likely to occur in the second half of the game, specifically the third quarter (40%). The vast majority of injuries were acute (90%), with the remainder being either chronic or recurrent. Conclusions: Injury rate increases at higher levels of play in rugby union. Injury rates have increased in the professional era. Most injuries are now seen in the third quarter of the game, a finding that may reflect new substitution laws. There is a need for standardised collection of injury data in rugby union.

282 citations


Journal ArticleDOI
TL;DR: Playing actions with high injury risk were linked to contesting possession in areas of the pitch where possession of the ball is most vigorously contested, which were specific attacking and defending zones close to the goal.
Abstract: Objective: To assess the exposure of players to injury risk during English Premier League soccer matches in relation to selected factors. Methods: Injury risk was assessed by rating the injury potential of playing actions during competition with respect to (a) type of playing action, (b) period of the game, (c) zone of the pitch, and (d) playing either at home or away. In all, 10 games from the English Premier League 1999‐2000 were chosen for analysis. A notation system was used whereby 16 soccer specific playing actions were classified into three categories: those inducing actual injury, those with a potential for injury (graded as mild, moderate, or high), and those deemed to have no potential for injury. The pitch was divided into 18 zones, and the position of each event was recorded along with time elapsed in the game, enabling six 15 minute periods to be defined. Results: Close to 18 000 actions were notated. On average (mean (SD)), 1788 (73) events (one every three seconds), 767 (99) events with injury potential (one every six seconds), and 2 (1) injuries (one every 45 minutes) per game were recorded. An overall injury incidence of 53 per 1000 playing hours was calculated. Receiving a tackle, receiving a “charge”, and making a tackle were categorised as having a substantial injury risk, and goal catch, goal punch, kicking the ball, shot on goal, set kick, and heading the ball were all categorised as having a significant injury risk. All other actions were deemed low in risk. The first 15 minutes of each half contained the highest number of actions with mild injury potential, the last 15 minutes having the highest number of actions with moderate injury potential (p<0.01). The first and last 15 minutes of the game had the highest number of actions with high injury potential, although not significant. More actions with mild injury potential occurred in the goal area, and more actions with moderate and high injury potential occurred in the zone adjacent to the goal area (p<0.001). There was no significant difference between home and away with regard to injury potential. Conclusions: Playing actions with high injury risk were linked to contesting possession. Injury risk was highest in the first and last 15 minutes of the game, reflecting the intense engagements in the opening period and the possible effect of fatigue in the closing period. Injury risk was concentrated in the areas of the pitch where possession of the ball is most vigorously contested, which were specific attacking and defending zones close to the goal. Injury potential was no greater in away matches than at home.

274 citations


Journal ArticleDOI
TL;DR: There was a significant effect of age and playing level on playing experience, body mass, muscular power, speed, agility, and estimated maximal aerobic power, with the physiological capacities of players increasing as the playing level increased.
Abstract: Objectives: To investigate the physiological characteristics of subelite junior and senior rugby league players and establish performance standards for these athletes. Methods: A total of 159 junior (under 16, 15, 14, and 13, n = 88) and senior (first grade, second grade, and under 19, n = 71) rugby league players (forwards, n = 80, backs, n = 79), competing at a subelite level, underwent measurements of body mass, muscular power (vertical jump), speed (10 m, 20 m, and 40 m sprint), agility (Illinois agility run), and estimated maximal aerobic power (multistage fitness test). Data were also collected on match and training frequency and playing experience. Results: There was a significant effect (p<0.05) of age and playing level on playing experience, body mass, muscular power, speed, agility, and estimated maximal aerobic power, with the physiological capacities of players increasing as the playing level increased. Forwards were heavier than backs for all junior and senior teams. Forwards and backs had similar estimated maximal aerobic power, except for under 16 players, for whom significant (p<0.05) differences were detected (mean (95% confidence intervals) 42.9 (40.1 to 45.7) v 49.5 (46.4 to 52.6) ml/kg/min for forwards and backs respectively). Scores for speed, muscular power, and agility were not significantly different between forwards and backs for any of the junior or senior teams. Conclusions: The results show that there is a progressive improvement in the physiological capacities of rugby league players as the playing level increases. These findings provide normative data and performance standards for subelite junior and senior rugby league players. Further studies on the sociological, physical, psychological, and personal predictors of talent in rugby league are warranted.

245 citations


Journal ArticleDOI
TL;DR: It was concluded that, by adhering to current guidelines for physical activity and expending about 4200 kJ of energy a week, women can postpone mortality.
Abstract: A computer assisted literature search was performed (Medline, 1966–2000) to examine the association of physical activity with all cause mortality in women. It was concluded that, by adhering to current guidelines for physical activity and expending about 4200 kJ of energy a week, women can postpone mortality. The magnitude of benefit experienced by women is similar to that seen in men.

Journal ArticleDOI
TL;DR: Precooling studies confirm that increasing body heat is a limiting factor during exercise, but it seems that precooling is only beneficial for endurance exercise of up to 30–40 minutes rather than intermittent or short duration exercise.
Abstract: Precooling studies confirm that increasing body heat is a limiting factor during exercise. However, it seems that precooling is only beneficial for endurance exercise of up to 30-40 minutes rather than intermittent or short duration exercise.

Journal ArticleDOI
TL;DR: Injuries occurring to the state and national teams were surveyed prospectively between the seasons 1998/1999 and 2000/2001, and the three preceding seasons were surveyed retrospectively to describe and analyse injuries and illness occurring in Australian cricket.
Abstract: Objective: To describe and analyse injuries and illness occurring in Australian cricket at first class level. Methods: Injuries occurring to the state and national teams were surveyed prospectively between the seasons 1998/1999 and 2000/2001, and the three preceding seasons were surveyed retrospectively. The definition of an injury was detailed and generally required the player to miss playing time in a major match. Results: Average injury match incidence in the seasons studied prospectively varied from a low of 19.0 injuries per 10 000 player hours in first class domestic matches to a high of 38.5 injuries per 10 000 player hours in one day internationals. The average seasonal incidence was 19.2 injuries per squad (25 players) per season (20 matches). Injury prevalence (the percentage of players missing through injury at any given time) was 14% for pace bowlers, 4% for spin bowlers, 4% for batsmen, and 2% for wicket keepers. The most common injuries were hamstring strains, side strains, groin injuries, wrist and hand injuries, and lumbar soft tissue injuries. Bowlers who had bowled more than 20 match overs in the week leading up to a match had an increased risk of sustaining a bowling injury (risk ratio 1.91, 95% confidence interval (CI) 1.28 to 2.85). A further risk for bowling injury is bowling second in a match—that is, batting first (risk ratio 1.62, 95% CI 1.04 to 2.50). A risk factor for injury in fielding is colliding with the boundary fence. Conclusions: Further study is required to determine ways to minimise the risk of injury in fast bowlers. Cricket grounds should mark a boundary line on the playing field to prevent players colliding with fences in the field.

Journal ArticleDOI
TL;DR: In both sexes, smoking, irregular breakfast eating, attending vocational school, and poor self perceived current health were significantly associated with persistent inactivity.
Abstract: Objective: To examine the association between leisure time physical activity over a three year period and health related behaviour, social relationships, and health status in late adolescence as part of a nationwide longitudinal study. Methods: Five birth cohorts of adolescent twins aged 16 at baseline (n = 5028; 2311 boys and 2717 girls) participated in the study. Questionnaires on leisure time physical activity, other health related behaviour, social relationships, and health status were sent to the twins on their 16th and 17th birthdays and six months after their 18th birthday. The combined response rate to the three questionnaires was 75.8% for boys and 81.7% for girls. Those who answered in all three questionnaires that their frequency of physical activity was 4‐5 times a week or more were defined as persistent exercisers, and those who exercised at most twice a month in all three were defined as persistently inactive. Logistic regression analyses were used to identify baseline variables associated with outcome measures. Results: Overall, 20.4% of boys and 13.0% of girls were persistent exercisers and 6.5% of boys and 5.3% of girls were persistently inactive. In both sexes, smoking, irregular breakfast eating, attending vocational school, and poor self perceived current health were significantly associated with persistent inactivity. Conclusions: Persistent physical inactivity in adolescents is associated with a less healthy lifestyle, worse educational progression, and poor self perceived health. Tailoring methods to promote physical activity may prove useful for influencing other health habits. Such programmes are indicated for vocational schools in particular.

Journal ArticleDOI
TL;DR: Although 10RM strength decreased after eight weeks of detraining, the results remained significantly elevated from baseline measures, suggesting that a short, low intensity resistance training programme produces substantial improvements in muscle strength.
Abstract: Objectives: To study the effects of eight weeks of supervised, low intensity resistance training (80% of 10 repetition maximum (10RM)) and eight weeks of detraining on muscle strength and blood lipid profiles in healthy, sedentary postmenopausal women. Subjects: Fifteen postmenopausal women, aged 49–62 years, took part in the study. Subjects were assigned to either a control (n = 7) or training (n = 8) group. The training regimen consisted of three sets of eight repetitions of leg press, bench press, knee extension, knee flexion, and lat pull-down, three days a week at 80% of 10RM. Dynamic leg strength, 10RM, and blood lipid profiles (total cholesterol (TC), low and high density lipoprotein cholesterol (LDL-C, HDL-C), triglycerides, and very low density lipoprotein cholesterol (VLDL-C)) were measured at baseline, after eight weeks of training, and after a further eight weeks of detraining. Results: Eight weeks of resistance training produced significant increases in knee extension (F1,13 = 12.60; p<0.01), bench press (F1,13 = 13.79; p<0.01), leg press (F1,13 = 15.65; p<0.01), and lat pull-down (F1,13 = 16.60; p<0.005) 10RM strength tests. Although 10RM strength decreased after eight weeks of detraining, the results remained significantly elevated from baseline measures. Eight weeks of training did not result in any significant alterations in blood lipid profiles, body composition, or dynamic isokinetic leg strength. There were no significant differences in any of the variables investigated over the 16 week period in the control group. Conclusions: These data suggest that a short, low intensity resistance training programme produces substantial improvements in muscle strength. Training of this intensity and duration was not sufficient to produce significant alterations in blood lipid concentrations.

Journal ArticleDOI
TL;DR: This review analyses rowing by linking the biological and mechanical systems that comprise the rowing system, finding Blade force to be the only propulsive force to counter the drag forces, consisting of both air drag and hydrodynamic drag, acting on the system.
Abstract: This review analyses rowing by linking the biological and mechanical systems that comprise the rowing system. Blade force was found to be the only propulsive force to counter the drag forces, consisting of both air drag and hydrodynamic drag, acting on the system. Vertical oscillations of the shell are shown to have minimal impact on system dynamics. The oar acts as the link between the force generated by the rower and the blade force and transmits this force to the rowing shell through the oarlock. Blade dynamics consist of both lift and drag mechanisms. The force on the oar handle is the result of a phased muscular activation of the rower. Oar handle force and movement are affected by the joint strength and torque-velocity characteristics of the rower. Maximising sustainable power requires a matching of the rigging setup and blade design to the rower's joint torque-velocity characteristics. Coordination and synchrony between rowers in a multiple rower shell affects overall system velocity. Force-time profiles should be better understood to identify specific components of a rower's biomechanics that can be modified to achieve greater force generation.

Journal ArticleDOI
TL;DR: The risks associated with minor, moderate, and major acute injuries and osteoarthritis in lower limb joints of professional footballers were found to be unacceptable when evaluated against work based risk criteria used by the Health and Safety Executive.
Abstract: Objectives: To show how epidemiological data can be presented and analysed in frequency based and risk based formats and how risk based information can simplify management decisions on injury prevention strategies in professional football. Methods: The club physiotherapists at four English professional football clubs prospectively recorded players’ injuries over the period November 1994 to May 1997. The nature, location, and mechanism of each injury and the specific numbers of days that players were unavailable to train or play as a result of injuries were recorded. The rates of injury were evaluated on a risk matrix using the number of days and the estimated costs of absence as measures of injury consequences. Results: There was a significant difference in the time lost through injury as a function of injury severity (p Conclusions: The risks associated with minor, moderate, and major acute injuries and osteoarthritis in lower limb joints of professional footballers were found to be unacceptable when evaluated against work based risk criteria used by the Health and Safety Executive. All stakeholders within professional football were shown to have an important contribution to make in reducing the overall level of risk to players through the provision of risk prevention strategies.

Journal ArticleDOI
TL;DR: Fatigue and metabolite accumulation do not appear to be critical stimuli for strength gain, and resistance training can be effective without the severe discomfort and acute physical effort associated with fatiguing contractions.
Abstract: BACKGROUND: High resistance training enhances muscular strength, and recent work has suggested an important role for metabolite accumulation in this process. OBJECTIVE: To investigate the role of fatigue and metabolite accumulation in strength gains by comparing highly fatiguing and non-fatiguing isotonic training protocols. METHODS: Twenty three healthy adults (18-29 years of age; eight women) were assigned to either a high fatigue protocol (HF: four sets of 10 repetitions with 30 seconds rest between sets) to maximise metabolic stress or a low fatigue protocol (LF: 40 repetitions with 30 seconds between each repetition) to minimise changes. Subjects lifted on average 73% of their 1 repetition maximum through the full range of knee extension with both legs, three times a week. Quadriceps isometric strength of each leg was measured at a knee joint angle of 1.57 rad (90 degrees ), and a Cybex 340 isokinetic dynamometer was used to measure the angle-torque and torque-velocity relations of the non-dominant leg. RESULTS: At the mid-point of the training, the HF group had 50% greater gains in isometric strength, although this was not significant (4.5 weeks: HF, 13.3 (4.4)%; LF, 8.9 (3.6)%). This rate of increase was not sustained by the HF group, and after nine weeks of training all the strength measurements showed similar improvements for both groups (isometric strength: HF, 18.2 (3.9)%; LF, 14.5 (4.0)%). The strength gains were limited to the longer muscle lengths despite training over the full range of movement. CONCLUSIONS: Fatigue and metabolite accumulation do not appear to be critical stimuli for strength gain, and resistance training can be effective without the severe discomfort and acute physical effort associated with fatiguing contractions.

Journal ArticleDOI
TL;DR: A significant circadian variation in the variables measured before exercise is suggested, without showing a significant effect on their acute responses to exercise.
Abstract: Objective: To examine whether time of day significantly affects salivary cortisol and IgA levels before and after submaximal swimming. Methods: Fourteen male competitive swimmers (mean (SD) age 18 (3.2) years) volunteered to participate in the study. In a fully randomised, cross over design, each subject performed 5 × 400 m front crawl at 85 (1.2)% of their seasonal best time (277 (16) seconds), with one minute rest between each 400 m, at 0600 and 1800 hours on two separate days. Timed, unstimulated saliva samples were collected before and after exercise. Saliva samples were analysed for cortisol and IgA by radioimmunoassay and single radial immunodiffusion respectively. Results: Significant time of day effects (am and pm respectively) were observed in IgA concentration (0.396 (0.179) v 0.322 (0.105) mg/ml, p 0.05) but, in comparison with values before exercise, caused significant alterations in cortisol (p 0.05). However, most of the values of the salivary variables before exercise were significantly inversely related to their exercise induced response (p<0.05). Conclusion: These results suggest a significant circadian variation in the variables measured before exercise, without showing a significant effect on their acute responses to exercise.

Journal ArticleDOI
TL;DR: Mouthguards should be mandatory as an effective device for the prevention of dental and orofacial injuries, as well as reducing the incidence and severity of mTBI.
Abstract: The number of minor traumatic brain injury (mTBI), cerebral concussions, is increasing and cannot be eliminated by any kind of equipment. Prevention strategies, such as the introduction of “checking from behind” rules have become effective in decreasing the number of severe spinal injuries. A new “head checking” rule should reduce mTBI in the same way in the following years. Mouthguards should be mandatory as an effective device for the prevention of dental and orofacial injuries, as well as reducing the incidence and severity of mTBI. A new internet database system, the International Sports Injury System (ISIS) should improve epidemiological analysis of head, face, and spinal injuries worldwide. ISIS should provide an internationally compatible system for continuous monitoring of risk factors, protective effects of equipment, and protective effects of equipment and effects of changes in rules through the years.

Journal ArticleDOI
TL;DR: The changes in muscle metabolism produced by CM treatment indicate that CM may promote aerobic energy production, which may be the result of an enhanced malate supply activating ATP production from the tricarboxylic acid cycle through anaplerotic reactions.
Abstract: Background: Previous studies have shown an antiasthenic effect of citrulline/malate (CM) but the mechanism of action at the muscular level remains unknown. Objective: To investigate the effects of CM supplementation on muscle energetics. Methods: Eighteen men complaining of fatigue but with no documented disease were included in the study. A rest-exercise (finger flexions)-recovery protocol was performed twice before (D-7 and D0), three times during (D3, D8, D15), and once after (D22) 15 days of oral supplementation with 6 g/day CM. Metabolism of the flexor digitorum superficialis was analysed by 31P magnetic resonance spectroscopy at 4.7 T. Results: Metabolic variables measured twice before CM ingestion showed no differences, indicating good reproducibility of measurements and no learning effect from repeating the exercise protocol. CM ingestion resulted in a significant reduction in the sensation of fatigue, a 34% increase in the rate of oxidative ATP production during exercise, and a 20% increase in the rate of phosphocreatine recovery after exercise, indicating a larger contribution of oxidative ATP synthesis to energy production. Considering subjects individually and variables characterising aerobic function, extrema were measured after either eight or 15 days of treatment, indicating chronological heterogeneity of treatment induced changes. One way analysis of variance confirmed improved aerobic function, which may be the result of an enhanced malate supply activating ATP production from the tricarboxylic acid cycle through anaplerotic reactions. Conclusion: The changes in muscle metabolism produced by CM treatment indicate that CM may promote aerobic energy production.

Journal ArticleDOI
TL;DR: Investigating perceptions of physical self in male weightlifters found that those with MD syndrome have poorer body image and are less happy with their bodies, and in addition to a desire for greater muscularity, they are very concerned not to gain fat.
Abstract: Recently more men have reported a desire for larger, more muscular bodies. Muscle dysmorphia (MD) is a new syndrome in which individuals (usually men), although highly muscular, have a pathological belief that they are of very small musculature. As more men are motivated to take up training with weights in order to develop greater musculature, more cases of MD are likely to be encountered. A greater understanding and awareness of the syndrome are therefore needed. Therefore the aim of this study was to investigate perceptions of physical self in male weightlifters, one group with MD (n = 24) and one without (n = 30). Between group comparisons were made using the multidimensional body-self relations questionnaire. The findings confirm the nature of the disorder in that those with MD syndrome have poorer body image and are less happy with their bodies. Moreover, in addition to a desire for greater muscularity, they are very concerned not to gain fat. The results also suggest that future research into perceptions of specific body parts and health is warranted.

Journal ArticleDOI
TL;DR: It seems that visual information is more important to the higher level judoists, and the level of competition influences the sensory canals involved in balance.
Abstract: The aim of this work was to study the posturokinetic capacities and use of visual information by judoists according to their level of competition. Twenty male judoists aged between 16 and 19 took part. They were separated into two groups: those that competed at regional level and those that competed at national and international level. Static balance was measured on a force platform. No difference was seen between the two groups. However, it seems that visual information is more important to the higher level judoists. Perhaps the level of competition influences the sensory canals involved in balance.

Journal ArticleDOI
TL;DR: Walking at moderate intensity 45% to 55% ofVo2max, with a total weekly energy expenditure of 1000–1500 kcal, improves Vo2max and body composition of previously sedentary, non-obese, postmenopausal women and apparently approaches the minimum effective dose.
Abstract: Background: The American College of Sports Medicine recommends 20–60 minutes of aerobic exercise three to five days a week at an intensity of 40/50–85% of maximal aerobic power (VO2MAX) reserve, expending a total of 700–2000 kcal (2.93–8.36 MJ) a week to improve aerobic power and body composition. Objective: To ascertain the minimum effective dose of exercise. Methods: Voluntary, healthy, non-obese, sedentary, postmenopausal women (n = 121), 48–63 years of age, were randomised to four low dose walking groups or a control group; 116 subjects completed the study. The exercise groups walked five days a week for 24 weeks with the following intensity (% of VO2MAX) and energy expenditure (kcal/week): group W1, 55%/1500 kcal; group W2, 45%/1500 kcal; group W3, 55%/1000 kcal; group W4, 45%/1000 kcal. VO2MAX was measured in a direct maximal treadmill test. Submaximal aerobic fitness was estimated as heart rates at submaximal work levels corresponding to 65% and 75% of the baseline VO2MAX. The body mass index (BMI) was calculated and percentage of body fat (F%) estimated from skinfolds. Results: The net change (the differences between changes in each exercise group and the control group) in VO2MAX was 2.9 ml/min/kg (95% confidence interval (CI) 1.5 to 4.2) in group W1, 2.6 ml/min/kg (95% CI 1.3 to 4.0) in group W2, 2.4 ml/min/kg (95% CI 0.9 to 3.8) in group W3, and 2.2 ml/min/kg (95% CI 0.8 to 3.5) in group W4. The heart rates in standard submaximal work decreased 4 to 8 beats/min in all the groups. There was no change in BMI, but the F% decreased by about 1% unit in all the groups. Conclusions: Walking (for 24 weeks) at moderate intensity 45% to 55% of VO2MAX, with a total weekly energy expenditure of 1000–1500 kcal, improves VO2MAX and body composition of previously sedentary, non-obese, postmenopausal women. This dose of exercise apparently approaches the minimum effective dose.

Journal ArticleDOI
TL;DR: The use of a full face shield compared with half face shield by intercollegiate ice hockey players significantly reduced the playing time lost because of concussion, suggesting that concussion severity may be reduced by the use of the full faceshield.
Abstract: Objective: To identify specific risk factors for concussion severity among ice hockey players wearing full face shields compared with half face shields (visors). Methods: A prospective cohort study was conducted during one varsity hockey season (1997–1998) with 642 male ice hockey players (median age 22 years) from 22 teams participating in the Canadian Inter-University Athletics Union. Half of the teams wore full face shields, and half wore half shields (visors) for every practice and game throughout the season. Team therapists and doctors recorded on structured forms daily injury, participation, and information on face shield use for each athlete. The main outcome measure was any traumatic brain injury requiring assessment or treatment by a team therapist or doctor, categorised by time lost from subsequent participation and compared by type of face shield worn. Results: Players who wore half face shields missed significantly more practices and games per concussion (2.4 times) than players who wore full face shields (4.07 sessions (95% confidence interval (CI) 3.48 to 4.74) v 1.71 sessions (95% CI 1.32 to 2.18) respectively). Significantly more playing time was lost by players wearing half shields during practices and games, and did not depend on whether the athletes were forwards or defence, rookies or veterans, or whether the concussions were new or recurrent. In addition, players who wore half face shields and no mouthguards at the time of concussion missed significantly more playing time (5.57 sessions per concussion; 95% CI 4.40 to 6.95) than players who wore half shields and mouthguards (2.76 sessions per concussion; 95% CI 2.14 to 3.55). Players who wore full face shields and mouthguards at the time of concussion lost no playing time compared with 1.80 sessions lost per concussion (95% CI 1.38 to 2.34) for players wearing full face shields and no mouthguards. Conclusions: The use of a full face shield compared with half face shield by intercollegiate ice hockey players significantly reduced the playing time lost because of concussion, suggesting that concussion severity may be reduced by the use of a full face shield.

Journal ArticleDOI
TL;DR: In this paper, a mixture of athletes, their coaches, and academics attending a conference (n = 85) was studied during their flights from the United Kingdom to Australia (two flights with a one hour stopover in Singapore), and for the first six days in Australia.
Abstract: Background: Travelling across multiple time zones disrupts normal circadian rhythms and induces “jet lag”. Possible effects of this on training and performance in athletes were concerns before the Sydney Olympic Games. Objective: To identify some determinants of jet lag and its symptoms. Methods: A mixture of athletes, their coaches, and academics attending a conference (n = 85) was studied during their flights from the United Kingdom to Australia (two flights with a one hour stopover in Singapore), and for the first six days in Australia. Subjects differed in age, sex, chronotype, flexibility of sleeping habits, feelings of languor, fitness, time of arrival in Australia, and whether or not they had previous experience of travel to Australia. These variables and whether the body clock adjusted to new local time by phase advance or delay were tested as predictors for jet lag and some of its symptoms by stepwise multiple regression analyses. Results: The amount of sleep in the first flight was significantly greater in those who had left the United Kingdom in the evening than the morning (medians of 5.5 hours and 1.5 hours respectively; p = 0.0002, Mann-Whitney), whereas there was no significant difference on the second flight (2.5 hours v 2.8 hours; p = 0.72). Only the severity of jet lag and assessments of sleep and fatigue were commonly predicted significantly (p Conclusions: These results indicate the importance of an appropriate choice of itinerary and lifestyle for reducing the negative effects of jet lag in athletes and others who wish to perform optimally in the new time zone.

Journal ArticleDOI
TL;DR: Growth factors may be useful in tendon healing, possibly introduced using gene therapy, as tendon healing is classically considered to occur through extrinsic and intrinsic healing.
Abstract: Growth factors may be useful in tendon healing, possibly introduced using gene therapy Tendon disorders are a major problem in sports and occupational medicine. Tendons have the highest tensile strength of all connective tissue because of a high proportion of collagen in the fibres and their closely packed parallel arrangement in the direction of force. The individual collagen fibrils are arranged into fascicles which contain blood vessels and nerve fibres. Specialised fibroblasts, tenocytes, lie within these fascicles and exhibit high structural organisation.1 Histologically, they appear as star shaped cells in cross sections. In longitudinal sections, they are arranged in rows following the direction of the tendon fibres. This specialised arrangement is related to their function, as tenocytes synthesise both fibrillar and non-fibrillar components of the extracellular matrix, and are able to reabsorb collagen fibrils.2 The fascicles themselves are enclosed by epitenon. This is surrounded by the paratenon, and the potential space between them is filled by a thin, lubricating film of fluid which allows gliding of the tendon during motion. Tendon healing is classically considered to occur through extrinsic and intrinsic healing. The intrinsic model produces obliteration of the tendon and its tendon sheath. Healing of the defect involves an exudative and a formative phase which, on the whole, are very similar to those associated with wound healing.3 Extrinsic healing occurs through the chemotaxis of the specialised fibroblasts into the defect from the ends of the tendon sheath.4 The process can be divided into three phases: inflammation, repair, and organisation or remodelling. In the inflammatory phase, occurring three to seven days after the injury, cells migrate from the extrinsic peritendinous tissue such as the tendon sheath, periosteum, subcutaneous tissue, and fascicles, as well as from the epitenon and endotenon.5 Initially, the extrinsic response far …

Journal ArticleDOI
TL;DR: Surgical treatment of chronic Achilles tendinopathy gives good and acceptable short term results and a lower complication rate and a trend to better recovery was observed in patients with peritendinous adhesions only than in those with per itendinousAdhesions combined with an intratendinous lesion.
Abstract: Objective: To prospectively assess the early results of surgical treatment of chronic Achilles tendinopathy. Methods: This seven month prospective follow up study assessed the short term results of surgical treatment of chronic Achilles tendinopathy and compared the subjective and functional outcome of patients with Achilles tendinopathy without a local intratendinous lesion (group A) with that of similar patients with such a lesion (group B). Forty two of the initial 50 patients were examined before surgery and after the seven month follow up. Evaluation included an interview, subjective evaluation, clinical tests, and a performance test. Results: At the follow up, physical activity was fully restored in 28 of the 42 patients (67%), and 35 patients (83%) were asymptomatic or had only mild pain during strenuous exercise. In clinical tests, significant improvements were observed in climbing up and down stairs and the rising on the toes test. Surgical treatment also seemed to be successful from the total test score, which was excellent or good in 35 patients, compared with before surgery when it was excellent or good in one patient only. Patients in group A fared better than those in group B, whether evaluated by recovery of physical activity after surgery (88% v 54%) or the complication rate (6% v 27%). Conclusions: Surgical treatment of chronic Achilles tendinopathy gives good and acceptable short term results. A lower complication rate and a trend to better recovery was observed in patients with peritendinous adhesions only than in those with peritendinous adhesions combined with an intratendinous lesion.

Journal ArticleDOI
TL;DR: The changes in sTfr and the variables of iron status can be mainly attributed to exercise induced changes in volume and can be recommended as a marker of iron deficiency in athletes.
Abstract: Background: Soluble transferrin receptor (sTfr) is a new marker of iron status and erythropoietic activity. It has been included in multivariable blood testing models for the detection of performance enhancing erythropoietin misuse in sport. Objective: To evaluate the effect of different types and volumes of physical activity on sTfr concentration, variables of iron status (ferritin, transferrin, iron, and protein), and haematological indices. Methods: Thirty nine subjects were divided into three groups: 1, untrained (n = 12); 2, moderately trained (n = 14); 3, highly trained (n = 13, seven men, six women). Groups 1 and 2 carried out two exercise tests: an incremental running test until exhaustion (test A) and a 45 minute constant speed running test at 70% Vo 2 max (test B). Group 3 performed three days (women) or four days (men) of prolonged aerobic cycling exercise. The above variables together with haemoglobin and packed cell volume were analysed in venous blood samples before and after exercise. Changes in blood and plasma volume were estimated. Results: sTfr levels were slightly increased in trained and untrained subjects immediately after test A. Test B and aerobic exercise had no significant effect on sTfr. Ferritin levels were increased after the laboratory tests for trained and untrained subjects and after prolonged aerobic exercise in male cyclists. Transferrin was increased significantly in trained and untrained subjects after both laboratory tests, but remained unchanged after prolonged exercise. Plasma and blood volumes were decreased after the laboratory tests but increased after aerobic exercise. No differences in the variables were observed between trained and untrained subjects with respect to response to exercise. Conclusion: The changes in sTfr and the variables of iron status can be mainly attributed to exercise induced changes in volume. Taking these limitations into account, sTfr can be recommended as a marker of iron deficiency in athletes.

Journal ArticleDOI
TL;DR: The overall rate and pattern of injury are similar to those reported previously in comparable studies, and several factors are associated with an increased risk of injury and should be targeted in future injury prevention campaigns.
Abstract: Objectives: To examine the incidence and patterns of snow sports injuries at the three largest commercial ski areas in Scotland and to identify factors associated with injury risk. Methods: A prospective case-control study of all injured people at Cairngorm, Glenshee, and Nevis Range ski areas during the 1999–2000 winter season. Personal details, snow sports related variables, diagnosis, and treatment were recorded. Control data were collected at random from uninjured people at all three areas. Random counts were performed to analyse the composition of the on slope population. Results: A total of 732 injuries were recorded in 674 people. Control data were collected from 336 people. The injury rate for the study was 3.7 injuries per 1000 skier days. Alpine skiers comprised 67% of the on slope population, snowboarders 26%, skiboarders 4%, and telemark skiers 2%. Lower limb injuries and sprains were the commonest injuries in alpine skiers and skiboarders. Snowboarders sustained more injuries to the upper limb and axial areas. Skiboarders and snowboarders had a higher incidence of fractures. After adjustment for other variables, three factors were all independently associated with injury: snowboarding (odds ratio (OR) 4.07, 95% confidence interval (CI) 1.65 to 10.08), alpine skiing (OR 3.82, CI 1.6 to 9.13), and age <16 years (OR 1.9, CI 1.14 to 3.17). More than five days of experience in the current season and at least one week of experience in total had a protective effect against injury. Conclusions: Despite a change in the composition of the alpine population at Scottish ski areas, the overall rate and pattern of injury are similar to those reported previously in comparable studies. Several factors are associated with an increased risk of injury and should be targeted in future injury prevention campaigns.