scispace - formally typeset
Search or ask a question

Showing papers in "Optometry and Vision Science in 2011"


Journal ArticleDOI
TL;DR: Font size and viewing distance were measured while subjects used handheld electronic devices to consider the closer distances adopted while viewing material on smart phones when examining patients and prescribing refractive corrections for use at near, as well as when treating patients presenting with asthenopia associated with nearwork.
Abstract: Purpose.The use of handheld smart phones for written communication is becoming ubiquitous in modern society. The relatively small screens found in these devices may necessitate close working distances and small text sizes, which can increase the demands placed on accommodation and vergence.M

203 citations


Journal ArticleDOI
TL;DR: The overall pattern of results suggests that optical treatment strategies for myopia that take into account the effects of peripheral vision are likely to be more successful than strategies that effectively manipulate only central vision.
Abstract: It is well established that refractive development is regulated by visual feedback. However, most optical treatment strategies designed to reduce myopia progression have not produced the desired results, primarily because some of our assumptions concerning the operating characteristics of the vision-dependent mechanisms that regulate refractive development have been incorrect. In particular, because of the prominence of central vision in primates, it has generally been assumed that signals from the fovea determine the effects of vision on refractive development. However, experiments in laboratory animals demonstrate that ocular growth and emmetropization are mediated by local retinal mechanisms and that foveal vision is not essential for many vision-dependent aspects of refractive development. However, the peripheral retina, in isolation, can effectively regulate emmetropization and mediate many of the effects of vision on the eye’s refractive status. Moreover, when there are conflicting visual signals between the fovea and the periphery, peripheral vision can dominate refractive development. The overall pattern of results suggests that optical treatment strategies for myopia that take into account the effects of peripheral vision are likely to be more successful than strategies that effectively manipulate only central vision. (Optom Vis Sci 2011;88:1029–1044)

197 citations


Journal ArticleDOI
TL;DR: OK significantly reduced myopia in the central 20° VF in myopic children, converting relative peripheral hyperopia measured at baseline to relative peripheral myopia, which may provide a potential mechanism for myopia control.
Abstract: Purpose.To investigate changes in peripheral refraction after orthokeratology (OK) and rigid gas-permeable (GP) lens wear in progressing myopic children and to compare these peripheral defocus changes with reported changes in adults wearing OK.Methods.Sixteen myopic children subjects were fi

189 citations


Journal ArticleDOI
TL;DR: In this paper, the authors examined the visual predictors of falls and injurious falls among older adults with glaucoma and found that more extensive field loss in the inferior region was associated with higher rate of falls (RR 1.57, 95%CI 1.06, 2.32).
Abstract: PURPOSE: To examine the visual predictors of falls and injurious falls among older adults with glaucoma. METHODS: Prospective falls data were collected for 71 community-dwelling adults with primary open-angle glaucoma, mean age 73.9 ± 5.7 years, for one year using monthly falls diaries. Baseline assessment of central visual function included high-contrast visual acuity and Pelli-Robson contrast sensitivity. Binocular integrated visual fields were derived from monocular Humphrey Field Analyser plots. Rate ratios (RR) for falls and injurious falls with 95% confidence intervals (CIs) were based on negative binomial regression models. RESULTS: During the one year follow-up, 31 (44%) participants experienced at least one fall and 22 (31%) experienced falls that resulted in an injury. Greater visual impairment was associated with increased falls rate, independent of age and gender. In a multivariate model, more extensive field loss in the inferior region was associated with higher rate of falls (RR 1.57, 95%CI 1.06, 2.32) and falls with injury (RR 1.80, 95%CI 1.12, 2.98), adjusted for all other vision measures and potential confounding factors. Visual acuity, contrast sensitivity, and superior field loss were not associated with the rate of falls; topical beta-blocker use was also not associated with increased falls risk. CONCLUSIONS: Falls are common among older adults with glaucoma and occur more frequently in those with greater visual impairment, particularly in the inferior field region. This finding highlights the importance of the inferior visual field region in falls risk and assists in identifying older adults with glaucoma at risk of future falls, for whom potential interventions should be targeted. KEY WORDS: glaucoma, visual field, visual impairment, falls, injury

120 citations


Journal ArticleDOI
TL;DR: Longitudinal data suggest that time outdoors may be protective against myopia onset, and adjusted for differences in the intake of dietary variables, myopes appear to have lower average blood levels of vitamin D than non-myopes, consistent with the hypothesis above.
Abstract: Debates about the causes of myopia have always been classic nature vs. nurture discussions fueled by ample evidence for each side. On the side of nature, myopic parents tend to have myopic children more often than non-myopic parents,1–2 heritabilities are high, on the order of 0.8 to nearly 1.0,3–4 and recent molecular studies have identified numerous genetic loci associated or linked with myopia.5–6 On the side of nurture, excessive near work has been a putative risk factor for myopia for at least 400 years. There is evidence in support of the stereotype; myopic children spend more time in reading and other close work than non-myopic children.7–10 However, recent large, longitudinal studies have shown that the amount of reading or other close work does not increase the risk of becoming myopic.2, 11 The additional close work that myopic children engage in did not precede, and therefore likely did not cause, their myopia. Time spent outdoors has recently become a variable of interest in myopia research. Like near work, many cross-sectional studies find an association between myopia and time outdoors; myopic children spend less time outdoors than non-myopic children.10, 12–14 A recent study indicates that there may be seasonal variation in this effect with smaller differences between refractive error groups in the summer compared to during the school year.15 Unlike near work, this cross-sectional association has been borne out in a longitudinal study, suggesting that more time outdoors might actually be protective and reduce the risk of the onset of myopia.2 The magnitude of this effect may be substantial. For example, the probability of developing myopia by the eighth grade for a third grade child who has two myopic parents and engaged in 0–5 hours per week of sports/outdoor activity was estimated at about 0.60. This probability was reduced to about 0.20 if the third grade child with two myopic parents engaged in over 14 hours per week of sports/outdoor activity.2 Physical activity by itself, whether indoors or outdoors, does not carry the same protective effect of simply spending time outdoors.13–14 Therefore the relevant protective factor appears to be merely being outside rather than some specific activity such as exercise. One might argue that the effects of time outdoors are just the effects of near work in reverse, that time spent outdoors is just time spent not reading. Numerous studies have investigated this question and none has found evidence of this tradeoff behavior; children’s time outdoors is not negatively correlated with reading or other close work. Correlations are either not significant or slightly positive, with children spending more time reading and outdoors.2, 13–14 Several theories have been proposed as the physiological basis of a protective effect on myopia of time spent outdoors. Among these has been a better quality retinal image during distance fixation outdoors.2, 13 Ocular growth is sensitive to retinal defocus in animal models of myopia across numerous species.16–20 A smaller pupil size and the absence of accommodative errors may contribute to an improved retinal image during distance viewing. In animal models, the absence of defocus can have a powerful inhibitory effect on growth toward excessive myopic ocular lengths.19, 21 Alternatively, the greater amount of light outdoors may alter retinal levels of dopamine, also shown to inhibit myopic ocular growth.22–23 Another possibility is that the protective effect of time outdoors is from higher levels of cutaneously-derived vitamin D. Several lines of evidence are consistent with this hypothesis. Chief among these is the finding that time outdoors rather than any specific physical activity carries the protective effect.13–14 There are also seasonal effects on eye growth, resulting in a faster rate of myopia progression in the autumn and winter when there are fewer hours of daylight and a slower rate in the sunnier spring and summer months.24 The purpose of this study was to evaluate whether myopic and non-myopic individuals differ with respect to circulating levels of vitamin D, with appropriate adjustment for activities or dietary factors that might affect vitamin D.

108 citations


Journal ArticleDOI
TL;DR: Although intraocular pressure elevation (or its absence) no longer can be counted on for diagnostic purposes, the role of intraocular pressures in the management of glaucomatous optic neuropathy remains critical.
Abstract: Doctors have not always associated elevated intraocular pressure with the vision loss from glaucoma. Although several individuals appear to have noted firmness of the eye in this condition as far back as the 10th century, elevated intraocular pressure was not routinely assessed until the lat

89 citations


Journal ArticleDOI
TL;DR: Vision therapy/orthoptics is effective in improving accommodative amplitude and accommodative facility in school-aged children with symptomatic CI and co-existing accommodative dysfunction.
Abstract: Accommodative disorders are commonly encountered in pediatric eye care practices.1,2 and the two most common accommodative disorders are accommodative insufficiency and accommodative infacility.3–5 Accommodative insufficiency is a condition in which the amplitude of accommodation is less than expected for a nonpresbyopic patient’s age,5 whereas accommodative infacility is a condition in which the latency and speed of the accommodative response are abnormal compared to normative clinical data.5 Associated signs and symptoms are usually related to reading and other close work activities and include: blurred vision at near, intermittent blurred vision when looking up from near work, headaches, watering or burning of the eyes, tired eyes, loss of concentration, and avoidance of near activities.6–9 The most commonly prescribed treatments for accommodative dysfunction are a plus lens addition at near or vision therapy/orthoptics.4,5,10–13 While plus lenses worn for near activities may improve symptoms for some patients, vision therapy/orthoptics has the potential to eliminate the accommodative dysfunction rather than solely providing symptomatic relief.5 Studies have shown that voluntary control of accommodation can be learned and transferred to a variety of conditions,14,15 and objective improvements in the dynamics16 and accuracy of accommodation following vision therapy have been documented. 17,18 While clinical studies have reported success rates for the treatment of accommodative dysfunction as high as 96%,10–13,19,20 methodological limitations have prevented definitive conclusions from being made. A more rigorous scientific base, ideally a randomized controlled trial, is needed to evaluate the effectiveness of vision therapy/orthoptics for the treatment of accommodative dysfunction in children. The Convergence Insufficiency Treatment Trial (CITT),21,22 a large-scale, randomized clinical trial evaluating vision therapy/orthoptics modalities for children with symptomatic convergence insufficiency, enrolled 164 children with concomitant deficiencies in accommodative function. Accommodative amplitude and facility measures were prospectively collected using standardized methods. These data provide an opportunity to determine the effectiveness of vision therapy/orthoptics for accommodative dysfunction. Herein, we report the effectiveness of office-based vergence/accommodative therapy (OBVAT), home-based computer vergence/accommodative therapy plus pencil push-ups (HBCVAT+), home-based pencil push-up therapy (HBPP), and office-based placebo therapy (OBPT or placebo therapy) for improving accommodative amplitude and accommodative facility in school-aged children with symptomatic convergence insufficiency and accommodative dysfunction.

84 citations


Journal ArticleDOI
TL;DR: A compact and convenient clinical apparatus for the measurement of suppression based on a previously reported laboratory-based approach that constitutes the ideal platform for suppressors to combine information between their eyes in a similar way to binocularly normal people is described.
Abstract: Purpose.We describe a compact and convenient clinical apparatus for the measurement of suppression based on a previously reported laboratory-based approach. In addition, we report and validate a novel, rapid psychophysical method for measuring suppression using this apparatus, which makes the techni

79 citations


Journal ArticleDOI
TL;DR: Although most patients consider themselves to be complying with standard practitioner guidelines for lens wear and care practices, essentially all contact lens wearing patients exhibit behavioral non-compliance with resulting increased risk for significant complications.
Abstract: Purpose To compare the effects of existing patient awareness of lens-related complications and underlying risk factors on actual patient behavior during contact lens wear and care practices in two different clinical study populations. Methods Established contact lens wearers (n = 281) completed an anonymous written questionnaire on presenting to their habitual eye care practitioner in the Dallas-Fort Worth metroplex. Data were analyzed and compared against a second study population, which comprised established contact lens wearers (n = 152) who were sequentially evaluated after their routine contact lens examination at the University of Texas Southwestern Medical Center at Dallas, TX (UTSW). All patients were questioned regarding his or her lens care practices and knowledge of complications and risk factors associated with contact lens wear. Results Fifty-eight percent of patients in the general community could identify by name a complication associated with lens wear compared with 91% within the medical center. The most frequent complications reported were related to comfort and handling (72%, Dallas-Fort Worth) and infection (47%, UTSW). The majority of patients could correctly identify risk factors associated with lens-related complications; awareness for topping-off solutions, tap water exposure, and hygiene varied between groups. Overall, 85% of patients perceived themselves as compliant with their lens wear and care practices. Using a standard scoring model to determine actual compliance, 2% of patients demonstrated good compliance; however, only 0.4% of patients were fully compliant with contact lens wear and care practices. Conclusions The data reveal some study bias in complication and risk awareness between populations; however, despite this limitation, a significant proportion of patients exhibited actual non-compliant behavior despite acknowledged awareness of risk. Although most patients consider themselves to be complying with standard practitioner guidelines for lens wear and care practices, essentially all contact lens wearing patients exhibit behavioral non-compliance with resulting increased risk for significant complications.

79 citations


Journal ArticleDOI
TL;DR: Kc pathology, at least initially, has a distinct anterior focus involving the epithelium, ALL, and anterior stroma, and the activity of the hitherto unreported recruited stromal cells may be to break down and remove ALL and anteriorStromal lamellae, leading to the overall thinning that accompanies this disease.
Abstract: Keratoconus (Kc) is a non-inflammatory condition characterized by corneal ectasia, steepening, thinning and scarring.1–3 Reports in the literature have described a number of pathological changes within the Kc cornea. Featured prominently in these reports are observations of “breaks/interruptions/fragmentations/dehiscences/ruptures” of the anterior limiting lamina (ALL).4–19 This observation appears to describe a ‘crack’ or brief discontinuity of the ALL. Although most sources agree that Kc is a disease of the stromal layer, Chi et al. (1956)5 presented some evidence of an epithelial contribution to this ocular pathology. Teng (1963) even postulated that the origin of this disease was to be found in the epithelium,20 and this hypothesis has later received further scientific support.6,7,12,21–23 Stromal changes in the Kc cornea have been noted in the literature but in limited detail. Leibowitz and Morello (1998) stated that “collagen fibers are said to be normal, with thinning attributable to a decrease in the number of collagen lamellae”10. The loss of lamellae as an explanation to corneal thinning is also proposed by other sources.14,24–26 In addition, and according to Morishige, the Kc cornea showed “less lamellar interweaving and marked reduction or loss of [anterior] inserting lamellae”.25 Most corneal alterations reported appear to have an anterior location,1,4,14,17–21 which seems to be confirmed by in-vivo corneal confocal microscopy. However, histopathological changes can occur in the posterior cornea in this condition as a result of hydrops. However, hydrops is viewed as an uncommon complication in this disease.27 Rupture of the posterior limiting lamina (PLL), leading to a discontinuity of endothelial cell coverage of the posterior corneal surface, allows aqueous free entry into the corneal stroma.14,24,27 Although the induced corneal edema mostly clears up, resultant scarring may prevent patients with history of hydrops from regaining acceptable visual acuity.27 The current understanding of the pathophysiology of Kc is incomplete and eyecare practitioners managing Kc patients would greatly benefit from improvement of our knowledge base. Detailed information on tissue changes within the cornea may provide indications on the ability of the unhealthy cornea to handle the presence of a contact lens or where lens bearing is undesirable. More precise knowledge of the location and focus of the disease may better direct the surgeon to the optimal choice of procedure, which, in transplant surgery, may be lamellar or penetrating keratoplasty. The literature contains only a modest number of histopathological reports on Kc involving tissue samples from a larger case series. However, with few exceptions, they lack morphometry and in some cases, are limited to light microscopy and a protocol using a high osmolality fixative that further distorts an already distorted cornea.12,23,28–30 The purpose of this study was to systematically investigate and quantify pertinent histopathological changes in a series of Kc corneas, at both the apex of the cone and the adjacent mid-peripheral corneal region, using an established fixative with physiological osmolality.

77 citations


Journal ArticleDOI
TL;DR: Overall, daily dietary supplementation with goji berry for 90 days increases plasma zeaxanthin and antioxidant levels as well as protects from hypopigmentation and soft drusen accumulation in the macula of elderly subjects.
Abstract: Purpose.Goji berry (Lycium barbarum L.) is purported to benefit vision because of its high antioxidant (especially zeaxanthin) content, although this effect has not been demonstrated in high-quality human studies. The purpose of this study was to evaluate the effects of daily supplementation

Journal ArticleDOI
TL;DR: Although the majority of surgeries used nowadays were introduced in the 1960s, their roots can be traced to the work of surgeons in the 19th century, and the fundamentals of different glaucoma procedures are described.
Abstract: Many new surgeries have been devised since 1856, when von Graefe discovered that iridectomy is an effective surgical method for acute glaucoma treatment. Two years later, De Wecker presented sclerotomy as a procedure for chronic glaucoma. In 1900, internal filtration (cyclodialysis) was developed. In 1932, ciliodestruction was suggested. The four approaches, relief of pupillary block, external filtration, internal filtration, and ciliodestruction, are still the basic techniques of glaucoma surgeries over 100 years later. There have been two basic approaches to lowering eye pressure surgically: increase outflow and decrease inflow of aqueous humor. Although the majority of surgeries used nowadays were introduced in the 1960s, their roots can be traced to the work of surgeons in the 19th century. Trabeculectomy, in use since the mid-1960s, is the most effective glaucoma surgery in terms of intraocular pressure reduction but carries its own limitations. Non-penetrating glaucoma surgeries emerged at the same time trabeculectomy was presented, but they are not used as commonly as trabeculectomy. Molteno introduced the first effective shunt and followed by others. Since 1995, the majority of new surgeries have consisted of new implantable devices including SOLX, iStent, and Ex-PRESS shunt. This article will review the history of glaucoma surgery and describe the fundamentals of different glaucoma procedures.

Journal ArticleDOI
TL;DR: The significant correlation between temporal RPEL and central myopic shift, with the latter being independent of baseline refraction, supports the hypothesis that eye shape at the posterior pole is one of the factors influencing visually guided axial eye growth, possibly through associated peripheral defocus.
Abstract: Purpose. Retinal steepness at the posterior pole was shown to be associated with peripheral refraction, and there exists strong evidence that peripheral refraction influences central refractive development. The purpose of this study was to investigate whether retinal steepness is associated with central myopic shift in children. Methods. Central refraction was measured in OD of 140 children aged 7 to 11 years as central sphere equivalent refraction (CSER) and central sphere refraction at baseline and after ∼30 months. For the estimation of retinal steepness, relative peripheral eye length (RPEL) was determined in OD by measuring length axially with a custom-made optical low coherence interferometer and subtracting it from eye length measured peripherally at 20° in the nasal, inferior, temporal, and superior fields. Association between baseline RPEL at the various locations and shift in central refraction was evaluated with a Structural Equation Modeling analysis. Results. CSER at baseline measured +0.05 ± 0.54 diopters (D) (mean ± SD). Shift in CSER, as standardized over a 30-month interval to account for individual differences in the follow-up period, was −0.21 ± 0.56 D. A weak, but significant, correlation was observed between baseline RPEL in the temporal retina and myopic shift in CSER (r = 0.207, p = 0.049), steeper retinas displaying greater myopic shifts. Myopic shift was correlated with axial elongation but not correlated with baseline refraction. Analyses were performed for both CSER and central sphere refraction with near-identical results. RPEL did not change significantly. Conclusions. The significant correlation between temporal RPEL and central myopic shift, with the latter being independent of baseline refraction, supports the hypothesis that eye shape at the posterior pole is one of the factors influencing visually guided axial eye growth, possibly through associated peripheral defocus. Its predictive value for refractive development and limitation to the temporal retina require further investigation.

Journal ArticleDOI
TL;DR: The history of glaucoma pharmacology begins in 1862 with the isolation of physostigmine from the calabar bean, and during the 20th century, drug discovery and development accelerated, with the introduction of carbonic anhydrase inhibitors, beta blockers, and prostaglandin analogs.
Abstract: The history of glaucoma pharmacology begins in 1862 with the isolation of physostigmine from the calabar bean. The discovery of epinephrine's intraocular pressure lowering capacity came along some 40 years later. During the 20th century, drug discovery and development accelerated, with the introduction of carbonic anhydrase inhibitors, beta blockers, and prostaglandin analogs. This survey of the history of glaucoma medications reviews some of the pivotal stories behind the development of the drugs that we use daily to manage our patients with glaucoma. In addition, some unmet needs that persist in glaucoma pharmacology are discussed.

Journal ArticleDOI
TL;DR: It is proposed that BDNF in the serum might be a useful biochemical marker for early detection of POAG and a reliable, time efficient, and cost-effective method for diagnosis, screening, and assessing the progression ofPOAG.
Abstract: PURPOSE To introduce a novel biomarker for screening of primary open-angle glaucoma (POAG) by detecting and measuring brain-derived neurotrophic factor (BDNF) in the serum of normal subjects and patients with early stage of glaucoma. METHODS Twenty-five glaucoma patients as the case group and 25 age- and sex-matched normal persons as the control group were tested. The control group comprised 19 men and 6 women, with the mean age of 59.32 ± 11.8 years and without any apparent ocular or systemic diseases. The case group comprised 20 men and 5 women, with the mean age of 59.64 ± 11.56 years, who were assessed by routinely performed clinical and paraclinical investigations. BDNF levels in serum were determined by enzyme-linked immunosorbent assay using monoclonal antibodies specific for BDNF. RESULTS The mean of BDNF levels in the serum was 27.16 ± 5.53 ng/mL in the control subjects and 18.42 ± 4.05 ng/mL in the subjects with the early stage glaucoma. A statistically significant difference was evident between the two groups (p 0.05). Glaucoma patients who had lower serum BDNF concentration had disclosed a significant negative correlation with pattern standard deviations. CONCLUSIONS We conclude that BDNF in the serum might be a useful biochemical marker for early detection of POAG. We also propose that this might be a reliable, time efficient, and cost-effective method for diagnosis, screening, and assessing the progression of POAG. However, more studies and trials are needed to investigate these factors in greater detail.

Journal ArticleDOI
TL;DR: When human segmenters are trained, the within-and between-segmenter reliability of manual border segmentation is quite good, suggesting that manual segmentation provides a reliable measure of the thickness of layers typically measured in studies of glaucoma.
Abstract: For more than 15 years, the thickness of the human retinal nerve fiber layer (RNFL) has been routinely measured with time-domain optical coherence tomography (OCT).1,2 With the newer frequency-domain (fd) OCT, other retinal layers can be easily discerned and measured as well. Studies of glaucoma typically focus on layers of the inner retina—the RNFL, the retinal ganglion cell (RGC) layer, the combined thickness of the RNFL, RGC and inner plexiform layers (IPL)—and, in some cases, total retinal thickness as well. A variety of computer algorithms have been developed for segmenting two or more of these layers. Some of these algorithms are commercially available, included with particular fdOCT machines, whereas others are reserved for the use of individual research groups. However, different algorithms can produce different results. For example, we provided evidence that segmentation algorithms, rather than hardware, accounted for differences between the RNFL thickness measured with an fdOCT machine vs. that measured with time-domain OCT machine.3 Because there is no generally available and accepted algorithm for segmenting different retinal layers, computer-aided manual segmentation procedures have been used by a few groups, e.g., as shown in Refs. 4–9. For example, we have used a computer-aided manual procedure to measure the thickness of the RGC plus IPL in patients with glaucoma.8 In addition to the obvious need to assess the reproducibility of these procedures, there are three other reasons to be concerned with validating manual procedures. First, there is no “gold standard” for assessing the results of segmentation algorithms; thus, it is hard to compare the relative performance of different algorithms. Manual segmentation provides a possible vehicle for validation.5,6 Although, we do not mean to imply that the visually determined borders are the “gold standard,” it is true that many errors of automated algorithms can be detected visually. Second, a systematic attempt to visualize borders supplies information about the problems automated algorithms might confront, as will be illustrated below. Third, even if automated procedures become reliable enough for general use, it is very likely that they will include an option for the user to “correct” the segmented borders. In fact, both commercial and non-commercial programs presently have this capability. This also raises the question of the reliability of manual segmentations. That is, how consistent are different operators in how they “correct” the segmentation? The purpose of this study was to assess the within- and between-segmenter agreement of a computer-aided manual procedure for segmenting fdOCT scans. The procedures here were designed to study the effects of glaucoma. Thus, we concentrate on the layers that were most vulnerable to this disease. We trained four individuals, whom we call “segmenters,” to mark the borders between these layers in fdOCT line scans of the horizontal meridian. After training, we asked How good is the within-segmenter reliability (i.e., repeatability)? How good is the agreement between segmenters? And, what have we learned about the problems confronting computerized algorithms?

Journal ArticleDOI
TL;DR: A correction factor to improve the accuracy of intraocular pressure (IOP) measurements made by the Goldmann applanation tonometer (GAT), which considers the combined effect of variations in corneal thickness, curvature, age, and IOP, is developed.
Abstract: Purpose. To develop a correction factor to improve the accuracy of intraocular pressure (IOP) measurements made by the Goldmann applanation tonometer (GAT), which considers the combined effects of variations in central corneal thickness (CCT), central anterior curvature (R), age, and the IOP level itself. Methods. Nonlinear numerical simulations based on the finite element method were used to represent corneal behavior under the effect of IOP and external tonometric pressure. The simulations considered various biomechanical corneal properties including the cornea’s nonuniform thickness, elliptical topography, weak stromal interlamellar cohesion, low epithelial and endothelial stiffness, and hyperelastic and hysteretic material behavior. The simulations were used to model the GAT procedure on corneas to obtain a correction equation based on the values of CCT, R, age, and IOP measured using GAT (IOPG). The efficiency of the equation in reducing the effects of corneal parameters on IOPG measurements was also assessed using an independent clinical database. Results. The individual effects of variations in CCT, R, and age were estimated at 1.66 mm Hg/100 of CCT, 0.89 mm Hg/1 mm of R, and 0.12 mm Hg/decade of age. The correction equation reduced the association between clinical IOP measurements and corneal parameters with r 2 reducing from 11.8 to 0.02%. Conclusions. The GAT correction factor can consider the combined effect of variations in corneal thickness, curvature, age, and IOP. The factor could significantly reduce the reliance of IOPG measurements on corneal stiffness parameters. (Optom Vis Sci 2011;88:E102–E112)

Journal ArticleDOI
TL;DR: In this article, a range of ophthalmic markers were used to assess diabetic peripheral neuropathy (DPN) using optical coherence tomography (OCT) and perimetry.
Abstract: Diabetic peripheral neuropathy (DPN) is a debilitating condition that affects about 50% of diabetic patients. The symptoms of DPN include numbness, tingling, or pain in the arms and legs. Patients with numbness may be unaware of foot trauma, which could develop into a foot ulcer. If left untreated, this may ultimately require amputation. Currently, the only method of directly examining peripheral nerves is to conduct skin punch or sural/peroneal nerve biopsies, which are uncomfortable and invasive. Indirect methods include quantitative sensory testing (assessing responses to heat, cold, and vibration) and nerve electrophysiology. Here, I describe research undertaken in my laboratory, investigating the possibility of using a range of ophthalmic markers to assess DPN. Corneal nerve structure and function can be assessed using corneal confocal microscopy and non-contact corneal esthesiometry, respectively. Retinal nerve structure and visual function can be evaluated using optical coherence tomography and perimetry, respectively. These techniques have been used to demonstrate that DPN is associated with morphological degradation of corneal nerves, reduced corneal sensitivity, retinal nerve fiber layer thinning, and peripheral visual field loss. With further validation, these ophthalmic markers could become established as rapid, painless, non-invasive, sensitive, reiterative, cost-effective, and clinically accessible means of screening for early detection, diagnosis, staging severity, and monitoring progression of DPN, as well as assessing the effectiveness of possible therapeutic interventions. Looking to the future, this research may pave the way for an expanded role for the ophthalmic professions in diabetes management.

Journal ArticleDOI
TL;DR: Lysozyme deposited on contact lenses does not possess antibacterial activity against certain bacterial strains, whereas lactoferrin possess an antibacterial effect against strains of P. aeruginosa.
Abstract: Purpose. The aim of the study is to determine the adhesion of Gram positive and Gram negative bacteria onto conventional hydrogel (CH) and silicone hydrogel (SH) contact lens materials with and without lysozyme, lactoferrin, and albumin coating. Methods. Four lens types (three SH—balafilcon A, lotrafilcon B, and senofilcon A; one CH—etafilcon A) were coated with lysozyme, lactoferrin, or albumin (uncoated lenses acted as controls) and then incubated in Staphylococcus aureus (Saur 31) or either of two strains of Pseudomonas aeruginosa (Paer 6294 and 6206) for 24 h at 37°C. The total counts of the adhered bacteria were determined using the 3 H-thymidine method and viable counts by counting the number of

Journal ArticleDOI
TL;DR: Reduced ability to respond to myopia by slowing axial elongation may contribute to the development of myopia in cases where genetics alone would make the axial length longer than the focal plane.
Abstract: Substantial evidence has emerged over the past decades for a role of genetics in the development of human refractive error. There also is an emmetropization mechanism that uses visual signals to match the axial length to the focal plane. There has been little discussion of how these two important factors might interact. We explore here ways in which genetic factors driving axial growth may interact with the emmetropization mechanism, mostly to produce emmetropic eyes but often to produce myopia. An important factor may be a normal, yet reduced ability of juvenile eyes to use myopia to restrain genetically driven axial elongation. Reduced ability to respond to myopia by slowing axial elongation may contribute to the development of myopia in cases where genetics alone would make the axial length longer than the focal plane.

Journal ArticleDOI
TL;DR: It is proposed that something may be special about the visual processing of real astigmatic and cross-cylinder defocus—because they have less effect on VA than simulations predict.
Abstract: Purpose.To compare the effects of “simulated” and “real” spherical and astigmatic defocus on visual acuity (VA).Methods.VA was determined with letter charts that were blurred by calculated spherical or astigmatic defocus (simulated defocus) or were seen through spherical or astigmatic trial

Journal ArticleDOI
TL;DR: The results from this study suggest that degree of myopia and central corneal radius both have a significant though weak association withCorneal asphericity in Chinese eyes.
Abstract: Purpose.To observe and analyze corneal asphericity and its related factors in Chinese subjects.Methods.The corneal asphericity of 1052 right eyes from a Chinese population was determined using the Wavelight-ALLEGRO Topographer. The corneal asphericity coefficient Q describes the rate of curv

Journal ArticleDOI
TL;DR: The results suggest that the risk of events that interrupt SCL wear peaks in late adolescence and early adulthood and reflects risk factors identified in prospective contact lens studies.
Abstract: Purpose. The purpose of this study was to describe age and other risk factors for ocular events that interrupt soft contact lens (SCL) wear in youth. Methods. A retrospective chart review of SCL wearers aged 8 to 33 years at the first observed visit was conducted at six academic eye care centers in North America. Data were extracted from all visits during the observation period (3 years). Clinical records that documented conditions resulting in an interruption of SCL wear “events” were scanned, masked for age and SCL parameters, and then adjudicated to consensus diagnosis. Generalized estimating equations were used to examine the effect of selected covariates, including age, on the risk of an event. Results. Chart review of 3549 SCL wearers yielded 522 events among 426 wearers (12%). The risk of an event increased from ages 8 to 18 years, showed modest increases between ages 19 and 25 years, and then began to decline after age 25 years. New lens wearers (1 year) were less likely to experience events (p 0.001). Lens replacement schedule and material were also predictive of interruptions to SCL wear with the lowest risk in daily replacement and hydrogel lens wearers (both p 0.0001). Conclusions. These results suggest that the risk of events that interrupt SCL wear peaks in late adolescence and early adulthood and reflects risk factors identified in prospective contact lens studies. Relative to older teens and young adults, patients younger than 14 years presented with significantly fewer events resulting in interrupted lens wear. (Optom Vis Sci 2011;88:973–980)

Journal ArticleDOI
TL;DR: To determine how high- and low-contrast visual acuities are affected by blur caused by crossed cylinder lenses, single lines of letters based on the Bailey-Lovie chart were used and small levels of crossed cylinder blur produces losses in visual acuity that are dependent on the cylinder axis.
Abstract: Purpose To determine how high- and low-contrast visual acuities are affected by blur caused by crossed cylinder lenses. Methods Crossed cylinder lenses of power 0 (no added lens), +0.12 diopter sphere (DS)/-0.25 diopter cylinder (DC), +0.25 DS/-0.50 DC and +0.37/-0.75 DC were placed over the correcting lenses of the right eyes of eight subjects. Negative cylinder axes used were 15 to 180° in 15° step for the two higher crossed cylinders and 30 to 180° in 30° steps for the lowest crossed cylinder. Targets were single lines of letters based on the Bailey-Lovie chart. Successively smaller lines were read until the subject could not read any of the letters correctly. Two contrasts were used: high (100%) and low (10%). The screen luminance of 100 cd/m, in combination with the room lighting, gave pupil sizes of 4.5 to 6 mm. Results High-contrast visual acuities were better than low-contrast visual acuities by 0.1 to 0.2 log unit (1 to 2 chart lines) for the no added lens condition. Based on comparing the average of visual acuities for the 0.75 D crossed cylinder with the best visual acuity for a given contrast and subject, the rates of change of visual acuity per unit blur strength were similar for high contrast (0.34 ± 0.05 logarithm of the minutes of arc resolution/D) and low contrast (0.37 ± 0.09 logarithm of minutes of arc of resolution/D). There were considerable asymmetry effects, with the average loss in visual acuity across the two contrasts and the 0.50 D/0.75 D crossed cylinders doubling between the 165° and 60° negative cylinder axes. The loss of visual acuity with 0.75 D crossed cylinders was approximately twice times that occurring for defocus of the same blur strength. Conclusions Small levels of crossed cylinder blur (≤0.75 D) produces losses in visual acuity that are dependent on the cylinder axis. Crossed cylinders of 0.75 D produce losses in visual acuity that are twice those produced by defocus of the same blur strength.

Journal ArticleDOI
TL;DR: Both instruments showed similar variability and test-retest variability when results were compared using equivalent units, however, there are important differences in sensitivity values, stimulus parameters, and testing strategies that have to be taken into account when comparisons are made.
Abstract: Purpose. To compare visual fields on the Nidek MP-1 to those obtained on the Humphrey field analyzer (HFA) in healthy volunteers and assess the effects of differences in stimulus parameters and testing strategies that may influence the interpretation of results in patients. A secondary aim was to establish MP-1 normative data to calculate the total deviation analyses and global indices analogous to those used by the HFA. Methods. Fifty healthy volunteers (age 43.5 ± 13.9 years, range, 18 to 68 years) underwent repeat MP-1 and HFA visual field testing, using the 10-2 pattern. MP-1 data were converted to HFA equivalent dB units. Between instrument comparisons of HFA and MP-1 sensitivities, regression of sensitivity with age and examination duration were assessed. Test-retest variability was examined between visits. Results. MP-1 (mean = 32.82 dB, SD = 1.92 dB) and HFA sensitivities (mean = 32.84 dB, SD = 1.83 dB) were not significantly different (p = 0.759). SD values for the HFA (range, 1.11 to 3.30 dB) were similar to the MP-1 (range, 0.14 to 2.75 dB). However, asymmetry comparisons between instruments showed significantly decreased superior rather than inferior retinal values for the MP-1. There was a small but significant difference (p = 0.004) in mean test duration between the MP-1 (mean = 6:11 min, SD = 1:49 min) and the HFA (mean = 5:14 min, SD = 0:42 min). There was also a difference in the decline of mean sensitivity with age, a decline of 0.1 and 0.4 dB per decade was noted in MP-1 and HFA sensitivity, respectively. Test-retest variability was similar between instruments. A small but non-significant increase in mean sensitivity at the second visit for both the MP-1 (p = 0.060) and HFA (p = 0.570) was found. Conclusions. Both instruments showed similar variability and test-retest variability when results were compared using equivalent units. However, there are important differences in sensitivity values, stimulus parameters, and testing strategies that have to be taken into account when comparisons are made.

Journal ArticleDOI
TL;DR: It is demonstrated that “rub and rinse” is the most effective regimen and should be recommended in conjunction with all multipurpose lens care solutions and all contact lens types, particularly with silicone hydrogel lenses.
Abstract: PURPOSE The introduction of contact lens multipurpose disinfection solution (MPDS) that can be used in conjunction with a "no-rub" regimen has simplified lens care requirements. Once adhered to a surface, microorganisms can become less susceptible to disinfection. The aim of the study was to evaluate the effect of various regimen steps on the efficacy of MPDS when used with silicone hydrogel and conventional lenses. METHODS Commercially available MPDSs containing polyquad or polyhexamethylene biguanide were used in conjunction with two types of silicone hydrogel (lotrafilcon B and galyfilcon A) and one type of conventional soft contact lenses (etafilcon A). Challenge microorganisms included Staphylococcus aureus ATCC 6538, Pseudomonas aeruginosa ATCC 9027, Serratia marcescens ATCC 13880, Fusarium solani ATCC 36031, Candida albicans ATCC 10231, or Acanthamoeba polyphaga Ros. The effect of regimen steps "rub and rinse," "rinse-only," or "no rub and no rinse" on the disinfection efficacy of test MPDSs was examined using the ISO 14729 Regimen Test procedure. RESULTS Overall, the greatest efficacy of MPDSs was observed when "rub and rinse" was performed before disinfection with each of the microorganisms tested, regardless of lens type. "No rub and no rinse" steps resulted in a greater load of microorganisms remaining on lenses compared with the other regimens (p < 0.05). When "rinse-only" was performed before disinfection, the MPDS containing polyquad performed generally better (p < 0.05) than MPDSs containing polyhexamethylene biguanide against bacteria. Significantly, less microorganisms were recovered from galyfilcon A than from other lenses (p < 0.05) when MPDSs were used with "rinse-only" step. CONCLUSIONS This study has demonstrated that "rub and rinse" is the most effective regimen and should be recommended in conjunction with all multipurpose lens care solutions and all contact lens types, particularly with silicone hydrogel lenses.

Journal ArticleDOI
TL;DR: Increased VA and VF variability was predicted largely by increased RP severity and occurred in subjects with reduced VF who reported less physical activity or increased negative psychosocial states, and should be considered during clinical examinations and trials for RP.
Abstract: Purpose—We explored whether greater amounts of short-term variability in visual acuity (VA), contrast sensitivity (CS), or visual field (VF) in retinitis pigmentosa (RP) was related to disease severity or psychosocial factors. Methods—We obtained spectral domain-optical coherence tomography in 27 RP subjects and determined variability (SD) of VA, CS and VF during a mean of 16 tests self-administered at home on a personal computer (PC) twice a week. Subjects completed the Positive and Negative Affect Schedules at each PC-test session, and SF-36 general health and Beck Depression Inventory questionnaires on one occasion. Results—There was a 0.10 log unit increase in VA variability for every 0.58 logMAR increase (worse mean VA) (p=0.001). For subjects with reduced foveal thickness, mean VA explained more of the total VA variability than foveal thickness (R 2 =0.72 and 0.46, respectively, in simple linear regressions). There was a statistically significant 4.3% increased log VF area variability for every 50% mean log VF area decrease (p<0.001); explaining most of the total variability in log VF area variability (R 2 =0.44). When controlling for mean log VF area, there was a statistically significant increase in log VF area variability for subjects with greater than minimal depressive symptoms (p=0.015), with increased mean irritability scores (p=0.02), decreased SF-36 physical functioning subscale scores (p=0.03), or decreased mean score for feeling active, strong and proud (p=0.008) (adjusted R 2 =0.62). CS variability was low, and not statistically significantly related to mean CS, macular thickness or psychosocial factors. Conclusions—Increased VA and VF variability was predicted largely by increased RP severity. Greater VF variability occurred in subjects with reduced VF who reported less physical activity or increased negative psychosocial states. These associations should be considered during clinical exams and trials for RP.

Journal ArticleDOI
TL;DR: Although the basic test procedure has remained similar throughout the ages, there have been many advances in test administration, standardization, statistical evaluation, clinical analysis, interpretation, and prediction of outcome based on visual field findings.
Abstract: Perimetry and visual field testing have been used as clinical ophthalmic diagnostic tools for many years, and this manuscript will provide a brief historical overview of these procedures and the individuals who developed them. Today, we have many different forms of perimetry that are designed to evaluate different locations within the visual pathways and various mechanisms and subsets of mechanisms within the visual system. However, the most widely used method of performing perimetry and visual field testing has not substantially changed for more than 150 years, consisting of detecting a small target superimposed on a uniform background at different locations within the field of view. Although the basic test procedure has remained similar throughout the ages, there have been many advances in test administration, standardization, statistical evaluation, clinical analysis, interpretation, and prediction of outcome based on visual field findings.

Journal ArticleDOI
TL;DR: A semiautomatic algorithm that will allow for repeatable, efficient, and masked ciliary muscle measurements in large datasets is developed and evaluated for segmentation and morphological assessment in Visante Anterior Segment Optical Coherence Tomography images.
Abstract: The traditional method for imaging the ciliary body in clinical practice and research is ultrasound biomicroscopy (UBM). A literature search in early 2010 revealed 341 publications in which an ultrasound biomicroscope was used to image the ciliary body in studies of tumors of the ciliary body,1–4 accommodation,5 accommodating intraocular lenses,6 glaucoma,7 and the relationship between refractive error and ciliary body dimensions.8–9 Despite the fact that over 300 publications have used the UBM to image the ciliary body, there is a relative paucity of literature related to the development and function of the ciliary body throughout the human life span. In fact, the ciliary muscle is perhaps the only smooth muscle without an associated disease state; it is either an unusually robust organ or the discomfort and invasive nature of viewing the ciliary muscle with the UBM has limited detection of ciliary muscle diseases and disorders. Recently, the development of the Zeiss Visante Anterior Segment Optical Coherence Tomographer (OCT, Carl Zeiss Meditec) has allowed for non-contact imaging of the ciliary body. This is especially important for pediatric research.10 Pediatric studies of the relationship between refractive error and ciliary body dimensions,10 and between accommodative microfluctuations and ciliary body dimensions11 would not have been feasible using the UBM. In the future, the authors plan to use the Visante Anterior Segment OCT in studies of the relationship between refractive error and ciliary body dimensions, in studies of the ciliary body in accommodative dysfunction in children, and in studies of the ciliary body in developing presbyopia. Although the Visante was not specifically designed for imaging the ciliary body or measuring its dimensions, there are a number of published studies showing that the Visante provides repeatable and valid measurements of central corneal thickness,12–13 crystalline lens thickness,14 and anterior chamber depth.12–13 In addition, Dada and co-workers (2007) reported that Visante Anterior Segment OCT images showed sharper definition of the scleral spur.13 Based on these data, one might expect that the Visante would also provide high-quality images of the ciliary body that would allow for repeatable and valid measurements of ciliary body dimensions; however, this topic has not been fully addressed in the literature. As Westphal and co-workers (2002) have pointed out previously, OCT instruments are becoming valuable tools for imaging human tissue, but a raw image obtained from these instruments may be subject to distortions due to non-linear axial scanning, non-telecentric scanning, and lack of correction for the refractive properties of the tissue that is imaged.15 While the literature cited above suggests the manufacturer has addressed these distortions when the Visante is used to image and measure the anterior segment, the Visante was not designed to image and measure the ciliary body. In measuring the ciliary body with the calipers in the Visante software in previous studies,10–11 we discovered several inadequacies of the calipers when used in ciliary body images that prompted us to begin analyzing a raw format of the images, i.e., binary files, in third-party software. Using a raw format of the images has, of course, necessitated evaluating distortions in the images. The first in adequacy we noted was that it is impossible for the examiner who acquired the images to make measurements in a masked fashion. Second, the calipers in the Visante software are straight lines, and in some patients, the sclera is curved. When calipers are used to locate the desired measurement distance from the scleral spur, they cut across ciliary body rather than follow the curvature of the sclera (Figure 1). In our previous studies using the calipers in the Visante software,10–11 we noted that some subjects had a flatter scleral curvature, while other subjects had a steeper scleral curvature. These scleral curvature differences could lead to an increased variability in the ciliary body thickness measurements, especially as measurements are made at an increasing distance from the scleral spur. Third, if one wants to make measurements of ciliary body thickness that are in the range of a physiologically accurate measurement, the Visante software is not programmed to apply an appropriate refractive index or scaling factor to the image of the sclera and ciliary body. Finally, measurements of the cross-sectional area cannot be made using tools available within the Visante software, but may be critical to understanding changes in the ciliary body with presbyopia. Figure 1 Example images of the ciliary body obtained with the Zeiss Visante Anterior Segment OCT. (A) An image where the scleral curvature appears relatively flat, and the caliper used to align a measurement 3 mm posterior to the scleral spur follows the contour ... To address these inadequacies, a semi-automatic extraction algorithm was developed to objectively and accurately measure the dimensions of the ciliary body. The algorithm uses active contour models that have been successfully applied in the segmentation of many types of images.16–21 These models can produce sub-pixel accuracy of object boundaries, incorporate regional information for robust segmentation, and provide smooth and closed contours of the object of interest. Recently, a new active contour model based on a local binary fitting energy was proposed to segment magnetic resonance images with intensity inhomogeneity.22–23 Here we extend the model to outline the boundary of the ciliary muscle. In order to avoid any image alternation created by the Visante software when generating a jpeg file, we use the raw images in the form of binary files that were exported from the Visante. The use of a raw image format, however, required that we also assess the raw images for distortions as described by Westphal and co-workers (2002). In summary, aims for this study were: To investigate and develop a correction for any image distortion in the binary files exported from the Visante Anterior Segment OCT by determining: The appropriate pixel per mm conversion factor in Visante images for both air and the sclera/ciliary body; The general fidelity and level of geometric and refractive distortions present in the raw images obtained with the Visante; and To develop a semi-automatic algorithm for outlining and measuring the ciliary body in Visante images and assess the performance of the algorithm by determining: The within-examiner and between-examiner variability for: Caliper measurements from the Visante analysis software; and The semi-automatic algorithm measurements; The number of images needed per subject to provide acceptably repeatable algorithm measurement in future studies; and The agreement between semi-automatic algorithm measurements and caliper measurements (Visante software). With these study aims, we have demonstrated below that the binary files exported from the Visante provide renderings of the structure of human sclera and ciliary muscle that are free from geometric distortions and that the semi-automatic algorithm is capable of segmenting the ciliary muscle in Visante images and providing a repeatable measurement.

Journal ArticleDOI
TL;DR: Oxybuprocaine eye drops do not appear to induce a significant corneal swelling and do not affect the measurements when comparing CCT measured with optical or ultrasound devices.
Abstract: PURPOSE To investigate the effect of oxybuprocaine eye drops on corneal volume (CV) and corneal thickness measurements METHODS Central corneal thickness (CCT), corneal thinnest point (CTP), and CV of 78 eyes of 78 healthy volunteers were measured with Pentacam, before and 5 min after the administration of oxybuprocaine eye drops The fellow non-anesthetized eyes were used as control RESULTS Before topical anesthesia, the mean CCT was 54676 ± 353 μm, after anesthesia, it was 54776 ± 3656 μm (p = 086) In the fellow eyes, the first mean CCT was 54882 ± 352 μm and the second was 54755 ± 359 μm (p = 082) The mean CTP before anesthesia was 54399 ± 3523 μm, after it was 54489 ± 363 μm (p = 088) In the fellow eyes, the first mean CTP was 54415 ± 3535 μm and the second was 54281 ± 36 μm (p = 081) Before topical anesthesia, the mean CV was 6055 ± 384 mm, after it was 6066 ± 397 mm (p = 086) In the fellow eyes, the first mean CV was 6093 ± 387 mm and the second was 6073 ± 4 mm (p = 075) CONCLUSIONS Oxybuprocaine eye drops do not appear to induce a significant corneal swelling and do not affect the measurements when comparing CCT measured with optical or ultrasound devices