Αναζήτηση αυτού του ιστολογίου

Δευτέρα 28 Δεκεμβρίου 2020

AIDS

Progressive phasing out of baseline CD4 cell count testing for people living with HIV in Kinshasa, DRC
No abstract available

Long-term consequences of interpersonal violence experiences on treatment engagement and health status in people living with HIV
Objective: To examine the impact of previous interpersonal violence (IPersV) experiences on long-term healthcare engagement and health outcomes in a large Canadian HIV-cohort. Design: People living with HIV (PLHIV) were screened for IPersV, and their healthcare outcomes over the nine subsequent years were analyzed. Methods: A total of 1064 PLHIV were screened for past and present IPersV experiences through semi-structured interviews. Follow-up included core treatment engagement (e.g., clinic visits) and health-status variables (HIV viral load, CD4 T-cell count, mortality, comorbidities), analysed descriptively and with longitudinal Cox regressions. Results: At intake, 385 (36%) PLHIV reported past or present IPersV including childhood (n = 224, 21%) or adulthood experiences (n = 161, 15%) and were offered conventional social work support. Over nine years, individuals with any IPersV experiences were 36% more likely to discontinue care, 81% more likely to experience viremia, 47% more likely to experience a drop in CD4 counts below 200/mm3, and 65% more likely to die compared to patients not reporting IPersV (p's <0.05). Outcomes were similar when adjusted for sociodemographic factors. Childhood IPersV in particular was linked to several of the outcomes, with higher rates of discontinuation of care, viremia, and mortality related to mental health/addiction or HIV-related complications. Conclusions: IPersV is associated with an increased risk over time of healthcare discontinuation, poorer long-term HIV-related health outcomes, and increased mortality, especially for patients victimized in childhood. Apart from targeted IPersV screening to initiate conventional supports (e.g., through social work), increased efforts to engage vulnerable populations in their long-term care seems warranted. Correspondence to Dr. Esther Fujiwara, Department of Psychiatry, University of Alberta, Edmonton, AB, Canada T6G 2V2. Tel.: 1+(780) 492-4104; fax: +1(780) 492-6841; e-mail: efujiwara@ualberta.ca Received 19 October, 2020 Revised 2 December, 2020 Accepted 7 December, 2020 Supplemental digital content is available for this article. Direct URL citations appear in the printed text and are provided in the HTML and PDF versions of this article on the journal's Website (http://www.AIDSonline.com). Copyright © 2020 Wolters Kluwer Health, Inc.

Combining traditional and molecular epidemiology methods to quantify local HIV transmission among foreign-born residents of King County, Washington
Objectives: We evaluated the ability for molecular epidemiology to augment traditional HIV surveillance beyond the detection of clusters for outbreak investigation. To do this, we address a question of interest to Public Health – Seattle and King County: what proportion of HIV diagnoses among people born outside of the United States are acquired locally? Design: King County residents diagnosed with HIV, 2010–2018. Methods: We linked HIV-1 pol gene sequences to demographic information obtained from the National HIV Surveillance System, Public Health – Seattle and King County case investigation and partner services interviews. We determined the likely location of HIV acquisition based on HIV testing, travel histories and cluster-based molecular analyses. Results: Among 2409 people diagnosed with HIV, 798 (33%) were born outside of the United States. We inferred the location of acquisition for 77% of people born outside of the United States: 26% likely acquired HIV locally in King County (of whom 69% were MSM, 16% heterosexual), and 51% likely acquired HIV outside of King County (primarily outside of the United States). Of this 77% of people for whom we inferred the location of HIV acquisition, 45% were determined using traditional epidemiology methods and an additional 32% were inferred using molecular epidemiology methods. Conclusion: We found that the National HIV Surveillance System misclassified most HIV-infected foreign-born residents as 'new' local infections, and that these cases contribute to an overestimate of local incidence. Our findings highlight how molecular epidemiology can augment traditional HIV surveillance activities and provide useful information to local health jurisdictions beyond molecular cluster detection. Correspondence to Joshua T. Herbeck, Department of Global Health, International Clinical Research Center, University of Washington, Seattle, Washington, USA. E-mail: herbeck@uw.edu, rkerani@uw.edu Supplemental digital content is available for this article. Direct URL citations appear in the printed text and are provided in the HTML and PDF versions of this article on the journal's Website (http://www.AIDSonline.com). This is an open access article distributed under the terms of the Creative Commons Attribution-Non Commercial-No Derivatives License 4.0 (CCBY-NC-ND), where it is permissible to download and share the work provided it is properly cited. The work cannot be changed in any way or used commercially without permission from the journal. http://creativecommons.org/licenses/by-nc-nd/4.0 Copyright © 2020 Wolters Kluwer Health, Inc.

Homonegativity, sexual violence and condom use with women in men who have sex with men and women in West Africa (CohMSM)
Objective: The study aimed to explore longitudinal interactions between homonegativity and sexual behaviors with female partners among HIV-negative West African men who have sex with men and women (MSMW) Design and method: The community-based cohort CohMSM ANRS 12324 - Expertise France enrolled MSM in Togo, Burkina Faso, Côte d'Ivoire and Mali. Socio-behavioral data were collected every 6 months. Using 30-month follow-up data, a multiprobit analysis was performed to investigate the relationship between psychosocial and behavioral variables ex-ante (t-1) and ex-post (t). Results: MSMW (n = 326) accounted for half of all participants in CohMSM. They reported inconsistent condom use with women in 39% of visits. Perceived and internalized homonegativity at t-1 tended to lead to sexual violence towards women at t (p < 0.1), which was associated with inconsistent condom use with them at t (p < 0.05). Conclusion: Given the high HIV prevalence in West African MSM, widespread condom-less sex with women in MSMW, and the aggravating effect of social and internalized homonegativity, more research in the MSMW subpopulation is needed to assess the risk of HIV bridging to women and to design support activities. Correspondence to Marion Fiorentino, SESSTIM, Faculté de médecine, 27 bd Jean Moulin, 13005 Marseille FRANCE. Tél: +33 413 732 283; e-mail: marion.fiorentino@inserm.fr Received 8 January, 2020 Revised 21 February, 2020 Accepted 2 March, 2020 Supplemental digital content is available for this article. Direct URL citations appear in the printed text and are provided in the HTML and PDF versions of this article on the journal's Website (http://www.AIDSonline.com). Copyright © 2020 Wolters Kluwer Health, Inc.

Identifying influential neighbors in social networks and venue affiliations among young MSM: A data science approach to predict HIV infection
Objective: Young men who have sex with men (YMSM) bear a disproportionate burden of HIV infection in the United States and their risks of acquiring HIV may be shaped by complex multi-layer social networks. These networks are formed through not only direct contact with social/sex partners but also indirect anonymous contacts encountered when attending social venues. We introduced a new application of a state-of-the-art graph-based deep learning method to predict HIV infection that can identify influential neighbors within these multiple network contexts. Design and Methods: We used empirical network data among YMSM aged 16–29 years old collected from Houston and Chicago in the U.S. between 2014 and 2016. A computational framework GAT-HIV (Graph Attention Networks for HIV) was proposed to predict HIV infections by identifying influential neighbors within social networks. These networks were formed by multiple relations comprised of social/sex partners and shared venue attendances, and using individual-level variables. Further, GAT-HIV was extended to combine multiple social networks using multi-graph GAT methods. A visualization tool was also developed to highlight influential network members for each individual within the multiple social networks. Results: The multi-graph GAT-HIV models obtained average AUC values of 0.776 and 0.824 for Chicago and Houston respectively, performing better than empirical predictive models (e.g. AUCs of random forest: 0.758 and 0.798). GAT-HIV on single networks also delivered promising prediction performances. Conclusions: The proposed methods provide a comprehensive and interpretable framework for graph-based modeling that may inform effective HIV prevention intervention strategies among populations most vulnerable to HIV. Correspondence to: Yang Xiang, Ph.D., Houston, TX United States;. e-mail: Yang.Xiang@uth.tmc.edu Received 7 June, 2020 Revised 19 October, 2020 Accepted 19 November, 2020 Supplemental digital content is available for this article. Direct URL citations appear in the printed text and are provided in the HTML and PDF versions of this article on the journal's Website (http://www.AIDSonline.com). Copyright © 2020 Wolters Kluwer Health, Inc.

ApoA-I mimetics attenuate macrophage activation in chronic treated HIV
Objective(s): Despite antiretroviral therapy (ART), there is an unmet need for therapies to mitigate immune activation in HIV infection. The goal of this study is to determine whether the apoA-I mimetics 6F and 4F attenuate macrophage activation in chronic HIV. Design: Preclinical assessment of the in vivo impact of Tg6F and the ex vivo impact of apoA-I mimetics on biomarkers of immune activation and gut barrier dysfunction in treated HIV. Methods: We used two humanized murine models of HIV infection to determine the impact of oral Tg6F with ART (HIV+ART+Tg6F+) on innate immune activation (plasma human sCD14, sCD163) and gut barrier dysfunction [murine I-FABP, endotoxin (LPS), LPS binding protein (LBP), murine sCD14]. We also used gut explants from 10 uninfected and 10 HIV infected men on potent ART and no morbidity, to determine the impact of ex vivo treatment with 4F for 72 hours on secretion of sCD14, sCD163 and I-FABP from gut explants. Results: When compared to mice treated with ART alone (HIV+ART+), HIV+ART+Tg6F+ mice attenuated (i) macrophage activation (h-sCD14, h-sCD163), (ii) gut barrier dysfunction (m-IFABP, LPS, LBP and m-sCD14), iii) plasma and gut tissue oxidized lipoproteins. The results were consistent with independent mouse models and ART regimens. Both 4F and 6F attenuated shedding of I-FABP and sCD14 from gut explants from HIV infected and uninfected participants. Conclusions: Given that gut barrier dysfunction and macrophage activation are contributors to comorbidities like cardiovascular disease in HIV, apoA-I mimetics should be tested as therapy for morbidity in chronic treated HIV. Correspondence to Theodoros Kelesidis, M.D, PhD, Department of Medicine, Division of Infectious Diseases, David Geffen School of Medicine at UCLA, Los Angeles, California, USA. 10833 Le Conte Ave. CHS 37-121 Los Angeles, CA 90095, USA; e-mail: tkelesidis@mednet.ucla.edu Received 11 June, 2020 Revised 26 October, 2020 Accepted 18 November, 2020 Supplemental digital content is available for this article. Direct URL citations appear in the printed text and are provided in the HTML and PDF versions of this article on the journal's Website (http://www.AIDSonline.com). Copyright © 2020 Wolters Kluwer Health, Inc.

Minimal detection of cerebrospinal fluid escape after initiation of antiretroviral therapy in acute HIV-1 infection
Objective: Despite suppression of HIV-1 replication in the periphery by antiretroviral therapy (ART), up to 10% of treated individuals have quantifiable HIV-1 in the CSF, termed CSF escape. CSF escape may be asymptomatic but has also been linked to progressive neurological disease, and may indicate persistence of HIV in the central nervous system (CNS). CSF escape has not yet been assessed after initiation of ART during acute HIV-1 infection (AHI). Design: Prospective cohort study. Setting: Major voluntary counseling and testing site in Bangkok, Thailand. Subjects: Participants identified and initiated on ART during AHI who received an optional study lumbar puncture at pre-ART baseline or after 24 or 96 weeks of ART. Main outcome measures: Paired levels of CSF and plasma HIV-1 RNA, with CSF > plasma HIV-1 RNA defined as CSF escape. Results: 204 participants had paired blood and CSF sampling in at least one visit at baseline, week 24, or week 96. 29 participants had CSF sampling at all three visits. CSF escape was detected in 1/90 at week 24 (CSF HIV-1 RNA 2.50 log10 copies/mL, plasma HIV-1 RNA < 50 copies/mL), and 0/55 at week 96. Conclusions: While levels of CSF HIV-1 RNA in untreated AHI are high, initiating treatment during AHI results in a very low rate of CSF escape in the first two years of treatment. Early treatment may improve control of HIV-1 within the CNS compared with treatment during chronic infection, which may have implications for long-term neurological outcomes and CNS HIV-1 persistence. Correspondence to Serena Spudich, MD, Professor, Department of Neurology, Yale University School of Medicine, 300 George Street, Room 8300c, New Haven, Connecticut, 06511 USA. Tel: +1 650 400 1222; e-mail: serena.spudich@yale.edu Received 28 May, 2020 Revised 8 July, 2020 Accepted 1 September, 2020 Supplemental digital content is available for this article. Direct URL citations appear in the printed text and are provided in the HTML and PDF versions of this article on the journal's Website (http://www.AIDSonline.com). Copyright © 2020 Wolters Kluwer Health, Inc.

Circulating extracellular vesicles as new inflammation marker in hiv infection
Background: Extracellular Vesicles(EVs), released by cell pullulation, are surrounded by a phospholipid bilayer and carry proteins as well and genetic material. It has been shown that EVs mediate intercellular communication in several conditions, such as inflammation, immunodeficiency, tumor growth and viral infections. Here, we analyzed circulating levels of EVs in order to clarify their role in chronic inflammation mechanisms characterizing HIV patients. Methods: We analyzed and subtyped circulating levels of EVs, through a recently developed flow cytometry method. In detail, endothelial-derived EVs (CD31+/CD41a-/CD45-, EMVs), EVs stemming from leukocytes (CD45+, LMVs) and platelets (CD41a+/CD31+) were identified and enumerated. Moreover, we analyzed the EV protein cargo with proteomic analysis. Results: Circulating levels of total EVs, EMVs and LMVs were significantly lower in the HIV+ patients than in healthy subjects, while platelets-derived EVs resulted higher in patients than in the healthy population. Proteomic analysis showed the upregulation of gammaIFN and IL1alpha, and down-regulation of OSM, NF-kB, LIF and RXRA signaling resulted activated in this patients. Conclusion: These data demonstrate, for the first time, that HIV infection induces the production of EVs containing mediators that possibly feed the chronic inflammation and the viral replication. These two effects are connected since the inflammation itself induces the viral replication. We therefore hypothesize that HIV infection inhibits the production of EVs that carry anti-inflammatory molecules. Correspondence to Prof. Katia Falasca, Clinic of Infectious Diseases, Dept. of Medicine and Science of Aging, University "G. D'Annunzio" School of Medicine, Via dei Vestini, 66100 Chieti – Italy. Tel.: +39-0871-357562; fax: +39-0871-358372; e-mail: k.falasca@unich.it Received 7 August, 2020 Revised 14 October, 2020 Accepted 17 November, 2020 Supplemental digital content is available for this article. Direct URL citations appear in the printed text and are provided in the HTML and PDF versions of this article on the journal's Website (http://www.AIDSonline.com). Copyright © 2020 Wolters Kluwer Health, Inc.

Promoting antiretroviral therapy adherence habits: a novel approach based on a synthesis of economic and psychological theories of habit formation
No abstract available

The relationship between smoking, current CD4, viral load and cancer in persons living with HIV
Background: It is unknown if the carcinogenic effect of smoking is influenced by CD4 count and viral load (VL) in persons living with HIV. Material and Methods: RESPOND participants with known smoking status were included. Poisson regression adjusting for baseline confounders investigated the interaction between current CD4/VL strata (good [CD4≥500/mm3 and VL<200 copies/mL], poor [CD4≤350/mm3 and VL>200 copies/ml] and intermediate [all other combinations]), smoking status and all cancers, non-AIDS defining cancers (NADC), smoking-related cancers (SRC), and infection-related cancers (IRC). Results: Of 19602 persons, 41.3% were never smokers 44.4% current and 14.4% previous smokers at baseline. CD4/VL strata were poor in 3.4%, intermediate in 44.8% and good in 51.8%. There were 513 incident cancers; incidence rate 6.9/1000 PYFU (95% CI 6.3–7.5). Current smokers had higher incidence of all cancer (adjusted incidence rate ratio 1.45; 1.17–1.79), NADC (1.65; 1.31–2.09), SRC (2.21; 1.53–3.20), and IRC (1.38; 0.97–1.96) vs never smokers. Those with poor CD4/VL had increased incidence of all cancer (5.36; 95% CI 3.71–7.75), NADC (3.14; 1.92–5.14), SRC (1.82; 0.76–4.41) and IRC (10.21; 6.06–17.20) versus those with good CD4/VL. There was no evidence that the association between smoking and cancer subtypes differed depending on the CD4/VL strata (p > 0.1, test for interaction). Conclusions: In the large RESPOND consortium, the impact of smoking on cancer was clear and reducing smoking rates should remain a priority. The association between current immune deficiency, virological control and cancer was similar for never smokers, current smokers and previous smokers suggesting similar carcinogenic effects of smoking regardless of CD4 count and VL. Correspondence to Amanda Mocroft, Centre for Clinical Research, Epidemiology, Modelling and Evaluation (CREME), Institute for Global Health, UCL, Rowland Hill St, London, NW3 2PF; e-mail: a.mocroft@ucl.ac.uk Received 1 September, 2020 Revised 30 November, 2020 Accepted 2 December, 2020 Supplemental digital content is available for this article. Direct URL citations appear in the printed text and are provided in the HTML and PDF versions of this article on the journal's Website (http://www.AIDSonline.com). Copyright © 2020 Wolters Kluwer Health, Inc.


#
Medicine by Alexandros G. Sfakianakis,Anapafseos 5 Agios Nikolaos 72100 Crete Greece,00302841026182,00306932607174,alsfakia@gmail.com,
Telephone consultation 11855 int 1193,

Cardiovascular Nursing

Naturalistic Decision Making in Everyday Self-care Among Older Adults With Heart Failure
Background Every day, older adults living with heart failure make decisions regarding their health that may ultimately affect their disease trajectory. Experts describe these decisions as instances of naturalistic decision making influenced by the surrounding social and physical environment and involving shifting goals, high stakes, and the involvement of others. Objective This study applied a naturalistic decision-making approach to better understand everyday decision making by older adults with heart failure. Methods We present a cross-sectional qualitative field research study using a naturalistic decision-making conceptual model and critical incident technique to study health-related decision making. The study recruited 24 older adults with heart failure and 14 of their accompanying support persons from an ambulatory cardiology center. Critical incident interviews were performed and qualitatively analyzed to understand in depth how individuals made everyday health-related decisions. Results White, male (66.7%), older adults' decision making accorded with a preliminary conceptual model of naturalistic decision making occurring in phases of monitoring, interpreting, and acting, both independently and in sequence, for various decisions. Analyses also uncovered that there are barriers and strategies affecting the performance of these phases, other actors can play important roles, and health decisions are made in the context of personal priorities, values, and emotions. Conclusions Study findings lead to an expanded conceptual model of naturalistic decision making by older adults with heart failure. In turn, the model bears implications for future research and the design of interventions grounded in the realities of everyday decision making. This work was supported by the Agency for Healthcare Research & Quality (R21 HS025232). Dr Holden reports consultant payments from federal research grants awarded to the University of Wisconsin, Clemson University, Oregon Health & Science University, and Kent State University. He receives an annual honorarium for editorial duties from Taylor & Francis publisher. Dr Mirro reports grants from Biotronik Inc, the Agency for Healthcare Research and Quality, Medtronic plc, and Janssen Scientific Affairs; consulting fees/honoraria from iRhythm Technologies Inc and Zoll Medical Corporation; and nonpublic equity/stock interest in Murj, Inc/Viscardia. Dr Mirro's relationships with academia include serving as a trustee of Indiana University. Dr Toscos reports grants from Biotronik Inc, Medtronic plc, Janssen Scientific Affairs, and iRhythm Technologies, Inc. All other authors have no conflicts to disclose. The content is solely the responsibility of the authors and does not necessarily represent the official views of the Agency for Healthcare Research & Quality or Parkview Health. Correspondence Richard J. Holden, PhD, Regenstrief Institute, 1101 W 10th St #421, Indianapolis, IN 46202 (rjholden@iu.edu). Copyright © 2020 Wolters Kluwer Health, Inc. All rights reserved

A Pilot Study to Evaluate a Computer-Based Intervention to Improve Self-care in Patients With Heart Failure
Background Cognitive dysfunction contributes to poor learning and impaired self-care (SC) for patients with heart failure. Objectives The aims of this study were to (1) evaluate the feasibility and acceptability of a nurse-led, virtual home-based cognitive training and SC education intervention to support SC and (2) evaluate the relationship between improvements in SC and cognitive change and examine 30-day readmission rates. Methods In this 2-phase pilot study, we used a prospective, exploratory design. In phase 1, recruitment criteria and retention issues threatened feasibility and acceptance. Significant modifications were made and evaluated in phase 2. Results In phase 2, 12 participants were recruited (7 women and 5 men). Feasibility was supported. All participants and the study nurse positively evaluated acceptability of the intervention. Median SC scores improved over time. Thirty-day hospital readmission rates were 25%. Conclusion Phase 1 indicates the intervention as originally designed was not feasible or acceptable. Phase 2 supports the feasibility and acceptability of the modified intervention. Further testing is warranted. This study was supported by NIH/NINR P20 NR015331-02 (funded as part of the University of Michigan School of Nursing, P20 Center for Complexity and Self-management of Chronic Disease [PIs: Debra Barton, PhD, RN, FAAN; Ivo Dinov, PhD], and Donald and Karin Allen Faculty Fund, Department of Health Behavior and Biological Sciences, School of Nursing, University of Michigan). The authors have no conflicts of interest to disclose. Correspondence Cynthia Arslanian-Engoren, PhD, RN, ACNS-BC, FAHA, FAAN, University of Michigan School of Nursing, 400 N Ingalls, Room 2176, Ann Arbor, MI 48109 (cmae@umich.edu). Copyright © 2020 Wolters Kluwer Health, Inc. All rights reserved

Translation and Psychometric Evaluation of the German Version of the Thirst Distress Scale for Patients With Heart Failure
Background In patients with chronic heart failure, thirst can be perceived as an intensive and burdensome symptom, which may have a negative impact on patients' quality of life. To initiate thirst-relieving interventions, assessment of thirst and its related distress is essential. At the time of this study, no instrument was available to evaluate thirst distress in patients with heart failure in Germany. Objective The aims of this study were to translate the "Thirst Distress Scale for patients with Heart Failure" (TDS-HF) from English into German and to test validity and reliability of the scale. Methods The English version of the TDS-HF was translated into German. A linguistically and culturally sensitive forward-and-backward translation was performed. Psychometric evaluation included confirmatory factor analysis, reliability in terms of internal consistency, and concurrent validity. Results Eighty-four hospitalized patients (mean age, 72 ± 10 years; 29% female; mean left ventricular ejection fraction, 36% ± 12%; 62% New York Heart Association functional classes III–IV, 45% on fluid restriction) from an acute care hospital were involved in the study. The item-total correlation ranged from 0.58 to 0.78. Interitem correlations varied between 0.37 and 0.79. Internal consistency was high, with a Cronbach α of 0.89. There was a high correlation between the total score of the TDS-HF and the visual analog scale to assess thirst intensity (r = 0.72, P ≤ .001), and a low correlation with fluid restriction (r = 0.35, P = .002). Conclusions The evaluation of the German TDS-HF showed satisfactory psychometric properties in this sample. The instrument is usable for further research and additional psychometric testing. The authors have no funding or conflicts of interest to disclose. Correspondence Christiane Kugler, PhD, RN, FAAN, Faculty of Medicine, Institute of Nursing Science, Albert-Ludwigs-University Freiburg, Elsässer Str. 2-o, 79106 Freiburg, Germany (christiane.kugler@uniklinik-freiburg.de). Copyright © 2020 Wolters Kluwer Health, Inc. All rights reserved

The Influence of Preparedness, Mutuality, and Self-efficacy on Home Care Workers' Contribution to Self-care in Heart Failure: A Structural Equation Modeling Analysis
Background Home care workers (HCWs) are increasingly caring for patients with heart failure (HF). Previous studies have shown that they contribute to HF patients' care, but how their preparedness and their relationship with patients (mutuality) influence caregiving is unknown, as well as the role of HCWs' self-efficacy. Objective Guided by the Situation-Specific Theory of Caregiver Contribution to HF Self-Care, we investigated the influence of HCWs' preparedness and mutuality on HCWs' contribution to HF self-care and the mediating effect of HCWs' self-efficacy in the process. Methods We conducted a cross-sectional survey of HCWs who cared for patients with HF. The survey included the Caregiver Preparedness Scale, Mutuality Scale, Caregiver Contribution to Self-Care of HF Index, and Caregiver Self-Efficacy in Contributing to Self-Care Scale. We performed structural equation modeling and a mediation analysis. Results A total of 317 HCWs employed by 22 unique home care agencies across New York, NY, completed the survey. They had a median age of 50 years, 94% were women, and 44% were non-Hispanic Black. Results demonstrated that mutuality had a direct influence on HCW contribution to self-care and preparedness influenced their contribution to self-care, but only through the mediation of self-efficacy. Conclusion Home care workers' preparedness, mutuality, and self-efficacy have important roles in influencing their contribution to HF self-care. As a workforce increasingly involved in the care of patients with HF, knowing the mechanisms underpinning HCWs' contribution to self-care may illuminate future interventions aimed at improving their contributions and HF patient outcomes. This research was made possible, in part, through a generous donation by Douglas Wigdor, Esq. Dr Sterling and this research are supported by the National Heart, Lung, and Blood Institute (K23HL150160). Dr Riegel is supported by the National Institute of Nursing Research of the National Institutes of Health (R01NR018196). REDCap at Weill Cornell Medicine is supported by Clinical and Translational Science Center grant UL1 TR002384. The authors have no conflicts of interest to disclose. Correspondence Madeline R. Sterling, MD, MPH, MS, Weill Cornell Medicine, 420 E 70th St, Box 331, New York, NY 10021 (mrs9012@med.cornell.edu). Copyright © 2020 Wolters Kluwer Health, Inc. All rights reserved

Health-Related Quality of Life in Patients With a Left Ventricular Assist Device (QOLVAD) Questionnaire: Initial Psychometrics of a New Instrument
Background Patients with a left ventricular assist device are a unique and growing population who deserve their own valid, reliable instrument for health-related quality of life. Objective We developed and tested the Health-Related Quality of Life with a Left Ventricular Assist Device (QOLVAD) questionnaire. Methods In a prospective, descriptive study, patients from 7 sites completed the QOLVAD and comparator questionnaires. Construct validity was tested using confirmatory factor analysis. Convergent validity was tested using correlations of QOLVAD scores to well-established measures of subjective health status, depression, anxiety, and meaning/faith. Reliability and test-retest reliability were quantified. Results Patients (n = 213) were 58.7 ± 13.9 years old; 81.0% were male, 73.7% were White, and 48.0% had bridge to transplant. Questionnaires were completed at a median time of 44 weeks post ventricular assist device. The 5 QOLVAD domains had acceptable construct validity (root mean square error of approximation = 0.064, comparative and Tucker-Lewis fit indices > 0.90, weighted root mean square residual = 0.95). The total score and domain-specific scores were significantly correlated with the instruments to which they were compared. Internal consistency reliability was acceptable for all subscales (α = .79–.83) except the cognitive domain (α = .66). Unidimensional reliability for the total score was acceptable (α = .93), as was factor determinacy for multidimensional reliability (0.95). Total test-retest reliability was 0.875 (P < .001). Conclusion Our analysis provided initial support for validity and reliability of the QOLVAD for total score, physical, emotional, social, and meaning/spiritual domains. The QOLVAD has potential in research and clinical settings to guide decision making and referrals; further studies are needed. This study was supported by grants from the following sources: Abbott-Northwestern Hospital Foundation, Minneapolis Heart Institute Foundation, and Minnesota Nurses Association Foundation. The funding sources had no role in the collection, analysis, or interpretation of the data. Additional support for K.E.S. was received from Bethel University (sabbatical). P.E. received honoraria from or is a consultant for Abbott Laboratories and Medtronic (paid to the institution, not the individual). S.M.J. receives speaking honoraria from Abbot (modest) and consulting fees from Medtronic (minimal). S.H. is a consultant for Abbott Laboratories. E.Y.B. received grants from Impulse Dynamics, personal fees from American Regent, and grants from Medtronic Inc. B.H. received a grant (shared with K.E.S.) from Minnesota Nurses Association Foundation (completed). All were paid to Minneapolis Heart Institute Foundation to be managed (not paid to the individual). J.A.C. Received personal fees from Abbott and Medtronic and is a member of the scientific advisory committee for Medtronic and Procyrion—all outside the submitted work. K.E.S. and B.H. are coauthors of the QOLVAD questionnaire and hold it as intellectual property. BH does contract work with Lippincott/WK as a Nurse Educator Consultant. (JCVN is owned/published by Lippincott/Wolters Kluwer - which is the owner of a Lippincott Clinical Experiences product from which BH receives royalties). [The QOLVAD questionnaire has been granted without charge to others who request permission in advance for nonprofit use.] Correspondence Kristin E. Sandau, PhD, RN, FAHA, FAAN, Bethel University, 3900, Bethel Drive, St Paul, MN 55112 (k-sandau@bethel.edu). Copyright © 2020 Wolters Kluwer Health, Inc. All rights reserved

Health-Related Quality of Life Declines Over 3 Years for Congenital Heart Disease Survivors
Background Because of medical advancements, many congenital heart disease (CHD) survivors are relatively symptom-free until adulthood, at which time complications may occur. Worsening health status likely drives a change in patient-reported outcomes, such as health-related quality of life (HRQoL), although change in HRQoL has not been investigated among adolescent and young adult CHD survivors. Objective The aims of the current mixed cross-sectional and longitudinal study were to (1) examine changes in HRQoL over 3 years and (2) identify any demographic (age, sex, estimated family income, and distance from medical center) and medical predictors (functional status and number of cardiac-related medications) of that change. Methods Baseline and 3-year follow-up data were obtained via an online survey of 172 CHD survivors (15–39 years old at baseline; 25% simple, 45% moderate, 30% complex) recruited from a pediatric hospital and an adult hospital. Medical predictors were abstracted from electronic medical records. Results After controlling for New York Heart Association functional class, mixed-effects models identified significant declines in all subscales of the Research and Development Corporation 36-Item Health Survey 1.0 across the 3-year timeframe. A lower estimated family income (≤$35 000) predicted more decline in physical functioning (b = 0.5, 95% confidence interval, 0.2–0.8; P = .001) and emotional functioning (b = 0.3, 95% confidence interval, 0.1–0.5; P = .017). No other significant demographic or medical predictors were identified. Conclusions Study findings highlight the importance of tracking patient-reported outcomes over time, suggesting that medical staff should discuss HRQoL with CHD survivors during late adolescence and early adulthood before decline. The authors have no conflicts of interest to disclose. This work was supported by the National Institutes of Health (grant number T32HL-098039) to J.L. Jackson, The Heart Center at Nationwide Children's Hospital, and the Clinical and Translational Science Award (grant number UL1TR001070) to The Ohio State University and Nationwide Children's Hospital. All authors take responsibility for all aspects of the reliability and freedom from bias of the data presented and their discussed interpretation. Correspondence Jamie L. Jackson, PhD, 700 Children's Drive, Columbus, OH 43205 (Jamie.jackson2@nationwidechildrens.org). Copyright © 2020 Wolters Kluwer Health, Inc. All rights reserved

Psychometric Properties and Factorial Structure of Vietnamese Version of the Hypertension Self-Care Profile Behavior Scale
Background The Hypertension Self-Care Profile Behavior (HTN-SCPB) scale is a self-report instrument with which a patient's self-care behavior can be assessed. However, its psychometric properties for adult patients with hypertension in Vietnam require clarification. Objective The aim of this study was to translate the HTN-SCPB scale into Vietnamese and to assess its psychometric properties. Methods The study included 220 adult patients with hypertension. To evaluate test-retest reliability, 133 participants were tested twice with a 3-week interval between tests. For construct validity, exploratory factor analysis was used to assess factor structure, and confirmatory factor analysis was used to evaluate the structural model fit of the scale. Results Reliability was confirmed by internal consistency (Cronbach α = 0.79) and test-retest reliability (intraclass correlation coefficient, 0.88). The Kaiser-Meyer-Olkin value was 0.75, and Bartlett's test of sphericity was significant (P < .001) and adequate for exploratory factor analysis. A 5-factor structure was obtained, and the factors were named as follows: "advanced self-management skills," "adverse health behaviors," "medication adherence," "diet-related knowledge regarding hypertension," and "information skills." Confirmatory factor analysis revealed that the model fit indices were acceptable (root-mean-square error of approximation, 0.07) or slightly less than the good fit values (comparative fit index, 0.85; incremental fit index, 0.85; goodness-of-fit index, 0.88; adjusted goodness-of-fit index, 0.84; and Tucker-Lewis index, 0.82). Conclusions The Vietnamese HTN-SCPB scale had satisfactory validity and reliability for assessing self-care behaviors in patients with hypertension in Vietnam. The authors have no funding or conflicts of interest to disclose. Correspondence Pei-Shan Tsai, PhD, RN, 250 Wuxing St, Taipei, 110 Taiwan, ROC (ptsai@tmu.edu.tw). Copyright © 2020 Wolters Kluwer Health, Inc. All rights reserved

Home-Based Versus Outpatient-Based Cardiac Rehabilitation Post–Coronary Artery Bypass Graft Surgery: A Randomized Controlled Trial
Background The prevalence of coronary heart disease continues to increase in the Kingdom of Saudi Arabia (KSA). Despite advances in cardiac surgery, there are no established outpatient cardiac rehabilitation programs in the KSA. Objective The aim of this study was to investigate the effectiveness of home-based cardiac rehabilitation compared with outpatient-based cardiac rehabilitation and usual care for patients who are post–coronary artery bypass graft surgery. Method This 3-arm, single-blind, randomized controlled trial was carried out at the King Faisal Specialist Hospital, Riyadh, KSA. A total of 82 patients post–coronary artery bypass graft surgery were randomized and 73 patients completed the study. Recruited patients were distributed to home-based cardiac rehabilitation (n = 24), outpatient-based cardiac rehabilitation (n = 25), or usual care (control group) (n = 24). Participants in the intervention groups completed an individualized exercise program for 2 hours, 3 times a week for 8 weeks. The control group followed usual care (no intervention). The incremental shuttle walk test (ISWT), metabolic equivalence task, Short Form-36, and Hospital Anxiety and Depression Scale (HADS) were measured at baseline, postintervention, and after a 4-week follow-up period. Results Postintervention, there was an increase in mean ISWT score from baseline in both the home-based cardiac rehabilitation and outpatient-based cardiac rehabilitation groups (66 [0.58] m and 71 [9.19] m, respectively). No difference was observed in the control group. At the 4-week follow-up, both intervention groups showed statistically significant improvements in all outcome measures (ISWT, metabolic equivalence tasks, HADS-A, HADS-D, and Short Form-36) compared with baseline (all P < .001). The home-based cardiac rehabilitation group showed statistically continuous improvement compared with the outpatient-based cardiac rehabilitation group. The control group did not show any significant changes across time in outcome measures. Conclusion Home-based cardiac rehabilitation is as effective as outpatient-based cardiac rehabilitation. Home-based cardiac rehabilitation appears to be more effective at maintaining improvements follow the end of the intervention. Clinical messages: An 8-week home-based cardiac rehabilitation program is as effective as an outpatient-based program for improving functional capacity, physiological and psychological well-being, and quality of life for patients with coronary heart disease after coronary artery bypass graft surgery. The authors have no funding or conflicts of interest to disclose. Correspondence Mohammed A. Takroni, PhD, Department of Physiotherapy, King Faisal Specialist Hospital & Research Centre, PO Box 3354, Riyadh, Kingdom of Saudi Arabia 11211 (mtakroni@kfshrc.edu.sa). Copyright © 2020 Wolters Kluwer Health, Inc. All rights reserved

Persistent Heart Failure Symptoms at Hospital Discharge Predicts 30-Day Clinical Events
Background The relationship between heart failure (HF) symptoms at hospital discharge and 30-day clinical events is unknown. Variability in HF symptom assessment may affect ability to predict readmission risk. Objective The aim of this study was to describe HF symptom profiles and burden at hospital discharge. A secondary aim was to examine the relationship between symptom burden at discharge and 30-day clinical events. Methods An exploratory descriptive design was used. Patients with HF (n = 186) were enrolled 24 to 48 hours pre hospital discharge. The HF Somatic Perception Scale quantified 18 HF physical signs and symptoms. Scores were divided into tertiles (0–10, 11–19, and 20 and higher). The Patient Health Questionnaire-9 quantified depressive symptoms. Self-assessed health, comorbid illnesses, and 30-day clinical events were documented. Chi-square and logistic regression were used to examine clinical events. Results The sample (n = 186) was predominantly White (87.6%), male (59.1%), elderly (mean [SD], 74.2 [12.5]), and symptomatic (92.5%) at discharge. Heart Failure Somatic Perception Scale scores ranged from 0 to 53, with a mean (SD) of 13.7 (10.1). Symptoms reported most frequently were fatigue (67%), nocturia (62%), need to rest (53%), and inability to do usual activities due to shortness of breath (52%). Thirty-day event rate was 28%, with significant differences between Heart Failure Somatic Perception Scale tertiles (9.4% vs 37.7% in the second and third tertiles, respectively; χ22(N = 186) = 16.73, P < .001). Heart Failure Somatic Perception Scale tertile 2 or 3 (odds ratio [OR], 5.7; P = .003; and OR, 4.3; P = .021), self-assessed health (OR, 2.6; P = .029), and being in a relationship predicted clinical events. Conclusions Heart failure symptom burden at discharge predicted 30-day clinical events. Comprehensive symptom assessment is important when determining readmission risk. The authors have no funding or conflicts of interest to disclose. Correspondence Laura E. Senecal, DNP, RN, AGACNP-BC, Saint Francis Hospital, 100 Port Washington Blvd, Ste 105, Roslyn, NY 11576 (laurasenecal.dnp@gmail.com). Copyright © 2020 Wolters Kluwer Health, Inc. All rights reserved

Depressive Symptom Trajectories in Family Caregivers of Stroke Survivors During First Year of Caregiving
Background The purpose of this study was to identify patterns of depressive symptom trajectory and examine the associations of the symptom trajectory with caregiving burden, family function, social support, and perceived health status of caregivers of stroke survivors during the first year of caregiving after discharge from rehabilitation center. Methods Caregivers of stroke survivors completed a survey of depressive symptoms, caregiving burden, family function, perceived availability of social support, and perceived health status at postdischarge and 1 year. Patterns of depressive symptom trajectory (ie, symptom-free, symptom relieved, symptom developed, and persistent symptom groups) were identified by grouping depressive symptoms based on 2 assessments using the Center for Epidemiologic Studies-Depression. Repeated-measures analysis of variance and multinomial logistic regression were used to examine the associations. Results Of the 102 caregivers, 57.8% were symptom-free, 20.6% experienced persistent depressive symptoms, 11.8% relieved depressive symptoms, and 9.8% developed depressive symptoms. There were significant changes in family function (Wilks λ = 0.914, P = .038) and perceived health status (Wilks λ = 0.914, P = .033) among the groups during the first year of caregiving. The persistent symptom group reported the highest level of burden and the lowest level of family function and perceived availability of social support at both assessment times. Compared with symptom-free caregivers, caregivers with persistent depressive symptoms were 7 times more likely to have fair/poor health rather than excellent/very good health at 1 year (odds ratio, 7.149; P = .012). Conclusion Caregivers with persistent depressive symptoms are the most vulnerable to negative psychosocial outcomes and poor perceived health status during the first year of caregiving from discharge for stroke survivors. This study was supported by NIH R01NR02416 (King, PI) and the Chung-Ang University research grants (2018). The authors have no conflicts of interest to disclose. Correspondence Suk Jeong Lee, PhD, RN, Red Cross College of Nursing, Chung-Ang University, Seoul 06974, South Korea (lsj1109@cau.ac.kr). Copyright © 2020 Wolters Kluwer Health, Inc. All rights reserved


#
Medicine by Alexandros G. Sfakianakis,Anapafseos 5 Agios Nikolaos 72100 Crete Greece,00302841026182,00306932607174,alsfakia@gmail.com,
Telephone consultation 11855 int 1193,

Ear and Hearing

The Effect of Hearing Loss and Hearing Device Fitting on Fatigue in Adults: A Systematic Review
imageObjectives: To conduct a systematic review to address two research questions: (Q1) Does hearing loss have an effect on fatigue? (Q2) Does hearing device fitting have an effect on fatigue? It was hypothesized that hearing loss would increase fatigue (H1), and hearing device fitting would reduce fatigue (H2). Design: Systematic searches were undertaken of five bibliographic databases: Embase, MedLine, Web of Science, Psychinfo, and the Cochrane Library. English language peer-reviewed research articles were included from inception until present. Inclusion and exclusion criteria were formulated using the Population, Intervention, Comparison, Outcomes and Study design strategy. Results: Initial searches for both research questions produced 1,227 unique articles, after removal of duplicates. After screening, the full text of 61 studies was checked, resulting in 12 articles with content relevant to the research questions. The reference lists of these studies were examined, and a final updated search was conducted on October 16, 2019. This resulted in a final total of 20 studies being selected for the review. For each study, the information relating to the Population, Intervention, Comparison, Outcomes and Study design criteria and the statistical outcomes relating to both questions (Q1 and Q2) were extracted. Evidence relating to Q1 was provided by 15 studies, reporting 24 findings. Evidence relating to Q2 was provided by six studies, reporting eight findings. One study provided evidence for both. Using the Grading of Recommendations Assessment, Development and Evaluation guidelines, the quality of evidence on both research questions was deemed to be "very low." It was impossible to perform a meta-analysis of the results due to a lack of homogeneity. Conclusions: As the studies were too heterogeneous to support a meta-analysis, it was not possible to provide statistically significant evidence to support the hypotheses that hearing loss results in increased fatigue (H1) or that hearing device fitting results in decreased fatigue (H2). Despite this, the comparative volume of positive results and the lack of any negative findings are promising for future research (particularly in respect of Q1). There was a very small number of studies deemed eligible for the review, and there was large variability between studies in terms of population, and quantification of hearing loss and fatigue. The review highlights the need for consistency when measuring fatigue, particularly when using self-report questionnaires, where the majority of the current evidence was generated.

Dorsomedial Prefrontal Cortex Repetitive Transcranial Magnetic Stimulation for Tinnitus: Promising Results of a Blinded, Randomized, Sham-Controlled Study
imageObjectives: Tinnitus is the perception of sound in ears or head without corresponding external stimulus. Despite the great amount of literature concerning tinnitus treatment, there are still no evidence-based established treatments for curing or for effectively reducing tinnitus intensity. Sham-controlled studies revealed beneficial effects using repetitive transcranial magnetic stimulation (rTMS). Still, results show moderate, temporary improvement and high individual variability. Subcallosal area (ventral and dorsomedial prefrontal and anterior cingulate cortices) has been implicated in tinnitus pathophysiology. Our objective is to evaluate the use of bilateral, high frequency, dorsomedial prefrontal cortex (DMPFC) rTMS in treatment of chronic subjective tinnitus. Design: Randomized placebo-controlled, single-blinded clinical trial. Twenty sessions of bilateral, 10 Hz rTMS at 120% of resting motor threshold of extensor hallucis longus were applied over the DMPFC. Fourteen patients underwent sham rTMS and 15 were submitted to active stimulation. Tinnitus Handicap Inventory (THI), visual analog scale, and tinnitus loudness matching were obtained at baseline and on follow-up visits. The impact of intervention on outcome measures was evaluated using mixed-effects restricted maximum likelihood regression model for longitudinal data. Results: A difference of 11.53 points in the THI score was found, favoring the intervention group (p = 0.05). The difference for tinnitus loudness matching was of 4.46 dB also favoring the intervention group (p = 0.09). Conclusions: Tinnitus treatment with high frequency, bilateral, DMPFC rTMS was effective in reducing tinnitus severity measured by THI and matched tinnitus loudness when compared to sham stimulation.

The Influence of Forced Social Isolation on the Auditory Ecology and Psychosocial Functions of Listeners With Cochlear Implants During COVID-19 Mitigation Efforts
imageObjectives: The impact of social distancing on communication and psychosocial variables among individuals with hearing impairment during COVID-19 pandemic. It was our concern that patients who already found themselves socially isolated (Wie et al. 2010) as a result of their hearing loss would be perhaps more susceptible to changes in their communication habits resulting in further social isolation, anxiety, and depression. We wanted to better understand how forced social isolation (as part of COVID-19 mitigation) effected a group of individuals with hearing impairment from an auditory ecology and psychosocial perspective. We hypothesized that the listening environments would be different as a result of social isolation when comparing subject's responses regarding activities and participation before COVID-19 and during the COVID-19 pandemic. This change would lead to an increase in experienced and perceived social isolation, anxiety, and depression. Design: A total of 48 adults with at least 12 months of cochlear implant (CI) experience reported their listening contexts and experiences pre-COVID and during-COVID using Ecological Momentary Assessment (EMA; methodology collecting a respondent's self-reports in their natural environments) through a smartphone-based app, and six paper and pencil questionnaires. The Smartphone app and paper-pencil questionnaires address topics related to their listening environment, social isolation, depression, anxiety, lifestyle and demand, loneliness, and satisfaction with amplification. Data from these two-time points were compared to better understand the effects of social distancing on the CI recipients' communication abilities. Results: EMA demonstrated that during-COVID CI recipients were more likely to stay home or be outdoors. CI recipients reported that they were less likely to stay indoors outside of their home relative to the pre-COVID condition. Social distancing also had a significant effect on the overall signal-to-noise ratio of the environments indicating that the listening environments had better signal-to-noise ratios. CI recipients also reported better speech understanding, less listening effort, less activity limitation due to hearing loss, less social isolation due to hearing loss, and less anxiety due to hearing loss. Retrospective questionnaires indicated that social distancing had a significant effect on the social network size, participant's personal image of themselves, and overall loneliness. Conclusions: Overall, EMA provided us with a glimpse of the effect that forced social isolation has had on the listening environments and psychosocial perspectives of a select number of CI listeners. CI participants in this study reported that they were spending more time at home in a quieter environments during-COVID. Contrary to our hypothesis, CI recipients overall felt less socially isolated and reported less anxiety resulting from their hearing difficulties during-COVID in comparison to pre-COVID. This, perhaps, implies that having a more controlled environment with fewer speakers provided a more relaxing listening experience.

Peripheral Auditory Involvement in Childhood Listening Difficulty
imageObjectives: This study tested the hypothesis that undetected peripheral hearing impairment occurs in children with idiopathic listening difficulties (LiDs), as reported by caregivers using the Evaluation of Children"s Listening and Processing Skills (ECLiPS) validated questionnaire, compared with children with typically developed (TD) listening abilities. Design: Children with LiD aged 6–14 years old (n = 60, mean age = 9.9 yr) and 54 typical age matched children were recruited from audiology clinical records and from IRB-approved advertisements at hospital locations and in the local and regional areas. Both groups completed standard and extended high-frequency (EHF) pure-tone audiometry, wideband absorbance tympanometry and middle ear muscle reflexes, distortion product and chirp transient evoked otoacoustic emissions. Univariate and multivariate mixed models and multiple regression analysis were used to examine group differences and continuous performance, as well as the influence of demographic factors and pressure equalization (PE) tube history. Results: There were no significant group differences between the LiD and TD groups for any of the auditory measures tested. However, analyses across all children showed that EHF hearing thresholds, wideband tympanometry, contralateral middle ear muscle reflexes, distortion product, and transient-evoked otoacoustic emissions were related to a history of PE tube surgery. The physiologic measures were also associated with EHF hearing loss, secondary to PE tube history. Conclusions: Overall, the results of this study in a sample of children with validated LiD compared with a TD group matched for age and sex showed no significant differences in peripheral function using highly sensitive auditory measures. Histories of PE tube surgery were significantly related to EHF hearing and to a range of physiologic measures in the combined sample.

Better Hearing in Norway: A Comparison of Two HUNT Cohorts 20 Years Apart
imageObjective: To obtain updated robust data on a age-specific prevalence of hearing loss in Norway and determine whether more recent birth cohorts have better hearing compared with earlier birth cohorts. Design: Cross-sectional analyzes of Norwegian representative demographic and audiometric data from the Nord-Trøndelag Health Study (HUNT)—HUNT2 Hearing (1996–1998) and HUNT4 Hearing (2017–2019), with the following distribution: HUNT2 Hearing (N=50,277, 53% women, aged 20 to 101 years, mean = 50.1, standard deviation = 16.9); HUNT4 Hearing (N=28,339, 56% women, aged 19 to 100 years, mean = 53.2, standard deviation = 16.9). Pure-tone hearing thresholds were estimated using linear and quantile regressions with age and cohort as explanatory variables. Prevalences were estimated using logistic regression models for different severities of hearing loss averaged over 0.5, 1, 2, and 4 kHz in the better ear (BE PTA4). We also estimated prevalences at the population-level of Norway in 1997 and 2018. Results: Disabling hearing loss (BE PTA4 ≥ 35 dB) was less prevalent in the more recent born cohort at all ages in both men and women (p < 0.0001), with the largest absolute decrease at age 75 in men and at age 85 in women. The age- and sex-adjusted prevalence of disabling hearing loss was 7.7% (95% confidence interval [CI] 7.5 to 7.9) and 5.3% (95% CI 5.0 to 5.5) in HUNT2 and HUNT4, respectively. Hearing thresholds were better in the more recent born cohorts at all frequencies for both men and women (p < 0.0001), with the largest improvement at high frequencies in more recent born 60- to 70-year old men (10 to 11 dB at 3 to 4 kHz), and at low frequencies among the oldest. Conclusions: The age- and sex-specific prevalence of hearing impairment has decreased in Norway from 1996–1998 to 2017–2019.

Search for Electrophysiological Indices of Hidden Hearing Loss in Humans: Click Auditory Brainstem Response Across Sound Levels and in Background Noise
imageObjectives: Recent studies in animals indicate that even moderate levels of exposure to noise can damage synaptic ribbons between the inner hair cells and auditory nerve fibers without affecting audiometric thresholds, giving rise to the use of the term "hidden hearing loss" (HHL). Despite evidence across several animal species, there is little consistent evidence for HHL in humans. The aim of the study is to evaluate potential electrophysiological changes specific to individuals at risk for HHL. Design: Participants forming the high-risk experimental group consisted of 28 young normal-hearing adults who participated in marching band for at least 5 years. Twenty-eight age-matched normal-hearing adults who were not part of the marching band and had little or no history of recreational or occupational exposure to loud sounds formed the low-risk control group. Measurements included pure tone audiometry of conventional and high frequencies, distortion product otoacoustic emissions, and electrophysiological measures of auditory nerve and brainstem function as reflected in the click-evoked auditory brainstem response (ABR). In experiment 1, ABRs were recorded in a quiet background across stimulus levels (30–90 dB nHL) presented in 10 dB steps. In experiment 2, the ABR was elicited by a 70 dB nHL click stimulus presented in a quiet background, and in the presence of simultaneous ipsilateral continuous broadband noise presented at 50, 60, and 70 dB SPL using an insert earphone (Etymotic, ER2). Results: There were no differences between the low- and high-risk groups in audiometric thresholds or distortion product otoacoustic emission amplitude. Experiment 1 demonstrated smaller wave-I amplitudes at moderate and high sound levels for high-risk compared to low-risk group with similar wave III and wave V amplitude. Enhanced amplitude ratio V/I, particularly at moderate sound level (60 dB nHL), suggesting central compensation for reduced input from the periphery for high-risk group. The results of experiment 2 show that the decrease in wave I amplitude with increasing background noise level was relatively smaller for the high-risk compared to the low-risk group. However, wave V amplitude reduction was essentially similar for both groups. These results suggest that masking induced wave I amplitude reduction is smaller in individuals at high risk for cochlear synaptopathy. Unlike previous studies, we did not observe a difference in the noise-induced wave V latency shift between low- and high-risk groups. Conclusions: Results of experiment 1 are consistent with findings in both animal studies (that suggest cochlear synaptopathy involving selective damage of low-spontaneous rate and medium-spontaneous rate fibers), and in several human studies that show changes in a range of ABR metrics that suggest the presence of cochlear synaptopathy. However, without postmortem examination by harvesting human temporal bone (the gold standard for identifying synaptopathy) with different noise exposure background, no direct inferences can be derived for the presence/extent of cochlear synaptopathy in high-risk group with high sound over-exposure history. Results of experiment 2 demonstrate that to the extent response amplitude reflects both the number of neural elements responding and the neural synchrony of the responding elements, the relatively smaller change in response amplitude for the high-risk group would suggest a reduced susceptibility to masking. One plausible mechanism would be that suppressive effects that kick in at moderate to high levels are different in these two groups, particularly at moderate levels of the masking noise. Altogether, a larger scale dataset with different noise exposure background, longitudinal measurements (changes due to recreational over-exposure by studying middle-school to high-school students enrolled in marching band) with an array of behavioral and electrophysiological tests are needed to understand the complex pathogenesis of sound over-exposure damage in normal-hearing individuals.

Selection Criteria for Cochlear Implantation in the United Kingdom and Flanders: Toward a Less Restrictive Standard
imageObjectives: The impact of the newly introduced cochlear implantation criteria of the United Kingdom and Flanders (Dutch speaking part of Belgium) was examined in the patient population of a tertiary referral center in the Netherlands. We compared the patients who would be included/excluded under the new versus old criteria in relation to the actual improvement in speech understanding after implantation in our center. We also performed a sensitivity analysis to examine the effectiveness of the different preoperative assessment approaches used in the United Kingdom and Flanders. Design: The selection criteria were based on preoperative pure-tone audiometry at 0.5, 1, 2, and 4 kHz and a speech perception test (SPT) with and without best-aided hearing aids. Postoperatively, the same SPT was conducted to assess the benefit in speech understanding. Results: The newly introduced criteria in Flanders and the United Kingdom were less restrictive, resulting in greater percentages of patients implanted with CI (increase of 30%), and sensitivity increase of 31%. The preoperative best-aided SPT, used by both countries, had the highest diagnostic ability to indicate a postoperative improvement of speech understanding. We observed that patient selection was previously dominated by the pure-tone audiometry criteria in both countries, whereas speech understanding became more important in their new criteria. Among patients excluded by the new criteria, seven of eight (the United Kingdom and Flanders) did exhibit improved postoperative speech understanding. Conclusions: The new selection criteria of the United Kingdom and Flanders led to increased numbers of postlingually deafened adults benefitting from CI. The new British and Flemish criteria depended on the best-aided SPT with the highest diagnostic ability. Notably, the new criteria still led to the rejection of candidates who would be expected to gain considerably in speech understanding after implantation.

Vestibular Function in Children With a Congenital Cytomegalovirus Infection: 3 Years of Follow-Up
imageObjectives: Congenital cytomegalovirus (cCMV) infection is the most common nongenetic cause of sensorineural hearing loss in children. Due to the close anatomical relationship between the auditory and the vestibular sensory organs, cCMV can also be an important cause of vestibular loss. However, the prevalence and nature of cCMV-induced vestibular impairment is still underexplored. The aim of this study was to investigate the occurrence and characteristics of vestibular loss in a large group of cCMV-infected children, representative of the overall cCMV-population. Design: Ninety-three children (41 boys, 52 girls) with a confirmed diagnosis of cCMV were enrolled in this prospective longitudinal study. They were born at the Ghent University Hospital or referred from another hospital for multidisciplinary follow-up in the context of cCMV. The test protocol consisted of regular vestibular follow-up around the ages of 6 months, 1 year, 2 years, and 3 years with the video Head Impulse Test, the rotatory test, and the cervical Vestibular Evoked Myogenic Potential test. Results: On average, the 93 patients (52 asymptomatic, 41 symptomatic) were followed for 10.2 months (SD: 10.1 mo) and had 2.2 examinations (SD: 1.1). Seventeen (18%) patients had sensorineural hearing loss (7 unilateral, 10 bilateral). Vestibular loss was detected in 13 (14%) patients (7 unilateral, 6 bilateral). There was a significant association between the occurrence of hearing loss and the presence of vestibular loss (p < 0.001), with 59% (10/17) vestibular losses in the group of hearing-impaired children compared to 4% (3/76) in the group of normal-hearing subjects. In the majority of the cases with a vestibular dysfunction (85%, 11/13), both the semicircular canal system and the otolith system were affected. The remaining subjects (15%, 2/13) had an isolated semicircular canal dysfunction. Sixty-one patients already had at least one follow-up examination. Deterioration of the vestibular function was detected in 6 of them (10%, 6/61). Conclusions: cCMV can impair not only the auditory but also the vestibular function. Similar to the hearing loss, vestibular loss in cCMV can be highly variable. It can be unilateral or bilateral, limited or extensive, stable or progressive, and early or delayed in onset. As the vestibular function can deteriorate over time and even normal-hearing subjects can be affected, vestibular evaluation should be part of the standard otolaryngology follow-up in all children with cCMV.

Human Frequency Following Responses to Filtered Speech
imageObjectives: There is increasing interest in using the frequency following response (FFR) to describe the effects of varying different aspects of hearing aid signal processing on brainstem neural representation of speech. To this end, recent studies have examined the effects of filtering on brainstem neural representation of the speech fundamental frequency (f0) in listeners with normal hearing sensitivity by measuring FFRs to low- and high-pass filtered signals. However, the stimuli used in these studies do not reflect the entire range of typical cutoff frequencies used in frequency-specific gain adjustments during hearing aid fitting. Further, there has been limited discussion on the effect of filtering on brainstem neural representation of formant-related harmonics. Here, the effects of filtering on brainstem neural representation of speech fundamental frequency (f0) and harmonics related to first formant frequency (F1) were assessed by recording envelope and spectral FFRs to a vowel low-, high-, and band-pass filtered at cutoff frequencies ranging from 0.125 to 8 kHz. Design: FFRs were measured to a synthetically generated vowel stimulus /u/ presented in a full bandwidth and low-pass (experiment 1), high-pass (experiment 2), and band-pass (experiment 3) filtered conditions. In experiment 1, FFRs were measured to a synthetically generated vowel stimulus /u/ presented in a full bandwidth condition as well as 11 low-pass filtered conditions (low-pass cutoff frequencies: 0.125, 0.25, 0.5, 0.75, 1, 1.5, 2, 3, 4, 6, and 8 kHz) in 19 adult listeners with normal hearing sensitivity. In experiment 2, FFRs were measured to the same synthetically generated vowel stimulus /u/ presented in a full bandwidth condition as well as 10 high-pass filtered conditions (high-pass cutoff frequencies: 0.125, 0.25, 0.5, 0.75, 1, 1.5, 2, 3, 4, and 6 kHz) in 7 adult listeners with normal hearing sensitivity. In experiment 3, in addition to the full bandwidth condition, FFRs were measured to vowel /u/ low-pass filtered at 2 kHz, band-pass filtered between 2–4 kHz and 4–6 kHz in 10 adult listeners with normal hearing sensitivity. A Fast Fourier Transform analysis was conducted to measure the strength of f0 and the F1-related harmonic relative to the noise floor in the brainstem neural responses obtained to the full bandwidth and filtered stimulus conditions. Results: Brainstem neural representation of f0 was reduced when the low-pass filter cutoff frequency was between 0.25 and 0.5 kHz; no differences in f0 strength were noted between conditions when the low-pass filter cutoff condition was at or greater than 0.75 kHz. While envelope FFR f0 strength was reduced when the stimulus was high-pass filtered at 6 kHz, there was no effect of high-pass filtering on brainstem neural representation of f0 when the high-pass filter cutoff frequency ranged from 0.125 to 4 kHz. There was a weakly significant global effect of band-pass filtering on brainstem neural phase-locking to f0. A trends analysis indicated that mean f0 magnitude in the brainstem neural response was greater when the stimulus was band-pass filtered between 2 and 4 kHz as compared to when the stimulus was band-pass filtered between 4 and 6 kHz, low-pass filtered at 2 kHz or presented in the full bandwidth condition. Last, neural phase-locking to f0 was reduced or absent in envelope FFRs measured to filtered stimuli that lacked spectral energy above 0.125 kHz or below 6 kHz. Similarly, little to no energy was seen at F1 in spectral FFRs obtained to low-, high-, or band-pass filtered stimuli that did not contain energy in the F1 region. For stimulus conditions that contained energy at F1, the strength of the peak at F1 in the spectral FFR varied little with low-, high-, or band-pass filtering. Conclusions: Energy at f0 in envelope FFRs may arise due to neural phase-locking to low-, mid-, or high-frequency stimulus components, provided the stimulus envelope is modulated by at least two interacting harmonics. Stronger neural responses at f0 are measured when filtering results in stimulus bandwidths that preserve stimulus energy at F1 and F2. In addition, results suggest that unresolved harmonics may favorably influence f0 strength in the neural response. Lastly, brainstem neural representation of the F1-related harmonic measured in spectral FFRs obtained to filtered stimuli is related to the presence or absence of stimulus energy at F1. These findings add to the existing literature exploring the viability of the FFR as an objective technique to evaluate hearing aid fitting where stimulus bandwidth is altered by design due to frequency-specific gain applied by amplification algorithms.

Effects of Signal Type and Noise Background on Auditory Evoked Potential N1, P2, and P3 Measurements in Blast-Exposed Veterans
imageObjectives: Veterans who have been exposed to high-intensity blast waves frequently report persistent auditory difficulties such as problems with speech-in-noise (SIN) understanding, even when hearing sensitivity remains normal. However, these subjective reports have proven challenging to corroborate objectively. Here, we sought to determine whether use of complex stimuli and challenging signal contrasts in auditory evoked potential (AEP) paradigms rather than traditional use of simple stimuli and easy signal contrasts improved the ability of these measures to (1) distinguish between blast-exposed Veterans with auditory complaints and neurologically normal control participants, and (2) predict behavioral measures of SIN perception. Design: A total of 33 adults (aged 19–56 years) took part in this study, including 17 Veterans exposed to high-intensity blast waves within the past 10 years and 16 neurologically normal control participants matched for age and hearing status with the Veteran participants. All participants completed the following test measures: (1) a questionnaire probing perceived hearing abilities; (2) behavioral measures of SIN understanding including the BKB-SIN, the AzBio presented in 0 and +5 dB signal to noise ratios (SNRs), and a word-level consonant-vowel-consonant test presented at +5 dB SNR; and (3) electrophysiological tasks involving oddball paradigms in response to simple tones (500 Hz standard, 1000 Hz deviant) and complex speech syllables (/ba/ standard, /da/ deviant) presented in quiet and in four-talker speech babble at a SNR of +5 dB. Results: Blast-exposed Veterans reported significantly greater auditory difficulties compared to control participants. Behavioral performance on tests of SIN perception was generally, but not significantly, poorer among the groups. Latencies of P3 responses to tone signals were significantly longer among blast-exposed participants compared to control participants regardless of background condition, though responses to speech signals were similar across groups. For cortical AEPs, no significant interactions were found between group membership and either stimulus type or background. P3 amplitudes measured in response to signals in background babble accounted for 30.9% of the variance in subjective auditory reports. Behavioral SIN performance was best predicted by a combination of N1 and P2 responses to signals in quiet which accounted for 69.6% and 57.4% of the variance on the AzBio at 0 dB SNR and the BKB-SIN, respectively. Conclusions: Although blast-exposed participants reported far more auditory difficulties compared to controls, use of complex stimuli and challenging signal contrasts in cortical and cognitive AEP measures failed to reveal larger group differences than responses to simple stimuli and easy signal contrasts. Despite this, only P3 responses to signals presented in background babble were predictive of subjective auditory complaints. In contrast, cortical N1 and P2 responses were predictive of behavioral SIN performance but not subjective auditory complaints, and use of challenging background babble generally did not improve performance predictions. These results suggest that challenging stimulus protocols are more likely to tap into perceived auditory deficits, but may not be beneficial for predicting performance on clinical measures of SIN understanding. Finally, these results should be interpreted with caution since blast-exposed participants did not perform significantly poorer on tests of SIN perception.


#
Medicine by Alexandros G. Sfakianakis,Anapafseos 5 Agios Nikolaos 72100 Crete Greece,00302841026182,00306932607174,alsfakia@gmail.com,
Telephone consultation 11855 int 1193,

Retina

THE RAP STUDY, REPORT TWO: The Regional Distribution of Macular Neovascularization Type 3, a Novel Insight Into Its Etiology
imagePurpose: To explore the regional distribution of macular neovascularization type 3 (MNV3). Methods: Seventy-eight eyes of 78 patients were reviewed. We defined the location of each lesion after applying a modified ETDRS grid and the incidence of simultaneous MNV1 or 2. Also, we investigated the distribution of MNV3 at the outline of the foveal avascular zone and when the diameter of foveal avascular zone was less than 325 µm. Results: The distribution of MNV3 was 4 lesions (5%) from the center to 500 µm, 72 (92%) from 500 µm to 1500 µm, and 2 (3%) from 1,500 µm to 3000 µm. The distribution in respect of the ETDRS fields was 7 (9%) nasal, 16 (20%) superior, 32 (40%) temporal, and 23 (31%) inferior. No additional MNV1 or 2 were found elsewhere. Most lesions tended to distribute along straight bands radiating from the perifoveal area, mainly in the temporal half (72%). None of the cases had MNV3 at the boundary of the foveal avascular zone. Only five cases had foveal avascular zone diameter of less than 325 µm, the closest lesion was 425 µm away from the center. Conclusion: MNV3 lesions are most likely neither symmetrical nor uniformly distributed. They have a higher affinity to distribute radially in the temporal perifoveal area.

OPTICAL COHERENCE TOMOGRAPHY ANGIOGRAPHY CAN CATEGORIZE DIFFERENT SUBGROUPS OF CHOROIDAL NEOVASCULARIZATION SECONDARY TO AGE-RELATED MACULAR DEGENERATION
imagePurpose: Choroidal neovascularization (CNV) is a common complication of patients affected by age-related macular degeneration, showing a highly variable visual outcome. The main aim of the study was, at baseline, to perform a quantitative optical coherence tomography angiography assessment of CNV secondary to age-related macular degeneration and to assess posttreatment outcomes. Methods: Seventy-eight naïve age-related macular degeneration-related CNV patients (39 men, mean age 78 ± 8 years) were recruited and underwent complete ophthalmologic evaluation and multimodal imaging. Several OCT and optical coherence tomography angiography parameters were collected, including vessel tortuosity and vessel dispersion (VDisp), measured for each segmented CNV. All patients underwent anti–vascular endothelial growth factor PRN treatment. Vessel tortuosity and VDisp values of CNVs were tested at baseline to establish a cutoff able to distinguish clinically different patient subgroups. Results: Mean best-corrected visual acuity was 0.49 ± 0.57 (20/62) at baseline, improving to 0.31 ± 0.29 (20/41) at the 1-year follow-up (P < 0.01), with a mean number of 6.4 ± 1.9 injections. Our cohort included the following CNV types: occult (45 eyes; 58%), classic (14 eyes; 18%), and mixed (19 eyes; 24%). Observing optical coherence tomography angiography parameters, classic, mixed, and occult CNV revealed significantly different values of VDisp, with classic forms showing the highest values and the occult CNVs showing the lowest (P < 0.01); mixed forms displayed intermediate VDisp values. The ROC analysis revealed that a CNV vessel tortuosity cut-off of 8.40, calculated at baseline, enabled two patient subgroups differing significantly in visual outcomes after anti–vascular endothelial growth factor treatment to be distinguished. Conclusion: A baseline quantitative optical coherence tomography angiography-based parameter could provide information regarding both clinical and functional outcomes after anti–vascular endothelial growth factor treatment in age-related macular degeneration-related CNV.

FEATURES OF THE MACULAR AND PERIPAPILLARY CHOROID AND CHORIOCAPILLARIS IN EYES WITH NONEXUDATIVE AGE-RELATED MACULAR DEGENERATION
imagePurpose: We investigated macular and peripapillary choroidal thickness (CT) and flow voids in the choriocapillaris in eyes with nonexudative age-related macular degeneration. Methods: We retrospectively reviewed the medical records of patients with nonexudative age-related macular degeneration and classified their eyes into three categories: pachydrusen, drusen, and subretinal drusenoid deposit. Mean macular and peripapillary CT and choriocapillaris flow void area were compared among the three groups. Results: The three groups included 29, 33, and 33 patients, respectively. The mean macular and peripapillary CT findings were 260.64 ± 75.85 µm and 134.47 ± 46.28 µm for the pachydrusen group; 163.63 ± 64.08 µm and 93.47 ± 39.07 µm for the drusen group; and 95.33 ± 28.87 µm and 56.06 ± 11.64 µm for the subretinal drusenoid deposit group (all, P < 0.001). Mean macular and peripapillary flow void area varied among the subretinal drusenoid deposit group (57.07 ± 6.16% and 55.38 ± 6.65%), drusen group (58.30 ± 6.98% and 49.11 ± 9.11%) and pachydrusen group (50.09 ± 5.77% and 45.47 ± 8.06%) (all P < 0.001). Conclusion: The peripapillary CT and flow voids in the choriocapillaris varied according to the features of drusen in nonexudative age-related macular degeneration eyes. Greater flow voids and thinner CT in eyes with subretinal drusenoid deposits may suggest that these eyes have diffuse choroidal abnormalities both in and outside the macula.

REAL-COLOR VERSUS PSEUDO-COLOR IMAGING OF FIBROTIC SCARS IN EXUDATIVE AGE-RELATED MACULAR DEGENERATION
imagePurpose: To compare the morphological characteristics of subretinal fibrosis in late age-related macular degeneration using multicolor (MC) imaging, color fundus photography (CFP), and ultra-widefield CFP (UWFCFP). Methods: Thirty-two eyes of 31 patients diagnosed with subretinal fibrosis complicating exudative age-related macular degeneration were included. Included eyes were imaged by MC, CFP, and UWFCFP. The overall ability to visualize fibrosis, its margins, and dissimilarity with surrounding atrophy was graded using a score (0: not visible, 1: barely visible, 2: mostly visible, and 3: fully visible) by two readers. Area of fibrosis was calculated. Scaling, lesion colocalization on all three imaging techniques, and area measurements were performed using ImageJ. Results: Ninety-six images of 32 eyes were graded. The average area of fibrosis was 14.59 ± 8.94 mm2 for MC, 13.84 ± 8.56 mm2 for CFP, and 13.76 ± 8.79 mm2 for UWFCFP. Fibrosis was fully visible in 87.5% of cases using MC and 50% using CFP and UWFCFP. Fibrosis' margins were sharply defined in 40.6% of eyes with MC, 15.6% and 9.4% with CFP and UWFCFP, respectively. Multicolor imaging provided superior distinction between fibrosis and atrophy (100% for MC vs. 13.4% for CFP and 33.3% for UWFCFP). The inter- and intra-reader agreement was high for all measurements (P < 0.0001). Conclusion: Multicolor technology allows for improved visualization and analysis of subretinal fibrosis when compared with CFP and UWFCFP, especially when surrounding atrophy is present.

PREVALENCE AND RISK FACTORS FOR THE DEVELOPMENT OF PHYSICIAN-GRADED SUBRETINAL FIBROSIS IN EYES TREATED FOR NEOVASCULAR AGE-RELATED MACULAR DEGENERATION
imagePurpose: To assess the prevalence and incidence of and risk factors for subretinal fibrosis (SRFi) in eyes with neovascular age-related macular degeneration (nAMD) that underwent vascular endothelial growth factor inhibitor treatment for up to 10 years. Methods: A cross-sectional and longitudinal analysis was performed on data from a neovascular age-related macular degeneration registry. The presence and location of SRFi were graded by the treating practitioner. Visual acuity, lesion characteristics (type, morphology, and activity), and treatment administered at each visit was recorded. Results: The prevalence of SRFi in 2,914 eyes rose from 20.4% at year interval 0-1 to 40.7% at year interval 9 to 10. The incidence in 1,950 eyes was 14.3% at baseline and 26.3% at 24 months. Independent characteristics associated with SRFi included poorer baseline vision (adjusted odds ratio 5.33 [95% confidence interval 4.66–7.61] for visual acuity ≤35 letters vs. visual acuity ≥70 letters, P < 0.01), baseline lesion size (adjusted odds ratio 1.08 [95% confidence interval 1.08–1.14] per 1000 µm, P = 0.03), lesion type (adjusted odds ratio 1.42 [95% confidence interval 1.17–1.72] for predominantly classic vs. occult lesions, P = 0.02), and proportion of active visits (adjusted odds ratio 1.58 [95% confidence interval 1.25–2.01] for the group with the highest level of activity vs. the lowest level of activity, P < 0.01). Conclusion: Subretinal fibrosis was found in 40% of eyes after 10 years of treatment. High rates of lesion activity, predominantly classic lesions, poor baseline vision, and larger lesion size seem to be independent risk factors for SRFi.

DIAGNOSTIC CHARACTERISTICS OF POLYPOIDAL CHOROIDAL VASCULOPATHY BASED ON B-SCAN SWEPT-SOURCE OPTICAL COHERENCE TOMOGRAPHY ANGIOGRAPHY AND ITS INTERRATER AGREEMENT COMPARED WITH INDOCYANINE GREEN ANGIOGRAPHY
imagePurpose: To examine the characteristics of polypoidal choroidal vasculopathy using B-scan optical coherence tomography angiography (OCTA), and determine the diagnostic criteria of polypoidal choroidal vasculopathy based on OCTA. Methods: This retrospective case series included patients diagnosed with treatment-naïve polypoidal choroidal vasculopathy who underwent indocyanine green angiography (ICGA) and swept-source OCTA at baseline. We compared the characteristics of the polyps detected using B-scan OCTA and ICGA. Then, the diagnostic concordance of each polypoidal lesion between ICGA and OCTA was evaluated. Results: Among 54 eyes of 52 patients, all 54 eyes showed flow signals indicating polyps on both ICGA and B-scan OCTA. All polyps on B-scan OCTA were detected as round/ring-like flow signals inside pigment epithelial detachments, incomplete round/ring-like flow signals overlaid with round/ring-like OCT structures inside pigment epithelial detachments, or flow signals adjacent to a pigment epithelial detachment notch. Using B-scan OCTA, 94.7% of the polypoidal lesions were detected by an independent evaluator with an overall accuracy of 92.6% for counting the polypoidal lesions per eye relative to ICGA and a Kappa value of 0.82. Conclusion: Polyp detection on B-scan OCTA demonstrates high accuracy and is comparable to that obtained on ICGA. B-scan OCTA could replace ICGA for the diagnosis of polypoidal choroidal vasculopathy.

VISUAL PROGNOSIS AFTER PNEUMATIC DISPLACEMENT OF SUBMACULAR HEMORRHAGE ACCORDING TO AGE-RELATED MACULAR DEGENERATION SUBTYPES
imagePurpose: This study compared the visual outcome after pneumatic displacement of submacular hemorrhage among patients with different subtypes of age-related macular degeneration (AMD). Methods: We retrospectively reviewed the medical records of 67 patients (67 eyes) who underwent treatment for submacular hemorrhage associated with AMD. All the patients underwent pneumatic displacement. Demographic parameters, visual acuity, and anatomical features were analyzed among AMD subtypes: typical AMD, polypoidal choroidal vasculopathy (PCV), and retinal angiomatous proliferation (RAP). Results: Among the eyes with submacular hemorrhage, 24, 30, and 13 eyes had typical AMD, PCV, and RAP, respectively. Post-treatment best-corrected visual acuity was best in the PCV group and worst in the RAP group (P < 0.001). The proportion of eyes with improved visual acuity was highest in the PCV subtype and lowest in the RAP subtype (P = 0.044). Logistic regression analysis showed that AMD subtype (P = 0.016) and time to treatment (<7 days) (P = 0.037) are associated with the final visual outcome. Conclusion: The final post-treatment visual outcome after the incidence of submacular hemorrhage was best in the PCV group and worst in the RAP group. Age-related macular degeneration subtype is a significant factor associated with the visual prognosis of submacular hemorrhage.

RISK OF AGE-RELATED MACULAR DEGENERATION IN PATIENTS WITH PERIODONTITIS: A Nationwide Population-Based Cohort Study
imagePurpose: Periodontitis is an inflammatory disease that results in loss of connective tissue and bone support. Evidence shows a possible relationship between periodontitis and age-related macular degeneration (AMD). Methods: This population-based cohort study was conducted using data from the National Health Insurance Research Database in Taiwan, with a 13-year follow-up, to investigate the risk of AMD in patients with periodontitis. The periodontitis cohort included patients with newly diagnosed periodontitis between 2000 and 2012. The nonperiodontitis cohort was frequency-matched with the periodontitis cohort by age and sex, with a sample size of 41,661 in each cohort. Results: Patients with periodontitis had an increased risk of developing AMD compared with individuals without periodontitis (5.95 vs. 3.41 per 1,000 person-years, adjusted hazard ratio = 1.58 [95% confidence interval, 1.46–1.70]). The risk of developing AMD remained significant after stratification by age (adjusted hazard ratio = 1.48 [1.34–1.64] for age <65 years and 1.76 [1.57–1.97] for age ≥65 years), sex (adjusted hazard ratio = 1.40 [1.26–1.55] for women and 1.82 [1.63–2.04] for men), and presence of comorbidity (adjusted hazard ratio = 1.52 [1.40–1.66] for with comorbidity and 1.92 [1.63–2.26] for without comorbidity). In addition, patients with periodontitis showed an increased incidence for both nonexudative type AMD (5.43 vs. 3.13 per 1,000 person-years) and exudative type AMD (0.52 vs. 0.28 per 1,000 person-years). Conclusion: People with periodontitis could be at a greater risk of developing AMD than those without periodontitis. However, we need more evidence to support this association.

OPTICAL COHERENCE TOMOGRAPHY ANGIOGRAPHY FEATURES OF FOCAL CHOROIDAL EXCAVATION AND THE CHOROIDAL STROMA VARIATIONS WITH OCCURRENCE OF EXCAVATION
imagePurpose: To describe retinal and choroidal vascular changes, and choroidal stroma variations occurring in focal choroidal excavation (FCE). Methods: Study design was a cross-sectional case series. Consecutive patients affected by FCE and healthy controls were recruited. All patients underwent complete ophthalmologic assessment and multimodal imaging, including structural optical coherence tomography and optical coherence tomography angiography. Choroidal thickness and stromal index were calculated from structural optical coherence tomography images. Moreover, we measured vessel density values of the superficial capillary plexus, deep capillary plexus and choriocapillaris at the level of the macula. Results: Twenty-two patients (28 eyes; mean age 57.2 ± 16.4) and 28 control eyes (mean age of 56.5 ± 9.8) were included. Five patients (23%) were asymptomatic, whereas 17 patients (77%) complained of visual symptoms. FCE was associated with choroidal neovascularization in 10 eyes (35%). Choroidal stromal component was lower in FCE patients than controls, whereas choroidal thickness was unremarkable. Stromal index values calculated in the region proximal to the FCE was significantly lower than the values obtained from the external region. Deep capillary plexus vessel density was lower in FCE than controls. Choriocapillaris was altered in the region surrounding the FCE, whereas it was normal in the external region. Conclusion: Deep capillary plexus and choriocapillaris plexus were significantly altered in FCE patients. Moreover, choroidal stroma was significantly reduced in the areas closer to FCE compared to the surrounding choroid in patients, as well as compared to healthy controls, suggesting the hypothesis of weakening of the architectural support, creating a more friable point, which can favor FCE development.

IDIOPATHIC FOVEAL HYPOPLASIA: Quantitative Analysis Using Optical Coherence Tomography Angiography
imagePurpose: To evaluate vascular density (VD), fractal dimension, and skeletal density on optical coherence tomography angiography in eyes with idiopathic foveal hypoplasia (IFH). Methods: Patients presenting with IFH to Creteil University Eye Clinic between January 2015 and October 2018 and age-matched healthy controls were retrospectively evaluated. Vascular density, skeletal density, and fractal dimension analyses were computed on optical coherence tomography angiography superficial capillary plexa (SCP) and deep capillary plexa (DCP) images on the whole image using a custom algorithm. Vascular density on the central 1 mm2 and the peripheral 8 mm2 for the two groups was performed. Results: Thirty-six eyes of 21 patients (18 eyes with IFH and 18 control eyes) were included. A decrease of VD at the level of the SCP and DCP was found in eyes with IFH compared with healthy control eyes (P = 0.005 for VD at the level of the SCP and P = 0.003 for VD at the level of the DCP, respectively). On the central 1 mm2, VD was decreased in healthy eyes (32.3% ± 4.8) at the level of the SCP compared to IFH eyes (55.6% ± 46.3) (P < 0.001). Skeletal density was decreased in IFH eyes in both SCP and DCP (P =< 0.001). Fractal dimension was lower in IFH eyes in both SCP and DCP (P < 0.001). Conclusion: Vascular density, skeletal density, and fractal dimension are reduced at the level of SCP and DCP in patients with IFH compared with controls, reflecting a particular anatomical and vascular organization. Quantitative analysis using optical coherence tomography angiography could help to evaluate the severity of IFH.


#
Medicine by Alexandros G. Sfakianakis,Anapafseos 5 Agios Nikolaos 72100 Crete Greece,00302841026182,00306932607174,alsfakia@gmail.com,
Telephone consultation 11855 int 1193,