共查询到20条相似文献,搜索用时 15 毫秒
1.
2.
Kerry Dhakal 《Journal of the Medical Library Association》2022,110(2):270
NVivo. A qualitative data analysis software tool, QSR International, https://www.qsrinternational.com/nvivo-qualitative-data-analysis-software/home; $1,249/user (one-time cost for using specific version indefinitely; upgrades are additional). 相似文献
3.
Bethany S. McGowan 《Journal of the Medical Library Association》2022,110(2):273
World Health Organization''s Early AI-supported Response with Social Listening Platform (WHO EARS). WHO HQ, Avenue Appia 20, 1211, Geneva 27, Switzerland; https://www.who-ears.com/; free. 相似文献
4.
5.
Kathryn Vela 《Journal of the Medical Library Association》2022,110(1):109
Background:The COVID-19 pandemic has sparked a wave of SARS-CoV-2 and COVID-19 research that organizations around the world have been synthesizing in evidence reviews to provide high-quality evidence to support policymakers and clinicians. The urgency of these efforts opens these organizations to the risk of duplicated efforts, which could waste valuable time and resources.Case Presentation:The VA Evidence Synthesis Program (VA ESP) formed a COVID Response Team that launched an online catalog of COVID-19 evidence reviews in March 2020 (https://www.covid19reviews.org/). The goal of this website is to capture the work of evidence synthesis groups in the US and around the world to maximize their collective contributions to patients, frontline clinicians, researchers, and policymakers during the COVID-19 pandemic and avoid duplicating efforts.Conclusions:This ongoing and evolving project provides a helpful catalog of evidence reviews at various stages of production; in addition, the website provides further value with informational icons, review collections, and links to similar resources. The VA ESP will maintain this website to continue to support the needs of policymakers, clinicians, and researchers both within the VA and around the world throughout the COVID-19 pandemic. 相似文献
6.
Stacy Brody 《Journal of the Medical Library Association》2021,109(4):707
Scite. Scite Inc., 334 Leonard St., Brooklyn, NY 11211; https://scite.ai/; tiered pricing model with free, basic ($7.99/month), premium ($19.99/month or $100/year), premium+ ($59.99/month), and enterprise plans.Scite (https://scite.ai/) was founded by Josh Nicholson and Yuri Lazebnik and previously funded by the National Science Foundation (NSF) and National Institute on Drug Abuse (NIDA) [1, 2, 3]. The Scite database contains over 800 million citation statements [4] tagged by a machine learning algorithm as supporting, mentioning, or contrasting the findings of cited articles [5] and by their locations in the citing articles (introduction, results, methods, discussion, or other). Scite also provides a count of editorial notices for each article. Users can search the website and install plug-ins for browsers Chrome and Firefox and reference management tools such as Zotero. Additional tools include reports and dashboards, badges, and automated reference checks. Scite can be used by researchers to locate evidence and evaluate references; librarians to enhance research impact projects; publishers and editors to check reference lists of submissions [6, 7]; and journals, publishers, and databases to create context and showcase impact [4, 8, 9]. 相似文献
7.
8.
The University of Baltimore Library was recently renovated. Part of the design in the newly renovated space was having one single service point, that we call the Information Desk. This desk serves as the public facing portal for what was previously three public facing service points to services by three separate departments, Academic Success, Access Services and Reference & Instruction.
This article describes in depth how our Information Desk Leadership Team encouraged and used our library student employees’ expertise and knowledge in the formation of policy and procedure manual, training documents, reference tools and resume lines for use at a single service point. 相似文献
9.
10.
Objective:This study was intended to (1) provide clinical trial data-sharing platform designers with insight into users'' experiences when attempting to evaluate and access datasets, (2) spark conversations about improving the transparency and discoverability of clinical trial data, and (3) provide a partial view of the current information-sharing landscape for clinical trials.Methods:We evaluated preview information provided for 10 datasets in each of 7 clinical trial data-sharing platforms between February and April 2019. Specifically, we evaluated the platforms in terms of the extent to which we found (1) preview information about the dataset, (2) trial information on ClinicalTrials.gov and other external websites, and (3) evidence of the existence of trial protocols and data dictionaries.Results:All seven platforms provided data previews. Three platforms provided information on data file format (e.g., CSV, SAS file). Three allowed batch downloads of datasets (i.e., downloading multiple datasets with a single request), whereas four required separate requests for each dataset. All but one platform linked to ClinicalTrials.gov records, but only one platform had ClinicalTrails.gov records that linked back to the platform. Three platforms consistently linked to external websites and primary publications. Four platforms provided evidence of the presence of a protocol, and six platforms provided evidence of the presence of data dictionaries.Conclusions:More work is needed to improve the discoverability, transparency, and utility of information on clinical trial data-sharing platforms. Increasing the amount of dataset preview information available to users could considerably improve the discoverability and utility of clinical trial data. 相似文献
11.
12.
13.
《Public Services Quarterly》2013,9(2-3):191-200
Abstract AT THE DESK OR ONLINE: REFERENCE TRAINING, MEASUREMENTS, AND GUIDELINES RUSA Professional Tools: Reference Guidelines http://www.ala.org/ala/rusa/rusaprotools/referenceguide/referenceguidelines.htm. Reviewed by Penny Scott RUSA Professional Tools: Guidelines for Implementing and Maintaining Virtual Reference Services http://www.ala.org/ala/rusa/rusaprotools/referenceguide/virtrefguidelines.htm. Reviewed by Susanne Markgren. Program http://www.arl.org/stats/ by Dawn Lowe-Wincentsen. Digital Reference Services Bibliography http://www.lis.uiuc.edu/~b-sloan/digiref.html. Reviewed by Susanne Markgren. Digital Reference Education Initiative http://drei.syr.edu/index.cfm. Reviewed by Lydia Eato Harris. Ohio Reference Excellence (ORE on the Web) http://www.olc.org/Ore/index.html. Reviewed by Dawn Eckenrode. Library Staff Competencies http://www.librarysupportstaff.com/4competency.html. Reviewed by Beth Thomsett-Scott Association of Research Libraries Statistics and Measurement Research Methods Knowledge Base http://www.socialresearchmethods.net/kb/. Reviewed by Barbara Burd. 相似文献
14.
Bert Avau Hans Van Remoortel Emmy De Buck 《Journal of the Medical Library Association》2021,109(4):599
Objective:The aim of this project was to validate search filters for systematic reviews, intervention studies, and observational studies translated from Ovid MEDLINE and Embase syntax and used for searches in PubMed and Embase.com during the development of evidence summaries supporting first aid guidelines. We aimed to achieve a balance among recall, specificity, precision, and number needed to read (NNR).Methods:Reference gold standards were constructed per study type derived from existing evidence summaries. Search filter performance was assessed through retrospective searches and measurement of relative recall, specificity, precision, and NNR when using the translated search filters. Where necessary, search filters were optimized. Adapted filters were validated in separate validation gold standards.Results:Search filters for systematic reviews and observational studies reached recall of ≥85% in both PubMed and Embase. Corresponding specificities for systematic review filters were ≥96% in both databases, with a precision of 9.7% (NNR 10) in PubMed and 5.4% (NNR 19) in Embase. For observational study filters, specificity, precision, and NNR were 68%, 2%, and 51 in PubMed and 47%, 0.8%, and 123 in Embase, respectively. These filters were considered sufficiently effective. Search filters for intervention studies reached a recall of 85% and 83% in PubMed and Embase, respectively. Optimization led to recall of ≥95% with specificity, precision, and NNR of 49%, 1.3%, and 79 in PubMed and 56%, 0.74%, and 136 in Embase, respectively.Conclusions:We report validated filters to search for systematic reviews, observational studies, and intervention studies in guideline projects in PubMed and Embase.com. 相似文献
15.
The Global Postcards column is pleased to publish two contributions from Joshua Finnell and his colleagues. The first contribution with Brian Cain documents the themes and conversations of the Research Data Access and Preservation Summit (RDAP) in April 2017. The second contribution from Joshua with Stacy Konkiel documents the creation and sustainment of the Library Pipeline, a grassroots library organization. Finally, coeditor Robin Kear provides a personal synopsis of her attendance at the IFLA World Library & Information Congress (WLIC) in Wroclaw Poland in August 2017.
We always welcome contributions. If you would like to send a submission, please contact either of the column's coeditors: Jacqueline Solis, jsolis@email.unc.edu, and Robin Kear, rlk25@pitt.edu. 相似文献
16.
Katherine G. Akers 《Journal of the Medical Library Association》2021,109(2):163
The Journal of the Medical Library Association (JMLA) sincerely thanks the 214 peer reviewers in 2020 who helped vet and improve the quality of work published in our journal.The Journal of the Medical Library Association (JMLA) sincerely thanks the 214 peer reviewers in 2020 who helped vet and improve the quality of work published in our journal.JMLA is always looking to expand our pool of reviewers with expertise in specific domains in health sciences librarianship research and practice. If you would like to serve as a peer reviewer for JMLA, please indicate your interest to an assistant editor, an associate editor, or the editor-in-chief. 相似文献
17.
18.
Strategic Communication during Marital Relationship Dissolution: Disengagement Resistance Strategies
Merry C. Buchanan H. Dan O'Hair Jennifer A. H. Becker 《Communication Research Reports》2013,30(3):139-147
This study investigated the communication strategies used by divorced individuals who did not wish their marriages to end (non-initiators). Participants were 270 divorced persons drawn from divorce recovery and support groups as well as network sampling. An adaptation of Buss's (1988) taxonomy of partner retention tactics served to capture the communication strategies of non-initiators during marital dissolution. A factor analysis revealed that four disengagement resistance strategies—commitment, alignment, negativity, and harm—are used by non-initiators during the process of marital dissolution. 相似文献
19.
20.