Results for 'Research Evaluation'

971 found
Order:
  1. Short-term incentives of research evaluations: Evidence from the UK Research Excellence Framework.Moqi Groen-Xu, Gregor E. Bös, Pedro Teixeira, Thomas Voigt & Bernhard Knapp - 2023 - Research Policy 52 (6).
    No categories
     
    Export citation  
     
    Bookmark  
  2.  22
    Research evaluation: From power to empowerment.Fred Carden - 1998 - Knowledge, Technology & Policy 10 (4):67-76.
    This article explores issues in the evaluation of research through an examination of the situation in the field of international development. It is increasingly recognized that traditional evaluation, which served largely a policing function, is not useful in assessing the impact of the development research process. It is argued that the role and perception of evaluation must change if it is to provide a reflection of the learning which takes place in research. The field (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  3. Epistemic Injustice in Research Evaluation: A Cultural Analysis of the Humanities and Physics in Estonia.Endla Lõhkivi, Katrin Velbaum & Jaana Eigi - 2012 - Studia Philosophica Estonica 5 (2):108-132.
    This paper explores the issue of epistemic injustice in research evaluation. Through an analysis of the disciplinary cultures of physics and humanities, we attempt to identify some aims and values specific to the disciplinary areas. We suggest that credibility is at stake when the cultural values and goals of a discipline contradict those presupposed by official evaluation standards. Disciplines that are better aligned with the epistemic assumptions of evaluation standards appear to produce more "scientific" findings. To (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  4.  71
    Tactics of Scientific Research: Evaluating Experimental Data in Psychology. Murray Sidman.Chester R. Wasson - 1962 - Philosophy of Science 29 (4):439-441.
  5. Citation counts for research evaluation: standards of good practice for analyzing bibliometric data and presenting and interpreting results.Lutz Bornmann, Rüdiger Mutz, Christoph Neuhaus & Hans-Dieter Daniel - 2008 - Ethics in Science and Environmental Politics 8 (1):93-102.
  6.  47
    Use and misuse of metrics in research evaluation.Ronald N. Kostoff - 1997 - Science and Engineering Ethics 3 (2):109-120.
    This paper addresses some critical issues in the applicability of quantitative performance measures (including bibliometric, economic, and co-occurrence measures) to the assessment of basic research. The strengths and weaknesses of metrics applied as research performance measures are examined. It is concluded that metrics have a useful role to play in the evaluation of research. Each metric employed, whether bibliometric, economic, co-occurrence, or others, brings a new dimension of potential insight to the complex problem of research (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  7.  28
    An Assessment of Research-Doctorate Programs in the United States: Biological Sciences.Lyle V. Jones, Gardner Lindzey, Porter E. Coggeshall & Conference Board of the Associated Research Councils - 1982 - National Academies Press.
    The quality of doctoral-level biochemistry (N=139), botany (N=83), cellular/molecular biology (N=89), microbiology (N=134), physiology (N=101), and zoology (N=70) programs at United States universities was assessed, using 16 measures. These measures focused on variables related to: (1) program size; (2) characteristics of graduates; (3) reputational factors (scholarly quality of faculty, effectiveness of programs in educating research scholars/scientists, improvement in program quality during the last 5 years); (4) university library size; (5) research support; and (6) publication records. Chapter I discusses (...)
    Direct download  
     
    Export citation  
     
    Bookmark   1 citation  
  8.  38
    China’s Research Evaluation Reform: What are the Consequences for Global Science?Fei Shu, Sichen Liu & Vincent Larivière - 2022 - Minerva 60 (3):329-347.
    In the 1990s, China created a research evaluation system based on publications indexed in the Science Citation Index (SCI) and on the Journal Impact Factor. Such system helped the country become the largest contributor to the scientific literature and increased the position of Chinese universities in international rankings. Although the system had been criticized by many because of its adverse effects, the policy reform for research evaluation crawled until the breakout of the COVID-19 pandemic, which accidently (...)
    No categories
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  9.  44
    Research on research evaluation.Sven Hemlin - 1996 - Social Epistemology 10 (2):209 – 250.
  10.  24
    Evaluating research institutions: Lessons from the CGIAR.Selçuk Özgediz - 1999 - Knowledge, Technology & Policy 11 (4):97-113.
    Investing in research is a long-term, risky proposition. In agriculture, it could take fifteen years or more for a research finding to show an improvement in a farmer’s field. Yet, research institutions, like other organizations it needs to be evaluated. For more than twenty years, independent panels of outside experts have evaluated each of the international research centers that the Consultative Group of International Agricultural Research (CGIAR) supports. This paper examines the evolution of this review (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  11.  15
    A Kantian critique of scientific essentialism, Robert Hanna.Evaluational IUusions - 1998 - Philosophy and Phenomenological Research 58 (3).
  12.  56
    Evaluating Scientific Research Projects: The Units of Science in the Making.Mario Bunge - 2017 - Foundations of Science 22 (3):455-469.
    Original research is of course what scientists are expected to do. Therefore the research project is in many ways the unit of science in the making: it is the center of the professional life of the individual scientist and his coworkers. It is also the means towards the culmination of their specific activities: the original publication they hope to contribute to the scientific literature. The scientific project should therefore be of central interest to all the students of science, (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  13.  9
    Establishing a research and evaluation capability for the joint medical education and training campus.Sheila Nataraj Kirby - 2011 - Santa Monica, CA: RAND Center for Military Policy Research. Edited by Julie A. Marsh & Harry Thie.
    In calling for the transformation of military medical education and training, the 2005 Base Realignment and Closure Commission recommended relocating basic and specialty enlisted medical training to a single site to take advantage of economies of scale and the opportunity for joint training. As a result, a joint medical education and training campus (METC) has been established at Fort Sam Houston, Texas. Two of METC's primary long-term goals are to become a high-performing learning organization and to seek accreditation as a (...)
    Direct download  
     
    Export citation  
     
    Bookmark  
  14.  32
    Evaluating the science and ethics of research on humans: a guide for IRB members.Dennis John Mazur - 2007 - Baltimore: Johns Hopkins University Press.
    Biomedical research on humans is an important part of medical progress. But, when lives are at risk, safety and ethical practices need to be the top priority. The need for the committees that regulate and oversee such research -- institutional review boards, or IRBs -- is growing. IRB members face difficult decisions every day. Evaluating the Science and Ethics of Research on Humans is a guide for new and veteran members of IRBs that will help them better (...)
    Direct download  
     
    Export citation  
     
    Bookmark   3 citations  
  15.  19
    Evaluating Benefits and Harms in Clinical Research.Paul B. Miller & Charles Weijer - unknown
    Direct download  
     
    Export citation  
     
    Bookmark   3 citations  
  16. Evaluating interdisciplinary research.Katri Huutoniemi - 2010 - In Robert Frodeman, Julie Thompson Klein & Carl Mitcham (eds.), The Oxford Handbook of Interdisciplinarity. Oxford, United Kingdom: Oxford University Press. pp. 309--320.
     
    Export citation  
     
    Bookmark   6 citations  
  17.  48
    Richard Whitley, Jochen Gläser , The Changing Governance of the Sciences. The Advent of Research Evaluation Systems. Sociology of the Sciences Yearbook. [REVIEW]Jürgen Enders - 2009 - Minerva 47 (4):465-468.
    Richard Whitley, Jochen Gläser, The Changing Governance of the Sciences. The Advent of Research Evaluation Systems. Sociology of the Sciences Yearbook Content Type Journal Article Pages 465-468 DOI 10.1007/s11024-009-9132-4 Authors Jürgen Enders, University of Twente Enschede The Netherlands Journal Minerva Online ISSN 1573-1871 Print ISSN 0026-4695 Journal Volume Volume 47 Journal Issue Volume 47, Number 4.
    Direct download (8 more)  
     
    Export citation  
     
    Bookmark  
  18.  83
    Evaluation of Research(ers) and its Threat to Epistemic Pluralisms.Marco Viola - 2017 - European Journal of Analytic Philosophy 13 (2):55-78.
    While some form of evaluation has always been employed in science (e.g. peer review, hiring), formal systems of evaluation of research and researchers have recently come to play a more prominent role in many countries because of the adoption of new models of governance. According to such models, the quality of the output of both researchers and their institutions is measured, and issues such as eligibility for tenure or the allocation of public funding to research institutions (...)
    Direct download (6 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  19. Evaluating Research and Development.I. R. Weschler & Paula Brown - 1954 - Philosophy of Science 21 (1):76-76.
     
    Export citation  
     
    Bookmark  
  20.  60
    Evaluating Philosophy as Exploratory Research.Rogier De Langhe & Eric Schliesser - 2017 - Metaphilosophy 48 (3):227-244.
    This article addresses the question how philosophy should be evaluated in a research-grant funding environment. It offers a new conception of philosophy that is inclusive and builds on familiar elements of professional, philosophical practice. Philosophy systematically questions the questions we ask, the concepts we use, and the values we hold. Its product is therefore rarely conclusive but can be embodied in everything we do. This is typical of explorative research and differentiates it from exploitative research, which constitutes (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  21.  57
    Evaluating the Capacity of Theories of Justice to Serve as a Justice Framework for International Clinical Research.Bridget Pratt, Deborah Zion & Bebe Loff - 2012 - American Journal of Bioethics 12 (11):30-41.
    This article investigates whether or not theories of justice from political philosophy, first, support the position that health research should contribute to justice in global health, and second, provide guidance about what is owed by international clinical research (ICR) actors to parties in low- and middle-income countries. Four theories—John Rawls's theory of justice, the rights-based cosmopolitan theories of Thomas Pogge and Henry Shue, and Jennifer Ruger's health capability paradigm—are evaluated. The article shows that three of the four theories (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   14 citations  
  22. Research student and supervisor evaluation of intertextuality practices.Jean Crocker & Philip Shaw - 2002 - Hermes 28:39-58.
  23. Problems in Argument Analysis and Evaluation.Trudy Govier - 2018 - Windsor: University of Windsor.
    We are pleased to publish this WSIA edition of Trudy’s Govier’s seminal volume, Problems in Argument Analysis and Evaluation. Originally published in 1987 by Foris Publications, this was a pioneering work that played a major role in establishing argumentation theory as a discipline. Today, it is as relevant to the field as when it first appeared, with discussions of questions and issues that remain central to the study of argument. It has defined the main approaches to many of those (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   129 citations  
  24. Research Proposal: Design & Evaluation of Social Software.Cédric Mesnage - 2008 - Social Research: An International Quarterly 7:8.
     
    Export citation  
     
    Bookmark  
  25. Research and evaluation in music therapy.Barbara Wheeler - 2008 - In Susan Hallam, Ian Cross & Michael Thaut (eds.), Oxford Handbook of Music Psychology. Oxford University Press.
     
    Export citation  
     
    Bookmark  
  26.  26
    Service evaluation: A grey area of research?Lu-Yen A. Chen & Tonks N. Fawcett - 2019 - Nursing Ethics 26 (4):1172-1185.
    The National Health Service in the United Kingdom categorises research and research-like activities in five ways, such as ‘service evaluation’, ‘clinical audit’, ‘surveillance’, ‘usual practice’ and ‘research’. Only activities classified as ‘research’ require review by the Research Ethics Committees. It is argued, in this position paper, that the current governance of research and research-like activities does not provide sufficient ethical oversight for projects classified as ‘service evaluation’. The distinction between the categories (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  27.  26
    Realist evaluation and its role in the stages of explanatory research based on critical realism.Juan David Parra - 2023 - Journal of Critical Realism 22 (5):859-881.
    This article advocates for the validity of Realist Evaluation (RE) as a manifestation of Critical Realism in evaluation research despite criticisms suggesting that the former disregards principles from Bhaskarian ontology. Specifically, I argue that critics overstate RE's philosophical actualism in their argument that its inclination towards technocratic knowledge impedes its scrutiny of stratified social systems. Notwithstanding its limitations in fully elucidating causal structural mechanisms in social inquiry, I argue that RE's research rationale can contribute to the (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  28.  17
    Evaluating Animal Research.Lilly-Marlene Russow - 1986 - Between the Species 2 (4):11.
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  29.  54
    Evaluating teaching and students' learning of academic research ethics.Deni Elliott & Judy E. Stern - 1996 - Science and Engineering Ethics 2 (3):345-366.
    A team of philosophers and scientists at Dartmouth College worked for three years to create, train faculty and pilot test an adequate and exportable class in research methods for graduate students of science and engineering. Developing and testing methods for evaluating students’ progress in learning research ethics were part of the project goals. Failure of methods tried in the first year led to the refinement of methods for the second year. These were used successfully in the pilot course (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   5 citations  
  30.  24
    Evaluating Risks of Non-therapeutic Research in Children.Paul B. Miller & Charles Weijer - unknown
    Direct download  
     
    Export citation  
     
    Bookmark  
  31.  31
    Evaluating Public-Participation Exercises: A Research Agenda.Lynn J. Frewer & Gene Rowe - 2004 - Science, Technology, and Human Values 29 (4):512-556.
    The concept of public participation is one of growing interest in the UK and elsewhere, with a commensurate growth in mechanisms to enable this. The merits of participation, however, are difficult to ascertain, as there are relatively few cases in which the effectiveness of participation exercises have been studied in a structured manner. This seems to stem largely from uncertainty in the research community as to how to conduct evaluations. In this article, one agenda for conducting evaluation (...) that might lead to the systematic acquisition of knowledge is presented. This agenda identifies the importance of defining effectiveness and of operationalizing one’s definition. The article includes analysis of the nature of past evaluations, discussion of potential difficulties in the enactment of the proposed agenda, and discussion of some potential solutions. (shrink)
    No categories
    Direct download  
     
    Export citation  
     
    Bookmark   27 citations  
  32.  32
    Using Evaluation Research as a Means for Policy Analysis in a ‘New’ Mission-Oriented Policy Context.Effie Amanatidou, Paul Cunningham, Abdullah Gök & Ioanna Garefi - 2014 - Minerva 52 (4):419-438.
    Grand challenges stress the importance of multi-disciplinary research, a multi-actor approach in examining the current state of affairs and exploring possible solutions, multi-level governance and policy coordination across geographical boundaries and policy areas, and a policy environment for enabling change both in science and technology and in society. The special nature of grand challenges poses certain needs in evaluation practice: the need for learning at the operational, policy and, especially, system level; and the importance of a wider set (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  33.  71
    Evaluating community engagement in global health research: the need for metrics.Kathleen M. MacQueen, Anant Bhan, Janet Frohlich, Jessica Holzer & Jeremy Sugarman - 2015 - BMC Medical Ethics 16 (1):1-9.
    BackgroundCommunity engagement in research has gained momentum as an approach to improving research, to helping ensure that community concerns are taken into account, and to informing ethical decision-making when research is conducted in contexts of vulnerability. However, guidelines and scholarship regarding community engagement are arguably unsettled, making it difficult to implement and evaluate.DiscussionWe describe normative guidelines on community engagement that have been offered by national and international bodies in the context of HIV-related research, which set the (...)
    Direct download (6 more)  
     
    Export citation  
     
    Bookmark   18 citations  
  34.  6
    Evaluating International Research Ethics Capacity Development: An Empirical Approach.Nancy E. Kass Joseph Ali - 2014 - Journal of Empirical Research on Human Research Ethics: An International Journal 9 (2):41-51.
    Direct download  
     
    Export citation  
     
    Bookmark  
  35.  17
    Educational Assessment, Evaluation and Research: The Selected Works of Mary E. James.Mary E. James - 2016 - Routledge.
    In the _World Library of Educationalists_, international experts themselves compile career-long collections of what they judge to be their finest pieces – extracts from books, key articles, salient research findings, major theoretical and practical contributions – so the world can read them in a single manageable volume, allowing readers to follow the themes of their work and see how it contributes to the development of the field. Mary James has researched and written on a range of educational subjects which (...)
    No categories
    Direct download  
     
    Export citation  
     
    Bookmark  
  36.  29
    Predatory publishing and beall’s list: Lessons for the countries adapting novel research evaluation criteria.Strielkowski Wadim, Gryshova Inna & Shcherbata Maryna - 2017 - Science and Education: Academic Journal of Ushynsky University 23 (8):39-43.
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  37.  13
    The Evaluation of Research in Social Sciences and Humanities: Lessons From the Italian Experience.Andrea Bonaccorsi (ed.) - 2018 - Springer Verlag.
    This book examines very important issues in research evaluation in the Social Sciences and Humanities. It is based on recent experiences carried out in Italy in the fields of research assessment, peer review, journal classification, and construction of indicators, and presents a systematic review of theoretical issues influencing the evaluation of Social Sciences and Humanities. Several chapters analyse original data made available through research assessment exercises. Other chapters are the result of dedicated and independent (...) carried out in 2014-2015 aimed at addressing some of the debated and open issues, for example in the evaluation of books, the use of Library Catalog Analysis or Google Scholar, the definition of research quality criteria on internationalization, as well as opening the way to innovative indicators. The book is therefore a timely and important contribution to the international debate. (shrink)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  38. Meta-Research Evidence for Evaluating Therapies.Jonathan Fuller - 2018 - Philosophy of Science 85 (5):767-780.
    The new field of meta-research investigates industry bias, publication bias, contradictions between studies, and other trends in medical research. I argue that its findings should be used as meta-evidence for evaluating therapies. ‘Meta-evidence’ is evidence about the support that direct ‘first-order evidence’ provides the hypothesis. I consider three objections to my proposal: the irrelevance objection, the screening-off objection, and the underdetermination objection. I argue that meta-research evidence works by rationally revising our confidence in first-order evidence and, consequently, (...)
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  39.  75
    An Evaluation of Journal Quality: The Perspective of Business Ethics Researchers.Robbin Derry - 1996 - Business Ethics Quarterly 6 (3):359-371.
    The subject of journal quality has received little attention in the business ethics literature. While there are reasons for this past neglect, there are important new considerations which make it vital that researchers now address this topic. First, virtually all business school departments use evaluations of journal quality as an important indicator of scholarly achievement, yet business ethics has no such studies. Second, as many schools are beginning to ask ethicists to publish in the wider management literature, it is important (...)
    Direct download (7 more)  
     
    Export citation  
     
    Bookmark   6 citations  
  40. Evaluating evidential pluralism in epidemiology: mechanistic evidence in exposome research.Stefano Canali - 2019 - History and Philosophy of the Life Sciences 41 (1):4.
    In current philosophical discussions on evidence in the medical sciences, epidemiology has been used to exemplify a specific version of evidential pluralism. According to this view, known as the Russo–Williamson Thesis, evidence of both difference-making and mechanisms is produced to make causal claims in the health sciences. In this paper, I present an analysis of data and evidence in epidemiological practice, with a special focus on research on the exposome, and I cast doubt on the extent to which evidential (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   10 citations  
  41.  20
    Framework for evaluation research on clinical ethical case interventions: the role of ethics consultants.Joschka Haltaufderheide, Stephan Nadolny, Jochen Vollmann & Jan Schildmann - 2022 - Journal of Medical Ethics 48 (6):401-406.
    Evaluation of clinical ethical case consultations has been discussed as an important research task in recent decades. A rigid framework of evaluation is essential to improve quality of consultations and, thus, quality of patient care. Different approaches to evaluate those services appropriately and to determine adequate empirical endpoints have been proposed. A key challenge is to provide an answer to the question as to which empirical endpoints—and for what reasons—should be considered when evaluating the quality of a (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   9 citations  
  42.  56
    Public Participation Methods: A Framework for Evaluation.Lynn J. Frewer & Gene Rowe - 2000 - Science, Technology, and Human Values 25 (1):3-29.
    There is a growing call for greater public involvement in establishing science and technology policy, in line with democratic ideals. A variety of public participation procedures exist that aim to consult and involve the public, ranging from the public hearing to the consensus conference. Unfortunately, a general lack of empirical consideration of the quality of these methods arises from confusion as to the appropriate benchmarks for evaluation. Given that the quality of the output of any participation exercise is difficult (...)
    No categories
    Direct download  
     
    Export citation  
     
    Bookmark   63 citations  
  43. Preregistration Does Not Improve the Transparent Evaluation of Severity in Popper’s Philosophy of Science or When Deviations are Allowed.Mark Rubin - manuscript
    One justification for preregistering research hypotheses, methods, and analyses is that it improves the transparent evaluation of the severity of hypothesis tests. In this article, I consider two cases in which preregistration does not improve this evaluation. First, I argue that, although preregistration can facilitate the transparent evaluation of severity in Mayo’s error statistical philosophy of science, it does not facilitate this evaluation in Popper’s theory-centric approach. To illustrate, I show that associated concerns about Type (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  44.  18
    Research, education, ethics consultation: evaluating a Bioethics Unit in an Oncological Research Hospital.Marta Perin, Elena Turola, Giovanna Artioli, Luca Ghirotto, Massimo Costantini, Morten Magelssen & Ludovica De Panfilis - 2022 - BMC Medical Ethics 23 (1):1-15.
    BackgroundThis study aims to quantitatively and qualitatively evaluate the activities of a Bioethics Unit (BU) 5 years since its implementation (2016–2020). The BU is a research unit providing empirical research on ethical issues related to clinical practice, clinical ethics consultation, and ethical education for health care professionals (HPS).MethodsWe performed an explanatory, sequential, mixed-method, observational study, using the subsequent qualitative data to explain the initial quantitative findings. Quantitative data were collected from an internal database and analyzed by descriptive analysis. (...)
    No categories
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  45.  60
    Current Emotion Research in Psychophysiology: The Neurobiology of Evaluative Bivalence.Greg J. Norman, Catherine J. Norris, Jackie Gollan, Tiffany A. Ito, Louise C. Hawkley, Jeff T. Larsen, John T. Cacioppo & Gary G. Berntson - 2011 - Emotion Review 3 (3):349-359.
    Evaluative processes have their roots in early evolutionary history, as survival is dependent on an organism’s ability to identify and respond appropriately to positive, rewarding or otherwise salubrious stimuli as well as to negative, noxious, or injurious stimuli. Consequently, evaluative processes are ubiquitous in the animal kingdom and are represented at multiple levels of the nervous system, including the lowest levels of the neuraxis. While evolution has sculpted higher level evaluative systems into complex and sophisticated information-processing networks, they do not (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   7 citations  
  46.  63
    (1 other version)Scientific Evidence: Creating and Evaluating Experimental Instruments and Research Techniques.William Bechtel - 1990 - PSA: Proceedings of the Biennial Meeting of the Philosophy of Science Association 1990:559 - 572.
    The production of evidence for scientific hypotheses and theories often depends upon complex instruments and techniques for employing them. An important epistemological question arises as to how the reliability of these instruments and techniques is assessed. To address that question, this paper examines the introduction of electron microscopy and cell fractionation in cell biology. One important claim is that scientists often arrive at their techniques for employing instruments like the electron microscope and the ultracentrifuge by tinkering and that they evaluate (...)
    Direct download  
     
    Export citation  
     
    Bookmark   1 citation  
  47.  19
    Critical evaluation of the guidelines of the Finnish Advisory Board on Research Integrity and of their application.Erja Moore & Liisa Räsänen - 2016 - Research Integrity and Peer Review 1 (1).
    We have national guidelines for the responsible conduct of research (RCR) and procedures for handling allegations of misconduct in Finland. The guidelines have been formulated and updated by the Finnish Advisory Board on Research Integrity (TENK). In this article, we introduce and evaluate the national RCR guidelines. We also present statistics of alleged and proven RCR violation cases and frequency of appeals to TENK on the decisions or procedures of the primary institutions. In addition, we analyze the available (...)
    No categories
    Direct download  
     
    Export citation  
     
    Bookmark   5 citations  
  48. The evaluation of scientific research in democratic societies: Kitcher, Rawls and the approach of scientific significant truths.Ignacio Mastroleo - 2011 - Revista Redbioética/UNESCO 2 (4):43-60.
    This paper critically assesses the model of evaluation of scientific research for democratic societies defended by Philip Kitcher. The “significant truth” approach proposes a viable alternative to two classic images of science: that of the “critics”, who believe that science always serves the interests of the powerful and that of the “faithful”, who argue that the pursuit of scientific knowledge is always valuable and necessary. However, the democratic justification of Kitcher’s proposal is not compatible with the ethical problems (...)
    Direct download  
     
    Export citation  
     
    Bookmark  
  49.  16
    A literature review of the research on students’ evaluation of teaching in higher education.Luying Zhao, Pei Xu, Yusi Chen & Shuangsheng Yan - 2022 - Frontiers in Psychology 13.
    Students’ evaluation of teaching is a teaching quality evaluation method and teacher performance evaluation tool commonly used in Chinese and foreign universities, and it is also a controversial hot issue in the field of teaching evaluation. At present, the research results of students’ evaluation of teaching in higher education are relatively rich, mainly focusing on reliability, validity and its influencing factors, construction of index system, problems in practical application and improvement strategies. The purpose of (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  50.  24
    Research on the Influencing Factors of Problem-Driven Children’s Deep Learning.Xiao-Hong Zhang & Chun-Yan Li - 2022 - Frontiers in Psychology 13.
    Deep learning is widely used in the fields of information technology and education innovation but there are few studies for young children in the preschool stage. Therefore, we aimed to explore factors that affect children’s learning ability through collecting relevant information from teachers in the kindergarten. Literature review, interview, and questionnaire survey methods were used to determine the influencing factors of deep learning. There were five dimensions for these factors: the level of difficulty of academic, communication skills, level of active (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
1 — 50 / 971