The Replication Network

Furthering the Practice of Replication in Economics

Menu

  • HOME
    • Why Replications?
    • Organizers
  • RESEARCH
    • Publishing
    • Replication Studies
  • FROM MEMBERS
    • Publications
    • Working Papers
  • GUEST BLOGS
  • NEWS & EVENTS
  • MEMBERSHIP
    • List of Members
    • Join

Journal of Economic Psychology Calls for Papers for Special Replication Issue

  Posted on 9th April 2018 by replicationnetwork

  Leave a Comment

[From the website of the Journal of Economic Psychology announcing a special issue on “Replications in Economic Psychology and Behavioral Economics”]
“In this special issue, we aim to contribute to ongoing efforts in both disciplines to test the replicability of important findings, but also to tackle theoretical questions such as how to improve replicability, how to conduct proper replications, and how to decide which studies should be replicated in the first place. As such, we invite both empirical and theoretical contributions. Regarding empirical contributions to the special issue, we invite replications of previous findings relevant for economic psychology and/or behavioral or experimental economics. We invite two formats of replication studies: classic submissions based on already existing data and submissions of registered reports. Likewise, we invite two types of theoretical contributions: classic submissions of full manuscripts as well as brief proposals of planned theoretical contributions.”
To read more, click here.

 Category: NEWS & EVENTS      Tags: Journal of Economic Psychology, Journal policies, Psychology, replication

IN THE NEWS: Buzzfeed (April 4, 2018)

  Posted on 9th April 2018 by replicationnetwork

  Leave a Comment

[From the article, “Hundreds of Researchers Are Trying to Replicate High-Profile Psychology Studies” by Stephanie M. Lee in Buzzfeed]
“More than 400 psychologists worldwide are teaming up to fight a looming problem in their field: headline-making research that doesn’t hold up.”
“As part of a new network called the Psychological Science Accelerator, the researchers are trying to fix the so-called replication crisis that’s punctured splashy findings, from Diederik Stapel’s fabricated claims that messy environments lead to discrimination, to Brian Wansink’s retracted studies about eating behavior. …”
“So at the Accelerator, scientists will select a handful of influential studies, attempt to redo them, and share their results with the public, whether or not they’re able to reproduce the original finding.”
“The Accelerator grew out of a blog post that Chartier, an associate psychology professor at Ashland University in Ohio, penned last year. It isn’t the first effort of its kind. In 2015, the Center for Open Science’s Reproducibility Project sought to replicate 100 psychology experiments — and reproduced less than half of the original findings.”
“But while the Reproducibility Project sought to provide an overview of the field, the Accelerator is assigning many researchers to verify just a couple studies.”
To read more, click here.

 Category: NEWS & EVENTS      Tags: Brian Wansink, Christopher Chartier, Diederik Stapel, Media, Psychological Science Accelerator, Psychology, Stephanie M. Lee

Should We Talk About It?

  Posted on 6th April 2018 by replicationnetwork

  Leave a Comment

[From the article “How (and Whether) to Teach Undergraduates About the Replication Crisis in Psychological Science”, recently published by William Chopik, Ryan Bremner, Andrew Defever, and Victor Keller in Teaching of Psychology]
“Over the past 10 years, crises surrounding replication, fraud, and best practices in research methods have dominated discussions in the field of psychology. However, no research exists examining how to communicate these issues to undergraduates and what effect this has on their attitudes toward the field. We developed and validated a 1-hr lecture communicating issues surrounding the replication crisis and current recommendations to increase reproducibility. Pre- and post-lecture surveys suggest that the lecture serves as an excellent pedagogical tool. Following the lecture, students trusted psychological studies slightly less but saw greater similarities between psychology and natural science fields. We discuss challenges for instructors taking the initiative to communicate these issues to undergraduates in an evenhanded way.”

 Category: NEWS & EVENTS      Tags: Psychological Science, Teaching, Undergraduates

ROYNE: Building and Enhancing the Advertising Discipline Through Replication

  Posted on 6th April 2018 by replicationnetwork

  Leave a Comment

[This blog is taken from a recent editorial that appeared in the Journal of Advertising Research entitled “Why We Need More Replication Studies to Keep Empirical Knowledge in Check” by Marla B. Royne. The full-length editorial can be found here]
Advertising research in top advertising journals goes through a rigorous peer-review process to ensure that published articles are of the highest quality.  Yet such research is generally published only in these top journals when something novel is reported because previously reported results are believed to be less interesting and unimportant. Despite the rarity of replication research in advertising, Nosek et al (2012) note that replications can help keep existing knowledge in check, yet they argue that academia only rewards novel, positive results.   However, controversy about conducting and publishing advertising replications remains.
Replications might be best viewed as a process of conducting similar, but consecutive studies that increasingly consider alternative explanations, critical contingencies, and real-world relevance. This belief is in line with my own work (Royne 2016) and supports the role of replications as a way to reach ultimate understanding of a particular theory or construct. 
The Journal of Advertising’s 2016 special issue on “re-inquiries” in advertising research reinforces this notion and published a range of articles reinvestigating advertising questions.   The issue included articles that replicated existing studies either empirically or conceptually; in some cases, the publication offered support for the original work and in others, provided different results.
For example, an attempted empirical replication of Gwinner and Eaton (1999) showed the effects of brand sponsorship on image congruence between sponsoring brands and sponsored sporting events (Kwon, Ratneshwar, and Kim, 2016). Attempting to address potential methodological flaws, the authors enhanced their statistical analyses.  Findings supported some of the original findings, including that brand sponsorship increases image congruence between sponsoring brands and sponsored sporting events. However, only mild support of the matchup hypothesis was found and there was no support of a moderating influence of image-based similarity on the extent of image congruence.
Another article in the same issue by Bellman, Wooley, and Varan (2016) applied facial-tracking technology in a conceptual replication of Kamins, Marks, and Skinner’s (1991) program-ad matching study. Although this research examined the original study’s program–ad matching effect on informational advertisements on cognitive recall, this replication varied because it utilized a mixed experimental design, different genres of television shows and a biometric process measure (computer-detected smiling). The replication both corroborated and extended the original study demonstrating how replications need not be limited to just the original results. 
A third study included a hybrid attempt to empirically replicate the findings of Kees, et al (2006) who originally reported that more graphic pictorial cigarette warnings positively affect smoking cessation intentions and that evoked fear underlies this relationship (Davis and Burton, 2016). This replication study also differed.  Specifically, the 2016 study used cigarette advertisements (and not packaging and warning statements), FDA mandated pictures (and not self-selected pictures) and different samples. Partial corroboration of Kees et al. (2006) was found including additional support that more graphic pictorials positively influence warning effectiveness perceptions and smoking cessation intentions and confirmation of evoked fear as the primary underlying mediating mechanism.
These are just three examples of studies that helped add knowledge and understanding to the advertising literature through “replications,” but not a 100% pure repeat of what had been done previously.   In short, replication is about much more than just redoing a study that had been done before; rather, it has the vast potential of building and enhancing the advertising discipline.
Marla B. Royne (Stafford) is the Great Oaks Foundation Professor of Marketing at the University of Memphis, USA. She is past President of the American Academy of Advertising and past Editor-in-Chief of the Journal of Advertising, the leading journal in the advertising discipline. Professor Royne Stafford can be contacted at mstaffrd@memphis.edu.
REFERENCES
Bellman, S., B. Wooley, and D. Varan. “Program–Ad Matching and Television Ad Effectiveness: A Reinquiry Using Facial Tracking Software.” Journal of Advertising 45, 1 (2016): 72–77.
Davis, C., and S. Burton. “Understanding Graphic Pictorial Warnings in Advertising: A Replication and Extension.” Journal of Advertising 45, 1 (2016): 33–42.
Gwinner, K. P., and J. Eaton. “Building Brand Image through Event Sponsorship: The Role of Image Transfer.” Journal of Advertising 28, 4, (1999): 47–57.
Kamins, M. A., L. J. Marks, and D. Skinner. “Television Commercial Evaluation in the Context of Program-Induced Mood: Congruency versus Consistency Effects.” Journal of Advertising 20, 2 (1991): 1–14.
Kees, J., S. Burton, J. C. Andrews, and J. Kozup. “Tests of Graphic Visuals and Cigarette Package Warning Combinations: Implications for the Framework Convention on Tobacco Control.” Journal of Public Policy and Marketing 25, 2 (2006): 212–23.
Kwon, E., S. Ratneshwar, and E. Kim. “Brand Image Congruence Through Sponsorship of Sporting Events: A Reinquiry of Gwinner and Eaton (1999).” Journal of Advertising 45, 1 (2016): 130–38.
Nosek, B. A., J. R. Spies, and M. Motyl. “Scientific Utopia: II. Restructuring Incentives and Practices to Promote Truth over Publishability.” Perspectives on Psychological Science 7, 6 (2012): 615–31.
Park, J. H., O. Venger, D. Y. Park, and L. N. Reid. “Replication in Advertising Research, 1980-2012: A Longitudinal Analysis of Leading Advertising Journals.” Journal of Current Issues and Research in Advertising 36, (2015): 115–35.
Royne (Stafford), M. “Research and Publishing in the Journal of Advertising: Making Theory Relevant.” Journal of Advertising 45, 2 (2016): 269–73.

 Category: GUEST BLOGS      Tags: Advertising, Journal of Advertising, Journal policies, replication

Things Aren’t Looking That Great in Ecology and Evolution Either

  Posted on 30th March 2018 by replicationnetwork

  Leave a Comment

[From a recent working paper entitled “Questionable Research Practices in Ecology and Evolution” by Hannah Fraser, Tim Parker, Shinichi Nakagawa, Ashley Barnett, and Fiona Fidler]
“We surveyed 807 researchers (494 ecologists and 313 evolutionary biologists) about their use of Questionable Research Practices (QRPs), including cherry picking statistically significant results, p hacking, and hypothesising after the results are known (HARKing). We also asked them to estimate the proportion of their colleagues that use each of these QRPs. Several of the QRPs were prevalent within the ecology and evolution research community. Across the two groups, we found 64% of surveyed researchers reported they had at least once failed to report results because they were not statistically significant (cherry picking); 42% had collected more data after inspecting whether results were statistically significant (a form of p hacking) and 51% had reported an unexpected finding as though it had been hypothesised from the start (HARKing).”
To read more, click here.

 Category: NEWS & EVENTS      Tags: Cherry-Picking, Ecology, Evolution, HARKing, p-hacking, Questionable Research Practices

A Summary of Proposals to Improve Statistical Inference

  Posted on 30th March 2018 by replicationnetwork

  Leave a Comment

In a recent comment published in the Journal of the American Medical Association, John Ioannidis provided the following summary of proposals (see table below). The summary, and his brief commentary, may be of interest to readers of TRN. 

Capture

Source: Ioannidis JPA. The Proposal to Lower P Value Thresholds to .005. JAMA. Published online March 22, 2018. doi:10.1001/jama.2018.1536

 Category: NEWS & EVENTS      Tags: JAMA, John Ioannidis, p-values, Statistical practice

Review of Development Finance Calling for Replication Studies

  Posted on 24th March 2018 by replicationnetwork

  Leave a Comment

[From the website of the journal, Review of Development Finance]
“In addition to its primary scope of publishing original research articles, Review of Development Finance would like to make it worth your time to submit replication studies. First, we’re issuing calls for papers like this one. We are also publishing special issues that highlight replication studies to emphasize that we want to publish these types of papers and to correct for the natural advantage a highly novel claim has in capturing attention. We are also tagging replication studies on Science Direct so that they can more easily be found.”
To read more, click here.

 Category: NEWS & EVENTS      Tags: Journal policies, publishing replications, replications, Review of Development Finance

Reproducibility crisis? What reproducibility crisis?

  Posted on 18th March 2018 by replicationnetwork

  Leave a Comment

[From the opinion article, “Is science really facing a reproducibility crisis, and do we need it to?” by Daniele Fanelli, published in Proceedings of the National Academy of Sciences (PNAS)]
“Efforts to improve the reproducibility and integrity of science are typically justified by a narrative of crisis, according to which most published results are unreliable due to growing problems with research and publication practices. This article provides an overview of recent evidence suggesting that this narrative is mistaken, and argues that a narrative of epochal changes and empowerment of scientists would be more accurate, inspiring, and compelling.”
To read more, click here.

 Category: NEWS & EVENTS      Tags: Daniele Fanelli, PNAS, Reproducibility crisis

National Science Foundation Now Funding Replication Studies. In Neuroscience.

  Posted on 17th March 2018 by replicationnetwork

  Leave a Comment

[From the post “Dear Colleague Letter: Achieving New Insights through Replicability and Reproducibility” published at nsf.gov]
“The National Science Foundation’s (NSF) Directorate for Social, Behavioral and Economic Sciences (SBE) encourages submission of proposals that target reproducibility and replicability efforts in data-intensive domains and that specifically rely on analysis of neuroimaging or neuroelectric data … Research questions should fall within the content domain of any of the following NSF programs: Cognitive Neuroscience, Perception, Action, and Cognition, and Science of Learning. NSF expects that these activities will aid in verification of prior findings, disambiguate among alternative hypotheses and serve to build a community of practice that engages in thoughtful reproducibility and replicability efforts. The suggested research must demonstrate clear potential for generating new scientific advances and discoveries, beyond simply rejecting or corroborating prior findings.”
To read more, click here.

 Category: NEWS & EVENTS      Tags: Grants, National Science Foundation, Neuroscience, NSF, replication, United States

Postdiction, Prediction, and Preregistration

  Posted on 16th March 2018 by replicationnetwork

  Leave a Comment

[From the article, “The preregistration revolution” by Brian Nosek, Charles Ebersole, Alexander DeHaven, and David Mellor, published in Proceedings of the National Academy of Sciences (PNAS)]
“Sometimes researchers use existing observations of nature to generate ideas about how the world works. This is called post-diction. Other times, researchers have an idea about how the world works and make new observations to test whether that idea is a reasonable explanation. This is called prediction. To make confident inferences, it is important to know which is which. Preregistration solves this challenge by requiring researchers to state how they will analyze the data before they observe it, allowing them to confront a prediction with the possibility of being wrong. Preregistration improves the interpretability and credibility of research findings.”
To read more, click here.

 Category: NEWS & EVENTS      Tags: Brian Nosek, PNAS, Postdiction, Pre-registration, Prediction

« 1 … 39 40 41 42 43 … 70 »
Search for:
rss Subscribe via RSS
Tags
Andrew Gelman BITSS Brian Nosek Data sharing economics Journal policies Meta-analysis null hypothesis significance testing Open Science p-hacking p-values Pre-registration Psychology publication bias Registered Reports replication replication crisis replications Reproducibility Transparency
Blogroll
  • Berkeley Initiative for Transparency in the Social Sciences
  • Curate Science
  • Political Science Replications
  • Replication Wiki
  • Retraction Watch
  • Statistical Modeling, Causal Inference, and Social Science
Recent Posts
  • AoI*: “Computational Reproducibility and Robustness of Empirical Economics and Political Science Research”
  • REED: Another Reason to Prefer Random Effects Over Fixed Effects/UWLS Meta-Analysis
  • REED: You Can Calculate Power Retrospectively — Just Don’t Use Observed Power
  • ROODMAN: Appeal to Me – First Trial of a “Replication Opinion”
  • AoI*: “Introducing Synchronous Robustness Reports” by Bartos et al. (2025)

Blog at WordPress.com.

  • Subscribe Subscribed
    • The Replication Network
    • Join 107 other subscribers
    • Already have a WordPress.com account? Log in now.
    • The Replication Network
    • Subscribe Subscribed
    • Sign up
    • Log in
    • Report this content
    • View site in Reader
    • Manage subscriptions
    • Collapse this bar
 

Loading Comments...
 

You must be logged in to post a comment.