Defaults versus framing: Revisiting Default Effect and Framing Effect with a replication and extension of Johnson and Goldstein (2003) and Johnson, Bellman, and Lohse (2002)

People tend to stick with a default option instead of switching to another option. For instance, Johnson and Goldstein (2003) found a default effect in an organ donation scenario: if organ donation is the default option, people are more inclined to consent to it. Johnson et al. (2002) found a similar default effect in health-survey scenarios: if receiving more information about your health is the default, people are more inclined to consent to it. Much of the highly cited, impactful work on these default effects, however, has not been replicated in well-powered samples. In two well-powered samples (N = 1920), we conducted a close replication of the default effect in Johnson and Goldstein (2003) and in Johnson et al. (2002). We successfully replicated Johnson and Goldstein (2003). In an extension of the original findings, we also show that default effects are unaffected by the permanence of these selections. We, however, failed to replicate the findings of Johnson et al. (2002)’s study; we did not find evidence for a default effect. We did, however, find a framing effect: participants who read a positively-framed scenario consented to receive health-related information at a higher rate than participants who read a negatively framed scenario. We also conducted a conceptual replication of Johnson et al. (2002) that was based on an organ-donation scenario, but this attempt failed to find a default effect. Our results suggest that default effects depend on framing and context. Materials, data, and code are available on: https://osf.io/8wd2b/.

Suppose that people receive a health survey after a doctor's appointment in order to see if they would like to receive health updates from their doctors. If the option to participate is preselected, people would probably not change their response-instead sticking with the default option and participating in the service. This is an example of the default effect: given a default option, people stick with it rather than changing (Johnson and Goldstein, 2003;Johnson et al., 1993).
The framing of the options may also affect people's choices. In this example, people would be more inclined to select an option if it is framed positively, as in answering "Yes" to "I will participate," as opposed to negatively, as in selecting "No" to "I will not participate." This is an example of a framing effect: people consent to participate at a higher rate when a choice is positively framed than when it is negatively framed (Johnson and Goldstein, 2003).
Default effects and framing effects have been very influential across many academic disciplines and in public policy (Araña and León, 2013;Evans et al., 2011;Johnson and Goldstein, 2003;Mintz and Redd, 2003;Tversky and Kahneman, 1981). The use of default effects is a well-known effective example of leveraging behavioral insights to influence people or to nudge them toward specific socially desirable choices. Governments and public policy organizations worldwide have set-up Nudge Units that implemented interventions using default effects to encourage desired behavior encouraging organ donations and pension savings (Halpern, 2015).
There is, however, some evidence for an overestimation of the size of nudge effects. For instance, DellaVigna and Linos (2022) recently found that there were larger effect sizes for nudge interventions reported in 2 published literature than those reported by Nudge Units in the United States. This finding suggests that selective reporting may lead to inflated meta-analytic effect sizes (Kvarven et al., 2020). Moreover, in some cases, nudge effects did not replicate with larger samples (Bohner and Schlüter, 2014;Kettle et al., 2017;Kristal et al., 2020).
Given these recent findings, there is reason to investigate default effects and framing effects. Despite a substantial number of experimental studies on default effects, for instance, very few of these employed preregistered analysis plans using well-powered samples (Szaszi et al., 2018). Together, these may lead to misplaced optimism about easy-to-implement nudging interventions, while much more complex solutions involving structural reforms have been ignored (Schmidt and Engelen, 2020). As such, researchers have called for more preregistered replications using well-powered samples (Ferguson and Heene, 2012;Franco et al., 2014).
In the current research, we sought to revisit and reassess classics on default and framing effects by embarking on preregistered high-power replications and extensions of two impactful studies on default effects: Johnson and Goldstein (2003) and Johnson et al. (2002). The first study by Johnson and Goldstein (2003) was an early demonstration of default effects. The study found that people were more likely to register as organ donors when the default option was to register. Johnson et al. (2002) contrasted default effects against framing effects and found that default effects prevailed, and that framing did not change the participants' tendency to select the default over alternatives. We investigated these foundational studies.

Default Effect
Early demonstrations of default effect were in the context of auto-insurance choices made in New Jersey and Pennsylvania, when each state had a different policy regarding the right to sue for damages in auto accidents (Johnson et al., 1993). New Jersey residents had low insurance premiums yet could acquire an additional right to sue at an additional cost. Pennsylvanian residents by default had the right to sue, but they could opt out of this right and pay a lower insurance premium. For instance, Johnson et al. (1993) found that 75% of Pennsylvania auto-insurance consumers paid the higher premium and retained their right to sue. In comparison, only 20% of New Jersey auto-insurance consumers actively chose to pay the additional premium and obtain the right to sue. Researchers have since found support for the default effect in a variety of contexts related to health, retirement saving, organ donation, sustainability, insurance coverage, electricity consump-tion, charitable giving, and many other decision-making domains (Abadie and Gay, 2006;Benartzi and Thaler, 1999;Cronqvist and Thaler, 2004;Ebeling, 2013;Jachimowicz et al., 2019;Madrian and Shea, 2001;Shealy and Klotz, 2015) 1 . While a few studies failed to support default effects (Abhyankar et al., 2014;Everett et al., 2015;Keller et al., 2011;Reiter et al., 2012), a recent meta-analysis noted substantial variations in the efficacy of the default effects (Jachimowicz et al., 2019); for instance, defaults in consumer domains were more effective, while defaults in environmental domains were less effective (Jachimowicz et al., 2019).

Framing Effects
People's decisions are also influenced by the way a decision scenario is framed-whether by using different wordings, settings, or situations (Brewer and Kramer, 1986;De Martino et al., 2006;Fagley and Miller, 1987;Gamliel and Kreiner, 2013;Huber et al., 1987;Kramer, 1989;Kühberger, 1998;Levin and Gaeth, 1988;Piñon and Gambara, 2005;Puto, 1987;Rothman and Salovey, 1997). Johnson et al. (2002) tested the action framing effects of a decision by manipulating whether participants were asked to select (positive frame) or reject (negative frame) an option. Participation rates in the positively framed condition were higher than the negatively framed condition. In this case, the positive or negative framing greatly influenced people decisions. The findings are consistent with the view that positive dimensions of a choice are weighted more when selecting an option whereas the negative dimensions are weighted more when rejecting an option (Shafir et al., 1993).

Present research
We selected Johnson and Goldstein (2003) and Johnson et al. (2002) as our replication targets for three reasons: each is foundational, has been highly influential in academia (Kahneman, 2003;Kruglanski and Gigerenzer, 2011;Weber and Johnson, 2009), and has been highly influential in practice for policy making. Johnson and Goldstein (2003)'s work was the first to demonstrate the use of defaults in an organ donation scenario, and at the time of writing this article the paper has been cited more than 2000 times. In the original study, the experimenters varied whether the donor or non-donor status was the default option. Organ donation rates were higher when the default option was 3 to donate (82%) than when the default option was to not donate (42%). These findings have influenced public policy decisions; Argentina (Nacion, 2005), Uruguay (Trujillo, 2013), Chile (Zúñiga-Fajuri, 2015), England (English et al., 2019), and Wales (Griffiths, 2013;Madden et al., 2020) have adopted default organ-donor status policies. Organ donation statistics from the Organization for Economic Cooperation and Development (OECD) countries show that, on average, organ donation rates are higher in countries where the default option is to donate (Opt-Out system) than in countries where the default option was not to donate (Opt-In system) (Li and Nikolka, 2016).
To the best of our knowledge, Johnson et al. (2002) were the first to investigate the interaction of framing of action (we refer to this framing effect here as an action framing effect) 2 and default effects in people's decisions. In the original study, the researchers asked participants whether they would like to be notified about future health surveys after they completed an online health questionnaire (Johnson et al., 2002). The experimenters varied whether the default selection was to receive these future notifications, not to receive these future notifications, or neither. They also varied whether the options were framed positively ("Notify Me") or negatively ("Do Not Notify Me"). Consistent with the default effect, participants were more inclined to be notified when participation was the default. Although the framing manipulation was not significant as a predictor of participants' decision to receive these future notifications, the pattern of responses showed that participants in the positive framing conditions consented to receive health-related information at a higher rate than participants in the negative framing conditions (Johnson et al., 2002).
We embarked on direct well-powered replications of these two classic findings with two primary goals. First, we sought to revisit and reexamine the robustness of the basic default effect reported in the well-known organ donation decision scenario by Johnson and Goldstein (2003). Second, to build on these findings we sought to also contrast default and framing effects, replicating and extending the design used in Johnson et al. (2002).

Effect Sizes in target articles
The chosen target studies did not report effect sizes. We reanalyzed the data and conducted logistics regression analysis to calculate odds-ratios with a 95% confidence interval for the regression coefficients as a measure of effect size. The effect sizes of the original studies are summarized in Table 9 (for detailed results, see Table S7 and Table S8 in the supplementary materials).

Extensions
In addition to the direct replications, we also performed two extensions. First, we investigated whether the permanence of the decision affects default effects. In particular, half the study participants were told that their organ donation-related decision was valid for three years, and the other half of participants were not provided with any additional information about the permanence of their decision. We based our extension on van Dalen and Henkens (2014) who found that organ donation rates were higher when the option was temporary and would have to be renewed than when the default option was to donate. Based on these results, we investigated the presumed permanence (temporary vs. permanent) consent in Johnson and Goldstein (2003) scenario. In line with previous work, we predicted a higher organ donation participation rate when the choices were framed as temporary (i.e., the decision can be revised in five years) rather than permanent. Second, we added a conceptual replication of Johnson et al. (2002). We applied their experimental design involving framing and default effects to the organ-donation scenario in Johnson and Goldstein (2003). This replication was meant to further test the generalizability of their findings regarding the interaction of default and framing effects.

Process
We crowdsourced the replication and extension effort using two teams of two authors. Both teams were supervised by two other experienced authors. Each team worked independently to conduct their own in-depth analysis of the chosen target articles and wrote detailed pre-registrations aiming for a very close replication and adding additional extensions. Data collection was then conducted separately for each team using a different sample. Thus, the two data collections tested two independent extensions: the effect of choice permanence (Sample 1) and the conceptual replication of Experimental 2 of Johnson et al. (2002) (Sample 2).

Pre-registrations and open data/code
In both data collections, we first preregistered the experiment on the Open Science Framework (OSF) and data collection was launched after registration. Preregistration, disclosures, power analyses, and all materials are available in the Supplementary Materials. These, together with datasets and analysis code, were 4 made available on the OSF at https://osf.io/8wd2b/. All measures, manipulations, and exclusions for this investigation are reported, and data collection was completed before analyses. Pre-registrations are available on the OSF: Group A -https://osf.io/mhwbe/, Group B -https://osf.io/j4rpc/.

Participants and power analysis
The present investigation includes two simultaneously collected data samples. For both the samples, we recruited participants from the United States via CloudResearch platform running on Amazon Mechanical Turk. Participants could participate in only one of these.
Power analyses across Group A and Group B suggested a sample size of 232. However, we note inconsistencies in the power analysis details reported as part of the pre-registrations across Group A and B. Rectified power analysis based on the original study's results indicated that a total sample of 156 participants was sufficient to obtain 95% power (at α = .05) to detect the smallest effect reported among the original studies (OR = 1.86). Please refer to the supplementary material for more details on the power analysis.
Since our replication study also involved additional extension hypotheses across two samples, we recruited 1004 and 1007 participants across two replication two teams, respectively. Additionally, a post-hoc power sensitivity analysis at an aimed sample size of 2000 participants is found to achieve a power of 96.93% power (at = .05) to detect a small effect size (i.e.,OR = 1.50). We, therefore, combined the two samples for the data analysis, which amounted to a total of 2011 participants. Following the preregistered exclusion criteria, we excluded 91 participants based on English proficiency, self-reported seriousness, knowledge of the hypothesis, survey completion, and place of residence (see supplementary material for details). Data were analyzed from the remaining 1920 participants (N1 = 954; N2 = 966; M age = 38,SD = 11.85; 52% female).

Materials and Procedure
The procedure involved two parts. In the first part, participants read about an organ donation scenario from Johnson and Goldstein (2003). In the second part, participants responded to the scenario from Experiment 2 of Johnson et al. (2002). After completing both parts of the survey, participants provided their demographic information, and they were debriefed at the end of the study. We provide a comparison of the target article sample and the replication samples in Table S2. Participants in Sample 1 were part of a choice permanence extension. In this regard, Sample 1 participants in the first of the experiment were randomly assigned to one of two between-participants conditions: the direct replication of Johnson and Goldstein (2003) or the temporary organ donation extension condition.

Part 1: Organ Donation
In part 1, participants were randomly assigned to 1 of 3 between-participants conditions (defaults: Opt-In vs. Opt-Out vs. No-Default). For example, participants in the "Opt-Out" condition read: "Imagine that you have just moved to a new state and are currently filling out the required online registration forms when you are asked to indicate your organ donor status. The default in this state is that you ARE automatically enrolled to be an organ donor. You are given the choice of whether to confirm or to change this status. Please select an option." After reading the passage, participants had to choose either "Yes -I want to be an organ donor" or "No -I do not want to be an organ donor." In the Opt-Out condition, the "Yes" option was pre-selected. Table 1 documents the format of the display of choices across experimental conditions. So, participants who consented to organ donation just had to click "Next" at the bottom of the page, whereas participants who did not wish to be an organ donor had to click the option "No" before clicking "Next." In the Opt-In condition, the "No" option was pre-selected. So, participants who consented to organ donation had to click the option "Yes" before proceeding, whereas participants who did not wish to be an organ donor just had to click "Next." In the No-Default condition, participants read: Assume you moved to a new state, therefore, you need to select enrollment as an organ donor. Please choose your preferred organ donor status: Participants in this No-Default condition saw the same binary response options without a pre-selected default. So, they had to actively select "Yes" or "No" before clicking "Next" to proceed. After completing part 1, participants moved on to part 2.

Part 2: Survey Subscription
In part 2, participants were randomly assigned to 1 of 6 conditions in a 2 (framing: Positive vs. Negative) × 3 (default option: Opt-In vs. Opt-Out vs. No-Default) between-participants design (see Table 2 for details). At the beginning of Part 2, participants read the following instruction: "Typically, regardless of your organ donor decision, the state online systems ask you to answer a number of health questions. Please answer the following. All the data will be kept completely confidential." Table 1 Study stimuli for the direct replication of Johnson and Goldstein (2003) [Introduction for participants in Opt-Out/Opt-in Conditions]: Imagine that you have just moved to a new state and are currently filling out the required online registration forms when you are asked to indicate your organ donor status. The default in this state is that you ARE automatically enrolled to be an organ donor. You are given the choice of whether to confirm or to change this status. Please select an option

[Opt-out]:
Assume you moved to a new state in which the default is that you are an organ donor, you are therefore by default enrolled as an organ donor. Please choose your preferred organ donor status:

Yes-I want to be an organ donor
No-I do not want to be an organ donor

[Opt-in]:
Assume you moved to a new state in which the default is that you are not an organ donor, you are therefore by default not enrolled as an organ donor. Please choose your preferred organ donor status: Yes-I want to be an organ donor No-I do not want to be an organ donor

[No-default]:
Assume you moved to a new state, therefore, you need to select enrollment as an organ donor. Please choose your preferred organ donor status: Yes-I want to be an organ donor No-I do not want to be an organ donor Participants then answered four generic questions on their health in general (for details, see Table S4 supplementary section). Participants then read: "You are almost at the end of the survey. Thank you for taking part. Would you be interested in being notified about other policy/health-related surveys? (If yes, we will contact you through MTurk using your MTurk worker ID)" Participants answered by selecting "Yes" or "No." Each condition had a positive ("Notify me about more health surveys.") or negative ("Do NOT notify me about more health surveys.") framing. In positively framed Opt-Out conditions, the 'yes' response was pre-selected. In positively framed Opt-In conditions, the 'No' response was pre-selected. In negatively framed Opt-Out conditions, the 'No' response was pre-selected. In negatively framed Opt-in conditions, the 'yes' response was preselected.

Extensions
Extension 1: The effect of choice permanence. Participants in Sample 1 were part of the choicepermanence extension. As such, participants in Sample 1 were randomly assigned to one of two betweenparticipants conditions (temporary or permanent). Participants assigned to the temporary conditions took the same survey as those in the permanent conditions-only they received the following additional instruction at the beginning of part 1 of the study: "Please note: Your organ donor authorization, if granted, would be for 3 years only, meaning that after 3 years you will be asked to reconfirm your organ donor decision." Participants in the permanent conditions had no additional instructions.
All the participants in Sample 2 took part in a different extension. Immediately after completing Part 1 of the study but just before Part 2, participants read the following instructions (see Table 3 for details): "Would you like to receive further information about organ donation through MTurk? If you indicate your approval, we'll contact you through MTurk using your worker ID with further information about organ donation." These participants were randomly assigned to 1 of 6 conditions in a 2 (framing: Positive vs. Negative) times 3 (default option: Opt-Out vs. Opt-In vs. No-Default) between-participants design (for details, see Table S6 in the supplementary section). After reading the above instruction, participants selected "Yes" or "No" to a question asking for consent to receiving further information on organ donation. Each of the default conditions involved either a positive ("Send me more information about organ donation") or negative ("Do NOT send me more information about organ donation") framing. The responses were pre-selected in the Opt-In and Opt-Out default conditions mirroring the experimental design of Experiment 2 of Johnson et al. (2002). In positively framed Opt-Out conditions, the 'yes' response was preselected. In positively framed Opt-In conditions, the 'No' response was pre-selected. In negatively framed Opt-Out conditions, the 'no' response was pre-selected.

Table 2
Study stimuli for the direct replication of Johnson et al. (2002) [Introduction]: Typically, regardless of your organ donor decision, the state online systems ask you to answer a number of health questions. Please answer the following. All the data will be kept completely confidential.
You are almost at the end of the survey. Thank you for taking part. Would you be interested in being notified about other policy/health-related surveys? (If yes, we will contact you through MTurk using your MTurk worker ID)

No
In negatively framed Opt-In conditions, the 'yes' response was pre-selected.

Data Transformations
Both Part 1 and Part 2 of the replication study collected the participants' responses in a binary format (Yes/No). In the organ donation scenario, we coded the answer "Yes" (i.e., consenting to donate organs) as '1'. We coded the response "No" (i.e., dissenting to donate organs) as '0'. In Part 2, the responses that indicate the consent for participation were coded as "1," whereas non-participation was coded as "0". Coding of the answers for the choice permanence extension and the conceptual replication of Johnson et al. (2002) followed the same response coding procedure as that of Part 1 and Part 2, respectively. Consistent with previous literature, we define default effects as the difference in participation rates between the Opt-Out condition versus that in the Opt-In condition. We then calculated the odds-ratio for the regression coefficients as a measure of effect size with a 95% confidence interval.

Analysis
Data were analyzed using R. Data were fit to logistic regression models using the glm function (with "binomial("logit")" as the family). In Part 1, we report the results from 2-sample test for equality of proportions (comparing the participation rate across different defaults). We analyzed the data for Part 2 with a 2 times 3 binomial logistic regression, with framing (Positive vs. Negative) and default (Opt-In vs. Opt-Out vs. No-Default) and interaction terms of framing and defaults as predictors of the respondent's decision to participate (1 = Yes; 0 = No). For testing the effect of choice permanence on organ donation rate, we conducted chi-square tests comparing the participation rate across temporary and permanent conditions. In the second extension, we conducted the same analysis as for part 2 of the study.

Replication evaluation
We evaluated findings in our replication effects using the criteria set by LeBel et al. (2019) (see Table S15 and Figure S2 in the supplementary material). Table  4 provides a classification of the replications using the LeBel et al. (2019) criteria. We summarize the present replications as very close.

Results
We provide a summary of the findings in Table 9. We present complete descriptive statistics across the two samples in Table 5 (also see Table S10 in the supplementary materials).

Table 3
Study stimuli for the on conceptual replication of Johnson et al. (2002) [Introduction]: Typically, regardless of your organ donor decision, the state online systems ask you to answer a number of health questions. Please answer the following. All the data will be kept completely confidential.
You are almost at the end of the survey. Thank you for taking part. Would you be interested in being notified about other policy/health-related surveys? (If yes, we will contact you through MTurk using your MTurk worker ID)

[Positive frame, Opt-out]:
Send me more information about organ donation.

[Positive frame, Opt-in]:
Send me more information about organ donation.

[Positive frame, No-default]:
Send me more information about organ donation.

[Negative frame, Opt-out]:
Do NOT send me more information about organ donation.

[Negative frame, Opt-in]:
Do NOT send me more information about organ donation.

[Negative frame, No-default]:
Do NOT send me more information about organ donation.  Figure 1). This result was consistent across both samples (See Table S11 for complete results). Also, participants in the No-Default condition consented to organ donation at a higher rate (69.7%) than participants in the Opt-In condition (62.5%) (χ 2 (1) = 5.31, p =.021, OR = 1.38, 95% CI [1.06, 1.80]) with slight deviations between the two samples (See Table S11 for complete results). Also see Table 6 for the results based on logistic regression.

Part 2: Replication of Johnson, Bellman, and Lohse (2002)
We present the results of the regression analysis in Table 7 (Figure 2), and descriptive statistics in Table S9 in the supplementary section.

Default effects
We failed to find support for differences in rates of consent to receive health-related information between the Opt-Out condition (60.5%) and the Opt-In condition (61.1%) (b = -.29, p = .095, OR = 0.75, 95% CI [0.53, 1.05]); that is, we found no support for a default effect. This result was consistent across both samples (See Table S11 for complete results). Participants in the No-Default (59.8%) condition, moreover, consented to receive health-related information at a lower rate than participants in Opt-In (61.1%) condition (b = -0.41, p = .021, OR = 0.67, 95% CI [0.47, 0.94]).  Note. Estimates represent the odds of the dependent variable = "1" vs. "0".

Figure 1
Results of direct replications of Johnson and Goldstein (2003). Percentage of participants consenting to organ donation by condition across both samples.

Exploratory: Default effects as a function of frames
We proceeded to conduct additional exploratory (not preregistered) analyses examining the interaction between framing and defaults. We found support for an interaction (see Table 7). We considered two sets of interactions: 1) (Positive -Negative) times (No-default -Opt-In); 2) (Positive -Negative) times (Opt-Out -Opt-In).
For the (Positive -Negative) × (No-default -Opt-In) interaction, we looked at the consent rates between the No-Default and Opt-In conditions across the positive and negative frame (b = 1.01, p = .003, OR = 2.76, 95% CI [1.43, 5.32]). Within positive framing conditions, participants in the No-Default condition (93.4%) consented to receive health-related information at a higher rate than participants in the Opt-In condition (88.6%) (see Table S12 and Table S13 in the supplementary materials). The pattern of results was in the opposite direction in the negative framing conditions: participants in the No-Default condition (25.3%) consented to receive health-related information at a lower rate than participants in the Opt-In condition (33.6%).
The results were similar for the (Positive -Negative) times (Opt-Out -Opt-In) interaction (b = 0.85, p = .010, OR = 2.35, 95% CI [1.23, 4.49]). Within positive framing conditions, participants in the Opt-Out condition (93.1%) consented to receive health-related information at a higher rate than participants in the Opt-In condition (88.6%). The pattern of results was in the opposite direction within the negative framing conditions: participants in the Opt-Out condition (27.6%) consented to receive health-related information at a lower rate than participants in the Opt-In condition (33.6%).

Extension 2: Conceptual replication of Experimental 2 of Johnson et al. (2002)
We summarized the regression analysis in Table 8 and Figure 4, and descriptive statistics are provided in Table  S14 in the supplementary.

Summary of replication findings
We replicated the default effects from Johnson and Goldstein (2003). In Part 1 of our study, participants consented to donate their organs at a higher rate when they had to opt out relative to when they had to opt in. We, however, failed to replicate the default effects in Johnson et al. (2002). In Part 2 of our study, we found no evidence that consent to be notified about health-related surveys varied between the opt-out and opt-in conditions. Furthermore, we found that people in positively framed scenarios consented to receive healthrelated information at a higher rate than participants in negatively framed scenarios. This result deviated from the findings of Johnson et al. (2002) that showed no framing effects.
We followed LeBel et al.'s (2019) framework for the evaluation of our replication using three factors: (a) whether a signal was detected (i.e., the confidence interval for the replication Effect Size (ES) excludes one), (b) consistency of the replication ES with the original study's ES, and (c) precision of the replication's ES estimate (see Figure S2 in the supplementary material). We summarized our evaluations of the replications' findings based on LeBel et al.'s (2019) replication evaluation framework in Table 9 (see Figure 5).

Extensions: Summary of findings
In the first extension, we predicted that people would be more inclined to become donors when consent to organ donation is temporary. We found no evidence that consent varied between the temporary and permanent conditions.
In the second extension, we conducted a conceptual replication of Experiment 2 of Johnson et al. (2002) using the scenario from Part 1 in which participants consented to receive additional information about organ donation. We found support for the default effect: participants who had to opt out consented at higher rates than those who had to opt in. Deviating from the original study where they found no support for framing effects, we found that people in positively framed scenarios consented to receive health-related information at a lower rate than participants in negatively framed scenarios. Framing effects in our extension is opposite to those found in our direct replication of the original study scenario (Johnson et al. 2002).

Summary of findings of Johnson et al. (2002) across original, direct replication, and conceptual replication studies
The findings across direct and conceptual replication of Johnson et al. (2002) provide mixed support to the original study's assertion. We summarize the comparison of the findings in Table 10. Both direct and conceptual replication failed to find differences in consent rates between the No-Default condition and the Opt-In condition. Only the conceptual replication found that consent rates were higher in Opt-out condition than in the Opt-In condition. Regarding the framing effects, we Table 7 Summary of the replication results of Part 1 (Johnson Goldstein, 2003)  Note. Estimates represent the odds of the dependent variable = "1" vs. "0". Standard errors are reported within the brackets. Note. Estimates represent the odds of the dependent variable = "1" vs. "0"; N = 966. Standard errors are reported within the brackets.

Table 10
Summary of the findings of Johnson et al. (2002)  Note. Directionality dimension summarizes the directional consistency of results across Default effects and Framing effects; Predicted directionality of framing effects: participants' consent rates are higher in the positive frame than negative frame condition; Predicted directionality of default effects: consent rates are higher in 'Opt-Out' and 'No-Default' experimental condition than 'Opt-In' experimental condition. Signal, indicates support for the hypothesis using null hypothesis significance testing ( p < .05)

Figure 2
Results of direct replication of Johnson et al. (2002).

Figure 3
Results of Extension 1. Percentage of participants who consented to organ donation between permanent vs. temporary choice scenarios.
Note.* p < .05, ** p< .01, *** p < .001 expected to find that participants in the positive framing condition consent at a higher rates than participants in the negative framing condition. While the original study did not find this, we found that consent rates were higher in positive frame condition than negative frame condition in our direct replication. However, in our conceptual replication, we found a framing effect in the opposite direction.

General discussion
We conducted a direct, close replication of Johnson and Goldstein (2003) and Johnson et al. (2002). In Part 1 of our study, we successfully replicated Johnson and Goldstein (2003). Participants consented to be organ donors at higher rates when they had to opt out of consent relative to participants who had to opt in. We found that participants in the No-Default condition-where no response was pre-selected-consented to organ donation at higher rates relative to participants who had to opt in. Additionally, we found that the permanence of these decisions affected people's choices.
Our replication results are consistent with Johnson and Goldstein (2003)-though the effects were smaller than those reported in the original study. The weaker effect is in line with recent work which found that effect sizes in large-scale studies were smaller than the estimates forecasted by academic experts and practitioners with relevant knowledge of nudge effects (DellaVigna  and Linos, 2022). Our well-powered study provides a more precise estimation of the effect size (OR = 1.67, 95% CI [1.27, 2.19]) that may be useful for future metaanalyses and for policy applications.
In Part 2 of our study, our replication results of Johnson et al. (2002) were inconsistent with the original findings. Unlike the original study, we found framing effects, yet we found no evidence for default effects. Consistent with the original study, we found that participants in the positive framing conditions consented to receive organ donation information at a higher rate than participants in the negative framing condition. However, in our conceptual replication of Johnson et al. (2002) that we report as Extension 2, participants in the positive framing condition consented to receive organ donation information at a lower rate than participants in the negative framing condition.
Our results on default effects were inconsistent with the original findings in Johnson et al. (2002): we had no evidence for default effects overall. Nonetheless, we did find some indication of default effects when scenarios were framed positively. For instance, within positive framing conditions, participants in the No-Default condition and Opt-Out condition consented to receive health-related information at a higher rate than participants in the Opt-In condition. The pattern of results was in the opposite direction within the negative framing conditions: participants in the No-Default condition and Opt-Out condition consented to receive health-related information at a lower rate than participants in Opt-In condition. Interestingly, we found the consistent pattern across positive and negative frames in the conceptual replication: although these differences were not significant, participants in the No-Default condition and Opt-Out condition consented to receive organ donation related information at a higher rate than participants in Opt-In condition. As such, our results suggest that the stability of default effects can vary depending on the framing of the decision scenario.
There are several possible explanations for the inconsistent findings in our replications of Johnson et al. (2002). First, the failure to replicate the default effects may have been due to insufficient sample size in Johnson et al. (2002), which involved only 235 participants--about 39 participants for each experimental condition. This small sample may have led to false-positive results and inflated the effect size. Moreover, the smaller sample size in the original article may have resulted in the failure to detect the framing effects and the interaction that we found.
Second, the differences could be a result of changing preferences toward participating in online surveys. The original study was published in 2002, and the experimental scenario involved consenting to be notified about health-related surveys in the near future. People's preferences for taking part in online surveys may have changed in the last two decades. Therefore, the differences in the results could be informed by the change in Effect sizes in Johnson and Goldstein (2003), Johnson et al. (2002), and the current replication. Estimates and confidence intervals are plotted on a natural logarithmic scale.
peoples' preferences. Given the other successful replication in Part 1 of our study, we think this explanation is unlikely, yet we cannot rule out this possibility.
A third related explanation may be due to carry-over effects resulting from the order of the replications. The failed replication of Johnson et al. (2002) was in Part 2 and followed the unrelated organ-donation scenario in Part 1. We acknowledge that there is the slight possibility that somehow the order of execution affected the findings in Part 2. We consider this unlikely; the findings were not noise--they reflected a clear pattern of framing effects over default effect--so it would seem improbable that the slight manipulation in Part 1 triggered such a major shift from default effects to framing effects in Part 2. In our study design, we also took measures to mitigate carry-over effects. In Part 1, participants responded to the organ donation scenarios of Johnson and Goldstein (2003). The participants were assigned to three between-participants scenarios: Opt-Out, Opt-in, No-default. After completing Part 1 of the experiment, participants were randomly assigned to one of six between-participants conditions related to Johnson et al. (2002) in Part 2. So, we find it unlikely that a carry-over effect occurred in such a complex betweenparticipants design. Furthermore, Samples 1 and 2 had slightly different procedures. Despite these differences, we report similar results across the sample (see Table  S11 in the supplementary materials). Therefore, this possibility of carry-over effects is unlikely.
Finally, the lack of support for the default effects in the negatively framed scenarios of Johnson et al. (2002) may have been due to the fact that double-negatively framed questions (i.e., negatively framed in the Opt-in scenario) are more confusing to participants than the other conditions. However, this possibility too seems to be an unlikely explanation for the lack of default effects. First off, the original study carried the same double-negatively framed questions yet found support for default effects. While we recognize that doublenegative questions may have been taxing to follow, the relatively consistent effects within negatively framed default conditions suggest otherwise. Across the three conditions with negatively-framed descriptions, the participation rates were similar: Opt-Out (28%), Opt-In (34%), and No-Default (25%). The similar participation rates across default conditions within the negative frame suggest that comprehending double negatively framed questions do not explain our pattern of findings.
There are also some potential explanations for other inconsistencies we found in our replications. Interestingly, the direction of framing effects in our conceptual replication of Johnson et al. (2002) was in the opposite direction of that found in our direct replication of Johnson et al. (2002). Although this result is inconsistent with the original study, it may not be entirely surprising; previous work suggests that framing effects may vary across task contexts. For example, work by Zhen and Yu (2016) show that framing effect vary between vignette-based vs. reward-based decision tasks. Furthermore, previous work also found that the direction of framing effects may differ based the relative attractiveness of the alternatives (Chandrashekar et al., 2021;Chen and Proctor, 2017;Wedell, 1997), or the degree to which decision may have personal relevance to participants (Krishnamurthy et al., 2001). Future work on framing effects may further investigate whether different task contexts modulate the direction of framing effects.
At a more theoretical level, Wedell (1997) accentuation hypothesis perhaps best describes the pattern of current results about framing effects. Wedell (1997) argues that people have a higher need for justification in a positively framed choice than in a negatively framed choice. This higher need for justification highlights the differences between alternatives. On this account, when the overall attractiveness or benefits of participating in a health survey is high, people in the positively framed choice will choose to participate at a higher rate. Alternatively, when the overall attractiveness of participating in a health survey is lower, participants in the positively framed choice will choose to participate at a lower rate. Our results across direct and conceptual replication of Johnson et al. (2002) support this account. In the direct replication of Johnson et al. (2002) using a healthcare survey scenario, we find an overall high participation rate of 60.4% across conditions, and we found that participation rates were higher in the positive frame condition. In the conceptual replication of Johnson et al. (2002) that involved an organ donation scenario, we found an overall lower participation rate of 44.6% across experimental conditions, suggesting that the overall attractiveness of the option of consenting to receive additional information on organ donation is lower. Interestingly, we found that participation rates were lower in the positively framed condition. Our find-ings suggest that future work on the default effect may benefit from paying closer attention to the accentuation hypothesis.

Conclusion
Overall, our effort to replicate Johnson et al. (2002) contributes to the extant literature by testing the stability of default effects. Since the publication of Johnson et al. (2002), there has not been much interest in further studying framing effects (Positive vs. Negative frame) together with default effects. We believe that our findings indicate that this is a promising area for future research.
The current findings underline the importance of well-powered preregistered replications and extensions of notable findings in the judgment and decisionmaking literature. Our results suggest that the stability of default effects depends on the framing and context of the decision scenario and therefore hold valuable implications for the study of default effects. Although work on default effects has deservedly garnered attention from both scholars and public policy practitioners in the last two decades, our work suggests that we need a more refined and contextualized understanding of defaults' effectiveness.
We propose two main assertions. First, the effect size of default effects is likely smaller than those documented in original studies (DellaVigna and Linos, 2022. Therefore, we need well-powered samples to study default effects to achieve greater precision in our effect size estimates. Second, framing seems to influence the direction of default effects. Future work on default effects should be aware that people's decision frame can influence defaults' effectiveness. We hope the current replication opens up a range of theoretical and empirical work that can further future work on default effects.

Conflict of Interest and Funding
This research was supported by the European Association for Social Psychology seedcorn grant.

Author Contributions
Nadia Adelina, Shiyuan Zeng, Yan Ying Esther Chiu, and Grace Yat-Sum Leung analyzed the original articles, wrote the pre-registrations, designed the replications and the extensions, and conducted an initial analysis of the results and write-up of the first draft. Boley Cheng guided and assisted the replication effort. Subramanya Prasad Chandrashekar and Paul Henne verified and extended analyses, integrated the studies, and wrote the final manuscript for submission. Gilad Feldman led the replication efforts, supervised each step, conducted the pre-registrations, ran data collections, provided feedback throughout, and edited the final manuscript for submission.

Open Science Practices
This article earned the Preregistration+, Open Data and the Open Materials badge for preregistering the hypothesis and analysis before data collection, and for making the data and materials openly available. It has been verified that the analysis reproduced the results presented in the article. The entire editorial process, including the open reviews, is published in the online supplement.