By Stephanie Walsh, Ph.D.



Interactions with government processes, whether to renew a driver’s license or apply for public assistance, commonly include frictions referred to as administrative burdens. Administrative burdens occur when individuals experience difficulty in implementing a particular policy process and can hinder an individual’s ability to access public benefits and services. (Moynihan et al., 2014). These frictions are experienced as the learning costs required to obtain information and understand a program or process, compliance costs associated with the need to complete applications or other documentation, and the psychological costs associated with the stigma of participating in a public program (Moynihan et al., 2014). Administrative burdens are often exacerbated for some who are most likely to need public benefits, including older adults and those experiencing poverty.

Prior work has shown how overwhelming each stage of public program participation can be (Hatke et al., 2022). When program information is presented in complicated ways, it limits one’s ability to process and use the information needed to learn about and apply to programs. Extraneous cognitive load increases the learning costs associated with the initial stage of learning about a program for which one may be eligible. This report explores how reducing the extraneous cognitive load associated with learning about a public program affects information recall, here a proxy for learning costs.

The case study for this report is the Supplemental Nutrition Assistance Program (SNAP), given its relatively straightforward program eligibility and compliance requirements but nevertheless a take-up rate that varies across different groups. Using a survey experiment with a representative sample of 1,677 New Jersey residents, this study tests objective information recall across three methods of communicating SNAP eligibility requirements and examines the impact for policy-relevant subgroups. The three modes of communication included: 1) a screening questionnaire, similar to those used by state governments to help residents understand eligibility requirements and apply for benefits; 2) a PDF flyer, the conventional format used by state governments; and 3) a video that provides users with a worked example of how to evaluate program eligibility. The final group is the control which only received a brief explanation of SNAP.

The results of the survey experiment highlight how those in each of the three treatment groups had significantly higher information recall about the SNAP program than those in the control group, and that the video is more effective than either the flyer or screener. However, the effects of the treatments vary by individual characteristics, at times with a greater magnitude of impact going to those from more privileged groups. For example, when comparing results by digital literacy levels, there was a significant difference in treatment effects between those with lower and higher levels of digital literacy; in this case, the treatments were more effective in increasing information recall among those with higher levels of digital literacy. Findings inform policy recommendations regarding more targeted and evidence-based program outreach.