Loading [Contrib]/a11y/accessibility-menu.js
Skip to main content
Survey Practice
  • Menu
  • Articles
    • Articles
    • Editor Notes
    • In-Brief Notes
    • Interview the Expert
    • Recent Books, Papers, and Presentations
    • All
  • For Authors
  • Editorial Board
  • About
  • Issues
  • Blog
  • Subscribe
  • Author terms & conditions
  • search

RSS Feed

Enter the URL below into your favorite RSS reader.

http://localhost:4488/feed
Articles
September 18, 2025 EDT

'I Prefer Not to Respond’ Survey Responses: Red Flag or Red Herring?

Angie Miller, Ph.D., Antonia Bacigalupa Albaum,
survey researchstudent engagementprefer not to respond
https://doi.org/10.29115/SP-2025-0011
Photo by Scott Graham on Unsplash
Survey Practice
Miller, Angie, and Antonia Bacigalupa Albaum. 2025. “’I Prefer Not to Respond’ Survey Responses: Red Flag or Red Herring?” Survey Practice 18 (September). https:/​/​doi.org/​10.29115/​SP-2025-0011.
Save article as...▾
Download all (1)
  • Appendix A. Wording and layout of five PNR items on NSSE 2023
    Download

Sorry, something went wrong. Please try again.

If this problem reoccurs, please contact Scholastica Support

Error message:

undefined

View more stats

Abstract

One increasing trend in demographic survey items is to include a “prefer not to respond” (PNR) option, particularly for items that might be considered sensitive information. This mixed-methods study uses data from the 2023 administration of the National Survey of Student Engagement (NSSE) to explore whether the frequency of selecting this option is related to substantive survey content, in this case other aspects of the student experience. Quantitative results suggest small but statistically significant negative relationships between PNR frequency and all Engagement Indicators. There are also negative relationships for belonging, satisfaction, self-reported grades, and intent to return, as well as a positive relationship for age. To explore potential motivation for PNR response, qualitative thematic analysis of final open-ended responses for students with high PNR was conducted. This revealed several themes, including: disapproval of demographic items themselves; hostility toward Diversity, Equity, and Inclusion (DEI); institutional distrust; and conversely also some praise for professors and certain institutional experiences. Potential reasons for these patterns, along with implications for institutional and survey research professionals, are also discussed.

Demographic items have long been an important element of survey research. They provide information about the sample, verify its representativeness, and allow for disaggregation of data by subgroups. In terms of design, one longstanding recommendation is to place these items at the end of the survey, so as not to distract respondents from key topics of interest (Dillman 2007). One more recent trend in the design of demographic items is to include a “prefer not to respond” (PNR) option, particularly for items that might be considered sensitive information (Santos et al. 2023). This allows the researcher to determine intentional vs. unintentional missing item-level information (Sischka et al. 2022), as the respondents can provide data points even for topics that elicit discomfort (Montagni et al. 2019). Including this response option is now recommended as a “best practice” in survey design across multiple sources. This suggestion appears in publicly available materials from leading professional organizations (AAPOR 2022), resources for institutional research professionals (Clayton 2021), or guidelines from industry user experience (Brown 2023). However, systematic searches yield very little scholarly research on how this option impacts data quality, either in the form of rigorous controlled experiments or descriptive explorations of how and why this option is used, and by whom. This study looks at use of the PNR option on a survey of postsecondary student experiences, to determine whether it is related to other quantitative and qualitative survey content, and what that might reveal about their motivations for selecting PNR.

Example Project: NSSE

Student engagement, defined as student involvement in educationally purposeful activities (Kuh 2001), has long been acknowledged to have positive impacts on key outcomes in higher education, such as learning, retention, persistence, and satisfaction (Astin 1993; Kuh 2003; Pascarella and Terenzini 2005). Engagement encompasses a range of experiences and perceptions (Kuh 2001; McCormick et al. 2013) and is related to courses and academic behaviors, thus making it a valuable element of institutional research and assessment activities. The National Survey of Student Engagement (NSSE), first launched in 2000 and then updated in 2013, was created to assess “activities and experiences that have been empirically linked to desired college outcomes” for students and uses student engagement as an umbrella term to capture numerous content areas including academic challenge, learning with peers, experiences with faculty, and campus environment (NSSE 2018, 1).

Over the past two decades, NSSE has been expanding the practice of including a PNR option. A relatively early adopter of the practice, NSSE first included a PNR option in 2005 for the race/ethnicity item. For many years, race/ethnicity was the sole item with this option. However, in 2013 during a major overhaul of the survey instrument, NSSE took this opportunity to expand the number of items offering a PNR option. Specifically, the 2013 update added a PNR option for the disability and sexual orientation questions in addition to the race/ethnicity question. The following year, a PNR option was added for gender identity. These four PNR items were featured on the survey from 2014 to 2022. In the 2023 administration, a fifth PNR option was added to an existing item about parent education level, thus resulting in the five current items with a PNR option: race/ethnicity, disability, gender identity, sexual orientation, and parent education.

The Current Study

There is an increasing use of PNR as a response option within survey design overall and NSSE specifically. Although recommended as a best practice, there is a lack of systematic research on how this demographic response option might be related to other survey content. Therefore, the goal of this study was to explore whether selecting PNR is indicative of any other aspects of the student experience. The research questions were:

  1. What is the prevalence of PNR selection in this sample of postsecondary students?

  2. What types of students are more likely to select PNR?

  3. Is there a relationship between student engagement and the frequency of PNR selection?

  4. For students who have a high frequency of PNR selection, are there any patterns in their qualitative perceptions of their institutions that might explain this survey-taking behavior?

Method

Data Source

This study uses data from the 2023 NSSE administration. NSSE is an annual survey administered in the spring semester to first-year and senior students at four-year colleges and universities across the United States (for the full survey instrument, see www.nsse.indiana.edu). We examined responses from 114,750 first-year students and 125,003 seniors attending 456 baccalaureate-granting institutions.

NSSE data collection takes place annually during the spring semester. Participating institutions provide NSSE staff with contact emails for all eligible students. These students are then invited to respond via an email request, which includes a link to the online survey instrument. Students receive a maximum of five contact emails. Because large institutions have disproportionately lower response rates, NSSE reports average institutional response rates, which for the 2023 administration was 26% (NSSE 2023).

Sample

Approximately 63% of the respondents identified as women, 32% identified as men, and between .3% and 2.6% selected one of the nonbinary gender categories (e.g., genderqueer, genderfluid, or transgender; students were allowed to select more than one gender category). A majority (90%) were enrolled full-time, and 81% were younger than 24 years old. About half (57%) of respondents were White, 8% Asian/Pacific-Islander, 9% African American/Black, 11% Hispanic/Latino, 10% multiracial, and 5% identified with another racial/ethnic group (e.g., Native American, “Another race/ethnicity” category) or preferred not to respond. Respondents self-reported a variety of academic major fields.

Measures

All five items from the NSSE demographics section that include a PNR option were the main focus for this study (see Appendix A for exact wording/layout of each). A count variable of “PNR frequency” was also created by summing the total number of these responses. For this created variable, students could range from a score of 0 (selected no PNR) to a score of 5 (selected all PNRs possible).

Other variables of interest for the study included 10 scales, which NSSE terms “Engagement Indicators” (EIs): higher-order learning, reflective and integrative learning, quantitative reasoning, learning strategies, collaborative learning, discussions with diverse others, student-faculty interactions, effective teaching practices, quality of interactions, and supportive environment. These have various stems and response options, with 3–8 items per scale. Each engagement indicator is scored on a 60-point scale. For more detailed information on the scale construction process see the NSSE website at https://nsse.indiana.edu/nsse/survey-instruments/engagement-indicators.html. Similarly, NSSE has also developed scales for sense of belonging and perceived gains (with parallel scoring construction to the EIs; see Table 1 for all Cronbach’s alphas and number of items per scale).

Table 1.Engagement Indicator and other scale alpha coefficients and PNR correlations
Scale or Other Variable # of Items FY: Cronbach’s α FY: PNR r SR: Cronbach’s α SR: PNR r
Reflective & Integrative Learning 7 0.85 -0.047 0.87 -0.063
Higher-Order Learning 4 0.84 -0.041 0.85 -0.052
Learning Strategies 3 0.77 -0.039 0.78 -0.035
Quantitative Reasoning 3 0.83 -0.029 0.85 -0.028
Collaborative Learning 4 0.84 -0.054 0.84 -0.053
Discussions with Diverse Others 4 0.88 -0.033 0.89 -0.033
Student-Faculty Interaction 4 0.83 -0.023 0.85 -0.04
Effective Teaching Practices 4 0.84 -0.046 0.86 -0.077
Quality of Interactions 5 0.85 -0.056 0.84 -0.084
Supportive Environment 8 0.88 -0.068 0.89 -0.086
Perceived Gains 10 0.91 -0.072 0.92 -0.106
Sense of Belonging 3 0.85 -0.084 0.85 -0.116
Institutional Satisfaction 1 -- -0.075 -- -0.116
Self-Reported Grades 1 -- -0.033 -- -0.029
Age 1 -- 0.031 -- 0.062

Notes: FY = first-year; SR = senior; PNR = Prefer Not to Respond count measure. All correlations are significant at p<.001

Other variables included in this study were individual survey items for overall institutional satisfaction, intention to return the following year (for first-year students only), and self-reported grades, self-reported course type (mostly in-person, mostly remote, mostly hybrid, or a mix of course type) and institution-reported age were also included in the study. Finally, an open-ended final comment asking students to make an overall assessment of their institutional experience and provide any additional feedback was also included in the qualitative component of the study.

Analyses

We explored the research questions using a mixed-methods approach that included descriptive statistics, inferential statistics, and a qualitative thematic analysis. First, we looked at frequencies for each item with a PNR option, as well as the frequencies of the created PNR count variable. Quantitative analyses, separate for first-year students and seniors, examined correlations for PNR sum and the continuous variables (EIs, sense of belonging, perceived gains, overall institutional satisfaction, self-reported grades, and age). An independent samples t-test compared means of PNR sum across first-year students who intended to return to those who did not. Chi-square analyses examined number of PNR responses by the categorical variable of course type, as well as age and grades by collapsed categories. For the qualitative analyses, a subgroup of students with the highest scores (4 or 5) for PNR count were selected, and their open-ended final comments were analyzed for themes.

Findings

Quantitative

Overall, there are low percentages for all five PNR options: gender identity (1.9%), sexual orientation (4.3%), race/ethnicity (3.0%), disability status (4.8%), and parent education level (3.2%). This is also reflected in the summed PNR count variable, with a frequency distribution of: no PNR responses (88.6%), one PNR response (8.3%), two PNR responses (1.5%), three PNR responses (.9%), four PNR responses (.5%), and five PNR responses (.2%). Overall, most respondents do not provide any PNR responses but among those who do, there is variation in their choice. It does not appear to be the case that if someone selects PNR for one question that they will automatically select that every time it is available, indicating that it is not a response style (Van Vaerenbergh and Thomas 2013).

Basic correlations (Table 1) showed small but statistically significant negative relationships between PNR count and all ten NSSE Engagement Indicator scores. There were also significant negative relationships for perceived gains, sense of belonging, overall institutional satisfaction, and self-reported grades, as well as a positive relationship for age (as a continuous variable). First-year students who did not intend to return to the institution the following year also had significantly more PNR responses compared to those who did intend to return; t(10361) = 10.25, p < .001, d = .138. Chi-square analyses suggested that students taking most of their courses online had significantly more PNR responses (Table 2). However, the chi-square analyses for collapsed categories of age and grades across PNR count also suggest that the distributions for age and grades may be non-linear, with the biggest differences for age and grades showing up for only one or two PNR options selected.

Table 2.Chi-square analyses for course type, grades, and age categories
Demographic category 0 PNR 1 PNR 2 PNR 3 PNR 4 PNR 5 PNR χ2 Cramer’s V
FY: Mostly in-person courses 88.20% 9.00% 1.50% 0.70% 0.40% 0.20% 96.46 0.02
FY: Mostly remote courses 85.10% 10.90% 2.20% 1.10% 0.60% 0.10%
FY: Mostly hybrid courses 85.40% 11.10% 2.10% 0.80% 0.40% 0.20%
FY: Balanced mix of course types 86.60% 10.60% 1.50% 0.70% 0.30% 0.20%
SR: Mostly in-person courses 90.40% 7.00% 1.20% 0.90% 0.40% 0.10% 197.11 0.045
SR: Mostly remote courses 87.70% 8.00% 1.70% 1.50% 0.80% 0.20%
SR: Mostly hybrid courses 87.80% 8.90% 1.50% 1.10% 0.50% 0.20%
SR: Balanced mix of course types 89.00% 7.90% 1.50% 0.90% 0.50% 0.20%
FY: Mostly A grades 88.90% 8.30% 1.40% 0.80% 0.40% 0.20% 219.35 0.052
FY: Mostly B grades 86.70% 10.30% 1.80% 0.60% 0.40% 0.20%
FY: Mostly C grades or lower 83.70% 12.80% 1.90% 1.00% 0.30% 0.20%
SR: Mostly A grades 90.10% 6.90% 1.30% 1.00% 0.50% 0.10% 137 0.037
SR: Mostly B grades 88.50% 8.30% 1.50% 1.00% 0.50% 0.20%
SR: Mostly C grades or lower 85.30% 10.40% 2.10% 1.20% 0.80% 0.30%
FY: 19 or younger 88.20% 9.10% 1.50% 0.70% 0.40% 0.20% 178.6 0.047
FY: 20-23 84.80% 11.10% 2.50% 0.90% 0.40% 0.20%
FY: 24-29 83.10% 12.30% 2.50% 1.10% 0.70% 0.40%
FY: 30-39 85.60% 10.10% 2.10% 1.40% 0.70% 0.10%
FY: 40-55 83.30% 11.80% 2.30% 1.80% 0.70% 0.20%
FY: Over 55 82.80% 12.40% 2.90% 1.00% 1.00% 0.00%
SR: 19 or younger 88.40% 9.20% 0.90% 0.80% 0.50% 0.30% 599.53 0.078
SR: 20-23 90.80% 6.70% 1.20% 0.70% 0.40% 0.10%
SR: 24-29 87.40% 8.80% 1.70% 1.30% 0.70% 0.20%
SR: 30-39 86.00% 8.90% 2.10% 1.90% 0.90% 0.20%
SR: 40-55 85.80% 9.30% 2.00% 1.70% 1.00% 0.20%
SR: Over 55 84.00% 10.40% 2.80% 2.40% 0.40% 0.10%

Note: Chi-squared analyses are all statistically significant at p < .001 level

Qualitative

Of the students who had four or five PNR responses, 42% also provided a comment in the final question of the survey (n=466). Qualitative thematic analysis aimed to identify any common themes across comments, including why they may have chosen PNR or other survey-taking behaviors. A very small subset of students (n=4) directly expressed disapproval of the inclusion of demographic questions, noting they did not believe demographics were crucial to their experiences. Another subset (n=17) expressed hostility toward Diversity, Equity, and Inclusion (DEI) work on campus, which potentially influenced their reluctance to provide demographic information, as DEI work often targets specific demographic groups. An example of one of these comments is the student who said, “I don’t give a s–t about pronouns, or your sexuality or other f—ing nonsense.” [expletives redacted]. The use of expletives in this quote indicates a level of hostility towards demographic information that may have influenced their decision to choose PNR. Beyond these subsets, students’ comments did not directly indicate reasons for a PNR response. However, their comment data did have some other unifying themes, explored below.

Overall, the comments indicated a divide between students who expressed a loss of institutional trust or feelings of alienation and high-achieving students who praised their professors and the academic experience. In particular, the students’ praise often centered around interactions with professors. Some exemplar quotes can be found in Table 3.

Table 3.Exemplar quotes from high count PNR students praising academics
“Wow. […] Of all my classes or courses taken I can admit honestly that I think the universe has blessed, conspired, or aligned me with an elite cadre of the most awesome, unfathomable, brilliant, empathetic, success-pushing, personable and everything or of a word that can epitomize of what a wonderful HUMAN BEING someone should be. […] I might not have attended the likes of [Ivy League Institutions]. However, I felt like I got a world class education to match all the aforementioned schools, and I wouldn't change it for the world”
“The education here is the most satisfying. The professors in my major are great, and care about students. Honestly, the academics are the only reason that I stay here at [Institution].”
“The most significant learning experience was learning that school is not inherently miserable. I struggled greatly in grade school with both academics and socializing, but I found that attending university has allowed me to flourish in both of those aspects.”
“I have learned that it is never too late to complete your degree. All my professors supported me to the fullest extent.”
“Being able to attend office hours and developing a personal relationship with my professors.”

Note: Emphasis added

While seemingly contradictory, the positive correlation between age and PNR count may suggest older students’ tendency to view relationships with professors as part of their support network. Self-exclusion from the demographics may be a cultural or generational preference rather than a reflection of their institutional experiences. This may explain the intentions of high PNR count students who left very positive comments.

Other comments reflected a loss of trust in their institution. These students’ comments indicate feeling disconnected from their community. For students whose comments showed a loss of institutional trust, refusal to provide demographic information may reflect overall withdrawal from engagement with the institution or distrust of how the institution would use the information. Exemplar comments from this category can be found in Table 4. Taken together, the qualitative responses indicate that students select PNR for a variety of reasons, not all of which should be interpreted as a red flag for student engagement or experience on campus.

Table 4.Exemplar quotes from high count PNR students reflecting loss of institutional trust
“As someone who hopes to pursue a graduate program, I feel disheartened that [Institution]’s divestment from educators (in favor of such superficial things as architecture and sports) negatively impacts not only my education but also what would be my future career connections within academia. In this way, I feel robbed by [Institution]’s decision to, time and time again, ascribe greater monetary value to everything except education and educators.”
“I had to go out, start my own business while doing internships to get the lessons of the real world that I had originally hoped would be shared in college. All of this taught me the lesson that college is basically a business, and at that, a scam based on the rate at which the tuition is valued versus the actual value of the education given.”
“better organization and communication with everything - from student organizations to professors to food on campus. It's been infuriating sometimes. I even had a [Food service] employee cuss at me. That really shook me. Times like those make me want to transfer. I get that feeling at least twice a week.”
“This administration has given up on the classes of 2021-2024 so that the classes of 2025 and beyond might thrive. Even with the renovations (and a lot of generosity from donors) enrollment will continue to fall. I actively anticipate the failure of this institution; the administration is fully out of touch with student needs, and their worry is justified. […] I'm painfully aware, however, of how little my voice matters to the administration.”

Note: Emphasis added

Implications

For the NSSE survey, students who selected PNR responses were more likely to be outside of the 18–24 age range. Students outside of this age range tend to represent a small percentage of students on most campuses across the country (with the exception of online-only institutions) and their experiences may vary from students who fall within the 18–24-year-old range. The emphasis of these students in praising the relationships with staff and professors indicates they may view these relationships as important to their campus support systems, despite the quantitative trends that PNR students are overall less engaged and less satisfied with their educational experience. For students outside of the 18–24 age range, selecting PNR may be motivated by a cultural difference, rather than a reflection on their institution experience. While PNR responses can limit the ability to disaggregate data by demographics, these results provide an important reminder that this group may have various unknown background influences on their response choices.

Including PNR response options on surveys can provide more precision in data by reducing missingness (Sischka et al. 2022), but it is also important to understand the potential reasons behind this response preference and how they might reflect the respondent’s experience. Overall, these results indicate that students who selected PNR responses are less engaged and potentially disenchanted with various aspects of their undergraduate experience. However, their relationships were small in magnitude, denoting that dissatisfaction or a lack of engagement is not the only explanation. The further qualitative analysis of student comment data also suggests that this is not a unidimensional group when it comes to engagement and satisfaction. Their motivation extends beyond mere dissatisfaction with their education or institutions, so survey researchers should not necessarily assume that a PNR response is only indicative of dissatisfaction or negative affect. Our varied results indicate that researchers including a PNR response on their surveys should include qualitative analysis, when possible, to contextualize PNR responses. It may be just as important for these students to “feel seen” through their PNR selection as it is to have other inclusive identity options on the demographic items.

Future research should also address various comparisons between respondents who do and do not utilize PNR options, from both quantitative and qualitative perspectives. Although it was beyond the scope of this particular study, a sample of comments from the students who did not select PNR could provide some additional comparisons, especially if it focused on older students. Deeper investigations of other components of data quality might also shed light on PNR responses, such as looking at relationships between PNR count and straight-lining patterns or looking at previous years of data (before the PNR option was added) for comparisons of the missingness on those specific items. Experimental designs could also be implemented to directly test whether a PNR option can reduce missingness for survey items. This work should also be replicated with other surveys on topics outside of postsecondary education. Surveys on varied topics such as drug use, sexual behaviors, or even consumer studies might elicit different distributions of PNR selection or connections to other survey content.

Limitations

Although there are several strengths of this study, some limitations should be stated as well. Given the data collection procedures and response rates, the sample may not be representative of all students in the population, and thus, generalizations should be made with caution. Additionally, this study relied on self-reported data. While self-reported aspects of the PNR responses were part of the research question, the reported engagement behaviors and satisfaction items may not always be completely objective. However, most studies looking at self-reports in higher education suggest that self-reports and actual measures of things like abilities are positively related (Anaya 1999; Hayek et al. 2002; Pike 1995; Zilvinskis, Masseria, and Pike 2017). Furthermore, the approach to converting the PNR responses into a count-based measure is innovative yet also may be losing nuance when it comes to interpretation. The qualitative component of the study did, on the other hand, attempt to address the distinctions in PNR responses, albeit this thematic analysis was based only on respondents with four or five PNR selections and may not fully generalize to those who selected fewer.

Conclusion

This study offers insight into the rationale and development of demographic items that include “I prefer not to respond” as a response option, using a well-known national research and assessment project as an exemplar. It is useful to explore engagement and experience patterns in the context of PNR responses for first-year and senior students, with the benefit of a large sample including hundreds of four-year colleges and universities. Investigating the quantitative and qualitative patterns of PNR responses, engagement items, experience items, and open-ended comment themes, as well as some potential reasons for these findings, can promote reflections about institutional and survey research professionals’ own experience with PNR survey items. Knowledge of this topic can better inform best practices in survey design and interpretation and can advance the conversation about more research that should be done to explore the effectiveness of this practice for the field and better understand motivations of respondents who select this option.


Funding statement

This research did not receive any specific grant from funding agencies in the public, commercial, or not-for-profit sectors.

Data availability statement

The NSSE data set is proprietary and not currently available for free public use. Syntax for all analyses included in this paper are available from the authors upon request.

Declarations of interest

None.

Human subjects statement

The data collection and analysis adhere to all human subject guidelines, as specified by the Indiana University Institutional Review Board (Protocol # 709000079), as well as with the Institutional Review Boards of all institutions participating in the study.

Corresponding author contact information

Angie L. Miller, Ph.D., Indiana University Bloomington, anglmill@iu.edu

Appendix A
Appendix A.Wording and layout of five PNR items on NSSE 2023

Submitted: April 06, 2025 EDT

Accepted: August 18, 2025 EDT

References

American Association for Public Opinion Research. 2022. “Best Practices for Survey Research.” https:/​/​aapor.org/​standards-and-ethics/​best-practices/​.
Anaya, Guadalupe. 1999. “College Impact on Student Learning: Comparing the Use of Self-Reported Gains, Standardized Test Scores, and College Grades.” Research in Higher Education 40 (5): 499–526. https:/​/​doi.org/​10.1023/​a:1018744326915.
Google Scholar
Astin, Alexander W. 1993. What Matters in College? Four Critical Years Revisited. San Francisco, CA: Jossey-Bass Publishers.
Google Scholar
Brown, Maddie. 2023. “Writing Good Survey Questions: 10 Best Practices.” Nielsen Norman Group. August 2023. https:/​/​www.nngroup.com/​articles/​survey-best-practices/​.
Clayton, Jess. 2021. “Best Practices in Survey Design.” Whitewater, WI: Office of Institutional Research & Planning, University of Wisconsin Whitewater. https:/​/​www.uww.edu/​a/​89815.
Dillman, Don A. 2007. Mail and Internet Surveys: The Tailored Design Method. 2nd ed. Hoboken, NJ: John Wiley and Sons, Inc.
Google Scholar
Hayek, John C., Robert M. Carini, Patrick T. O’Day, and George D. Kuh. 2002. “Triumph or Tragedy: Comparing Student Engagement Levels of Members of Greek-Letter Organizations and Other Students.” Journal of College Student Development 43 (5): 643–63.
Google Scholar
Kuh, George D. 2001. “Assessing What Really Matters to Student Learning: Inside The National Survey of Student Engagement.” Change: The Magazine of Higher Learning 33 (3): 10–17. https:/​/​doi.org/​10.1080/​00091380109601795.
Google Scholar
———. 2003. “What We’re Learning About Student Engagement From NSSE: Benchmarks for Effective Educational Practices.” Change: The Magazine of Higher Learning 35 (2): 24–32. https:/​/​doi.org/​10.1080/​00091380309604090.
Google Scholar
McCormick, A. C., J. Kinzie, R. M. Gonyea, and M. B. Paulsen. 2013. “Student Engagement: Bridging Research and Practice to Improve the Quality of Undergraduate Education.” In Higher Education: Handbook of Theory and Research, 28:47–92. New York: Springer. https:/​/​doi.org/​10.1007/​978-94-007-5836-0_2.
Google Scholar
Montagni, Ilaria, Tanguy Cariou, Christophe Tzourio, and Juan-Luis González-Caballero. 2019. “‘I Don’t Know’, ‘I’m Not Sure’, ‘I Don’t Want to Answer’: A Latent Class Analysis Explaining the Informative Value of Nonresponse Options in an Online Survey on Youth Health.” International Journal of Social Research Methodology 22 (6): 651–67. https:/​/​doi.org/​10.1080/​13645579.2019.1632026.
Google Scholar
National Survey of Student Engagement. 2018. “NSSE Conceptual Framework (2013) (NSSE Psychometric Portfolio Report).” Bloomington, IN: Center for Postsecondary Research, Indiana University, School of Education. http:/​/​nsse.indiana.edu/​html/​psychometric_portfolio.cfm.
NSSE. 2023. “National Survey of Student Engagement 2023 Overview.” Bloomington, IN: Center for Postsecondary Research, Indiana University, School of Education. https:/​/​scholarworks.iu.edu/​dspace/​items/​78ee6de1-95a4-4098-a484-dde761beced0.
Pascarella, Ernest T., and Patrick T. Terenzini. 2005. How College Affects Students: Volume 2. San Francisco, CA: Jossey-Bass.
Google Scholar
Pike, Gary R. 1995. “The Relationship between Self Reports of College Experiences and Achievement Test Scores.” Research in Higher Education 36 (1): 1–21. https:/​/​doi.org/​10.1007/​bf02207764.
Google Scholar
Santos, Henri C., Rebecca M. Mestechkin, Randi L. Vogt, Daniel Rosica, Patrick Ryan Heck, Pedram Heydari, Matt Brown, et al. 2023. “To Respond or Not to Respond: The Effect of Forced and Prefer Not to Answer Options in Demographic Questionnaires.” OSF. https:/​/​doi.org/​10.31234/​osf.io/​ga6b2.
Sischka, Philipp E., Jean Philippe Décieux, Alexandra Mergener, Kristina M. Neufang, and Alexander F. Schmidt. 2022. “The Impact of Forced Answering and Reactance on Answering Behavior in Online Surveys.” Social Science Computer Review 40 (2): 405–25. https:/​/​doi.org/​10.1177/​0894439320907067.
Google Scholar
Van Vaerenbergh, Y., and T. D. Thomas. 2013. “Response Styles in Survey Research: A Literature Review of Antecedents, Consequences, and Remedies.” International Journal of Public Opinion Research 25 (2): 195–217. https:/​/​doi.org/​10.1093/​ijpor/​eds021.
Google Scholar
Zilvinskis, John, Anthony A. Masseria, and Gary R. Pike. 2017. “Student Engagement and Student Learning: Examining the Convergent and Discriminant Validity of the Revised National Survey of Student Engagement.” Research in Higher Education 58 (8): 880–903. https:/​/​doi.org/​10.1007/​s11162-017-9450-6.
Google Scholar

This website uses cookies

We use cookies to enhance your experience and support COUNTER Metrics for transparent reporting of readership statistics. Cookie data is not sold to third parties or used for marketing purposes.

Powered by Scholastica, the modern academic journal management system