Frequently, surveys and screeners contain questions that some respondents may find sensitive or be reluctant to answer. Even questions that might seem innocuous to a researcher, like age, gender, or income level, could cause an emotional response from your respondents.
This article guides you through handling sensitive questions in surveys and screeners and provides some example wording for you to use in the future.
Researchers should always attempt to avoid any possible discomfort experienced by their research participants. In addition to the obvious ethical implications, sensitive questions could cause participants to either abandon the survey or, worse, provide unreliable answers.
A sensitive question is one that respondents might find embarrassing or invasive.
There are multiple categories of sensitive questions to be aware of, but two are particularly worthy of consideration for user researchers: demographic questions and questions about socially undesirable behaviors.
Survey researchers often add demographic questions to a questionnaire without considering how they might be construed. While many survey takers might breeze through demographic questions, they can be potentially triggering or offensive for some respondents.
Demographic questions that have the potential to be perceived as sensitive include those asking about:
Income level can be especially problematic and lead to high rates of nonresponse. One study by Jeffrey Moore and colleagues found that questions about income are 10 times more likely to be left blank than other demographic questions.
Sex and gender are other categories of questions that can be sensitive depending on one’s gender identity. Take this question from a recent Nielsen Radio Ratings survey.
Many people would easily complete this question without a second thought. However, consider this account from genderqueer essayist s.e. smith:
“For some trans* folk, it is a place of endless heartbreak. Every. Single. Time. I fill out a form, I stop here. There is a long pause. A hesitation. A sigh. I am not male. I am not female. On paper forms, I often leave it blank [...]
Imagine dreading the filling out of forms not because it’s a hassle and it’s repetitive and it’s not very fun. Imagine dreading it because you know that you are going to have to lie and erase yourself every time you fill out a form.”
User researchers like to refer to themselves as user advocates. This lofty designation extends not only to designing delightful user experiences and interfaces but also to protecting the human experiences of our research participants. This includes all participants, not just those that fall into a perceived norm.
Social-desirability bias is a cognitive bias that dictates that people are less likely to disclose behaviors or preferences that are deemed to be undesirable by society. Behaviors that are considered socially undesirable can vary from person to person and, therefore, are likely to slip undetected through a survey creation process.
Common behaviors known to lead to underreporting in surveys include:
However, there are less obvious behaviors as well. For example, voting is also considered a socially desirable behavior, and in many surveys, the reported voting rate is higher than the real one.
Beyond these categories, additional topics that might include sensitive questions include the following:
While the strategy for addressing sensitive survey questions will vary based on multiple factors, some general guidelines are always helpful.
Researchers frequently include demographic questions in a survey just out of habit, without considering whether asking is truly necessary. To optimize response rate, surveys should be kept as short as possible, and therefore any question should be included only if necessary for the research at hand.
Here are 2 questions you should ask yourself before including any question in a questionnaire, regardless of sensitivity.
Don’t ask questions because you’re merely curious, or assume all questionnaires should include basic demographic questions. If you have no plans to make decisions based on the outcome, you probably shouldn’t include the question.
Are you using a recruiting panel like UserTesting.com or User Interviews? If so, a lot of demographic data about respondents is already included in their profiles and, therefore, doesn’t need to be asked.
Are you asking about browsing behavior on your site, data that could be gathered more accurately from analytics? Don’t ask users to provide you with information that you can access in another, less intrusive way.
Whether to keep participant data confidential, anonymous, or both is an important consideration in survey methodology. While often conflated, these two terms describe two different concepts.
Anonymity means a participant's data cannot be traced back to their identity. Ensuring anonymity may mean using a pseudonym or code number instead of the participant’s name in reporting.
Confidentiality limits the people who are permitted to view the response. Ensuring confidentiality might mean that no one beyond the immediate research team will ever view the survey data.
Anonymity is typically required in most cases of user research, and confidentiality is typically overkill. That said, adding the promise of confidentiality can sometimes increase the likelihood of complete and honest responses to sensitive questions.
Respondents should never have to guess whether a survey is anonymous, confidential, both, or neither. This information should be readily available to respondents when they are deciding whether to take the survey. For example:
Your privacy is important to us. All responses to this survey will be kept strictly confidential. Individual answers will be anonymized and aggregated for analysis. Your personal information will not be shared with any third parties, and will only be used for the purpose of this research. Thank you for your participation.
Sensitive questions placed too early in a questionnaire or screener are more likely to lead to dropoffs. Build respondent trust with nonsensitive questions first, allowing them to invest effort in the process.
This guideline also applies to demographic questions, which should be optional and placed at the end of a questionnaire, not at the beginning, as is often the case. Demographic questions at the beginning are much more likely to lead to dropoffs than demographic questions at the end, even when they are not deemed to be sensitive.
Also, avoid placing particularly sensitive questions at the very end of a survey. If you inadvertently offend a respondent, you don’t want that to be their lasting impression of the survey.
UXers love to remove unnecessary text from interfaces and forms. Sometimes, however, adding more information is necessary.
First, you may use question loading (not to be confused with a leading question). Question loading is the inclusion of additional context that may assuage respondent guilt or shame around questionable behavior.
Original question: Do you save a portion of your income each month?
With question loading: Given the rising cost of living and various financial commitments, many people find it challenging to save regularly. How often are you able to save a portion of your income each month?
Additionally, if a question or topic may raise eyebrows, consider explaining the purpose of asking it and the benefit that could come from answering it honestly. For example, if a survey asks a question about abortion, the drafter may include language such as the following:
We are conducting a survey to understand people's experiences with abortion. Please know that your responses are completely anonymous and will be used solely to improve access to abortion care for those who need it. We approach this topic with sensitivity and without judgment.
Imagine you want to ask about someone’s income — something they may reasonably feel sensitive about sharing. Imagine someone’s reaction to reading the following question:
What is your annual household income: __________
A respondent would likely feel very apprehensive about sharing their specific income for several reasons. They may:
Now consider the following format instead:
What is your total annual household income?
This tweak addresses all the previous concerns. With this question format, a respondent would:
Ranges feel less sensitive than asking for specific values. The larger the ranges provided, the less sensitive the request feels.
Consider the following question from the 2019 Youth Risk Behaviors Survey.
During your life, how many times have you used marijuana?
Rather than simply asking Have you ever used marijuana? (which, admittedly, would probably have been easier for respondents to answer correctly if they had been willing to share that information), the researchers asked about the frequency of use. While the Yes/No formulation may imply a moral judgment and encourage lying, the frequency question makes a slight assumption that the respondent has indeed used marijuana in the past, which encourages honesty.
Additionally, you may wish to provide additional response options at the end of the range that may be perceived as undesirable, to skew the responses away from bias. For example, consider the following question about exercise frequency.
How often do you engage in physical exercise each month?
Someone who only works out once or twice per month may feel shame around selecting the bottommost option, and dishonestly select a response towards the middle of the range (see central-tendency bias).
The following formulation would likely encourage more honest responses from infrequent exercisers.
How often do you engage in physical exercise each month?
For particularly sensitive topics, researchers may wish to employ an indirect surveying technique, such as the item-count method.
In this approach, the respondent population is divided into two groups. Each group is asked an identical question about how many behaviors from a list they have engaged in, with the addition of the behavior in question for only one of the groups.
For example, consider this question from a 2014 study by Jouni Kuha and Jonathan Jackson at the London School of Economics:
I am now going to read you a list of five [six] things that people may do or that may happen to them. Please listen to them and tell me how many of them you have done or have happened to you in the last 12 months. Do not tell me which ones are and are not true for you. Just tell me how many you have done at least once.
Attended a religious service, except for a special occasion like a wedding or funeral
Went to a sporting event
Attended an opera
Visited a country outside [your country]
Had personal belongings such as money or a mobile phone stolen from you or from your house?
Treatment group only: Bought something you thought might have been stolen
During the analysis, the research will then compare the differences between the two groups in order to estimate the frequency of the behavior in question.
A single sensitive question in an otherwise mundane questionnaire can stand out and be jarring. Researchers may attempt to disarm the respondent with other slightly sensitive questions to mask the question of interest.
Some questions (e.g., demographic questions) are very common and potentially sensitive. The following section includes recommendations for specific wording for these questions. You are welcome to utilize these wordings in your own surveys and screeners.
Sex and gender, while frequently conflated and confused, are different (sex is biological characteristics, whereas gender is one’s social identity). The first question you need to ask yourself when asking about sex or gender (after asking, “Do I really need to be asking at all?”) is, “Which one do I care about?” Most of the time, UX researchers care about gender, not sex.
The most important thing to remember when asking about gender is to use inclusive and gender-expansive language that captures the full breadth of gender identity that society now recognizes.
First, consider the question wording. Please select your gender implies that one of the options provided will be a perfect match for the respondent’s gender identity and, therefore, requires the inclusion of an Other:_______ option.
On the other hand, Which of the following best represents your gender? implies only a best fit, so the use of a fill-in-the-blank Other option, while still advisable, is not required.
Next, consider the options provided. It is no longer acceptable to limit gender identity options to simply man and woman. Society’s understanding of gender identity has expanded, and research methods must reflect that.
In most instances, it is sufficient to limit options to the following categories:
If you additionally have a need to know whether someone is transgender (e.g., in order to differentiate between cisgender men and transgender men, both of whom would select Man from the above options), ask a followup question:
Do you identify as transgender?
If you are conducting research for which a granular and inclusive capturing of one’s gender identity is necessary, consider using the following options to the first question and omitting the followup transgender question.
Age is a common question in in user-research surveys, as age may correlate to distinct persona traits. Adhere to the following best practices when asking about age:
Age is a common question in in user-research surveys, as age may correlate to distinct persona traits. Adhere to the following best practices when asking about age:
Rule
✅
Provide options with ranges, rather than an open text field.
❌ How old are you? [text field]
✅ How old are you? [18 to 24, 25 to 34, 35 to 44, etc.]
Ensure ranges are all-inclusive (i.e., include all possible ages within the options provided).
❌
✅
Ensure ranges are nonoverlapping.
❌
✅
Frontload the number.
❌ Under 18
✅ 18 or under
Avoid language that may cause shame or judgement.
❌ 60 or older
✅ 60 or over
Use to rather than dashes for better readability (
❌ 18–20
✅ 18 to 20
Use conversational questions rather than more “accurate” but confusing ones (e.g., How old are you? rather than Which age range best describes you?).
❌ Which age range best describes you?
✅ How old are you?
One recommended wording is:
How old are you? (single select)
Like sex and gender, race and ethnicity are often confused but are different (race refers to physical traits; ethnicity refers to cultural identity and heritage).
Determine first which you care about. It is possible to formulate questions that ask about race only, about ethnicity only, or about both.
Additionally, be aware that many people identify with more than one racial and ethnic group and will be excluded if you force them to pick one. For this reason, use multiselect question formats for these questions.
Just Race
How do you describe yourself? (multiselect)
Just Ethnicity
Are you of Hispanic, Latinx, or of Spanish origin? (single select)
Race and Ethnicity
You can ask about both race and ethnicity separately, using the two questions above, or you can combine them into a single question:
How do you describe yourself? (multi-select)
When asking about accessibility needs or disability, do so directly, using language used by the communities in question.
Do you have any of the following accessibility needs? (Select all that apply)
Make sure to avoid language that implies negativity (e.g., Do you have diabetes? instead of Do you suffer from diabetes?; Do you use a wheelchair? instead of Are you confined to a wheelchair?).
Suppose you need to know about someone’s qualitative assessment of their own language proficiency. In that case, it is helpful to provide a brief description of each answer option to ensure understanding, particularly if the reader is less proficient with the language.
How would you describe your English language level (or proficiency)?
Groves, R.M., Fowler Jr, F.J., Couper, M.P., Lepkowski, J.M., Singer, E., and Tourangeau, R. 2009. Survey Methodology. 2nd ed. Wiley.
Jarrett, C. 2021. Surveys That Work. Rosenfeld Media.
Kuha, J. and Jackson, J. 2014. The item count method for sensitive survey questions: modelling criminal behaviour. Journal of the Royal Statistical Society: Series C (Applied Statistics) 63, 2 (Feb. 2014), 321-341. Published by Oxford University Press.
Moore, J., Stinson, L., and Welniak, E. 1997. Income Measurement Error in Surveys: A Review. In Cognition and Survey Research, Sirken, M., Herrmann, D., Schechter, S., Schwarz, N., Tanur, J., and Tourangeau, R. (Eds.). Wiley, New York, 155-174.
smith, s.e. 2009. Beyond the Binary: Forms. Retrieved March 31, 2021, from meloukhia.net/2009/12/beyond_the_binary_forms/.
More Stuff I Like
More Stuff tagged user research , user experience research , demographic questions , surveys
MyHub.ai saves very few cookies onto your device: we need some to monitor site traffic using Google Analytics, while another protects you from a cross-site request forgeries. Nevertheless, you can disable the usage of cookies by changing the settings of your browser. By browsing our website without changing the browser settings, you grant us permission to store that information on your device. More details in our Privacy Policy.