MODE SUITABILITY FRAMEWORK
Part A: Introduction
While aspects of these standards will be of interest to those outside the ABS, they were developed for internal use. As such, some information contained in these standards will not be applicable to an external audience. ABS staff should refer to the Corporate Manuals database for the most recent version of these documents, as some details (names, phone numbers etc.) have been removed from the online version.
There are numerous data collection modes that can be used when conducting a survey. The success of the survey will depend to a large extent on the suitability of the data collection mode/s chosen. "No one data collection method is best for all circumstances. The choice of a particular approach must be made within the context of the particular objectives of the survey and the resources available" (Groves et al., 2004, p. 146).
This document begins by outlining general issues to consider when determining appropriate modes of data collection for ABS surveys. It then discusses combining multiple modes and the issues that can arise from this. An overview of the advantages and disadvantages of the main modes of data collection used in survey organisations is then given, along with issues to consider when assessing the suitability of the modes for particular types of surveys.
The separate Mode Suitability Model
can be used in conjunction with this Framework document, to assist in determining a survey's overall suitability for a particular mode, and to easily compare the suitability of different modes for a survey. As at June 2008, the modes included in the model are CATI, mail surveys, fax surveys, and offline electronic forms. The document will be developed and expanded to include other modes over time.
It is intended that both the Mode Suitability Framework and the Mode Suitability Model
will be updated over time as ABS policies and procedures with regards to data collection modes change.
Part B: Factors influencing choice of mode
This part outlines in general terms some key issues that are important to consider when determining an appropriate data collection mode.
1. Complexity of topic
The depth and complexity of the topic to be covered in the survey, and the nature and form of the questions, will often dictate the collection mode to be employed. The likely quality or length of the response sought may also influence the choice of mode. For instance, it is often difficult to obtain answers to complex questions using telephone surveys, whereas self-administered surveys can include detailed explanatory notes, and face-to-face interviews, with the aid of features such as show-cards, can often yield a more detailed response. For surveys with complicated sequencing, an electronic version that performs this sequencing automatically is preferable to a paper version.2. Types of topics/data
The types of topics that can be properly covered in a survey vary across modes. For sensitive topics (e.g. income, drug use for household surveys; business practices, competitors etc. for business surveys), self-administered modes are generally preferred to interviews as they are seen as more private, allowing the respondent to feel more comfortable providing honest answers. Sometimes a face-to-face interview will include a self-administered module for this reason. Surveys which require respondents to check records before they can answer, or check with other people who may not be immediately available, should also use self-administered modes. For this reason a telephone survey might involve mailing a form out for respondents to complete, and then asking them to read out their answers over the phone. 3. Non-response
The quality and reliability of survey data can be significantly affected by the degree of response to a survey. Although it is very rare to achieve a survey response rate of 100%, the choice of collection mode can significantly influence the response rate obtained. Face-to-face interviews can achieve a better response rate than surveys administered by mail or telephone because it is more difficult to refuse survey participation to a person's face. However other factors such as whether or not the survey is compulsory, whether or not it is repeated and how the follow-up is conducted can mediate the effect of mode on response.4. Respondent preference and burden
How the collection mode fits in with the lifestyle or working style of the respondent should influence what collection method will be used, as this will increase respondent motivation.Some respondents have a strong preference for completing their forms electronically, and under the Electronic Transaction Act
(July 2001), respondents can insist on their right to do this for ABS surveys (within certain guidelines; see section 7 - "Legislative and Strategic Drivers"
The issue of respondent burden is also linked to their preference for particular modes. Some modes reduce reporting time for the respondent - for example, offering spreadsheets for a regular survey allows some respondents to program auto-completion of the form from their systems. However, this is dependent on the respondent - a mode that reduces reporting time for some respondents can increase it for others. Providing choice of mode (see Part C)
is a useful method of giving respondents the choice to select the reporting method that is most suitable for them.
5. Resources available
When human resources, money or time are limited, a less appropriate collection mode may be chosen in order to boost the number of responses or to get the responses more quickly. Such an option is often in conflict with the quality requirements of the survey. To compromise, a planned reduction in data quality, or a reduction in coverage might be alternatives considered. The likely fitness for purpose of the data collected should be kept in mind. It may not be possible to achieve the required outcomes with the resources available - in which case it may be more appropriate to make no attempt.6. Sampling frame and population
The type and quality of the sampling frame may influence the choice of mode. For example, if the target population includes the homeless, a mail survey is not suitable for that part of the population. The characteristics of the population may also influence the mode chosen. If we are interested in running a survey that includes, for example, shift workers or farmers who are unlikely to be available during the times an interviewer could reach them, then a mail survey would be more appropriate than an interview.7. Legislative and strategic drivers
Legislative and strategic drivers, particularly with regards to electronic data reporting, will also affect the mode/s offered. It is ABS policy that the electronic submission of data by respondents is not to be encouraged, due to security and cost issues. However, under the Electronic Transactions Act,
ABS survey respondents can insist on electronic data submission. If the respondent can comply with ABS specified IT requirements, this request is generally accepted. The option to submit data via an Excel form is provided for a range of business surveys (see section 3.8
for information on Excel forms). Standards for the design of ABS web forms are currently being drafted. Web forms are also becoming of interest to household survey areas. Part C - Mixed-mode design
Multiple modes of data collection can be used in the same survey - this is known as a "mixed-mode" design. Mixed-mode designs attempt to "...exploit the advantages of one mode... while neutralizing the disadvantages... through a combination of methods. Mixed-mode design thus involves explicit trade-offs of one source of error for another" (Groves et al., 2004, p. 164).
Allowing the respondent to choose the mode that suits them best can help maximise the response rate to the survey. Mixed-mode designs can also reduce costs - e.g. you might initially attempt to collect data via a mail survey (which is generally less expensive than interviewer-administered methods). You might then attempt to collect data from those who do not respond to the initial request for data, via interviewer-administered methods. Similarly, in longitudinal surveys, face-to-face interviewing might be used to collect data in the first wave of collection, to maximise response rates, establish rapport etc. Data in subsequent waves might then be collected by less expensive modes (Groves et al., 2004).1. Mode effects
"Mode effects" refer to the effects that characteristics of different data collection modes have on the way respondents answer the survey questions. Mode effects can be an issue in surveys where respondents provide their data via different methods, leading to differences in the data collected based on the mode used. For example, a web form that contains numerous edit checks may yield higher quality data than a paper form, where respondents are not made aware of illogical or inconsistent data they provide. Alternatively, the use of edits may lead to higher non-response and drop-off rates due to respondent frustration. Furthermore, when choice of mode is offered, there might be certain types of respondents who are more likely to choose a particular mode, which can influence the results obtained. It is therefore "... critical to ensure that any effects of the mode can be disentangled from other characteristics of the sample" (Groves et al., 2004, p. 164).
As another example, in "select all that apply" question formats, primacy and recency effects have been found to be influenced by mode, with primacy effects (the tendency for respondents to select more options from the start of a list) being more prevalent in self-administered forms, and recency effects (the tendency to select more options from the end of the list) being more prevalent in interviewer-administered forms. "Select all that apply" lists are often converted into separate questions with "yes" or "no" answers for interviewer-administered surveys ("forced choice" questions); however, in an investigation of self-administered forms, Smyth et al. (2006) found that these formats "... are not functional equivalents" (p. 76) - respondents spent more time responding to the forced choice questions, and were more likely to endorse more options in this format compared with the "select all that apply" format.
Steps can be taken to attempt to minimise differences between modes. Groves et al. (2004) state that it is more important to ensure that instruments and procedures are similar across modes, rather than "... optimizing the design features for one particular mode" (p. 164). For example, they recommend that electronic surveys resemble the paper form equivalent as closely as possible, and functions such as edit checks, complex skips etc. should be avoided. To minimise mode effects due to design, layout etc., ABS best practice is to develop electronic forms that (within the constraints of the software) are as visually similar to their paper equivalents as possible. Electronic forms should also offer at least the same level of help, information and instructions as their paper equivalent.
One exception to a consistent look and feel between paper and electronic forms is where the respondent would have an expectation of certain functionality in an electronic form, which, if missing would have a negative impact on their experience, potentially compromising data quality. For example, some minimal edit checking (e.g. alpha responses not being permitted in numeric fields), and the presence of certain levels of functionality (e.g. automatic totalling of numeric data) would be expected by many respondents, and are permitted in ABS electronic forms where appropriate.
2. Combining modes
Some modes of data collection should always be offered to respondents in conjunction with other modes, rather than being the only mode offered. This is particularly relevant for self-administered business forms, where the respondent is required to complete the form without the assistance of an interviewer. Fax forms (section 3.3)
, electronic forms (sections 3.7 & 3.8)
and Interactive Voice Response (IVR) (section 4.4)
should not be offered as the only method of providing data. This is because not all respondents will have access to the technology required (e.g. telephones, computers, internet, fax machines), and those who do have access may not feel comfortable providing data this way. Another method of providing data (generally a self-administered mail-out paper form for business surveys [section 3.2
]) should generally always be offered in conjunction with these modes. However, there may be some instances where a single mode may be supported.
Modes that are commonly used together in the ABS include:
Part D: Modes of data collection
In this part, data collection modes are described in terms of their main advantages and disadvantages, and the types of surveys they may be suitable for. For some modes, links to detailed suitability models are provided. The suitability models describe a range of measures or factors, which particular surveys can be assessed against to determine their relative suitability for conversion to the mode. A score is given for each factor, and scores are summed to yield a total "mode suitability" score. Further explanation on using the models is provided in each separate suitability model document.
This Part is divided into three broad categories:
- face-to-face (personal) interviews;
- self-administered forms; and
- telephone interviews.
The advantages, disadvantages, and suitability discussed at the start of each section apply to all of the modes within the section. In the discussion of each mode, additional advantages, disadvantages, and suitability considerations specific to the mode are discussed. 2. Face-to-face (personal) interviews
In a face-to-face interview, an interviewer generally travels to the respondent's home or business, and asks the respondent the interview questions in person. In the ABS, face-to-face interviews are used more commonly for household surveys than business surveys, although there are some business surveys that also make use of this mode of collection.
The two types of face-to-face interviews described in this section are:
2.1 Overview of advantages, disadvantages and suitability of face-to-face interviews
- Paper and Pencil Interviews (PAPIs)
- Computer Assisted Personal Interviews (CAPIs)
- Face-to-face interviews generally yield the highest response rate, compared with the other modes described in this document. Respondents are less likely to refuse to complete the survey when they must refuse face-to-face to an interviewer.
- The interviewer is able to clarify the meaning of questions during the interview if the respondent is unsure of a question, and can probe them for more detail in their answers. This can lead to more complete and accurate data, and lower item non-response.
- Face-to-face interviews are an opportunity to facilitate good provider relations, presenting the ABS as professional and responsive.
- Face-to-face interviews are very expensive to conduct. There are costs involved in personnel, time, and other resources required to obtain, train, and manage an interviewer workforce. Field interviewers may need to be more highly trained compared to telephone interviewers in a centralised location, where "the task is less complex and centralized supervision is possible" (Groves et al., 2004, p. 161). Travelling from one respondent to another is another cost component in face-to-face interviews and is especially problematic when repeat visits are necessary to catch the respondent while they are available.
- Data can be subject to bias caused by the interviewer's appearance and attitude, as well as the way they phrase and rephrase questions and explanations (these are known as "interviewer effects").
- Social desirability bias, "the tendency to present oneself in a favorable light" (Groves et al., 2004, p. 155) can be a particular problem in face-to-face interviews. Respondents may overreport socially desirable behaviours and underreport socially undesirable behaviours, reducing the accuracy of their responses.
- Recency effects (the tendency to select more options from the end of a list of options) may be an issue, and affect the answers respondents provide.
2.2 Paper and Pencil Interview (PAPI)
- Face-to-face interviews can be suitable for surveys containing complex questions and concepts, which would be difficult for the respondent to complete unaided.
- Compared with self-administered forms, face-to-face interviews can be suitable for forms with many open-ended questions requiring detailed information (Rosenfeld et al., 1993). However, recording these types of responses adds to the burden on interviewers, and may frustrate respondents while they wait for the interviewer to finish recording their response, so they should not be used excessively.
- Face-to-face interviews are suitable for populations who have poor reading abilities.
- Face-to-face interviewing is unsuitable for obtaining sensitive or private information, as respondents may be uncomfortable disclosing this type of information to a stranger. However, for business surveys, commercially sensitive information (e.g. regarding market share, competitors, prices etc.) may be more likely to be disclosed in a personal interview situation rather than via other modes, because of the confidentiality assurance that an ABS Officer can give in person.
In a "Paper and Pencil Interview" (PAPI), the interviewer asks the respondent the survey questions and records the responses on a paper form. PAPIs are used rarely for primary data capture in the ABS, with most face-to-face interviews conducted using CAPI (see section 2.3)
. Some post enumeration studies for business surveys are conducted using PAPIs.
- See section 2.1.2.
- Compared with Computer Assisted Personal Interviews (CAPIs, see section 2.3), a disadvantage of PAPIs is that there must be a separate data entry phase after the collection of the data, increasing processing time and cost.
- Interviewers must follow sequencing instructions manually, which can increase the length of the interview.
2.3 Computer Assisted Personal Interview (CAPI)
- See section 2.1.3.
- Forms that require the interviewer to follow complex sequencing would be better converted to a CAPI (described in section 2.3), which automates this process.
- PAPIs may be more suitable for one-off surveys than ongoing surveys, as the costs associated with setting up and maintaining software for electronic data collection may not be warranted.
When performing a "Computer Assisted Personal Interview" (CAPI), the interviewer takes a notebook computer to the interview and enters the data into the computer as the respondent provides it.
- See section 2.1.1.
- Administering the survey via a computer is useful as it generally allows more flexibility to move around the form and automatically skip questions that don't apply to the respondent. It can make conducting surveys which require complex sequencing easier and quicker, as sequencing can be built into the instrument, reducing the likelihood of interview error in following sequencing instructions.
- The order of response categories can be randomised to minimise effects due to the order of items.
- Data is entered during the interview, meaning that there is no separate data entry stage, saving time and processing costs.
- Compared with a PAPI, edit checks can be carried out much more easily at the time of the interview, as the interviewer is automatically alerted to errors and inconsistencies in the data. This can improve data quality, as well as improve the overall timeliness of the survey, as the need to follow-up respondents to resolve anomalies in the data is reduced.
- Interviewers sometimes show a preference for CAPI compared with PAPI (Martin & Manners, 1995), and respondents may take the survey more seriously because a computer is involved.
- See section 2.1.2.
- There are increased costs involved in the set-up, maintenance and technical support for the computer equipment/software, and in training the interviewers to use the CAPI system.
- The actual interview time may increase as responses are coded and edited at the time of the interview, which can increase the burden on respondents.
3. Self-administered forms
- See section 2.1.3.
- CAPIs can be suitable for surveys that contain complex sequencing, as this can be automatically built into the system. It is important that the sequences are thoroughly tested prior to the CAPI being used in the field, as any programming errors or bugs in the sequencing may mean that respondents are not asked questions they should be asked.
- Ongoing surveys may be suitable for CAPI, where the costs of implementing and maintaining the computer equipment and software are more justified compared to a one-off survey.
Self-administered forms are those in which respondents complete the survey questionnaires themselves, without the involvement of an interviewer. In the ABS, self-administered forms are mostly used in surveys of businesses. Self-administered forms may be either paper forms or electronic forms. The types of forms discussed in this section are:
3.1 Overview of advantages, disadvantages and suitability of self-administered forms
- Paper forms
- Computerised Self-Administered Questionnaires (CSAQs)
- Computer Assisted Self-Administered Questionnaire (CASI)
- Web forms
- Offline electronic forms
- The social desirability effects that can be a problem in interviewer-administered surveys are reduced in self-administered surveys, meaning that more honest answers may be provided in responses to sensitive questions.
- Detailed instructions, explanations, and examples can be included on the form, which can help respondents understand the questions, and lead to better quality data.
- The respondent can complete the form in their own time, rather than having to do it in a structured session with an interviewer.
- The costs associated with employing interviewers for the survey (recruitment, training, travel etc.) are removed.
- The interviewer effects described in section 2.1.2 are eliminated.
- Self-administered surveys generally have lower response rates (unit and item) compared with interviewer-administered surveys.
- With the exception of CASIs (see section 3.6), an interviewer is not present while the respondent completes the form. This means they cannot encourage the respondent to complete the survey (or particular questions within the survey), and if the respondent encounters difficulties with the questions or other aspects of the form, they cannot ask for clarification or additional information. This may lead to confusion, errors and non-response (Groves et al., 2004). For this reason, it is important that respondents are provided with instructions on who to contact to obtain additional assistance if they require it.
- Some of the costs associated with administration of the survey are passed from the ABS, to the respondent. For example, there are costs incurred by the respondent in faxing back forms (see Fax-out forms - section 3.3) and downloading web forms (see Web forms - section 3.7).
- Primacy effects (the tendency to select more options from the start of a list of options) may be an issue in self-administered forms.
3.2 Mail-out forms
- Self-administered forms are suitable for surveys that contain questions requiring respondents to consult records or other people in order to obtain the relevant information, as the respondent can do this in their own time.
- Self-administered forms are more appropriate for asking questions about sensitive topics compared with interviewer-administered surveys.
- Self-administered forms are not appropriate for potential respondents with limited ability to read or write English (unless questionnaires are provided in the appropriate language).
Mail-out forms are sent via the post to respondents with a return-paid envelope so that the respondent can mail back the completed form.
- See section 3.1.1.
- It is less expensive than face-to-face and telephone interviewing.
- A wider range of geographic areas can be covered for far lower cost than face-to-face interviews.
- It allows access to "difficult to contact" respondents; for example, those people who only provide post office boxes as addresses, residents of apartment buildings, and respondents who are away from home or the workplace a great deal.
- See section 3.1.2.
- The major disadvantage with mail surveys is that they can have relatively low response rates, due in part to the difficulty in maintaining accurate address and contact details. Note that some survey support areas call respondents prior to dispatch to confirm all contact details, including mailing addresses. This helps reduce the number of Return to Sender forms and may help increase timeliness and response rates.
- The reliance on postal services and on respondents to find the time to complete the form, can cause data to be slow to become available due to the lag in time between when the questionnaire is mailed out and the time it is returned (although this effect is reduced when surveys are repeated regularly). Back to top
3.3 Fax-out forms
- See section 3.1.3.
- Mail-out forms are not suitable for forms with complex sequencing instructions, as this can lead to errors (e.g. respondents skipping questions they are supposed to answer, or answering questions that do not apply to them).
During the process of following up non-response of business surveys (Intensive Follow-Up - IFU), the ABS regularly redispatches survey forms via fax to business survey respondents, when the respondent requests this method of dispatch, or when the survey due date is near and timeliness is an issue. Black and white versions of the standard paper form are used, to increase the readability of the questions. Modern fax technology provides the ability to send bulk faxes electronically, rather than one by one manually.
Note that this section covers the use of fax forms in the ABS as at June 2008. Changes will soon be made to the way fax forms are produced, and this will affect the information in this section (particularly some of the current disadvantages, which will no longer apply due to improved procedures).
- See section 3.1.1.
- Unlike mail-out forms, the speed of delivery and return of fax forms is a matter of minutes rather than days. For this reason form due dates can be up to a week longer, as there is no need to account for possible mail delays (this is particularly relevant for remote respondents).
- The sender is notified immediately if sending fails. Where there is a significant time lapse between a mail form being posted and it being returned to the sender if the address is incorrect, the sender is notified right away. This allows a greater period of time to find an alternate fax number.
- Timing of delivery can be programmed so that the form arrives at a predetermined time. For example, business surveys can be programmed to be faxed on a Sunday afternoon so that they are more likely to be on the top of the pile on Monday morning.
- See section 3.1.2.
- The quality of the form image is dependent on the receiving machine, and fax transmissions are limited to black and white. For these reasons, fax forms must be simplified and contain minimal shading and bold text. These modifications may result in a poorer quality form compared with a mail-out form.
- Transmission speed is dependent on the amount of ink on the original document, and the length of the form. Long forms may fail to send, particularly if the receiving machine is an older model.
- The receiving machine must be turned on and have paper and ink or delivery will fail.
- Many survey respondents prefer email to fax. While many businesses have a fax machine, they are very rare in households. For this reason email is more widely accessible than fax.
- While a business has a registered address, in order to get a valid fax number the business may need to be contacted by phone or mail, increasing costs and processing time.
- ABS fax numbers are not free-call numbers, meaning that the cost of submitting the data is passed on to the respondent.
- Current ABS practice means that unit IDs are not included on every page of the form. This can lead to problems when the form is faxed back, for example, pages of a faxed-back form may be interspersed with other fax forms, making identification of the relevant unit difficult.
3.4 Drop-off forms
- See section 3.1.3.
- Fax forms are best suited to business survey respondents.
- They are only suitable for black and white forms with minimal bold and shading ("fax friendly forms"), and should be limited to short forms (under 20 pages).
- Fax forms are most appropriate as an alternate redispatch or return method for mail-out forms near the end of the survey cycle, when timeliness is an issue.
In a drop-off survey, the form is delivered to respondents by an interviewer who explains the aims of the survey and how to complete the form. The form is left with the respondent to be completed, and is either mailed back by the respondent ("drop-off/mail-back" method) or collected by the interviewer at a later date ("drop-off/pick-up" method).
- See section 3.1.1.
- Response rates to drop-off surveys are usually higher than for mail surveys. This is because respondents tend to feel a higher level of accountability to return the form, as they have had personal interaction with an interviewer, and may have agreed to return the form. The respondent may also feel obligated to make the effort to respond to reciprocate the effort made by the interviewer to drop the form off.
- Respondents can complete the form at a time that suits them.
- See section 3.1.2.
- The costs involved in using interviewers makes this technique more expensive than mail surveys (although they are less expensive than face-to-face interviews).
- As with a face-to-face interview, the respondent must be available when the interviewer visits; if they are not, this adds to the cost and time involved in collecting their data at another time.
Computerised self-administered questionnaires (CSAQs)
- See section 3.1.3.
- The drop-off method is appropriate for surveys in which the interviewer needs to screen the household as to whether or not they are in scope of the survey, and/or for surveys in which the interviewer needs to select the correct person in the household to complete the survey.
- Drop-off surveys are suitable when some information best collected using self-administered methods need to be collected along with other types of information which may be too complex or sensitive for the respondent to provide in a self-administered form - this information can be collected by the interviewer instead.
- Drop-off surveys can be used to ensure that basic information (e.g. about the dwelling) is collected initially by the interviewer, in case the respondent does not return the form.
"Computerised self-administered questionnaires" (CSAQs - Groves et al., 2004) are self-administered forms that respondents complete themselves electronically, using a computer.
3.5 Overview of advantages, disadvantages and suitability of CSAQs
- Allowing forms to be completed and submitted electronically produces a faster response than other self-administered methods.
- Automatic sequencing can be built into the form so that only the questions relevant to the respondent are visible, reducing the burden on respondents in following sequencing instructions.
- Edit checks can be incorporated into the form, helping to reduce respondent errors, and increase data quality.
- The order of the questions and response options can be randomised across respondents to reduce effects caused by the order that the questions and responses appear in (Groves et al. 2004).
- Respondents enter their own information into the system, reducing data entry and processing costs. Costs associated with printing and mailing paper forms are also eliminated.
- Compared with interviewer-administered surveys, social desirability effects can be reduced, because respondents are likely to be more comfortable providing honest answers to a computer than a person.
- Once created, the form can generally be modified for future use with relatively little effort or expense.
- Compared with mail forms, web forms or offline electronic forms can be useful for surveying large businesses, where the form must be completed by different people within the organisation (e.g. Payroll, R&D, etc., areas which may be located in different States). Web or offline forms allow the relevant parts of a form to be easily accessed and completed by the appropriate person in the organisation. The final data can then be lodged by the primary ABS respondent.
- The costs involved in developing the forms, maintaining the systems, and ensuring the security of data can be high. However, while the set-up costs are higher than for mail surveys, the "per case" costs of electronic forms are very low (Groves et al., 2004).
- If the form is to be completed on the respondent's own computer, this method requires respondents to have compatible computer software, and help-desk type staff may be necessary to support the use of the form.
- Electronic forms with a large number of edits can be frustrating for respondents (potentially affecting response rates). This is where good form design, minimising the number of mandatory questions, and targeting edits to where they add the most value to the data quality, become important.
- Technical problems may occur with the system, leading to loss of data and respondent frustration.
- There can be increased burden/cost faced by the respondent, e.g. download costs.
3.6 Computer Assisted Self Administered Interview (CASI)
- Respondents must be computer literate.
- For web forms and offline forms, respondents must have access to a computer and the Internet, and/or be prepared to use an ABS storage device. They may also be required to have specific software.
- The costs associated with setting up the computer software etc. mean that CSAQs are more suitable for ongoing, rather than one-off surveys.
- CSAQs are not suitable for questions requiring respondents to provide detailed open-ended responses, as typing in this information can become too burdensome.
In a "Computer Assisted Self Administered Interview" (CASI) an interviewer visits the respondent with a notebook computer containing an electronic version of the questionnaire. The respondent uses the computer to complete the form themselves, rather than being asked the questions by the interviewer. CASIs are often used as part of a CAPI (see section 2.3
), with the interviewer handing over the computer for part of the interview, e.g. so that particularly sensitive sections of the questionnaire can be completed by the respondent privately. In this type of interview, the interviewer remains present to provide assistance to the respondent when required.
- See section 3.5.1.
- Compared with a CSAQ in which an interviewer is not present, CASIs allow the respondent to ask the interviewer for further information/clarification when required, which may lead to better quality data.
- Compared with a paper self-administered form, CASIs allow the use of edits and automatic sequencing, which can help reduce errors.
- See section 3.5.2.
- If the respondent's computer literacy is low, they are likely to find it difficult to use the CASI system.
3.7 Web forms
Web forms are forms that are completed by respondents online, over the Internet. Respondents are generally sent an email or letter containing information about the survey, and a link (or URL) to an online form. An authentication process is usually undertaken before the respondent can begin the survey, in which they must enter unique information provided by the survey organisation (e.g. a username and password). Web forms may be either resumable or non-resumable. Resumable forms can be saved and re-entered at a later time to complete, while non-resumable forms must be completed in one session for the information provided to be able to be submitted.
All Australian government departments and agencies should follow usability and accessibility guidelines when designing websites and web forms. The guidelines aim to ensure that web content is accessible and usable for all users, regardless of disabilities they may have, or the type of hardware/software they use to access the web.
- See section 3.5.1.
- The form can be sent to, and returned by, the respondent much more quickly than is the case for other self-administered surveys (e.g. mail forms), meaning that the data collection process is much quicker.
- The form can be sent to anyone with internet access, even in geographic locations where other methods (e.g. mail forms, face-to-face interviewing) would be difficult or impossible.
- See section 3.5.2.
- Not every person or business has access to the web. Respondents without internet access are automatically excluded from the sample if web forms are the only mode of collection being offered, which could lead to biased results.
- Even if completing the form online is optional (e.g. respondents can choose to complete a paper alternative), there are likely to be differences between respondents who choose to do the online form compared to those who opt to do the paper form. For example, younger respondents, and those who are more highly educated (Groves et al., 2004) may be more comfortable using the online option, which may affect results. Conversely, these respondents are becoming harder to engage through traditional collection methods, so some bias in data may be the trade-off for maintaining overall response rates and ensuring that certain demographic groups are represented in the results.
- Respondents may be reluctant to use web forms due to concerns about privacy or security issues, compromising response rates.
- For non-resumable forms, the form must be completed in one session. Data will be lost if the window is accidentally closed, or if a technical problem occurs while the respondent is completing the form.
- It is easy for a frustrated respondent to terminate the session by simply closing the web browser.
- There may be variation in respondents' computer settings (e.g. browser specifications or operating systems), that make it difficult to standardise the appearance of the form. For example, if a high level of functionality, sophisticated graphics, etc. are used, then some users may not be able to use the form due to the limitations of their operating system. This risk can be mitigated through good design (e.g. caching static content, especially images) and having a "lowest common denominator" approach (e.g. a form that works reasonably well on a 56k dialup connection will work well for those with broadband).
3.8 Offline electronic forms
- See section 3.5.3.
- If this is the only mode offered, the sample should be limited to respondents who are likely have internet access, to be computer literate, and comfortable using the internet to complete forms. It should be remembered that this will limit the extent to which the results can be generalised.
- Forms with complex sequencing can be suitable for conversion to web-administration, as automating the sequencing removes the burden of following sequencing instructions from the respondent or interviewer.
Offline electronic forms differ from web forms in that the file containing the form is saved by the respondent, and completed offline, before being submitted. The electronic form is usually sent to the respondent via email. In a "disk-by-mail" survey, however, the respondent is sent a disk through the mail, containing an electronic version of the form. The respondent answers the survey questions on their own computer, and the responses are recorded on the disk. The respondent then mails the disk back to the ABS. This mode has been used for ABS business surveys such as the Survey of International Investment.
Some business survey respondents submit their data using Microsoft Excel forms. The respondent is sent the form as an email attachment. They then save the form, complete it offline, and submit the completed form via the ABS "Secure Deposit Box", which allows secure, authenticated lodgement of files from respondents via the ABS Web site. ABS Excel forms currently contain basic functions such as formulae to automatically calculate totals, and macros that add additional rows if extra space is required. ABS Excel forms are generally designed to look quite similar to the paper equivalent.
Adobe LiveCycle can be used to create PDF forms that can be distributed to respondents via email, or published on a website. There are two types of PDF forms according to the Adobe website: "fill-and-print" forms, and "interactive" forms. Fill-and-print forms must be printed, filled out by hand, and returned via mail or fax. Interactive forms can be completed electronically, and can then be either printed and submitted by mail/fax, or submitted electronically by pressing a "Submit" button that appears on the form. Depending on the settings, submitting the form electronically either uploads the data entered on the form, or exports the data to a file that the respondent returns through email. Built-in security features safeguard the privacy of electronically submitted data. Respondents may be given the option to save a copy of the completed form, showing all the information they entered. At present the ABS does not send respondents PDF forms to complete.
- Offline forms share many of the same advantages as web forms (see Web-forms - section 3.7.1), as well as the general advantages of CSAQs (section 3.5.1).
- An advantage of forms created in spreadsheets such as Excel is that respondents can write programs for autocompletion of the form, which allows the relevant parts of the spreadsheet to be automatically populated with data from their electronic records.
- Respondents may feel more comfortable providing data in offline forms, compared with online web forms, due to greater concerns about security with web forms.
- Offline forms can usually be saved easily, meaning that data may be less likely to be lost compared with web forms.
- See section 3.5.2.
- Compared with web forms, programs used to create offline forms may not have the same flexibility in terms of automatic sequencing etc.
- Writing programs for autocompletion is an additional burden for the respondent, particularly if the form questions change and they have to modify their program.
4. Telephone interviews
In the ABS, telephone surveys are used for both household and business surveys, for data collection, intensive follow-up, and post enumeration studies. The modes of data collection described in this section are:
- Telephone PAPI
- Computer Assisted Telephone Interview (CATI)
- Interactive Voice Response (IVR)
Most of the telephone interviewing methods involve potential respondents being telephoned and asked the survey questions over the phone by an interviewer. In an IVR survey, however, there may be no interviewer involvement at all (see Interactive Voice Response section 4.4
).4.1 Overview of advantages, disadvantages and suitability of telephone interviews
- Compared with face-to-face interviews, costs are reduced in telephone interviews, as fewer interviewers are required, and there are no travel costs involved.
- Telephone surveys produce more timely results compared with face-to-face interviews and mail surveys.
- Call-backs for people not answering, and follow-ups for additional information are relatively quick and inexpensive.
- Monitoring of interviewing staff is possible, so that consistency of performance can be higher.
- "Call scheduling" is possible, where respondents can be called at convenient times or when information is available. Also, if the phone is engaged, the system will reschedule the call, improving interviewer efficiency.
- Telephone interviews are very limited in the number and complexity of the questions that can be asked. The amount of information that respondents can retain when it is administered aurally is limited, compared with when the survey questions are administered in written form.
- Non-response and partial non-response can be high compared with face-to-face interviews, due to the ease with which the respondent can terminate the interview.
- Particular types of participants can be difficult to reach via telephone, for example, farmers, and people who on the "Do not call" register.
- A bias is introduced because people with no phones or who are rarely available are excluded, and it is difficult to produce a reliable sampling frame because of unlisted numbers and changes in addresses (therefore changes in numbers).
- There can be problems with convincing respondents of the legitimacy of the survey (Tourangeau et al., 2000) and the confidentiality of results over the phone.
- Like any interviewer-administered questionnaire, answers to sensitive questions may be affected by social desirability effects, leading to under-reporting of socially undesirable behaviours.
- Recency effects (the tendency to select more options from the end of a list of options) may be an issue.
4.2 Telephone PAPI
- Telephone surveys should be short - the longer the survey, the higher the burden on respondents, and the less likely they are to provide considered answers. Of all the data collection modes, telephone surveys are particularly prone to time pressure and respondent fatigue.
- Telephone surveys are unsuitable for forms with conceptually complex questions, questions with long lists of response options to be read out, and those that require a large amount of record checking.
- The target population should be able to be reached by telephone relatively easily.
- Like face-to-face interviews, telephone interviews are suitable for populations who have poor reading abilities. Back to top
As the name suggests, a "Telephone PAPI" is similar to a face-to-face PAPI (see section 2.2
) in that an interviewer reads and records the survey questions using a paper form; however, the interview is conducted via telephone rather than in person. The form should contain proper scripting, rather than being a duplicate of a self-administered form. This is to ensure that the way questions are asked is standardised across interviewers, and increases the likelihood of the interviewer conveying all relevant information.
- See section 4.1.2.
- Compared with telephone interviews in which a computer is used to collect the data (e.g. CATIs - section 4.3), in PAPIs, there must be a separate data entry phase after the collection of the data, increasing processing time and cost.
4.3 Computer Assisted Telephone Interview (CATI)
In a "Computer Assisted Telephone Interview" (CATI), the interviewer reads the survey questions, and records the respondent's answers, using a computer interface (instead of a paper script/questionnaire). The computer interface contains scripted questions, and the same notes and definitions as the paper form, for the interviewer to refer to if necessary. At the ABS, CATI interfaces are generally built using Blaise.
CATI is used for both business and household surveys at the ABS. For business surveys, the use of CATIs is centralised. CATI is decentralised for household surveys, with interviewers conducting the interviews from their own homes. Centralisation is recommended, as it allows standardised interviewing procedures and better quality control, through making recruitment, training, supervision and monitoring of interviewers easier and more standardised. This can help avoid variation in response rates and estimates that are due to different procedures being used. It also allows for better access to technical support for the CATI systems.
- See section 4.1.1
- Data is entered during the interview itself, so there is no separate data entry phase (saving time and resources).
- Higher quality data may be collected compared to in a normal telephone interview. This is because:
- interviewers are likely to adhere to the questionnaire more strictly. This can reduce interviewer bias, and lead to interviews that are likely to have less item non-response and missing data (Groves & Nicholls, 1986), and are more standardised across interviewers (Tourangeau et al. (2000)).
- automatic edits and checks help reduce data entry errors and alert the interviewer to inconsistent or unlikely answers, improving data quality and decreasing processing time.
- sequencing can be automatically built into the system, meaning that interviewers do not have to manually follow complex sequencing instructions, reducing errors.
- See section 4.1.2.
- The set-up costs of the interface can be high.
- Interviewers must have computing skills and be trained to properly use the computer system.
- Compared with other telephone interviews, interview time is increased because editing is done while the interview is being performed - this can increase non-response.
4.4 Interactive Voice Response (IVR)
- See section 4.1.3
- Surveys that are run frequently, and that have large sample sizes are more suitable for conversion to CATI from a cost-benefit perspective.
- Also from a cost perspective, surveys that have many questions in common with other surveys (e.g. those based largely on Standard Question Wording) are more suitable for CATI, as the instrument will then be easier to adapt to other surveys in the future.
IVR is a computerised telephony system that has the ability to interact on a basic level with telephone users. A prerecorded voice lists a series of options, prompts and questions. The way in which the user provides their response depends on whether it is an Telephone Data Entry (TDE) or Automated Speech Recognition (ASR) system. In TDE, users enter their responses by pressing numbers on a telephone keypad; the corresponding tone then inputs the response to the system. In ASR, users enter their responses verbally, speaking a simple answer that the system is programmed to recognise. The system may repeat the user's responses so that the user can confirm or amend them. A single IVR system may allow both ASR and TDE input (e.g. users can either say "yes", or press 1, in response to a prompt).
See section 4.1
for general advantages, disadvantages and suitability considerations that apply to IVR along with other telephone survey modes. Below are some considerations that apply specifically to IVR.
18.104.22.168 TDE and ASR
- The interviewer is replaced by a computer, which is cost-effective in the long-term.
- Responses are coded and automatically entered immediately into the data collection system. This enables the removal of transcription errors.
- Questions are easily standardised, because there are no interviewer effects or errors that are not experienced by all respondents.
- The survey can be administered in the comfort of the respondent's home or business.
- Respondents with poor written communication skills can use this system.
- Respondents may feel more comfortable answering sensitive questions when they can submit their responses silently, and when they know that their responses are not being considered by a human interviewer.
- Respondents with poor oral communication skills can easily use the system.
- Respondents do not need a touchtone telephone to complete the interview.
- Response options are not as restricted compared with TDE, meaning that more detailed responses are allowed.
22.214.171.124 TDE and ASR
- IVR systems are expensive to set-up compared with traditional methods of data collection such as mail or telephone.
- Response rates may be compromised when the burden is on the respondent to initiate contact before submitting data (i.e. in inbound IVR).
- Respondents are more likely to hang-up on a machine than a human if they feel frustrated, which may compromise response rates.
- The computer may not be able to reiterate the questions, or provide more information to help the respondent understand the question.
- Providers without a touchtone pad on their phone are immediately excluded from the use of this data collection method. This may lead to exclusion of respondents with older models of telephones (e.g. older and poorer respondents).
- Respondents who report information via a mobile or cordless phone might have trouble keying in responses because they have to take the phone away from their head to do so.
- Respondents may have trouble navigating their way through the telephone survey, if there are lots of options to remember etc. Ensuring that the system follows sound design standards is crucial to minimise this problem.
- Typing errors are possible, and having the system repeat answers to ensure they are correct can become tedious and frustrating.
- There may be issues with language and accent variation. Respondents might use obscure language to answer a question or a term or phrase that the computer is not familiar with, and the response will be invalid. Background noise can also interfere with the system's ability to recognise respondents' input.
- Compared with TDE (where respondents can respond to the questions 'silently' using the keypad), sensitive topics are more likely to elicit non-response in ASR. The respondent is required to say their responses out loud, which may lead to them reporting socially desirable responses when answering sensitive questions in the company of family members, colleagues etc.
Surveys that may be suitable for administration via IVR are those in which:
Part E: Conclusions
- the target respondent population is likely to have touchtone telephones (for TDE) - however, note that IVR should never be the only method offered to respondents.
- the same respondents participate on a regular, ongoing basis (e.g. monthly). This increases the likelihood that respondents will be familiar with the system, questions, and expected responses. Respondents in less frequent or one-off surveys will be more likely to require assistance in using the system and answering the questions. The high costs involved in setting up an IVR system also make it inappropriate for one-off or infrequent surveys.
- the survey is short. ONS experience shows that TDE is only suitable for surveys that collect 9 or less data items (Lewis, 2006). As a general "rule of thumb", an IVR transaction should be able to be completed in ten minutes or less (Electronic Commerce Best Practices, 2000).
- the questions are simple, and do not require detailed responses. Questions with complex explanatory notes, definitions or long lists of items to select from are unsuitable for IVR. It is unrealistic to expect respondents to listen to a long list of instructions or explanations, and to remember all of this information when it comes time to give their response - this is likely to lead to frustration, errors, and the provision of poor quality data. Surveys in which respondents report a high number of nil values may be suitable for IVR, as entering this information is relatively simple.
- the sample size is large, and the survey has many questions in common with other surveys. These surveys are more suitable for conversion to IVR from a cost-benefit perspective.
- record checking is not required. The time a respondent would need to spend looking up information in their records in order to answer a question would increase the overall length of the call. Also, long periods of silence without respondent input may cause the system technical difficulties (as IVR systems generally interpret prolonged silence as the respondent requiring assistance).
This document describes the advantages, disadvantages and suitability of a range of data collection modes. The Mode Suitability Model
can be used in conjunction with this document to easily compare and contrast the suitability of different modes for a particular survey.
With the popularity of electronic data reporting increasing, more ABS self-administered paper surveys will need to have an electronic alternative. With the increase of electronic forms, the use of fax forms should decrease. In terms of interviewer-administered methods, computerised versions of the form are now used almost exclusively in face-to-face interviews. Telephone PAPIs are likely to become less common as more surveys are converted to CATI and IVR. Current CATI surveys may also be converted to IVR.
Part F: References
- Groves RM, Fowler Jr. FJ, Couper MP, Lepkowski JM, Singer E & Tourangeau R (2004). Survey Methodology. John Wiley & Sons, Inc., Hoboken, New Jersey.
- Groves RM & Nicholls II WL (1986). The status of computer-assisted telephone interviewing: Part II - data quality issues. Journal of Official Statistics, 2(2), pp. 117-134.
- Martin J & Manners T (1995). Computer assisted personal interviewing in survey research. In Lee RM (ed.) Information technology for the social scientist, UCL Press.
- Rosenfeld P, Booth-Kewley S & Edwards JE (1993). Computer-administered surveys in organizational settings. American Behavioral Scientist, 36(4), pp. 485-511.
- Smyth JD, Dillman DA, Christian LM & Stern MJ (2006). Comparing check-all and forced-choice question formats in web surveys. Public Opinion Quarterly, 70(1), 66-77.
- Tourangeau R, Rips LJ & Rasinski K (2000). The psychology of survey response. Cambridge University Press, Cambridge.
This page last updated 22 January 2010