EXPLANATORY NOTES
INTRODUCTION
1 This publication analyses the relationship between literacy skills and a range of socio-demographic factors, as collected in the Adult Literacy and Life Skills Survey (ALLS). The ALLS is the second survey of its type to be conducted in Australia. Its predecessor, the International Adult Literacy Survey (IALS), was conducted in Australia in 1996 as the Survey of Aspects of Literacy (SAL). The IALS was the world’s first internationally comparative survey of adult skills, undertaken with three rounds of data collection between 1994 and 1998. Similarly, the ALLS is an international survey that has been completed by participating countries in successive waves. In 2003, the first wave of countries that participated in the Adult Literacy and Life Skills Survey (referred to as 'ALL') were Bermuda, Canada, Italy, Mexico, Norway, Switzerland and the United States. Second wave countries to take part in ALLS were Australia, New Zealand, the Netherlands, Hungary and South Korea.
2 The survey provides information on knowledge and skills in the following five domains: prose literacy, document literacy, numeracy, problem solving and health literacy. In the 1996 SAL only three domains were assessed. These domains were prose literacy, document literacy and quantitative literacy. The addition of problem-solving, and health literacy as well as the expansion of the quantitative literacy to numeracy provides an extra dimension to the assessment of adult skills. A number of prose and document literacy tasks administered in the 1996 SAL have been retained for the 2006 ALLS to provide comparison of levels of literacy over time. Both SAL and ALLS measured skills in the official language, which in Australia was English.
3 Information on the following topics was collected:
- General demographic information
- Linguistic information
- first language learned and most used at home
- self-assessed proficiency in spoken English
- reading and writing proficiency in non-English language
- Parental information
- occupation
- educational attainment
- country of birth
- Labour Force activities
- current labour force status
- labour force activities in the last 12 months
- hours worked
- occupation and industry of main job
- job tenure
- Literacy and numeracy practices at work and daily life
- Frequency of reading and writing activities
- Participation in education and learning
- involvement in different types of learning
- incidence of education and learning
- reasons for pursuing education and learning
- volume of education undertaken
- Social capital and well-being
- volunteering, civic participation
- physical and psychological well being
- Information and communications technology
- access, types and usage
- purposes and intensity of computer use
- purposes and intensity of Internet use
- self assessment of skills
- Personal and household income
4 The statistics included in this publication present a broad overview of data items collected. Emphasis has been given to providing information on key measures of literacy.
5 More detail on the survey content, and how literacy was measured, is available in the Adult Literacy and Life Skills, Australia: User Guide (cat. no. 4228.0.55.002) available on the ABS website <www.abs.gov.au>.
SCOPE OF THE SURVEY
6 Only people who were usual residents of private dwellings in Australia were covered by the survey. Private dwellings are houses, flats, home units and any other structures used as private places of residence at the time of the survey. People usually resident in non-private dwellings such as hotels, motels, hostels, hospitals and short-stay caravan parks were not included in the survey. Usual residents are those who usually live in a particular dwelling and regard it as their own or main home. Visitors to private dwellings are not included in the interview for that dwelling. However, if they are a usual resident of another dwelling that is in the scope of the survey they have a chance of being selected in the survey or, if not selected, they will be represented by similar persons who are selected in the survey.
7 The ALLS was conducted in both urban and rural areas in all states and territories, except for very remote parts of Australia. Queensland, South Australia, Western Australia and the Northern Territory have very remote areas. With the exception of the Northern Territory, the population living in very remote areas represents only a small proportion of the total population (approximately 2%). For this, and other practical reasons, no adjustment was made to state population benchmarks (population benchmarks are discussed below) when deriving survey results. This exclusion is unlikely to impact on national estimates, and will only have a minor impact on any aggregate estimates that are produced for individual states and territories, except the Northern Territory where the excluded population accounts for over 20% of persons.
8 Persons aged 15 to 74 years were included in the survey. The estimated Australian resident population at December 2006, after the exclusion of people living in non-private dwellings and very remote areas of Australia, was 20,182,511 of which 15,105,435 were aged 15 to 74 years
9 The following non-residents were excluded from resident population estimates used to benchmark the survey results, and were not interviewed:
- diplomatic personnel of overseas governments
- members of non-Australian defence forces (and their dependants) stationed in Australia
- persons whose usual place of residence was outside Australia.
SAMPLE DESIGN
10 The ALLS was designed to provide reliable estimates at the national level and for each state and territory.
11 Dwellings included in the survey in each state and territory were selected at random using a multi-stage area sample. This sample included only private dwellings from the geographic areas covered by the survey. The initial sample for the survey consisted of 14,311 private dwellings. This number was reduced to 11,139 dwellings due to the loss of households which had no residents in scope for the survey and where dwellings proved to be vacant, under construction or derelict. Of the eligible dwellings, 80.7% responded fully (or adequately) which yielded a total sample from the survey of 8,988 dwellings/persons.
International response rates
12 The number of adequately responding persons and response rates for selected countries, as included in tables 5 to 7, are presented in the following table.
SELECTED COUNTRIES, Sample size and Response rate summary(a) |
|
| Fully responding | Response rate(b) |
| no. | % |
|
Bermuda | 2 696 | 82 |
Canada | 20 059 | 66 |
Italy | 6 853 | 44 |
Norway | 5 411 | 56 |
Switzerland | 5 120 | 40 |
United States | 3 420 | 66 |
|
(a) People aged 16-65 years. |
(b) Of eligible dwellings, excluding sample loss. |
DATA COLLECTION
13 ABS interviewers conducted personal interviews at selected dwellings during the period of June 2006 to January 2007. Interviews were conducted using a Computer Assisted Interviewing (CAI) questionnaire. CAI involves the use of a notebook computer to record, store, manipulate and transmit the data collected during interviews.
14 Much of the detail obtained from the ALLS was provided by one person aged 15 to 74 years randomly selected from each participating household. This person was randomly selected after basic information had been obtained about all household members.
15 Respondents were asked a series of questions to obtain background information of a socio-demographic nature, and information about their perceptions of their literacy and numeracy abilities, their literacy-related practices in daily life and at work, and about their use of different languages.
16 After the interview was completed, the respondent was asked to participate in an objective assessment of their literacy skills. The first assessment, a core task booklet (CTB), contained six relatively simple literacy-related tasks. Respondents who completed three or more of these correctly, as assessed by the interviewer, were then given a much larger variety of tasks in a separate main task booklet (MTB). The tasks in the MTB, which are more numerous and diverse in complexity and subject matter, are designed to provide an understanding of the literacy skills of the general adult population. Respondents were asked to use the textual materials provided in the booklet to complete a series of tasks.
17 Those who did not correctly complete at least three of the tasks from the CTB were not asked to attempt the more difficult tasks in the MTB. These respondents were assessed with skill level 1 for each domain.
18 There were 28 different types of booklets used in the assessed part of the interview. One of these booklets was randomly allocated to each selected dwelling.
19 Each booklet consisted of two, of a possible eight, blocks of questions. The blocks of questions measure different skill domains:
- Blocks 1 - 4 measure Prose and Document Literacy
- Blocks 5 - 6 measure Numeracy
- Blocks 7 - 8 measure Problem-solving
20 A fifth scale measuring health literacy proficiency was produced as a by-product of the above testing process. The survey contained 191 daily tasks that were judged to measure health-related activities in five domains: health promotion (60 items), health protection (65 items), disease prevention (18 items), health-care and disease management (16 items), and navigation (32 items). The production of the health literacy scale was an additional service provided to participating countries.
21 Interviewers administered the objective assessment component of the survey in a neutral and non-threatening manner, encouraging respondents to attempt as many tasks as possible. There were no time limits, and no assistance was allowed. Tasks covered each of the types of literacy, and were at varying levels of difficulty.
22 More information on the scoring of the MTBs is available in the Adult Literacy and Life Skills, Australia: User Guide (cat. no. 4228.0.55.002) available on the ABS website <www.abs.gov.au>.
WEIGHTING, BENCHMARKING AND ESTIMATION
Weighting
23 Weighting is the process of adjusting results from a sample survey to infer results for the total in-scope population. To do this, a 'weight' is allocated to each person. The weight is a value which indicates how many population units are represented by the sample unit.
24 The first step in calculating weights for each person is to assign an initial weight, which is equal to the inverse of the probability of being selected in the survey. For example, if the probability of a person being selected in the survey was 1 in 600, then the person would have an initial weight of 600 (that is, they represent 600 people).
Benchmarking
25 The initial weights were then calibrated to align with independent estimates of the population by sex, age, state and section of state, referred to as 'benchmarks'. Weights calibrated against population benchmarks ensure that the survey estimates conform to the independently estimated distribution of the population rather than to the distribution within the sample itself. Calibration to population benchmarks helps to compensate for over- or under-enumeration of particular categories of persons which may occur due to either the random nature of sampling or non-response.
26 The 2006 ALLS was benchmarked to the estimated resident population aged 15 to 74 years (ERP) living in private dwellings in each state and territory, excluding the ERP living in very remote areas of Australia, at 31 December 2006. The ERP for 2006 ALLS were based on results from the 2001 Census of Population and Housing. The ALLS estimates do not (and are not intended to) match estimates for the total Australian resident population obtained from other sources (which include persons and households living in non-private dwellings, such as hotels and boarding houses, and in very remote parts of Australia).
Estimation
27 In order to minimise respondent burden, not all literacy domains were directly assessed for each respondent. ALLS used a matrix-sampling design to assign assessment booklets to individuals so that a comprehensive picture of the literacy achievements across the country could be assembled from the components completed by each individual. ALLS relied on Item Response Theory scaling to combine the individual responses to provide accurate estimates of literacy achievement in the population. With this approach, however, aggregations of individuals scores can lead to biased estimates of population characteristics. To address this, the ALLS scaling procedures also used a multiple imputation or "plausible scores" methodology to obtain proficiency scores in literacy for all individuals, even though each individual responded to only a part of the assessment item pool. By using all available data, for each respondent five "plausible scores" were generated for each of the five domains measured.
28 All literacy estimates presented in this publication are obtained by taking the average of the five weighted estimates from each of the "plausible" scores. More information on measuring literacy is covered in the Adult Literacy and Life Skills Survey: User Guide (cat. no. 4228.0.55.002) available on the ABS website <www.abs.gov.au>.
RELIABILITY OF ESTIMATES
29 All sample surveys are subject to error which can be broadly categorised as either sampling error or non-sampling error. Sampling error occurs because only a small proportion of the total population is used to produce estimates that represent the whole population. Sampling error can be reliably measured as it is calculated based on the scientific methods used to design surveys. Non-sampling errors occur when survey processes work less effectively than intended. For example, some persons selected for the survey may not respond (non-response); some survey questions may not be clearly understood by the respondent; and occasionally errors can be made in processing data from the survey.
Non-Sampling error
30 One of the main sources of non-sampling error is non-response by persons selected in the survey. Non-response can affect the reliability of results and can introduce bias. The magnitude of any bias depends upon the level of non-response and the extent of the difference between the characteristics of those people who responded to the survey and those who did not.
31 To reduce the level and impact of non-response, the following methods were adopted in this survey:
- face-to-face interviews with respondents
- the use of interviewers who could speak languages other than English where necessary
- follow-up of respondents if there was initially no response
- ensuring the weighted file is representative of the population by aligning the estimates with population benchmarks.
32 Of the dwellings selected in the 2006 ALLS , 19.3% did not respond fully or adequately.
33 Every effort was made to minimise other non-sampling error by careful design and testing of questionnaires, intensive training and supervision of interviewers, and extensive editing and quality control procedures at all stages of data processing.
34 An advantage of the CAI technology used in conducting interviews for this survey is that it potentially reduces non-sampling errors by enabling edits to be applied as the data are being collected. The interviewer is alerted immediately if information entered into the computer is either outside the permitted range for that question, or contradictory to information previously recorded during the interview. These edits allow the interviewer to query respondents and resolve issues during the interview. CAI sequencing of questions is also automated such that respondents are asked only relevant questions and only in the appropriate sequence, eliminating interviewer sequencing errors.
35 In contrast to most other ABS surveys, the ALLS estimates also include significant imputation variability, due to the use of multiple possible MTB questionnaires and the complex literacy scaling procedures. The effect of the plausible scoring methodology on the estimation can be reliably estimated and is included in the calculated RSEs. This is covered in more detail in the Technical notes.
Sampling error
36 Sampling error is the difference between the published estimates, derived from a sample of persons, and the value that would have been produced if all persons in scope of the survey had been included. For more information refer to the 'Technical notes'. Sampling error is measured for this survey by relative standard errors (RSEs). In this publication estimates with RSEs of 25% to 50% are preceded by an asterisk (e.g. *3.4) to indicate that the estimate should be used with caution. Estimates with RSEs over 50% are indicated by a double asterisk (e.g.**0.6) and should be considered unreliable for most purposes.
SEASONAL EFFECTS
37 The estimates in this publication are based on information collected from June 2006 to January 2007, and due to seasonal effects they may not be fully representative of other time periods in the year. For example, the ALLS asked standard ABS questions on labour force status to determine whether a person was employed. Employment is subject to seasonal variation through the year. Therefore, the ALLS results for employment could have differed if the ALLS had been conducted over the whole year or in a different part of the year.
INTERPRETATION OF RESULTS
38 Care has been taken to ensure that the results of this survey are as accurate as possible. All interviews were conducted by trained ABS officers. Extensive reference material was developed for use in the field enumeration and intensive training was provided to interviewers in both classroom and on-the-job environments. There remain, however, other factors which may have affected the reliability of results, and for which no specific adjustments can be made. The following factors should be considered when interpreting these estimates:
- Information recorded in this survey is essentially 'as reported' by respondents, and hence may differ from information available from other sources or collected using different methodologies. Responses may be affected by imperfect recall or individual interpretation of survey questions.
- Some respondents may have provided responses that they felt were expected, rather than those that accurately reflected their own situation. Every effort has been made to minimise such bias through the development and use of culturally appropriate survey methodology.
39 Further information on the interpretation of results is contained in the
Adult Literacy and Life Skills Survey: User Guide (cat. no. 4228.0.55.002) available on the ABS website <www.abs.gov.au>.
40 For a number of ALLS data items, some respondents were unwilling or unable to provide the required information. Where responses for a particular data item were missing for a person or household they were recorded in a 'not known' or 'not stated' category for that data item. These 'not known' or 'not stated' categories are not explicitly shown in the publication tables, but have been included in the totals. Publication tables presenting proportions have included any 'not known' or 'not stated' categories in the calculation of these proportions.
41 For the personal gross weekly income data item, 11.6% of people (1.8 million) did not provide an income amount, either because they did not know their income or they refused to answer. There were a number of other data items included in the publication that had missing values. The proportions of these missing values did not exceed 2% for any data item.
CLASSIFICATIONS AND STANDARDS
42 The ALLS used the following classifications;
- Coding of country of birth. The survey questionnaire listed the 10 most frequently reported countries. Interviewers were instructed to mark the appropriate box, or if the reported country was not among those listed, to record the name of the country for subsequent coding. All responses for country of birth were coded according to the Standard Australian Classification of Countries (SACC), 1998 (cat. no. 1269.0).
- Coding of geographical data. Geography data (Capital city, Balance of state/territory; Remoteness areas) were classified according to the Australian Standard Geographical Classification (ASGC) (cat. no. 1216.0).
- Coding of language. The survey questionnaire listed the 10 most frequently reported languages first spoken at home. Interviewers were instructed to mark the appropriate box, or if the reported language was not among those listed, to record the name of the language for subsequent coding. All responses for language spoken were coded to the Australian Standard Classification of Languages (ASCL) (cat. no. 1267.0).
- Coding of education. Level and field of education were coded to the Australian Standard Classification of Education (ASCED) (cat. no. 1272.0). Coding was based on the level and field of education as reported by respondents and recorded by interviewers. In addition to this, level of education was coded to the International Standard Classification of Education (ISCED), 1997.
- Coding of occupation. Occupation data were dual classified according to the ASCO - Australian Standard Classification of Occupations, Second Edition, 1997 (cat. no. 1220.0.30.001) - and the newly released ANZSCO - Australian and New Zealand Standard Classification of Occupations, First Edition, 2006 (cat. no. 1220.0). In addition, occupation was coded to the International Standard Classification of Occupations (ISCO), 1988.
- Coding of industry. Industry data were dual classified according to the newly released 2006 Australian and New Zealand Standard Industrial Classification (ANZSIC) (cat. no. 1292.0) and the previous 1993 Australian and New Zealand Standard Industrial Classification (ANZSIC) (cat. no. 1292.0.15.001). In addition, industry was coded to the International Standard Industrial Classification of All Economic Activities (ISIC), 1989.
43 More information on the concepts collected in ALLS and related international resources are available in the
Adult Literacy and Life Skills, Australia: User Guide (cat. no. 4228.0.55.002) available on the ABS website <www.abs.gov.au>.
COMPARABILITY WITH 1996 SURVEY OF ASPECTS OF LITERACY
44 Selected summary results from the 1996 SAL are presented in this publication to allow comparisons over time to be made. Of the five literacy domains available from the 2006 ALLS only two: prose and document literacy, are directly comparable to those derived from the 1996 SAL. Experts in the field have argued that the Quantitative Literacy domain, derived from the 1996 SAL, was only narrowly focused on numeracy competency and did not necessarily emphasise real-life tasks which demand more numerous and diverse numeracy skills. Consequently, a more expanded concept of adult numeracy was developed and has been incorporated in the 2006 ALLS. The concept of numeracy is based on a wider range of skills and knowledge that deal with situations which demand effective responses and situational management skills. As a result of these conceptual changes, time series analyses of the numeracy domain are not possible. The problem solving and the health literacy domains are new additions to the 2006 ALLS and hence no time series information is available.
45 The statistical significance of data changes between 1996 and 2006 has been investigated and results that are statistically significant are indicated in tables 2, 4, 16 and 19. The scope, survey design, content, data collection and weighting procedures were largely the same in both collections.
46 The sample sizes differed between the 2006 ALLS and 1996 SAL. In 2006, the number of fully or adequately responding households achieved in the survey was 8,988 compared to approximately 9,302 for the 1996 cycle. The 2006 cycle had a larger initial sample size (14,311 possible dwellings) compared to the 1996 initial sample size (13,008 possible dwellings). These differences in the sample size for 2006 and 1996 should be considered when comparing results.
47 A full list of the data items from the 2006 ALLS is available with the Adult Literacy and Life Skills Survey: User Guide (cat. no. 4228.0.55.002), as a datacube, available on the ABS website <www.abs.gov.au>. For published results from the 1996 SAL, refer to Aspects of Literacy: Assessed Skill Levels, Australia, 1996 (cat. no. 4228.0) available on the ABS website <www.abs.gov.au>.
ALLS PRODUCTS AND SERVICES
48 Below is information describing the range of data to be made available from the 2006 Adult Literacy and Life Skills Survey, both in published form and on request. Products available on the ABS website <www.abs.gov.au> are indicated accordingly.
Adult Literacy and Life Skills Survey: Summary Results, Australia, 2006 datacubes
49 An electronic version of the tables released in this publication is available on the ABS website (cat. no. 4228.0), in spreadsheet format. The spreadsheet presents RSEs relating to estimates and/or proportions for each publication table.
Adult literacy and Life Skills Survey: User Guide
50 The ALLS User Guide has been released in conjunction with this summary results publication. It provides detailed information about the survey content, methodology and data interpretation. It also contains the list of ALLS data items. The User Guide is available free-of-charge on the ABS website (cat. no. 4228.0.55.002).
State/territory tabulations
51 A set of tables in a spreadsheet format equivalent to those in this publication will be produced for each state and territory (subject to standard error constraints and excluding time series and international comparative tables) and Australia. These tables will be available from the ABS website <www.abs.gov.au> (as Datacubes to cat. no. 4228.0).
Microdata
52 For users who wish to undertake more detailed analysis of the survey data, microdata from the 2006 ALLS will be released in the form of two confidentialised unit record files (CURFs), the basic CURF (Adult Literacy and Life Skills Survey: Basic Confidentialised Unit Record File, cat. no. 4228.0.30.001) and the expanded CURF (Adult literacy and Life Skills Survey: Expanded Confidentialised Unit Record File, cat. no. 4228.0.30.002). The expanded CURF will contain more detail than the basic CURF and will only be available via the Remote Access Data Laboratory (RADL), which is a secure Internet-based data query service. The basic CURF will be available via CD ROM or RADL.
53 Up-to-date information on the ABS RADL service, including information on pricing, 'Applications & Undertakings', and a training manual outlining obligations and responsibilities when accessing ABS microdata, is available on the ABS website <www.abs.gov.au>. Those wishing to access the 2006 ALLS microdata should contact the ABS, referring to the contact details noted at the front of this publication.
54 Technical information describing the content and use of the basic and expanded ALLS CURFs will be available in the Technical Manual: Adult Literacy and Life Skills Survey, Australia: Confidentialised Unit Record File (cat. no. 4228.0.55.003). Those wishing to access ALLS microdata should contact the ABS, referring to the contact details noted at the front of this publication.
Data available on request
55 Special tabulations of ALLS data are available on request and for a fee. Subject to confidentiality and sampling variability constraints, tabulations can be produced from the survey incorporating data items, populations and geographic areas selected to meet individual requirements. These can be provided in printed or electronic form. Please refer to the contact details noted at the front of this publication.
RELATED ABS PUBLICATIONS
56 Listed below is a selection of other ABS publications on related topics which may be of interest. Information about current ABS publications and products can be found in the Catalogue of Publications (cat. no. 1101.0), or on-line on the ABS website <www.abs.gov.au>.
- A Directory of Education and Training Statistics (cat. no. 1136.0),
- Aspects of Literacy: Profiles and Perceptions, Australia (cat. no. 4226.0)
- Census of Population and Housing 2006 (cat. no. 2015.0)
- Education and Work, Australia (cat. no. 6227.0)
- General Social Survey (cat. no. 4159.0 )
- Labour Force Australia (cat. no. 6202.0)
- Education and Training Experience, Australia (cat. no. 6278.0)
57 A separate publication on health literacy is expected to be available early in 2008, through the ABS website <www.abs.gov.au>.