4233.0 - Health Literacy, Australia, 2006  
ARCHIVED ISSUE Released at 11:30 AM (CANBERRA TIME) 25/06/2008   
   Page tools: Print Print Page Print all pages in this productPrint All

EXPLANATORY NOTES


INTRODUCTION

1 This publication analyses the relationship between health literacy skills and a range of socio-demographic factors, as collected in the Adult Literacy and Life Skills Survey (ALLS). The ALLS is the second survey of its type to be conducted in Australia. Its predecessor, the International Adult Literacy Survey (IALS), was conducted in Australia in 1996 as the Survey of Aspects of Literacy (SAL). The IALS was the first internationally comparative survey of adult skills, undertaken with three rounds of data collection between 1994 and 1998. Similarly, the ALLS is an international survey that has been completed by participating countries in successive waves. In 2003, the first wave of countries that participated in the Adult Literacy and Life Skills Survey were Bermuda, Canada, Italy, Mexico, Norway, Switzerland and the United States. Second wave countries to take part were Australia, New Zealand, the Netherlands, Hungary and South Korea.

2 The health literacy domain was only derived for some countries, as it was provided as an additional service (requiring additional funding). It also was not derived on the same basis for all countries. To date, Canada is the only country with a health literacy domain comparable to Australia's; however, due to the unavailability of Canada's health literacy microdata, no detailed health literacy comparisons have been made between the two countries.

3 The ALLS provides information on knowledge and skills in the following five domains: health literacy, prose literacy, document literacy, numeracy and problem solving. In the 1996 SAL only three domains were assessed. These domains were prose literacy, document literacy and quantitative literacy. The addition of health literacy and problem-solving, as well as the expansion of the quantitative literacy to numeracy, provides an extra dimension to the assessment of adult skills. A number of prose and document literacy tasks administered in the 1996 SAL have been retained for the 2006 ALLS to provide comparison of levels of literacy over time. However there is no time series information available for the health literacy domain since it is a new addition to the 2006 ALLS. Both SAL and ALLS measured skills in the official language, which in Australia was English.

4 Information on the following topics was collected:

  • General demographic information
  • Linguistic information
      • first language learned and most used at home
      • self-assessed proficiency in spoken English
      • reading and writing proficiency in non-English language
  • Parental information
      • occupation
      • educational attainment
      • country of birth
  • Labour Force activities
      • current labour force status
      • labour force activities in the last 12 months
      • hours worked
      • occupation and industry of main job
      • job tenure
  • Literacy and numeracy practices at work and daily life
  • Frequency of reading and writing activities
  • Participation in education and learning
      • involvement in different types of learning
      • incidence of education and learning
      • reasons for pursuing education and learning
      • volume of education undertaken
  • Social capital and well-being
      • volunteering, civic participation
      • physical and psychological well being
  • Information and communications technology
      • access, types and usage
      • purposes and intensity of computer use
      • purposes and intensity of Internet use
      • self-assessment of skills
  • Personal and household income

5 The statistics included in this report present a broad overview of data items collected. Emphasis has been given to providing information on key measures of health literacy.

6 More detail on the survey content, and how literacy for the different domains was measured, is available in the Adult Literacy and Life Skills, Australia: User Guide (cat. no. 4228.0.55.002) available on the ABS website <www.abs.gov.au>.


SCOPE OF THE SURVEY

7 Only people who were usual residents of private dwellings in Australia were covered by the survey. Private dwellings are houses, flats, home units and any other structures used as private places of residence at the time of the survey. People usually resident in non-private dwellings such as hotels, motels, hostels, hospitals and short-stay caravan parks were not included in the survey. Usual residents are those who usually live in a particular dwelling and regard it as their own or main home. Visitors to private dwellings are not included in the interview for that dwelling. However, if they are a usual resident of another dwelling that is in the scope of the survey they have a chance of being selected in the survey or, if not selected, they will be represented by similar people who are selected in the survey.

8 The ALLS was conducted in both urban and rural areas in all states and territories, except for very remote parts of Australia. Queensland, South Australia, Western Australia and the Northern Territory have very remote areas. With the exception of the Northern Territory, the population living in very remote areas represents only a small proportion of the total population (approximately 2%). For this, and other practical reasons, no adjustment was made to state population benchmarks (population benchmarks are discussed below) when deriving survey results. This exclusion is unlikely to impact on national estimates, and will only have a minor impact on any aggregate estimates that are produced for individual states and territories, except the Northern Territory where the excluded population accounts for over 20% of people.

9 People aged 15 to 74 years were included in the survey. The estimated Australian resident population at December 2006, after the exclusion of people living in non-private dwellings and very remote areas of Australia, was 20,182,511 of which 15,105,435 were aged 15 to 74 years.

10 The following non-residents were excluded from resident population estimates used to benchmark the survey results, and were not interviewed:
  • diplomatic personnel of overseas governments
  • members of non-Australian defence forces (and their dependants) stationed in Australia
  • people whose usual place of residence was outside Australia.


SAMPLE DESIGN

11 The ALLS was designed to provide reliable estimates at the national level and for each state and territory.

12 Dwellings included in the survey in each state and territory were selected at random using a multi-stage area sample. This sample included only private dwellings from the geographic areas covered by the survey. The initial sample for the survey consisted of 14,311 private dwellings. This number was reduced to 11,139 dwellings due to the loss of households which had no residents in scope for the survey and where dwellings proved to be vacant, under construction or derelict. Of the eligible dwellings, 80.7% responded fully (or adequately) which yielded a total sample from the survey of 8,988 dwellings/people.

13 For Canada, the number of adequately fully responding people aged 16 to 65 years was 20,059. The response rate of eligible dwellings (excluding sample loss) was 66%.


DATA COLLECTION

14 ABS interviewers conducted personal interviews at selected dwellings during the period of June 2006 to January 2007. Interviews were conducted using a Computer Assisted Interviewing (CAI) questionnaire. CAI involves the use of a notebook computer to record, store, manipulate and transmit the data collected during interviews.

15 Much of the detail obtained from the ALLS was provided by one person aged 15 to 74 years randomly selected from each participating household. This person was randomly selected after basic information had been obtained about all household members.

16 Respondents were asked a series of questions to obtain background information of a socio-demographic nature, information about their perceptions of their literacy and numeracy abilities, their literacy-related practices in daily life and at work, and about their use of different languages.

17 After the interview was completed, the respondent was asked to participate in an objective assessment of their literacy skills. The first assessment, a core task booklet (CTB), contained six relatively simple literacy-related tasks. Respondents who completed three or more of these correctly, as assessed by the interviewer, were then given a much larger variety of tasks in a separate main task booklet (MTB). The tasks in the MTB, which are more numerous and diverse in complexity and subject matter, are designed to provide an understanding of the literacy skills of the general adult population. Respondents were asked to use the textual materials provided in the booklet to complete a series of tasks.

18 Those who did not correctly complete at least three of the tasks from the CTB were not asked to attempt the more difficult tasks in the MTB. These respondents were assessed as having attained skill Level 1 for each domain.

19 There were 28 different types of booklets used in the assessed part of the interview. One of these booklets was randomly allocated to each selected dwelling.

20 Each booklet consisted of two, of a possible eight, blocks of questions. The blocks of questions measure different skill domains:
  • Blocks 1 - 4 measure Prose and Document Literacy
  • Blocks 5 - 6 measure Numeracy
  • Blocks 7 - 8 measure Problem-solving

21 For health literacy, a fifth scale was produced as a by-product of the above testing process. The survey contained 191 daily tasks that were judged to measure health-related activities in five domains: health promotion (60 items), health protection (65 items), disease prevention (18 items), health-care and disease management (16 items), and navigation (32 items). The production of the health literacy scale was an additional service provided to participating countries. In Australia, the Australian Government Department of Health and Ageing provided funding for the additional calculation of the health literacy scale.

22 Interviewers administered the objective assessment component of the survey in a neutral and non-threatening manner, encouraging respondents to attempt as many tasks as possible. There were no time limits, and no assistance was allowed. Tasks covered each of the types of literacy, and were at varying levels of difficulty.

23 More information on the scoring of the MTBs is available in the Adult Literacy and Life Skills, Australia: User Guide (cat. no. 4228.0.55.002) available on the ABS website <www.abs.gov.au>.


WEIGHTING, BENCHMARKING AND ESTIMATION

Weighting

24 Weighting is the process of adjusting results from a sample survey to infer results for the total in-scope population. To do this, a 'weight' is allocated to each person. The weight is a value which indicates how many people in the population are represented by the person in the sample.

25 The first step in calculating weights for each person is to assign an initial weight, which is equal to the inverse of the probability of being selected in the survey. For example, if the probability of a person being selected in the survey was 1 in 600, then the person would have an initial weight of 600 (that is, they represent 600 people).


Benchmarking

26 The initial weights were calibrated to align with independent estimates of the population by sex, age, state and section of state, referred to as 'benchmarks'. Weights calibrated against population benchmarks ensure that the population estimates conform to the independently estimated distribution of the population rather than to the distribution within the sample itself. Calibration to population benchmarks helps to compensate for over- or under-enumeration of particular categories of people which may occur due to either the random nature of sampling or non-response.

27 The 2006 ALLS was benchmarked to the estimated resident population (ERP) aged 15 to 74 years living in private dwellings in each state and territory, excluding the ERP living in very remote areas of Australia, at 31 December 2006. The ERP for 2006 ALLS was based on results from the 2001 Census of Population and Housing. The ALLS estimates do not (and are not intended to) match estimates for the total Australian resident population obtained from other sources (which include people and households living in non-private dwellings, such as hotels and boarding houses, and in very remote parts of Australia).


Estimation

28 In order to minimise respondent burden, not all literacy domains were directly assessed for each respondent. The ALLS used a matrix-sampling design to assign assessment booklets to individuals so that a comprehensive picture of the literacy achievements across the country could be assembled from the components completed by each individual. The ALLS relied on Item Response Theory scaling to combine the individual responses to provide accurate estimates of literacy achievement in the population. With this approach, however, aggregations of individuals scores can lead to biased estimates of population characteristics. To address this, the ALLS scaling procedures also used a multiple imputation or "plausible scores" methodology to obtain proficiency scores in literacy for all individuals, even though each individual responded to only a part of the assessment item pool. By using all available data for each respondent, five "plausible scores" were generated for each of the five domains measured.

29 All literacy estimates presented in this report are obtained by taking the average of the five weighted estimates from each of the "plausible" scores. More information on measuring literacy is covered in the Adult Literacy and Life Skills Survey: User Guide (cat. no. 4228.0.55.002) available on the ABS website <www.abs.gov.au>.


RELIABILITY OF ESTIMATES

30 All sample surveys are subject to error which can be broadly categorised as either sampling error or non-sampling error. Sampling error occurs because only a small proportion of the total population is used to produce estimates that represent the whole population. Sampling error can be reliably measured as it is calculated based on the scientific methods used to design surveys. Non-sampling errors occur when survey processes work less effectively than intended. For example, some people selected for the survey may not respond (non-response); some survey questions may not be clearly understood by the respondent; and occasionally errors can be made in processing data from the survey.


Non-Sampling error

31 One of the main sources of non-sampling error is non-response by people selected in the survey. Non-response can affect the reliability of results and can introduce bias. The magnitude of any bias depends upon the level of non-response and the extent of the difference between the characteristics of those people who responded to the survey and those who did not.

32 To reduce the level and impact of non-response, the following methods were adopted in this survey:
  • face-to-face interviews with respondents
  • the use of interviewers who could speak languages other than English where necessary
  • follow-up of respondents if there was initially no response
  • ensuring the weighted file was representative of the population by aligning the estimates with population benchmarks.

33 Of the dwellings selected in the 2006 ALLS, 19.3% did not respond fully or adequately.

34 Every effort was made to minimise other non-sampling errors by careful design and testing of questionnaires, intensive training and supervision of interviewers, and extensive editing and quality control procedures at all stages of data processing.

35 An advantage of the CAI technology used in conducting interviews for this survey is that it potentially reduces non-sampling errors by enabling edits to be applied as the data are being collected. The interviewer is alerted immediately if information entered into the computer is either outside the permitted range for that question, or contradictory to information previously recorded during the interview. These edits allow the interviewer to query respondents and resolve issues during the interview. CAI sequencing of questions is also automated such that respondents are asked only relevant questions and only in the appropriate sequence, eliminating interviewer sequencing errors.

36 In contrast to most other ABS surveys, the ALLS estimates also include significant imputation variability, due to the use of multiple possible MTB questionnaires and the complex literacy scaling procedures. The effect of the plausible scoring methodology on the estimation can be reliably estimated and is included in the calculated RSEs. This is covered in more detail in the Technical Note.


Sampling error

37 Sampling error is the difference between the published estimates, derived from a sample of people, and the value that would have been produced if all people in scope of the survey had been included. For more information refer to the Technical Note. Sampling error is measured for this survey by relative standard errors (RSEs). In this report estimates with RSEs of 25% to 50% are preceded by an asterisk (e.g. *3.4) to indicate that the estimate should be used with caution. Estimates with RSEs over 50% are indicated by a double asterisk (e.g.**0.6) and should be considered unreliable for most purposes.


SEASONAL EFFECTS

38 The estimates in this report are based on information collected from June 2006 to January 2007, and due to seasonal effects they may not be fully representative of other time periods in the year. For example, the ALLS asked standard ABS questions on labour force status to determine whether a person was employed. Employment is subject to seasonal variation through the year. Therefore, the ALLS results for employment could have differed if the ALLS had been conducted over the whole year or in a different part of the year.


INTERPRETATION OF RESULTS

39 Care has been taken to ensure that the results of this survey are as accurate as possible. All interviews were conducted by trained ABS officers. Extensive reference material was developed for use in the field enumeration and intensive training was provided to interviewers in both classroom and on-the-job environments. There remain, however, other factors which may have affected the reliability of results, and for which no specific adjustments can be made. The following factors should be considered when interpreting these estimates:
  • Information recorded in this survey is essentially 'as reported' by respondents, and hence may differ from information available from other sources or collected using different methodologies. Responses may be affected by imperfect recall or individual interpretation of survey questions.
  • Some respondents may have provided responses that they felt were expected, rather than those that accurately reflected their own situation. Every effort has been made to minimise such bias through the development and use of culturally appropriate survey methodology.

40 Further information on the interpretation of results is contained in the Adult Literacy and Life Skills Survey: User Guide (cat. no. 4228.0.55.002) available on the ABS website <www.abs.gov.au>.

41 For a number of the ALLS data items, some respondents were unwilling or unable to provide the required information. Where responses for a particular data item were missing for a person or household they were recorded in a 'not known' or 'not stated' category for that data item. These 'not known' or 'not stated' categories are not explicitly shown in the report tables, but have been included in the totals. Report tables presenting proportions have included any 'not known' or 'not stated' categories in the calculation of these proportions.


CLASSIFICATIONS AND STANDARDS

42 The ALLS used the following classifications;
43 More information on the concepts collected in the ALLS and related international resources is available in the Adult Literacy and Life Skills, Australia: User Guide (cat. no. 4228.0.55.002) available on the ABS website <www.abs.gov.au>.


COMPARABILITY WITH 1996 SURVEY OF ASPECTS OF LITERACY

44 Since the health literacy domain is a new addition to the 2006 ALLS, no time series information is available. However there is time series information for the prose and document domains. See Adult Literacy and Life Skills Survey, Summary Results, Australia, 2006 (cat. no. 4228.0) for more details.


RELATED PRODUCTS AND SERVICES

45 Below is information describing the range of data made available from the 2006 Adult Literacy and Life Skills Survey, both in published form and on request. Products available on the ABS website <www.abs.gov.au> are indicated accordingly.


Adult Literacy and Life Skills Survey: Summary Results, Australia, 2006 publication and datacubes

46 A publication summarising results from the Adult Literacy and Life Skills Survey (cat. no. 4228.0) can be found on the ABS website <www.abs.gov.au>, in pdf format. The results from ALLS assists in answering questions of whether Australians' literacy skills are adequate for the challenges they face in work and daily life.

47 An electronic version of the tables released in this publication are available on the ABS website <www.abs.gov.au>, in spreadsheet format. The spreadsheet presents RSEs relating to estimates and/or proportions for each report table.


Adult literacy and Life Skills Survey: User Guide

48 The ALLS User Guide has been released in conjunction with the above summary results publication. It provides detailed information about the survey content, methodology and data interpretation. It also contains the list of ALLS data items. The User Guide (cat. no. 4228.0.55.002) is available free-of-charge on the ABS website <www.abs.gov.au>.


State/territory tabulations

49 A set of tables in spreadsheet format equivalent to those in Adult Literacy and Life Skills Survey, Summary Results, Australia, 2006 (cat. no. 4228.0) have been produced for each state and territory (subject to standard error constraints and excluding time series and international comparative tables) and Australia. These tables are available from the ABS website <www.abs.gov.au> (as datacubes to cat. no. 4228.0).


Microdata

50 For users who wish to undertake more detailed analysis of the survey data, microdata from the 2006 ALLS is available in the form of two confidentialised unit record files (CURFs), the basic CURF (Adult Literacy and Life Skills Survey: Basic Confidentialised Unit Record File, cat. no. 4228.0.30.001) and the expanded CURF (Adult literacy and Life Skills Survey: Expanded Confidentialised Unit Record File, cat. no. 4228.0.30.002). The expanded CURF contains more detail than the basic CURF and will only be available via the Remote Access Data Laboratory (RADL), which is a secure Internet-based data query service. The basic CURF is available via CD ROM or RADL.

51 Up-to-date information on the ABS RADL service, including information on pricing, 'Applications & Undertakings', and a training manual outlining obligations and responsibilities when accessing ABS microdata, is available on the ABS website <www.abs.gov.au>. Those wishing to access the 2006 ALLS microdata should contact the ABS, referring to the contact details noted at the front of this publication.

52 Technical information describing the content and use of the basic and expanded ALLS CURFs is available in the Technical Manual: Adult Literacy and Life Skills Survey, Australia: Confidentialised Unit Record File (cat. no. 4228.0.55.003). Those wishing to access ALLS microdata should contact the ABS, referring to the contact details noted at the front of this publication.


Data available on request

53 Special tabulations of ALLS data are available on request and for a fee. Subject to confidentiality and sampling variability constraints, tabulations can be produced from the survey incorporating data items, populations and geographic areas selected to meet individual requirements. These can be provided in printed or electronic form. Please refer to the contact details noted at the front of this publication.