Qualifications and work methodology

This is not the latest release View the latest release
Reference period
2018-19 financial year
Released
29/09/2020

Overview

The Qualifications and Work publication presents detailed information on people's educational qualifications (including multiple and incomplete qualifications) and the relevance of completed qualifications to their current jobs. With data on the level and field of up to five qualifications, year of completion, cultural background, whether qualifications were attained in Australia, demographic characteristics, labour force participation and income, it provides a rich contextual background for understanding interplays between education and work.

How the data is collected

Scope

The scope of the survey was restricted to people aged 15 years and over who were usual residents of private dwellings and excludes:

  • members of the Australian permanent defence forces
  • certain diplomatic personnel of overseas governments, customarily excluded from Census and estimated resident population counts
  • overseas residents in Australia
  • members of non-Australian defence forces (and their dependants)
  • persons living in non-private dwellings such as hotels, university residences, boarding schools, hospitals, nursing homes, homes for people with disabilities, and prisons
  • persons resident in the Indigenous Community Strata (ICS).

The scope for Multipurpose Household Survey (MPHS) included households residing in urban, rural, remote and very remote parts of Australia, except the ICS.

Coverage

In the Labour Force Survey (LFS), rules are applied which aim to ensure that each person in coverage is associated with only one dwelling, and hence has only one chance of selection in the survey. See Labour Force, Australia (cat. no. 6202.0) for more detail.

Data collection

The survey is one of a number of small, self-contained topics on the MPHS.

Each month, one eighth of the dwellings in the LFS sample were rotated out of the survey and selected for the MPHS. After the LFS had been fully completed for each person in scope and coverage, a usual resident aged 15 years or over was selected at random (based on a computer algorithm) and asked the additional MPHS questions in a personal interview. 

In the MPHS, if the randomly selected person was aged 15 to 17 years, permission was sought from a parent or guardian before conducting the interview. If permission was not given, the parent or guardian was asked the questions on behalf of the 15 to 17 year old (proxy interview).

Data were collected using Computer Assisted Interviewing (CAI), whereby responses were recorded directly onto an electronic questionnaire in a notebook computer, with interviews conducted either face-to-face or over the telephone. The majority of interviews were conducted over the telephone.

Sample size

After taking into account sample loss, the response rate for the 2018/19 survey was 71.8%. In total, information was collected from 28,719 fully responding persons. This includes 477 proxy interviews for people aged 15 to 17 years, where permission was not given by a parent or guardian for a personal interview.

How the data is processed

Weighting

Weighting is the process of adjusting results from a sample survey to infer results for the total 'in-scope' population. To do this, a 'weight' is allocated to each enumerated person. The weight is a value which indicates the number of persons in the population represented by the sample person.

The first step in calculating weights for each unit is to assign an initial weight, which is the inverse of the probability of being selected in the survey. For example, if the probability of a person being selected in the survey was 1 in 600, then the person would have an initial weight of 600 (that is, they represent 600 people).

    Benchmarks

    The initial weights were calibrated to align with independent estimates of the population of interest, referred to as 'benchmarks'. Weights calibrated against population benchmarks ensure that the survey estimates conform to the independently estimated distribution of the population rather than the distribution within the sample itself. Calibration to population benchmarks helps to compensate for over or under-enumeration of particular categories of persons/households which may occur due to either the random nature of sampling or non-response.

    The survey was benchmarked to the Estimated Resident Population (ERP) living in private dwellings in each state and territory at December 2018. People living in Indigenous communities were excluded. These benchmarks are based on the 2016 Census.

    While LFS benchmarks are revised every 5 years, to take into account the outcome of the 5-yearly rebasing of the ERP following the latest Census, the supplementary surveys and MPHS (from which the statistics in this publication are taken) are not. Small differences will therefore exist between the civilian population aged 15 years and over reflected in the LFS and other labour household surveys estimates, as well as over time. If comparisons are being made over time then proportions should be used rather than estimates of persons.

      Estimation

      Survey estimates of counts of persons are obtained by summing the weights of persons with the characteristic of interest.

        Confidentiality

        To minimise the risk of identifying individuals in aggregate statistics, a technique is used to randomly adjust cell values. This technique is called perturbation. Perturbation involves a small random adjustment of the statistics and is considered the most satisfactory technique for avoiding the release of identifiable statistics while maximising the range of information that can be released. These adjustments have a negligible impact on the underlying pattern of the statistics. After perturbation, a given published cell value will be consistent across all tables. However, adding up cell values to derive a total will not necessarily give the same result as published totals. The introduction of perturbation in publications ensures that these statistics are consistent with statistics released via services such as TableBuilder.

          Reliability of estimates

          All sample surveys are subject to error which can be broadly categorised as either sampling error or non-sampling error. For more information refer to Accuracy.

            Classifications

            Country of birth

            Industry

            Language

            Occupation

            Socio-economic Indexes for Areas (SEIFA)

            Key education concepts

            Australian Standard Classification of Education (ASCED)

            Education data are coded to the Australian Standard Classification of Education (ASCED), 2001 (cat. no. 1272.0). The ASCED is a national standard classification which can be applied to all sectors of the Australian education system including schools, vocational education and training and higher education. The ASCED comprises two classifications: Level of Education and Field of Education.

            Level of Education is defined as a function of the quality and quantity of learning involved in an educational activity. There are nine broad levels, 15 narrow levels and 64 detailed levels. For definitions of these levels see the Australian Standard Classification of Education, 2001 (cat. no. 1272.0).

            Field of Education is defined as the subject matter of an educational activity. Fields of education are related to each other through the similarity of subject matter, through the broad purpose for which the education is undertaken, and through the theoretical content which underpins the subject matter. There are 12 broad fields, 71 narrow fields and 356 detailed fields. For detailed definitions of these fields see the Australian Standard Classification of Education (ASCED), 2001 (cat. no. 1272.0). 

            Level of highest educational attainment

            Level of highest educational attainment was derived from information on highest year of school completed and level of highest non-school qualification. The derivation process determines which of the 'non-school' or 'school' attainments will be regarded as the highest. Usually the higher ranking attainment is self-evident, but in some cases some secondary education is regarded, for the purposes of obtaining a single measure, as higher than some certificate level attainments.

            The following decision table is used to determine which of the responses to questions on highest year of school completed (coded to ASCED Broad Level 6) and level of highest non-school qualification (coded to ASCED Broad Level 5) is regarded as the highest. It is emphasised that this table was designed for the purpose of obtaining a single value for level of highest educational attainment and is not intended to convey any other ordinality. 

            Decision table: Level of highest educational attainment

             Level of highest non-school qualification
            Highest year of school completedCert IVCert IIICert III & IV n.f.d.Cert IICert ICert I & II n.f.d.Cert n.f.d.Inadequately described L.n.d.Not Stated
            Year 12Cert IVCert IIICert III & IV n.f.d.Year 12Year 12Year 12L.n.d.L.n.d.N.S.
            Year 11Cert IVCert IIICert III & IV n.f.d.Year 11Year 11Year 11L.n.d.L.n.d.N.S.
            Senior Sec. Education n.f.dCert IVCert IIICert III & IV n.f.d.Senior Sec. n.f.d.Senior Sec. n.f.d.Senior Sec. n.f.d.L.n.d.L.n.d.N.S.
            Year 10Cert IVCert IIICert III & IV n.f.d.Year 10Year 10Year 10L.n.d.L.n.d.N.S.
            Year 9 and belowCert IVCert IIICert III & IV n.f.d.Cert IICert ICert I & II n.f.d.Cert n.f.d.L.n.d.N.S.
            Sec. Education n.f.dCert IVCert IIICert III & IV n.f.d.L.n.d.L.n.d.L.n.d.L.n.d.L.n.d.N.S.
            Junior Sec. Education n.f.dCert IVCert IIICert III & IV n.f.d.L.n.d.L.n.d.L.n.d.L.n.d.L.n.d.N.S.
            Not statedCert IVCert IIICert III & IV n.f.d.N.S.N.S.N.S.N.S.N.S.N.S.
            Never attended schoolCert IVCert IIICert III & IV n.f.d.Cert IICert ICert I & II n.f.d.Cert n.f.d.L.n.d.N.S.

            Cert = Certificate
            L.n.d. = Level not determined
            n.f.d. = not further defined
            N.S. = Not Stated
            Sec. = Secondary
             

            The decision table is also used to rank the information provided in a survey about the qualifications and attainments of a single individual. It does not represent any basis for comparison between differing qualifications. For example, a person whose highest year of school completed was Year 12, and whose level of highest non-school qualification was a Certificate III, would have those responses crosschecked on the decision table and would as a result have their level of highest educational attainment output as Certificate III. However, if the same person answered 'certificate' to the highest non-school qualification question, without any further detail, it would be crosschecked against Year 12 on the decision table as Level not determined. The decision table, therefore, does not necessarily imply that one qualification is 'higher' than the other. Education Variables, June 2014 (cat. no. 1246.0).

            Comparing the data

            Comparability of Time Series

            In 2010-11, the Learning and Work survey was conducted on half the MPHS sample. In 2015, the Qualifications and Work topic was run on the full MPHS sample resulting in approximately twice the sample from 2010-11. 

            In 2018/19 and 2010/11 the data was collected from July to June whereas in 2015 the enumeration period ran from January to December. All surveys included a 12 month continuous collection period.

            There was a change to the way qualifications were asked in 2018/19. Selected respondents were asked whether they had any qualifications. In previous surveys they were asked to confirm qualification details collected from any responsible adult on behalf of the household. Due to this change, comparisons with previous surveys should be done with caution. 

              Comparability to monthly LFS Statistics

              Since the survey is conducted as a supplement to the LFS, data items collected in the LFS are also available in this publication. However, there are some important differences between the two surveys. The LFS had a response rate of over 90% compared to the MPHS response rate of 71.8%. The scope of the Qualifications and Work Survey (Q&W) and the LFS also differ (refer to these sections above). Due to the differences between the samples, data from this survey and the LFS are weighted separately. Differences may therefore be found in the estimates for those data items collected in the LFS and published as part of Q&W.

              Comparability with other ABS surveys

              Some comparisons can be made with other selected education and training publications. Care should be taken when comparing data from different surveys due to the differences in scope, definitions and methodologies used. In the interpretation of the results of this survey, consideration should be given to the representativeness of the sample. This is affected by the response rate (which is generally lower for surveys conducted as personal interview) and also the fact that the survey covers only people living in private dwellings.

              Additionally, estimates from Q&W may differ from the estimates for the same or similar data items produced from other ABS collections for several reasons. For example, all sample surveys are subject to different sampling errors so users should take account of the relative standard errors (RSEs) on estimates where comparisons are made. Differences also exist in scope and/or coverage, reference periods reflecting seasonal variations, non-seasonal events that may have impacted on one period but not another, or because of underlying trends in the phenomena being measured.

              Differences can occur as a result of using different collection methodologies. This is often evident in comparisons of similar data items reported from different ABS collections where, after taking account of definition and scope differences and sampling error, residual differences remain. These differences are often the result of the mode of the collections, such as whether data are collected by an interviewer or self-enumerated by the respondent and whether the data are collected from the person themselves or from a proxy respondent. Differences may also result from the context in which questions are asked, i.e. where in the interview the questions are asked and the nature of preceding questions. The impacts on data of different collection methodologies are difficult to quantify. As a result, every effort is made to minimise such differences.

              Survey of Education and Work

              The Survey of Education and Work (SEW) (cat. no. 6227.0) has some similarities with Q&W. Conducted annually, SEW provides a range of indicators about educational participation and attainment, and data on people's transition between education and work. Comparison of SEW and Q&W data should be undertaken with caution due to different collection methodologies, scope and sample size. SEW is based on a household interview with any responsible adult who responds on behalf of all persons aged 15-74 years in the household. Q&W is conducted as a personal interview with one randomly selected person, aged 15 years or over, in the household. As such, the Q&W survey has a smaller sample size.

              Recent Migrants

              Care should be taken when comparing with other migrant collections due to possible under-coverage. Coverage is always a greater challenge for recent migrants than longer-term migrants and Australian-born people, for reasons including: age, language and English proficiency, a propensity to live in secure apartment buildings or in temporary accommodation and increased difficultly in contact due to study and/or work commitments during peak contact periods. 

              The calibration to population benchmarks helps to compensate for over or under-enumeration of particular categories of persons/households however country of birth and year of arrival are not used in this benchmarking process.

              How the data is released

              Datacubes/spreadsheets 

              Data Cubes containing all tables for this publication in Excel spreadsheet format are available in the Data downloads. The spreadsheets present tables of estimates and proportions, and their corresponding relative standard errors (RSEs) and/or Margins of Error (MoEs).

              As well as the statistics included in this and related publications, the ABS may be able to provide other relevant data on request. Subject to confidentiality and sampling variability constraints, tables can be tailored to individual requirements for a fee. A list of data items from this survey is available from the Downloads tab. All enquiries should be made to the National Information and Referral Service on 1300 135 070, or email client.services@abs.gov.au 

              TableBuilder

              For users who wish to undertake more detailed analysis of the data, the survey microdata will be released through the TableBuilder product. Microdata can be used by approved users to produce customised tables and analysis from the survey data. Microdata products are designed to ensure the integrity of the data whilst maintaining the confidentiality of the respondents to the survey. More information can be found at How to Apply for Microdata

              DataLab

              Detailed microdata will be available on DataLab for approved users who are required to undertake interactive (real time) complex analysis of microdata in the secure ABS environment. For more details, refer to About the DataLab

              Accuracy

              Reliability of the estimates

              The estimates in this publication are based on information obtained from a sample survey. Any data collection may encounter factors, known as non-sampling error, which can impact on the reliability of the resulting statistics. In addition, the reliability of estimates based on sample surveys are also subject to sampling variability. That is, the estimates may differ from those that would have been produced had all persons in the population been included in the survey. This is known as sampling error.

              Two types of error are possible in estimates based on a sample survey:

              • non-sampling error
              • sampling error

              Read more

              Glossary

              Show all

              Abbreviations

              ABSAustralian Bureau of Statistics
              ABSCQAustralian Bureau of Statistics Classification of Qualifications
              ANZSCOAustralian and New Zealand Standard Classification of Occupations
              ANZSICAustralian and New Zealand Standard Industrial Classification
              ASCEDAustralian Standard Classification of Education
              ASGSAustralian Statistical Geography Standard
              ASCLAustralian Standard Classification of Languages
              GCCSAGreater Capital City Statistical Areas
              ISCEDInternational Standard Classification of Education
              LFSLabour Force Survey
              MOEMargin of Error
              n.f.d.not further defined
              Q&WQualifications and Work
              RSErelative standard error
              SACCStandard Australian Classification of Countries
              SEstandard error
              SEIFASocio-economic Indexes for Areas
              SEWSurvey of Education and Work
              TBTableBuilder
              Back to top of the page