Page tools: Print Page Print All | ||
EXPLANATORY NOTES
4 As in 2012–13, the 2013–14 scope for MPHS included households residing in urban, rural, remote and very remote parts of Australia, except the ICS. 5 In the LFS, rules are applied which aim to ensure that each person in coverage is associated with only one dwelling, and hence has only one chance of selection in the survey. See Labour Force, Australia (cat. no. 6202.0) for more detail. 6 The scope of the HSS matches the scope of the MPHS. 7 The coverage of the HSS was the same as the scope, except that persons living in Indigenous Communities in non-very remote areas and persons living in very remote areas were not covered for operational reasons. DATA COLLECTION 8 For the MPHS, ABS interviewers conducted personal interviews during the 2013–14 financial year for the monthly LFS. Each month, one eighth of the dwellings in the LFS sample were rotated out of the survey. These dwellings that were rotated out of the survey were selected for MPHS. In these dwellings, after the LFS had been fully completed for each person in scope and coverage, a usual resident aged 15 years or over was selected at random (based on a computer algorithm) and asked the additional MPHS questions in a personal interview. The publication Labour Force, Australia (cat. no. 6202.0) contains information about survey and sample design, scope, coverage and population benchmarks relevant to the monthly LFS, and consequently the MPHS. This publication also contains definitions of demographic and labour force characteristics, and information about telephone interviewing. 9 For the HSS, ABS interviewers conducted personal interviews between September 2013 and December 2013. The sample for the HSS was selected in particular Medicare Local catchments to improve estimates at this level. In these dwellings, up to 2 usual residents aged 15 years or over were selected at random (based on a computer algorithm) and asked the Patient Experience questions in a personal interview. 10 In the MPHS and the HSS, if the randomly selected person was aged 15 to 17 years, permission was sought from a parent or guardian before conducting the interview. If permission was not given, the parent or guardian was asked the questions on behalf of the 15 to 17 year old (proxy interview). 11 Data were collected for both the MPHS and the HSS using Computer Assisted Interviewing (CAI), whereby responses were recorded directly onto an electronic questionnaire in a notebook computer, with interviews conducted either face-to-face or over the telephone. The majority of MPHS interviews were conducted over the telephone while a larger proportion of HSS interviews were conducted face-to-face. SAMPLE SIZE 12 After taking into account sample loss, the response rate for the Patient Experience Survey was 77%. In total, information was collected from 27,327 fully responding persons. This includes 445 proxy interviews for people aged 15 to 17 years, where permission was not given by a parent or guardian for a personal interview. 13 After taking into account sample loss, the response rate for the Health Services Survey was 83%. In total, information was collected from 8,541 fully responding persons. This includes 184 proxy interviews for people aged 15 to 17 years, where permission was not given by a parent or guardian for a personal interview. WEIGHTING, BENCHMARKS AND ESTIMATION Weighting 14 Weighting is the process of adjusting results from a sample survey to infer results for the total 'in scope' population. To do this, a 'weight' is allocated to each enumerated person. The weight is a value which indicates the number of persons in the population represented by the sample person. 15 The first step in calculating weights for each unit is to assign an initial weight, which is the inverse of the probability of being selected in the survey. For example, if the probability of a person being selected in the survey was 1 in 600, then the person would have an initial weight of 600 (that is, they represent 600 people). 16 Initial weights were calculated separately for MPHS and HSS, as the probability of selection differed between the two surveys. The initial weights were then calibrated to align with independent estimates of the population. The MPHS sample was calibrated to one set of benchmarks. The HSS sample was calibrated to a different set of benchmarks which only included those Medicare Local catchments in scope of HSS. 17 The weights from the first calibrations were adjusted by the Medicare Local sample ratio for each survey. The data from the surveys were then combined. The adjusted weights were then calibrated to align with benchmarks. The population included in the benchmarks is the estimated civilian population aged 15 years and over living in private dwellings in each State and Territory excluding persons out of scope. This calibration process ensures that the weighted data conform to the independently estimated distribution of the population described by the benchmarks rather than to the distribution within the sample itself. Calibration to population benchmarks helps to compensate for over or under-enumeration of particular categories of persons which may occur due to either the random nature of sampling or non-response. 18 The survey was calibrated to the estimated resident population (ERP) in each state and territory at December 2013. 19 Every five years, the ERP series are revised to incorporate additional information available from the latest Census of Population and Housing (Census). The benchmarks used for the 2013–14 iteration are based on the 2011 Census, whereas the benchmarks for previous iterations were based on the 2006 Census. Estimation 20 Survey estimates of counts of persons are obtained by summing the weights of persons with the characteristic of interest. Estimates include data from both MPHS and HSS. 21 Analysis was conducted to determine whether there was any difference between the estimates which would have been obtained using the MPHS sample only and estimates obtained using the combined MPHS and HSS sample. This was particularly important given the predominantly different modes used between the two surveys. This analysis showed that combining the sample from the two surveys did not produce significantly different estimates. Therefore, estimates can be compared over time with other iterations of the Patient Experience Survey where only MPHS sample was used. Confidentiality 22 To minimise the risk of identifying individuals in aggregate statistics, a technique is used to randomly adjust cell values. This technique is called perturbation. Perturbation involves a small random adjustment of the statistics and is considered the most satisfactory technique for avoiding the release of identifiable statistics while maximising the range of information that can be released. These adjustments have a negligible impact on the underlying pattern of the statistics. After perturbation, a given published cell value will be consistent across all tables. However, adding up cell values to derive a total will not necessarily give the same result as published totals. 23 Perturbation has only been applied to 2013–14 data. Data from previous cycles (2009 to 2012–13) have not been perturbed. RELIABILITY OF ESTIMATES 24 All sample surveys are subject to error which can be broadly categorised as either:
25 Sampling error is the difference between the published estimate, derived from a sample of dwellings, and the value that would have been produced if all dwellings in scope of the survey had been included. For more information refer to the Technical Note. 26 Non-sampling error may occur in any collection, whether it is based on a sample or a full count of the population such as a census. Sources of non-sampling error include: non-response; errors in reporting by respondents or recording of answers by interviewers; and errors in coding and processing data. Every effort was made to reduce the non-sampling error by: careful design and testing of the questionnaire; training and supervision of interviewers; follow-up of respondents; and extensive editing and quality control procedures at all stages of data processing. DATA QUALITY 27 Information recorded in this survey is 'as reported' by respondents, and may differ from that which might be obtained from other sources or via other methodologies. This factor should be considered when interpreting the estimates in this publication. 28 Information was collected on respondents' perception of their health status and experiences with services. Perceptions are influenced by a number of factors and can change quickly. Care should therefore be taken when analysing or interpreting the data. 29 The definition of 'need' (in questions where respondents were asked whether they needed to use a particular health service) was left to the respondents' interpretation. 30 For some questions which called for personal opinions, such as self-assessed health or whether waiting times were felt to be unacceptable, responses from proxy interviews were not collected. 31 A small proportion of respondents were resident in areas with no Socio-economic Indexes for Areas (SEIFA) scores allocated. For the purposes of the Patient Experience Survey, these records have had a SEIFA decile imputed, based on the deciles of the surrounding areas. For information on SEIFA, see the Socio-economic Indexes for Areas (SEIFA) section below. DATA COMPARABILITY Comparability of Time Series 32 The ABS seeks to maximise consistency and comparability over time by minimising changes to surveys. Sound survey practice, however, requires ongoing development to maintain and improve the integrity of the data. The Patient Experience Survey, as a relatively new survey, is subject to a comparatively high level of change from this ongoing development. When comparing data from different cycles of the survey, users are advised to consult the questionnaires (available from the Downloads tab), check whether question wording or sequencing has changed, and consider whether this may have had an impact on the way questions were answered by respondents. 33 For some data items there was a change in the way that data was collected between 2012–13 and 2013–14, however the ABS view is that these items are still able to be compared over time. These data items are:
34 For some data items there was a changes in the way that data was collected between 2012–13 and 2013–14 which may affect comparability of data over time. These data items are:
35 The ABS recommends that due to changes in data collection methodology the following data items are not comparable with equivalent items in the 2012–13 cycle of the Patient Experience Survey:
36 The following data items were collected in 2012–13 but not in 2013–14:
37 The following data items were collected in 2013–14 but not in 2012–13:
38 As up to two people per household were selected for HSS, compared to only one person per household in MPHS, the collection of household income differed between the two surveys. For MPHS, household income is collected by asking the selected person their personal income and then the income for the remaining people in the household. Summing these provides the household income. For HSS, household income is collected by asking each selected person their personal income and then asking one of the selected people to provide the income for the non-selected people in the household. Summing these provides the household income. 39 The collections of labour force items differed between the two surveys. Less questions were asked in HSS than in MPHS to determine labour force data items. However, ABS advises that the differences between the questions asked in the two surveys do not affect the comparability of the output items. 40 All data items shown in time series tables are comparable between the survey cycles presented. Comparability with other ABS surveys 41 Caution should be taken in comparisons across ABS surveys and with administrative by-product data that address the access and use of health services. Estimates from the Patient Experience Survey may differ from those obtained from other surveys (such as the Australian Health Survey, General Social Survey and Survey of Disability, Ageing and Carers) due to differences in survey mode, methodology and questionnaire design. CLASSIFICATIONS 42 Remoteness areas are classified according to the Australian Statistical Geography Standard (ASGS): Volume 5 - Remoteness Structure, July 2011 (cat. no. 1270.0.55.005). Socio-economic Indexes for Areas (SEIFA) 43 The 2013–14 survey uses the 2011 Socio-economic Indexes for Areas (SEIFA). 44 SEIFA is a suite of four summary measures that have been created from 2011 Census information. Each index summarises a different aspect of the socio-economic conditions of people living in an area. The indexes provide more general measures of socio-economic status than is given by measures such as income or unemployment alone. 45 For each index, every geographic area in Australia is given a SEIFA number which shows how disadvantaged that area is compared with other areas in Australia. 46 The index used in the Patient Experience publication is the Index of Relative Socio-economic Disadvantage, derived from Census variables related to disadvantage such as low income, low educational attainment, unemployment, jobs in relatively unskilled occupations and dwellings without motor vehicles. 47 SEIFA uses a broad definition of relative socio-economic disadvantage in terms of people's access to material and social resources, and their ability to participate in society. While SEIFA represents an average of all people living in an area, it does not represent the individual situation of each person. Larger areas are more likely to have greater diversity of people and households. 48 For more detail, see the following: PRODUCTS AND SERVICES Data Cubes 49 Data Cubes containing all tables in Excel spreadsheet format can be found on the ABS website (from the Downloads tab). The spreadsheets present tables of estimates and proportions, and their corresponding relative standard errors (RSEs). Customised data requests 50 Special tabulations of the data are available on request. Subject to confidentiality and sampling variability constraints, tabulations can be produced from the survey incorporating data items, populations and geographic areas (including state and territory level data), tailored to individual requirements. These are provided in electronic form. A list of data items from the 2013-14 Patient Experience Survey is available from the Downloads tab. 51 For further information about these and related statistics, contact the National Information and Referral Service on 1300 135 070, or email client.services@abs.gov.au. The ABS Privacy Policy outlines how the ABS will handle any personal information that you provide to us. ACKNOWLEDGEMENTS 52 ABS surveys draw extensively on information provided by individuals, businesses, governments and other organisations. Their continued cooperation is very much appreciated and without it, the wide range of statistics published by the ABS would not be available. Information received by the ABS is treated in strict confidence as required by the Census and Statistics Act 1905. NEXT SURVEY 53 The Patient Experience Survey is conducted annually, with the next survey occurring in 2014–15. There are no plans to run the Health Services Survey again. Document Selection These documents will be presented in a new window.
|