EXTERNAL OBSERVERS
Summary of results
Over the course of the observation program, external observers provided feedback on the coding decision made by the Australian Bureau of Statistics (ABS) on 606,991 survey returns.
no. |
Survey returns observed % |
no. |
Instances where both observers disagreed with ABS coding decision % |
no. |
Instances where one or both observers agreed with ABS coding decision % |
Automatically coded Yes or No survey returns |
12,090,959 |
311,318 |
2.6 |
1 |
0.0003 |
311,317 |
99.9997 |
Manually coded Yes or No survey returns |
600,275 |
261,678 |
43.6 |
337 |
0.1 |
261,341 |
99.9 |
Blank survey returns |
16,912 |
15,659 |
92.6 |
29 |
0.2 |
15,630 |
99.8 |
Response Not Clear survey returns |
19,774 |
18,336 |
92.7 |
625 |
3.4 |
17,711 |
96.6 |
Rates of agreement with the ABS decision applied to an individual survey return ranged from 99.9997% of automatically coded forms, to 96.6% of forms coded as Response Not Clear.
These rates of agreement indicate, that if the observers were to have provided feedback on all 12,727,920 survey responses, then observers would have agreed with 99.99% of forms processed, which equates to 9 survey return disagreements per electoral division.
There were comparable numbers of Yes and No disagreements, such that if all of the disagreements were added up, and the observer's recommendation adopted, there would be a net decrease in the survey margin of just 56 survey responses at the national level.
These results demonstrate that the accuracy of the survey return coding was very high, and that there was no bias in the capture of Yes or No responses.
There were additional survey returns observed where only one observer disagreed with the ABS coding decision; these instances are detailed below.
Establishment and processes
Establishment of the observer program
External observers were engaged to provide feedback on a key element of the survey process: the coding of returned paper survey forms as Yes, No or Response Not Clear.
Two Commonwealth parliamentary committees were established solely for the purposes of the survey, one representing the yes position, and one representing the no position. Each committee nominated around 60 external observers each.
The role of observers was different to that of a scrutineer in an election. As the survey was conducted under the authority of the
Census and Statistics Act 1905, the rights and obligations of scrutineers under the
Commonwealth Electoral Act 1918, the
Referendum (Machinery Provisions) Act 1984, or other legislation did not apply.
Survey data capture and coding processes
All paper survey returns were opened and scanned at one of five processing centres across Australia and the digital images of scanned forms were then processed at the processing centre in Sydney.
The digital images were processed by an automatic coding system. Survey forms with only a single response box marked, no amendments made to the survey question or survey answers, and no other marks made in the white spaces to the left and right of the response boxes, were automatically coded as Yes or No.
95% of all returned survey forms were automatically coded as Yes or No.
Forms that could not be automatically coded were manually coded by trained coders in the processing centre in Sydney.
Manual coding was informed by a set of
coding guidelines which were developed in consultation with the two parliamentary committees and the Australian Electoral Commission.
4.7% of survey returns were manually coded as Yes or No, while a further 0.3% were manually coded as Blank or Response Not Clear.
Observer process
Observers worked over a six week period, between 29 September and 8 November 2017, at the processing centre in Sydney.
Observers from both committees were shown a sample of digital images of paper survey forms, along with the survey response as coded by the ABS for that form. Survey responses were coded as either Yes, No, Blank or Response Not Clear.
Where one or both observers disagreed with the decision made by the ABS, the disagreement was recorded, along with the observer's suggested response.
The samples of returns selected for observer review was designed to ensure that:
- nearly all returns coded as Blank or Response Not Clear were viewed by the observers
- a significant proportion of survey responses that were manually coded (coded by a human operator rather than the automatic coding system) were viewed by the observers.
- an approximately equal number of returns coded as Yes and No were reviewed by observers so that they would not have any indication of the overall survey result.
Feedback from observers did not change the recorded result for survey forms; it was recorded for reporting purposes only.
Detailed results
Automatically coded responses
There were 12,090,959 survey returns (95% of the total) that were automatically coded as either Yes or No. Observers provided feedback on 311,318 (2.6%) of these survey returns.
Observer feedback indicates that automatic coding was highly accurate. There was only a single instance where observers from both sides disagreed with the ABS code. In this case, both observers considered that the survey return should have been coded as Response Not Clear.
no. |
Instances where both observers disagreed with ABS coding decision % |
no. |
Instances where one or both observers agreed with ABS coding decision % |
Automatically coded as Yes |
155,656 |
- |
- |
155,656 |
100 |
Automatically coded as No |
155,662 |
1 |
0.0006 |
155,661 |
99.9994 |
Total automatically coded survey returns |
311,318 |
1 |
0.0003 |
311,317 |
99.9997 |
Manually coded responses
There were 600,275 survey returns (4.7% of the total) that were manually coded as either Yes or No. Observers provided feedback on 261,678 (43.6%) of these survey forms.
Observer feedback indicates that manual coding was very accurate. There were 337 instances where observers from both sides disagreed with the ABS coding decision. This represented 0.1% of all manually coded forms. Of these 337 instances, the ABS coded 236 as Yes and 101 as No.
The ABS developed very conservative rules for the use of automatic coding that included referring any survey returns which contained marks in the spaces to the left of the Yes response box, or the right of the No response box, for manual coding. This was done to ensure that any survey returns that may have had additional comments written on the form, that may have been material to the capture of the survey response, could be reviewed.
A consequence of this approach was that a higher proportion of No responses were referred to manual coding. Ticks made with a long tail in the No box were in some cases identified by the automatic coding system as possibly having additional markings, and were referred for manual coding. This was not the case for Yes responses indicated with a long tailed tick. Excluding these instances, the overall rate of disagreement for Yes and No manually coded responses is very similar.
no. |
Instances where both observers disagreed with ABS coding decision % |
no. |
Instances where one or both observers agreed with ABS coding decision % |
Manually coded as Yes |
131,049 |
236 |
0.2 |
130,813 |
99.8 |
Manually coded as No |
130,629 |
101 |
0.1 |
130,528 |
99.9 |
Total manually coded survey returns |
261,6780 |
337 |
0.1 |
261,341 |
99.9 |
Response not clear and blank responses
There were 19,774 survey returns (0.2% of the total) that were coded as Response Not Clear. Additionally 16,912 (0.1%) survey forms were returned blank. Observers provided feedback on 33,995, or 92.7% of these survey forms.
Where the intent of the survey respondent could not be determined, the survey response was coded as Response Not Clear. This commonly occurred where the forms were vandalised, where no attempt to answer the question had been made, or where both response boxes were marked.
There were 654 instances where observers from both sides disagreed with the ABS coding of the form as Response Not Clear or Blank. This represented 0.2% of all Blank survey returns, and 3.4% of survey returns coded as Response Not Clear.
no. |
Instances where both observers disagreed with ABS coding decision % |
no. |
Instances where one or both observers agreed with ABS coding decision % |
Blank |
15,659 |
29 |
0.2 |
15,630 |
99.8 |
Response Not Clear |
18,336 |
625 |
3.4 |
17,711 |
96.6 |
Total Blank and Response not clear survey returns |
33,995 |
654 |
1.9 |
33,341 |
98.1 |
Implications for overall survey results
As observers provided feedback on only a sample of the survey results, to measure the impact on the survey result, the ABShas extrapolated the disagreement rates to cover the additional forms not viewed.
To extrapolate the results of the observer program to cover all returned survey forms, the ABS used the proportion of survey forms where both observers disagreed, and applied that to the total number of survey forms returned.
The estimated total number of disagreements, had the observers viewed all survey returns, is shown below.
Automatically coded Yes or No survey returns |
1 |
29 |
Manually coded Yes or No survey returns |
337 |
636 |
Blank survey returns |
29 |
31 |
Response Not Clear survey returns |
625 |
674 |
Total |
- |
1,370 |
At the national level, this indicates that there would have been 1,370 survey responses where the both observers would have been expected to have disagreed with the ABS. This equates to 9 survey responses per electoral division.
Where observers raised a disagreement with the decision applied by the ABS, the coding result that the observers thought should have been applied was recorded by the ABS.
The following table shows the observer suggested coding results against the ABS coded response extrapolated across all survey forms in order to see the potential difference to the survey results.
Yes |
- |
135 |
- |
148 |
18 |
301 |
No |
133 |
- |
- |
218 |
13 |
364 |
Blank |
5 |
23 |
- |
2 |
1 |
31 |
Response Not Clear |
266 |
365 |
1 |
- |
42 |
674 |
Total |
404 |
523 |
1 |
368 |
74 |
1,370 |
These figures show that had the ABS applied code proposed by observers, there would have been, at the national level:
- 30 fewer returns recorded as Blank
- 306 fewer returns recorded as Response Not Clear
- 103 additional returns recorded as Yes responses
- 159 additional returns recorded as No responses.
This would have resulted in a net decrease in the survey margin of 56 responses across Australia.
Instances where only one observer disagreed with the ABS
There were an additional 386 instances where one of the observers disagreed with the coding decision made by the ABS, while the other observer agreed with the decision made by the ABS. These instances are shown below.
no. |
Only one observer disagreed with ABS coding decisions (Yes observer)% |
no. |
Only one observer disagreed with ABS coding decision (No observer)% |
Automatically coded survey returns |
Yes |
155,656 |
- |
- |
- |
- |
No |
155,662 |
- |
- |
- |
- |
Manually coded survey returns |
Yes |
131,049 |
- |
- |
40 |
0.03 |
No |
130,629 |
47 |
0.04 |
- |
- |
Blank |
15,659 |
2 |
0.01 |
3 |
0.02 |
Response Not Clear survey returns |
18,336 |
149 |
0.8 |
145 |
0.8 |