OverviewThis research team focused efforts to address the most critical barrier to the realisation of routine Unmanned Aircraft (UA) operations – the need for appropriate regulations to provide assurances in the safety of UA operations. The systematic study of the safety risks and social issues relating to the acceptance of these risks is foundational to the development of effective regulations for UAS.
Selected outputs from this research include:
- Analysis revealed a generally negative sentiment in the concepts associated with the technology, with the most prevalent themes being military-related.
- We also found that the media tend to report how the emerging technology was being used, as opposed to its features and capabilities. This suggests that the public may not have a good understanding of the technological capability or its uses outside of theatres of war. This is likely to unfavourably influence how the public perceive of the risks associated with UAS.
- We also observed a terminological divide that suggests that public perception may be influenced by what the technology is called.
- Finally, we observed that the information provided in the media changed significantly over the 14-year data collection period. Consequently, it is likely that public perception has also evolved and will continue to evolve as the technology becomes more widespread.
Analysis of current media articles on drones suggest the public to perceive the risks of drones to be higher than the risks associated with normal piloted aircraft. However, the results of this study suggest that the public perceive the risks to be no different. Further, safety does not appear to be of principal concern to the public, and in general, the public are yet to form a definitive position in relation to the technology. We also found that terminology had minimal affect on public perceptions, which contrasts with widely held industry beliefs and the results from analysis of media articles. However, this is subject to change as the public gains more knowledge about the technology and the risks and benefits associated with its use. The information made available to the public during this formative period will be influential in shaping public perceptions, and ultimately, acceptance of drone technologies.
This study provides empirical evidence to guide the selection of organisational trust repair strategies following risk events contributing to existing conclusions around contingency-based approach to trust repair at the interpersonal level of analysis, and post-crisis response research at the organisational level of analysis. Counterintuitive to these existing positions, the results of this research show that an organisation’s responsibility for the risk event does not influence the organisational selection or the perceived effectiveness of the trust repair strategy. That is, hypothesis one (denial matched to victim risk) and hypothesis two (excuse matched to technical error) were not supported but hypotheses three (organisational apology matched to preventable risk) was. The contingency-based view of matching trust repair strategy to organisational responsibility for the risk event does not hold. As a result, a single response strategy of an organisational apology is the only social account that shows an improvement in trust evaluations. The findings add weight to this conclusion by demonstrating that any other trust repair strategy contributes to a further decline in trust. In other words, when organisations do not apologise, regardless of whether the event is within their control or not, they face a further decline in trust.
Other outputs from this research has been incorporated into the CASA UAS Standards Sub-Committee, and presented at numerous industry and conference workshops (domestically and internationally).
Papers from the team include: