With autonomy becoming more common in aviation platforms, like drones, it’s essential that standards and software systems keep up so we stay safe in this brave new world.
Article by:
School of Engineering
Is it a bird? Is it a plane? No, it’s a drone. The hum of drones is becoming a familiar part of the soundscape of our daily lives. From fast food deliveries and hobbyists going for a spin, to monitoring for and responding to bushfires, drones have both fun and fundamental uses. But the systems that underpin these remotely piloted aircraft are far from basic, especially as the drone industry evolves and we see more autonomous capabilities being implemented.
The addition of autonomy to uncrewed air transport introduces a number of challenges. The aviation industry is heavily regulated and the successful implementation of autonomous drones is reliant upon robust certification and safety assurance.
The ANU aerospace cluster is currently leading work around the assurance and certification of drone autonomy. We have identified some of the key challenges in this space and are developing methods to address them. Two key challenges are managing the diversity and complexity of autonomous systems, and certifying the software that underpins these systems.
The word ‘autonomy’ itself is representative of a spectrum of capabilities. Partially autonomous systems involve shared control, where humans have some influence. Robot waiters are a good example of a system with shared control. Humans are very involved in the operation of this system, placing food on the trays, selecting the table for delivery and ensuring the robots return to the counter when their job is complete.
In fully autonomous systems, humans have less or no influence. Warehouse robots, such as those used to pick and pack orders by Amazon, operate through a fully autonomous system. These robots are capable of navigating through a warehouse and performing required tasks without human intervention.
We then have the added layer of complexity that comes with adaptive and learning systems. Adaptive systems are capable of changing behaviour in response to their surrounding environment. Self-driving cars, such as a Tesla, are a good example of an adaptive system. These vehicles have sensors that provide them with awareness of their surrounding environment. They are able to respond to their environment with the purpose of avoiding potential collisions with other vehicles or pedestrians.
Learning systems are capable of improving their behaviour over time. Everyday navigation tools most people use, such as Google Maps, are examples of learning systems. When using Google Maps, your route from point A to point B is constantly updated, providing information on potential delays along the way and presenting alternate routes that can get you to your destination faster.
So how do we begin to build certification around systems that can vary so much across the broad spectrum of autonomy?
When we think about certifying autonomy, what we are really certifying is the software that underpins that capability. Existing aviation standards do not cover the software systems that underpin autonomous aircraft. The regulatory framework needed for these software systems will be vastly different from what is currently in place for civil aviation.
For example, most aviation standards don’t change very much over time, but software systems are quite the opposite. Software is amorphous and dynamic. Updates and changes are a constant with most of these systems. Factors like these raise questions such as ‘how do we certify software that changes and evolves over time?’
We are likely to see fully autonomous systems become more prevalent in the aviation industry. Drones will soon demonstrate levels of complexity that are not covered by existing aviation standards and regulations. How we certify, assure and manage autonomous systems will require a proper understanding of the technologies we are dealing with.
That’s why the ANU aerospace team is conducting crucial research exploring methods for assuring and certifying autonomy. We have also recently established a community of practice bringing together the Australian Government, industry and academia to develop a coordinated research agenda to ensure drones can operate safely across Australian skies.
Top image: Es sarawuth/stock.adobe.com
Related tags:
The temporal lobe – a key area of our brain responsible for our memory and communication – could also reveal hidden clues about how human brains have evolved over time, according to a new study from ANU.
Tech entrepreneurs Dexter Todd and Oliver Bagin have launched Bardar to spare us from disappointing nights out.
ANU academics have been elected as Fellows by independent academies.