• +1-703-684-6777
  • See footer
Swiss Startup Sees Artificial Intelligence as Key to UAM
  • 31 Aug 2020 10:05 AM
  • 0

Swiss Startup Sees Artificial Intelligence as Key to UAM

Daedalean AG has a long-term vision to provide AI for vertical flight, step by step.

By Richard Whittle

Vertiflite, Sept/Oct 2020

Galvanized by the promise of electric vertical takeoff and landing (eVTOL) aircraft, a small startup in Switzerland has set out to lead the way to a new age of self-flying aircraft by doing something no one yet knows how to do and regulators don’t know how to certify as safe. Daedalean AG is developing autonomous flight controls that rely on artificial intelligence (AI).

“Our goal is to make it possible to fly without relying on human pilots,” said Luuk van Dijk, cofounder and chief executive officer of Zurich-based Daedalean, which the software engineer started in 2016 with fellow Google alumnus and helicopter pilot Anna Chernova. Their ambitious strategy is to develop aircraft-agnostic, autonomous flight control software that relies on various forms of AI, including computer vision, machine learning and neural networks. Those are mysterious concepts for most audiences and problematic ones for aviation regulators.

Van Dijk conceded that Daedalean’s vision is on the sharpest part of the cutting edge of aviation technology, and therefore a gamble, but he feels the times demand such solutions.

EASA’s W-shaped development cycle for learning assurance

“The inspiration was the eVTOL projects that started springing up three or four years ago,” said van Dijk. Urban air mobility (UAM) concepts envision hundreds or even thousands of eVTOL air taxis crowding city skies as they carry five or fewer people on short hops. “We thought, ‘Okay, that’s only going to be possible if you don’t require someone with a CPL,’” (a commercial pilot license) to fly those aircraft, van Dijk said.

Van Dijk and Chernova think the alternative to a human pilot or remote control is AI, and that makes Daedalean an outlier. “Almost nobody does computer vision and neural networks because everybody believes that’s uncertifiable,” van Dijk said. “‘AI — nobody knows how it works and it’s non-deterministic and it’s a black box and you can never get that certified.’ That’s actually a misperception, but it’s very widespread.”

Computer vision, one element of a broader autonomy category called “perception,” refers to software designed to recognize and act on images gathered by cameras, much as humans react to what they see with their eyes. Neural networks are sets of algorithms that “machine learn” by analyzing data fed into them and then altering themselves until the network has “learned” to respond to the data correctly, much as a human would learn, as opposed to being reprogrammed.

This poses a major challenge for regulators, since the outcomes of computer vision and machine learning using neural networks depend on the nature of the situations they face, and not all situations can be predicted.

“People hate to be subject to forces they don’t understand,” van Dijk observed. But as Daedalean develops such systems, its leaders hope to help regulators figure out how to certify AI-based flight control systems as safe.

Working with Regulators to Demystify AI

An underlying assumption is that the company’s systems will have to be certifiable under existing regulatory philosophy, which van Dijk said “is centered around the human pilot very deep in the control loop. You can’t change the system around. You have to come up with something that does all these things a human pilot does at least equally well, so that you have sort of a drop-in replacement, [a] path of least resistance. That’s our ten-year-long product roadmap. All the things a pilot does, we want to make a system that does it better.”

Daedalean has pursued its regulatory strategy by collaborating with the European Union Aviation Safety Agency (EASA) in a 10-month project to develop a first set of guidelines on how to ensure the safety of machine learning systems in aviation. The study, “Concepts of Design Assurance for Neural Networks (CoDANN),” defined a framework for “Learning Assurance,” a method for imposing “strict requirements on the datasets used” in a machine learning system’s algorithms for “any type of safetycritical application.” It was published in March.

“R2D2” camera pod used in Daedalean test of its visual positioning system.

“The objective of Learning Assurance is to gain confidence at an appropriate level that an ML [machine learning] application supports the intended functionality, thus opening the ‘AI black box’ as much as practically possible and required,” the study said. As a result of the study, EASA adopted a W-shaped development cycle for learning assurance, providing “the essential steps” as a foundation for future agency guidance on machine learning.

EASA has “a roadmap for certification, and we are a part of it,” van Dijk said. On July 1, EASA and Daedalean agreed to conduct another nine-month joint study, each devoting four experts to the task for up to 300 hours, to work out guidelines for open questions raised in the initial study.

The initial study with EASA focused on Daedalean’s first proposed product, a computer-vision system called Raven that provides positioning, including landmark recognition, plus regular and emergency landing guidance, and non-cooperative and cooperative traffic detection for both fixed-wing and VTOL aircraft. Still in the technology demonstration stage, Raven uses machine learning only in a limited way and only while on the ground.

“We don’t implement ‘decision making’ yet,” van Dijk explained by email. “One of the things we avoid in our machine-learned system is that they [would] ‘adapt’ or alter themselves in flight. To keep the problem tractable, we separate the design/learning phase (which can and does of course use offline-gathered data from real flights) from the deployment-in-flight phase.”

Used this way, Van Dijk said, machine learning far outstrips the ability of pilots to get better through experience, no matter how many thousands of hours they spend in the cockpit. “We can gather orders of magnitude more flight hours’ worth of data, pool it, and verify performance in the lab before releasing it to the deployed system with the performance guarantees in place,” he said.

Daedalean hasn’t yet engaged with the US Federal Aviation Administration (FAA), but its approach parallels current FAA policy as described by Wes Ryan, unmanned and pilotless aircraft technology lead for the Policy and Innovation Division of the FAA Aircraft Certification Service, during a June 24 autonomy webinar held by the Air Force’s Agility Prime, a project office that promotes eVTOL development.

“Our approach is a build-up from low risk functions to higher risk as we gain confidence or trust in AI and ML,” Ryan said. “We are taking a task-based approach as well, so initially systems may not be allowed to learn during flight, but may be required to learn offline between missions until the algorithms can be shown to be stable, repeatable, reliable and safely bounded in their behavior.”

Daedalean has tested Raven on Cirrus SR22 and Cessna fixedwing airplanes, on a Volocopter eVTOL prototype, on drones the company flies for cheaper and faster development, and most recently on an MD Helicopters MD 520N no tail rotor (NOTAR) light helicopter. Raven employs a machine learning component in landmark recognition, van Dijk said in an email, but that is not an in-flight neural network.

“We do use neural nets for the emergency and planned landing guidance and for the detect-and-avoid,” he said. “Our work with EASA was to show that given a set of precisely known situations a neural net actually does behave predictably, but of course the crux is to show that it is sufficiently reliable given the expected situations (which do include uncertainty). We don’t use the [neural network] for high-level decision-making, but for recognizing if a blob of pixels looks like an aircraft [for detect-and-avoid] or if a runway or flat piece of land looks unobstructed. The decisions based on that information can then be taken by a higher-level system (or a pilot, if we are in a pilot-advisory setup).”

Equipping Aircraft to See Their Flight Path

The MD 520N was used in a June 29 test I was invited to watch via Google Hangouts in which Daedalean’s Visual Positioning System (VPS) performed passively, tracking the aircraft’s flight path visually and feeding its data to Daedalean engineers monitoring computer screens. For the test, a black pod nicknamed “R2D2” — which held cameras aimed in different directions — was attached to the nose of the helicopter and connected by cables to a ruggedized laptop in the cabin that was running Daedalean’s software. Chernova explained that the pod is used only for tests; cameras will be held in small, aerodynamic structures when the system is offered as a product.

During the roughly two-hour test, narrated by Chernova, a pilot flew an MD 520N owned by Fuchs Helikopter from its base in nearby Schindellegi, Switzerland, about 70 km (38 nm) west on an outbound leg over two small lakes. It landed in a field in the canton of Aargau, then took off and flew a slightly different course home. The engineers were testing how well the VPS tracked the MD 520N’s path within a 210-ft (64-m) corridor during cruise and landing, as shown on a digital map visible in Google Hangout.

“The system performed as designed,” Jakob Rieckh, technical program manager at Daedalean, said afterward by email. “It accurately located the helicopter along the flight path. We were slightly outside the corridor for a brief period while flying over the lake. Presumably due to fewer visual clues (contrast) on the water. Still needs investigation by the engineers in the coming days to avoid those situations entirely in the final solution.” Even so, Rieckh said, “The test shows that if we were to feed the position estimate from our system into the aircraft autopilot, we could navigate in cruise and during landing.”

Rieckh added, “Our long-term product vision is to build a fully autonomous and certified aircraft [flight control system]. The roadmap towards this vision is shaped by the skills that a human pilot fulfills today. Visually localizing the aircraft in addition and as a backup to GPS is a fundamental function of a pilot. The VPS system we tested today fulfills exactly the same [function]. Additional building blocks are planning a mission, detecting and avoiding airborne and ground objects, fusing different sensor inputs, communicating with ground control, steering the aircraft and making decisions in emergency situations.”

Tackling the Hardest Tasks First

Daedalean has raised $17.5M in the four years since its founding and is trying to raise another $25M or more to bring its computervision landing and detect-and-avoid system to market in 2021 with hardware partners Avidyne Corp., BendixKing and Honeywell. But this is just one task on a spreadsheet taped to a wall in Daedalean’s headquarters that catalogs “the 30 things you do as a pilot,” van Dijk said. “That was our roadmap.”

520N ready for test flight with black camera pod on nose.

That roadmap might logically have begun with developing systems able to fly aircraft under instrument flight rules (IFR), since computers and other elements of autopilot systems are, in a manner of speaking, flight instruments, van Dijk said. Instead, Daedalean started their work at the other end of the list of the “30 things pilots do,” about a third of which depend on using their eyes. “The practice of IFR is, there’s two human beings talking over a voice channel on how not to fly into things, and then at some point you drop out of IFR and you’re in VFR [visual flight rules] anyway,” van Dijk said. “So, we figured, if you want to start with something that’s useful, you want to be able to fly in VFR.”

Other companies working on certifiable autonomy have far different strategies. Lockheed Martin’s Sikorsky, for example, is developing a set of autonomy systems called MATRIX Technology based on deterministic programming rather than artificial intelligence (see “Making Autonomy Certifiable,” Vertiflite July/Aug 2020).

Some companies also are trying to develop full autonomy from the start, said Daedalean system architect Olivier Cornes, but Daedalean is taking the challenge step by step for a compelling reason. “There is no company on the planet that has a clear idea of what it’s going to take to do a fully autonomous [certifiable] aircraft,” Cornes said.

Besides being safe enough to fly passengers or freight over a city, he said, for civilian use an autonomy system must also be cheap and reliable. “We know that these challenges are very big,” Cornes said. “That’s why we’re going at the problem in edible chunks. So, the first part is to do things that will help the pilot — augmentation systems, making flying easier and safer because you are assisting the pilot in different functions.”

This is part of the logic behind the Raven pilot-augmentation computer-vision system Daedalean hopes to sell as its first product. “The system that we’re developing allows you to recognize any runway that you can see, and it will guide you properly to the ground without having any equipment on that runway,” Cornes said. That could be a valuable aid for pilots who are tired, landing at an unfamiliar airfield, or have deficits in training, he noted. The Daedalean system’s ability to see other aircraft, he added, could be critical: “With the cameras you have a better field of view and you spot things much farther away than with your naked eye.”

Van Dijk added that while the company’s first major goal is to produce a system that can fly an airplane under daylight VFR, “Night VFR we think is actually doable. We have not tested it yet [but] we could easily add infrared cameras so that we see more at night.” Developing systems to fly under IFR, van Dijk said, will be “an easier problem” once air traffic control communication with aircraft becomes automated enough. “Once that has been solved, it’s going to be relatively easy to fly in IFR,” he said. “But if you’re flying VFR, you need eyes to obey the rules. And you cannot fly IFR everywhere.”

Exotic as it is and mysterious as it sometimes sounds, AI is already a part of daily life, and the riches it brought some of its early adapters in Silicon Valley have helped fund the rush to develop eVTOL aircraft. But van Dijk, who spent nine years as a software engineer at Google in two stints sandwiched around 14 months at Elon Musk’s SpaceX, ventured that “people who write apps for phones should not be allowed anywhere near a computer that’s used for a safety critical application.

“At Google, the [mission statement is to] ‘organize the world’s information and make it universally accessible and useful,’” van Dijk said. “But whether you’re done organizing the world’s information and whether it’s useful enough and universally accessible enough is ultimately decided by a group of people in a committee who decide that you’ve finished your project and will get promoted or not. What I really liked about SpaceX is, it’s Mother Nature who makes the call if you’ve done your job, and if not, the rocket’s going to explode. So, everybody sleeps a lot better if they’ve done all they can do to prove that the software is safe.

And I find this a very interesting point in the design space, which motivated me to start [Daedalean] to begin with.”

 

Leave a Comment