Sensory deprivation trouble

Old fashioned engineering is boring. Computer engineering works. Computer-aided design eliminated the drudgery of creating hardcopy drawings and cross-checking design decisions. Computer-based simulation takes care of testing the ideas. Large engineering teams can collaborate across multiple design and manufacturing sites. Ever more complex designs of complex machinery are now possible to let computers relying on arrays of sensors to take control, make operational maneouvres consistent and predictable, and let human operators relax and concentrate on handling only the most exceptional events.

Access to ever increasing computing power and smart software helped engineers to reach for the proverbial stars. Jokes about Apollo Guidance Computer steering manned missions to the Moon being less equipped than today’s toaster or your smartphone being more powerful than all of NASA’s computing power in 1969 are not funny anymore. As processors and integrated circuits shrank, software running on those computers allowed the engineers to equip complex machines with the ability to perform ever more complex calculations and make decisions in real time.

Since a miniaturised sensor can be added to any device to track its performance, the systems are no longer imagined to signal problems to human operators. They are designed to handle and correct their own mistakes. For every discovered exception, a new sensor can be added to the array to feed data into the subsystem control.

The subsystems in a complex machine like a ship or an airplane led to the formation of independent design and testing teams solely concentrated on the knowledge of inner workings of those subsystems. Highly specialised testing tools in use today allow to certify that the subsystem works perfectly. Airplane manufacturers hope to catch any problems during simulator runs performed by real pilots. Shipbuilders are ages behind in that matter, mostly relying on digital twins to assure seaworthiness of the structures they build.

Along the way, designers and testers missed something important. More sophisticated modes of automation led to even more sophisticated paths of sub-systems interactions that no human or machine could have imagined. The separation of the engineering teams left serious gaps in understanding what multiple systems can tell each other and how they can behave outside of the human control and intervention.

The recent saga of the Boeing airplane illustrated this problem. Changes to the flight control system were suggested by a test pilot who didn’t need to understand how changes in one system will influence actions of interconnected systems. Engineers changed the functionality which turned a benign background system into a mission critical one. Designers made a cost saving call to use one sensor to inform the control system. The safety analysts didn’t know that now critical system relied on one sensor. Regulators accepting the changes, even if they decided to test the whole system’s behaviour, would not have appropriate tools to do so. Competency and efforts of the pilots of the two doomed planes were insufficient to fight off the errant control systems. And in the end, innocent people died.

Then came a near disaster of Titanic proportions. A cruiseship, Viking Sky, with 1,300 passengers on board encountered loss of power to the engines in detrimental weather and near a dangerous shore. A preliminary report stated that due to the bad weather the sensor monitoring levels of lube oils sent a signal of low lube levels to the engine control system which then promptly shut down the engines. There was no time for the ‘human in the loop’ to decide. It is unclear at this moment why mission critical system relied on a single sensor, why the design of interconnection between the tank control system and the engine control system didn’t have a fallback mode to manual operations, or why tests and certifications have not properly assessed all possible interactions between those two subsystems. Just as in the case of the aviation regulator, I am certain that the classification society certifying Viking Sky had neither the tools to confirm the tests, nor an idea of the tests required to certify safety of the intelligent interconnectivity.

The last line of defence against errant automation, the human crew of Viking Sky, got an honourable mention by the Norwegian Maritime Authority stating that the competency and efforts of the crew played an important role in the fortunate outcome. This is hardly believable, as the crew struggled to understand what was happening and automation did not provide them with vital information or opportunity to counteract it. It appeared that the only thing possible for the crew to do was to drop the anchors hoping for a miracle and pray. Thankfully, in that case, the miracle happened.

Automation surprises like those encountered by Viking Sky and the Boeing airplane will keep happening, unless we rethink complex systems testing and quality assurance. More complex automated sub-systems can interact in more complex ways in abnormal situations and we will not be able to test the behaviours of the whole integrated system for those abnormal situations with our present tools. The complexity of each subsystem requires the engineering team to focus on correct designs and tests of a subsystem. Their simulation tools are good, but they don’t allow them to test the broad array of interactions with other sub-systems and discover occurrences of crossed signals and conflicting commands.

It is clear that if the shipbuilding industry is dreaming of autonomous ships they must redesign their computer-assisted tooling. What was good in the 1990s is clearly not providing design and test assurances in the face of much more complex interconnectivity between all things ‘smart’ being installed onboard. The classification societies have to equally redesign their certification processes to stay relevant in the age of hyper automation reaching the brittle boundaries of what’s controllable. If this is not done now, no passenger ever of a sane mind will board an autonomous vessel of any kind.

Kris Kosmala

Kris Kosmala is the director of smart port operations and digital services for Royal HaskoningDHV.


  1. Very valid point. Complex systems are not made safer by adding more complexity in the control subsystems.

Back to top button