Dr Teemu Manderbacka from NAPA Shipping Solutions writes on how to use operational data to validate new ship designs.
Recent research has shown how the cubic law of fuel consumption-speed dependency is simple, but the reality is more complicated, thus it does not necessarily hold further away from the design speed, and that slow steaming might not be the ‘fix-all’ magic bullet many people want it to be. This is potentially disappointing to a number of people who had been championing this idea, from various sectors of society – from Emmanuel Macron to Extinction Rebellion. It’s easy to see why there was so much support for the idea, even if it was potentially problematic – it solves, on paper, a vital issue, and seems, at first, remarkably simple, with no need for extensive uptake of new technology.
So, what can we do instead?
First, one of the main solutions is to ensure ships are designed for reasonable speeds, and operated accordingly. Ships are designed to travel within a certain set of speeds throughout their lifetimes; changing this speed changes the load on the engine, and affects performance significantly. Hull form modelling, and propellor modelling, given this targeted range to work with, then optimise performance against certain speed profiles. This is the main issue with the idea of simply slowing down without addressing design issues. Unlike road vehicles, which could feasibly operate at a slower speed and save fuel dramatically, when ships operate outside their design parameters, they don’t necessarily become more efficient; hence the need to find routes that keep ships operating within this ‘sweet spot’.
This is a significant operational challenge, but we have the data to make it a reality. By combining hydrodynamic modelling with engine data, weather data, and operational data, optimisation technology we can optimise routes and operations, and realistically model fuel consumption.
Demand for newbuilds is intrinsically linked to the speed that cargo can travel
This then creates a virtuous circle – the more we do this, the better the models become, and the more we are able make shipping more energy-efficient – and enables us to deliver more freight with less emissions.
The study referenced makes the case for this based on 11,000 noon reports, over three years of operation. One of the real advantages of ‘big data’ – a term that’s been over-used in past years, but we can confidently say applies here – is that we can then crunch the numbers on even more datasets. The model we use has possibly ten times more data than this, and can use all of it to make our models smarter. What makes a big difference is knowing not just about what the route is like – the weather, the speed, etc. – but having a picture of the ships we’re modelling. Knowing what the design specifics of a vessel are – its hull form, its propulsion system, what is was designed to do – is an essential key to making these cumulative years of logs worthwhile.
Naval architects also gain another beneficial feedback loop because they can then incorporate operational data into the design specifications to create more efficient vessels. In particular, this data could be used to inform the initial feasibility studies that owners rely upon to determine the specifications of a newbuild. If owners and architects can confidently use operational data to validate new designs, and push the boundaries of what is feasible to build, or ask to be built, the cycle then improves again.
The feasibility of how and why
Looking at the mechanics of why we build ships in the way that we do, and why we operate them as we do reveals the potential limits of this approach – and why speed limits are not the easy fix they initially appeared.
First, contractual matters encourage conservative voyage plans to avoid demurrage. Setting a speed limit would not help that, and it’s one of the main issues why the authors of the above study came to the conclusion they did.
It could be possible, in future, to design vessels in line with lower speeds, and therefore higher performance. This, however, demonstrates the challenges in factoring the wider context of shipping. Unlike cars, which spend most of their time out of action, ships must be in near constant use and that effective and available capacity is a factor of tonnage and speed. If speed goes down, tonnage, unless the world economy changes significantly, must go up. Global demand for newbuilds, despite the coronavirus slowdown, is increasing (particularly as many ships are reaching scrapping age during this decade, significant amount of tonnage delivered in the 2009-2012 will need to be replaced). This does mean that a new global speed limit could be set at the design stage, but this in turn would require significant changes in global cargo logistics, because the demand for newbuilds is intrinsically linked to the speed that cargo can travel. It may also result in sea cargo being re-routed to land-based transport, with higher emissions.
These complications also explain why, in a literature review (Bouman et al. 2017) of different studies, the fuel saving potential of voyage optimisation was so variable – saving between 0% and 80% of emissions. The potential scenarios vary depending on what target is set, and what target it is ‘possible’ to set strays beyond the realms of data and physics, and into the economic and political.
There are better decarbonisation strategies than slow steaming, and far wider possibilities for building better, more efficient ships, and reducing the tonne-mile of cargo emissions. However, it’s clear that to find these answers future optimisation of shipping needs to be integrated into wider questions about how we conduct global change, and go beyond vessel-based optimisation.