TransInfo

100% visibility in transport – is it even possible? What would it change if we were to achieve it?

You can read this article in 11 minutes

Text by Zuzanna Kosowska-Stamirowska , Jan Chorowski, Adrian Kosowski


100% visibility in transport is not physically possible. It is true that as communication technology becomes more mature, cost- and energy-efficient, we could perhaps envisage a world where we can “see” every asset in real-time. However, full visibility encompasses not just “seeing”, but also knowing how to interpret the registered data, understanding when something unexpected is likely to happen, or how to react correctly.

By a loose analogy, visibility is in the spirit of babysitting over shipments. Anyone who has ever looked after a 3 year-old child will know that anything can happen in a matter of seconds, it’s enough to look the other way. What’s more, even with perfect monitoring, there is no complete set of rules, no “algorithm”, which could describe completely if something is going wrong, in what cases an intervention is necessary.

The situation is similar with supply chains, but the scale is much larger. With possibly thousands of assets to oversee at once – the story stops to be so much about the individual assets and starts to be about the statistics and minimizing cumulative risks. However, if we are helped to focus our attention only on the few cases which actually require our attention at the right time, and the rest of the time just follow aggregated views, for the regular decision making process, we can get transport visibility just right.

So, what are the ways to achieve close to 100% visibility in transport?

IoT

In terms of obtaining the right data to ensure visibility, one obvious answer is through the Internet of Things. The majority of trucks and trailers are nowadays equipped (or being retrofitted) with systems which report at least the location, and other parameters. Thanks to this technology, fleet managers get precise information on, e.g., fuel consumption and behaviour of individual drivers. Containers are being turned into so-called smart boxes, with sensors which report location, shocks, humidity, temperature, send alerts about door opening, etc. This technology enables shippers to follow their goods in near-real time throughout the entire journey – a process which usually involves many actors and modes of transport, with road, sea, and train legs. Pallets and packages are being traced, with smaller and cheaper devices, possibly just with scanned barcodes or tags, in order to ensure the continuity of visibility, which thanks to IoT is linked no longer only to the mode of transport, which enforces fragmentation, but to the very goods that are being shipped.

Data sharing – collaboration between all the actors in the supply chain.

A complementary approach to automatically collected IoT data involves data sourced from logistics partners. The logistics sector is fragmented, and one container can switch hands dozens of times before arriving at its destination – there can be multiple carriers involved, sea, air, and land, there are freight forwarders, then there are logistics departments of the shippers themselves, or last-mile providers. Sharing data between all the actors could potentially be a way to enable continuous visibility over the cargo’s journey. The individual actors often have the data, which they do exploit for their own operations.

This collaboration between different actors of the supply chain is being supported by standardisation bodies which focus on Trade Facilitation, some notable ones being UN/CEFACT or Digital Container Shipping Association (DCSA). The first step of DCSA for instance, was to understand common processes in maritime and port operations in order to create common language, which could ease collaboration and data sharing. These advancements are fairly recent – in fact, we do have a standardised blueprint of maritime processes only since 2019 even though commercial sailing is one of the oldest businesses in the world.

But clearly, something must be holding us back here – what is it?

What is holding us back?

Data sharing is a risky business. There is a lot of mistrust between the actors involved in transport and supply chain management to make data sharing swift. Sharing data can mean disclosure of sensitive information, trade secrets, or may simply help the competition to advance quicker on a given market. There are multiple ways to overcome or minimize those risks – data pooling being one of the answers. Data pooling, as opposed to data sharing, implies that data is being put together in one “pile” but is not shared, only the “conclusions” are shared for the common benefit of the parties. However, data pooling requires careful design of privacy protocols to avoid accidental disclosure of strategic information. This in turn requires highly skilled teams working on the topic, and no – blockchain is not the remedy for everything. If it all, blockchain reduces willingness to share information, knowing it cannot be rectified later without leaving a trace.

IoT for shipments has the problems of power consumption, in numerous cases the data is not continuous but we get a measurement point every couple of minutes or even hours. Aggregating truck telemetry data and mobile phone app-based systems to achieve transport visibility has its issues too: it is error-prone, and says little about the condition of the cargo as such. In fact, there are many challenges in analysing data coming from IoT in motion which have not yet been addressed even by the scientific community, without even talking about commercial applications. A classic example being aggregation of devices moving along different routes. This is relatively easy when measurements are made every few seconds (as is the case for a mobile phone in a city), but a completely different story for logistics IoT where consecutive measurements may be dozens of kilometers apart.

Let’s assume we have the data, real time, everybody is sharing it – what do I get out of all of this exactly?

Suppose I learn that my container or truck was at a geographical location (latitude and longitude), at a given time, and reported given measurements. Does knowing these details really help my business? A bit, but not much. Especially not if I am managing thousands of such containers. The eyes need a brain, otherwise, the “visibility” would be equivalent to CCTV and its usefulness would boil down to analysing incidents once they already happened, after watching hours and hours of video tape – trying to discern faces in blurry images.

In order to make visibility functional, one needs analytics – advanced analytics, for that matter. Just as the eyes need the brain to interpret visual information, to differentiate between danger and a regular situation, analytics is necessary to put the information in context and give it interpretation which can then be directly useful for business. The better the analytics, the less input it needs to deliver value.

This is where the usefulness of advanced AI kicks in. The ability to work with little information to deliver meaningful conclusions solves some of the problems discussed above. Less data needs to be shared, so we are less worried about trade secrets, the IoT devices need to send measurements less frequently, thus preserving precious energy and driving down hardware costs, contributing to mass adoption of devices, which in turns makes the statistics more accurate…

Let’s imagine we have the data and the advanced analytics – what would become possible?

For a start, with a real-time predictive analytics suite, we would be able to prevent losses before they appear. As a shipper, knowing instantly that the cargo was damaged in transport would allow us to activate our “plan B” and, e.g., source the same goods from elsewhere to avoid costly disruptions to our operations and hurting relationships with our partners. A good rule of thumb is to aim to spot 95% of events needing intervention, while minimizing false alerts – having 2-3 true alerts out of 10 seems reasonable.

With the right analytics, we could consciously select the best routes for our cargo depending on the specific risks and make pressure on carriers to avoid risky scenarios, such as allowing our cargo to take a bumpy railroad track – say, the one between Łódź and Kutno.

If we could rely on analytics to provide accurate ETAs and estimate chances of delay, we could have shorter demand prediction cycles. We could even go as far as imagining a novel model of trade in which goods are being traded while already in transit, as opposed to the traditional scenario in which they are first ordered and then shipped. Sounds futuristic? Not really, it goes in the spirit of what Amazon is doing with its Prime service.

Where’s the catch?

Just like your sight can be influenced by the glasses which you are wearing, the insights which you get from your analytics may be biased depending on the interests of the party which supplies this analytics. For shippers, the provider or analytics needs to be neutral, independent from carriers and other providers of transportation services, to ensure that the insights you are getting are really in your best interest. Let’s take the example of ETA’s. It will always be in the interest of the carrier to display the most optimistic ETA to the client and wait until the very last minute before informing him about delays. Informing the client too soon would mean that the client will call the carrier and start asking why he is not doing anything about the issues faced, therefore making pressure on the carrier to adapt. For small carriers, the provider of analytics should rely on responsible pooling of data between carriers, so as to protect their best interests while avoiding disclosure of sensitive information.

In conclusion, the challenge we are facing is not only about having perfect visibility, but more about making sense out of what we see.


Photo by Noel Broda on Unsplash

Tags