As I prepare to teach JNB701 Maritime Informatics, a course on digital transformation in supply chains, at the Australian Maritime College for the upcoming semester, I’ve been reflecting on the flood of AI promotion in our industry.
Each week, companies announce that AI will “reshape everything,” often without much regard for the deeper structural and operational issues that still need to be resolved across global supply chains. This rising hype is exactly why I felt the need to write this article.
AI is a technology in a wider stack that includes standards, trusted event data, IoT, interoperability, and digital trade instruments. The organisations that win will be those that understand which technologies they need, how those layers fit together, and how to build them with partners. Most importantly, they will resist chasing headlines and instead build capability that compounds.
Here’s the simplest way I explain it: the digital supply chain is a cake. AI is the icing. Icing can add value, but it cannot hold up a cake that hasn’t been baked properly. Transformation requires building each layer deliberately and in sequence.
Every cake needs a base strong enough to bear weight. In global trade, the base is international standards and governance frameworks, such as DCSA, UN/CEFACT, GS1, and others. They provide shared identifiers, event definitions, timestamps, and data structures. Without them, integration becomes bespoke translation, and automation breaks down. Standards are not academic; they are the backbone that enables scale.
The next layer is event data, and it has been the hardest to evolve. For years, the industry has just shifted from paper forms to typing the same info into digital systems. This is computerisation, not true digitalisation. We still rely on humans re-entering data, causing errors and duplication. The system seems digitised, but data stays messy and fragile. We depend on it, yet don’t fully trust it.
Supply chains depend on milestones like gate-in, loaded, departed, arrived, discharged, and delivered. The event layer gathers data from EDI, API, terminal systems, and manual inputs. The shift is towards IoT devices generating event data directly at the source, rather than relying on manual data entry. This shift is happening faster than people think, moving from digital paperwork to machine-generated operational truth.
It is important to understand that not all IoT data is created equal. Two distinct layers of technology are emerging, broadly aligned with the value and risk profile of the commodities they support.
At one level, passive telemetry (low-power sensors and LoRaWAN devices) provides condition- and event-based data through “gateways” hosted at terminals and facilities. This expanding gateway network layer is well-suited for lower—and mid-value commodities where condition monitoring and transactional validation are important, but continuous positional tracking is neither commercially justified nor required.
At the higher end, active, live GPS tracking (smart containers) provides real-time positional and movement data. This visibility level allows dynamic asset management, proactive exception handling, and tighter supply chain control. This level is best suited for high-value cargo, but it is expanding as falling costs align with increasing demand for supply chain visibility.
Each supply chain will determine the appropriate IoT layer based on commodity value, risk tolerance, and commercial priorities, but in both cases, the shift is the same: data is created at the source rather than manually re-entered downstream. That shift creates opportunity. When assets generate event data directly, organisations can build automated rules, validations, and workflows. However, understanding differences in data type, reporting frequency, reliability, and validation logic is crucial.
Encouragingly, many organisations are not waiting for perfect systems before acting. They are starting with the data they have, steadily improving its quality, clarifying their internal requirements, and extending integrations to key partners. This pragmatic approach strengthens the event layer and prepares the architecture for what comes next.
Above the event and asset intelligence layer sits the most politically complex stage: interoperability. It is a thin yet critical binding layer that connects organisations across boundaries. This is not just a technical interface issue; it is commercial, legal, and strategic. It requires clear decisions about what data is shared, under what conditions, with which liability protections, and through which mechanisms, whether open exchanges or closed, permissioned ecosystems.
The foundational layer and work on standards make true interoperability possible. Commercial agreements give that language enforceable meaning. Interoperability is not primarily a technology challenge; it is a trust and governance challenge executed through technology.
A mixed environment of open and closed systems is forming, connected by a standards-aligned layer enabling secure, predictable interaction. This unlocks the next stage: a transaction layer driven by electronic bills of lading (eBLs) and digital trade docs. Once interoperability is achieved, operational truth leads directly to commercial consequences.
This layer is where operational events create economic value, triggering payments, transferring title, shifting liability, activating insurance, and influencing financing. Historically paper-based and retrospective, it is now beginning to digitise.
In 2023, DCSA member carriers committed to achieving 100% eBL adoption by 2030, signalling a shift towards transferable digital instruments that facilitate event-driven settlement: verified delivery can release payment, validated condition data can trigger insurance, and trusted ESG metrics can influence trade finance. Emerging platforms are developing the infrastructure to support legally equivalent eBLs across networks, integrating blockchain and traditional systems through the previously mentioned thin interoperability layer.
Adoption is still in its early stages, and scaling depends more on behavioural, legal, and commercial alignment than on technology. The real turning point will come when financial transactions are carried out directly from trusted events. When this happens, our industry will finally reach Stage 3 on the digitalisation curve, where AI can operate on programmable, reliable infrastructure.
AI does not create, it consumes. If the underlying layers are inconsistent, it is simply garbage in, garbage out. That is why so many grand promises about revolutionary AI in our industry ring hollow. Without trusted standards, reconciled event data, and programmable commercial infrastructure, the intelligence has nothing solid to anchor to. You cannot sweeten structural failure with decoration.
The organisations that will survive and thrive are those that understand their processes in detail, digitise them properly, and then build connected digital ecosystems with their partners and stakeholders. Only once that foundation is in place can advanced technologies be applied in a deep and meaningful way.
Build the cake properly. Then and only then apply the icing.