Decentralized computing is no longer the appropriate strategy for global logistics.

Understanding the limitations of our current technology environment is critical to appreciating the fundamental difference of the Cloud.  We simply cannot begin to have a meaningful dialogue about throwing out the billions of dollars in technology investment made over the past 20 years if we cannot measure the benefits of doing so by embracing a new technology.

There is a wonderful amount of discussion over the past year about Digitalization and we fully endorse this, as it is about time that this industry discovers the opportunity that Cloud based technology presents.  There is no other industry in the world that will be impacted as much by Cloud based technology as Global Transportation and Distribution.  This industry is enormously challenged by geography and the Cloud was developed to overcome the issues of distance.  So, it’s important to take a minute and understand where we are today and the limitations that our current technology faces in order to appreciate what we will gain through Cloud based technology and its ability to digitize this industry.

Everyone around the world has come to appreciate the power of centralized computing; the concept of the Mainframe goes back to IBM and the 1970’s.  Clearly, it’s always better if you can get everyone to work together on the same computer.  This creates the assembly line process of the service sector.

However, the biggest limitation to this vision has been the technology of the interface.  We have all experienced a slow system when working on a company server.  Well, that “slowness” is not coming from the server.  It’s coming from the time it takes to handle the interface.  The old “Green Screens” and GUI interfaces are “heavy” and don’t transmit well over distance.  Basically, the problem was telecommunications and not the computer hardware.

When global companies such as the ocean carriers and global forwarders found that they could not successfully implement their mainframe systems on a global basis, they turned to an alternative approach; decentralized computing.  This was basically the next best thing; get everyone within a region to work together on a single system and then try to feed data via EDI between the multiple systems within the network.



Decentralized computing was known to have limitations from the start, but it was a “best case” alternative to having no viable global system at all.  It at least allowed users within offices to work together, and that was certainly better than nothing.  But, the standards and controls that allowed many people to work well together were only applicable to the local systems.  This meant that interaction with other systems within the same company were hugely problematic because no global set of standards were or could be applied.

This creates the big blockages that we have today in the industry.  Despite many promises to enable visibility and interactivity, it simply cannot happen reliably because the carriers do not work on the same computer hardware in all of their offices.  It’s true that they have the same software in most, or all of their offices, but that is not the same thing as working in the same software on the same server. 

One glaring example of decentralized computing’s pitfalls is its failure to employ a global standard for naming customers.  This is critical for automation of things like sending out tracking information, enabling customer visibility to all shipments at the same time, or auto-rating.  When users call the same customer by a different name – read customer code – then the system cannot do anything automatically.  It can only react to the customer names that it is told to react to as the trigger, so if someone calls the same customer by a different name, the system does not know how to respond.

It’s already problematic enough to get users on the same hardware to name the same customer by the same name.  But when users are not even working on the same server, then all the controls that are put in place to help them avoid making duplicate codes cannot work. 

The basic concept of not being able to enforce standards is at the heart of what is keeping carriers and forwarders from being able to supply their customers with the services, consistency, and data that they are looking for in a consistent and meaningful way. 

This is the core reason why carriers consistently mis-rate shipments and then fail to quickly correct the mistakes, why carriers can successfully send a 315 tracking message on one shipment and then miss the next several shipments, and why carriers transmit the Actual Arrival Date (ATA) as one date but then show it as a different date on their web site.

So, we can see that Digitalization simply cannot happen in a world where global computing relies on decentralized computing.  There are no global standards among systems, so decentralized computing as a global strategy can only go so far.  It may have been the best strategy for the times, but times have changed and we now have technology that resolves the distance problem.  Cloud technology has brought the opportunity to get everyone around the world working on the same hardware and in the same system.

Next we will talk about designing a system where people in different countries around the world can work together to provide the same service to customers who are also geographically separated.

(0) Comments