Quantcast
Channel: Blog – Talend Real-Time Open Source Data Integration Software
Viewing all articles
Browse latest Browse all 824

Big Data & Logistics: 7 Current Trends to Watch

$
0
0

“Hey ‘Big Data’ is just a big fuzzy word for me” quoted a Vice President of an Innovation Center at a big logistic company back in early 2015. Just one year later, he not only has to admit that ‘Big Data’ is the next big revolution, but has already applied big data technology to dramatic effect significantly growing the business and reducing costs. In fact, he’s been so successful that he’s been given the funding for a new Machine Learning department!  

UPS’ 1 billion investment in big data more broadly blows the whistle for all logistics companies all around the globe to get very serious about becoming data-driven or otherwise be in fear of being wiped out. The delta being created by those who have been quick to embrace big data is growing rapidly.  

There are many project prototypes initiated by logistic companies in order to exploit big data analysis and few amazing projects that will be part of our daily lives very soon. This includes real-time analysis using Spark on Hadoop to assess large volumes of data that are stored on registers’ logs, excel, database or HDFS, which has changed the business dynamics completely. Here are several big data projects related to the logistics sector:

Volume Analysis

Prediction of parcel volume on a particular day of week, month, year has always been a major concern for logistic companies looking to optimize resource allocation and budget. Many logistics companies are investing in this area to determine the patterns that help predict peak volumes. This is an ideal use case where data scientists are running batch analysis to generate recommendations. 

Parcel Heath Data

It is crucial for medicines in particular and for some other commodities in general to be transported in controlled environment.  For example, some medicine requires being stored between 2-8 degrees Celsius. Some equipment is fragile and requires extra handling care. It’s very expensive for logistics companies and also for end-customer to manage the entire process, so companies are investing to find alternate routes that ensure the safest delivery within the required parameters. 

Researchers and data scientists are deploying IoT sensors in order to monitor temperature, shock level and other factors on parcels. The analysis of this data in offline mode is used to define the safest and most economical passage for “fragile” commodities.  

Routing Economy

Should our company organize our own plane to transport parcels or should we leverage the existing infrastructure? Which provider has better facilities, routing paths, costs, and transparency? There are some prototypes on which data scientist are working to answer these types of questions. They are using big data analysis to study massive amounts of parcel data to predict which routes are most reliable, cost effective and viable for future growth. The outcome for management is accurate data in order to make decisions around things like which airlines to choose or what warehouse services are the best match for their needs. 

Transparency for Management

Current situation risk analysis and resolutions are high on the wish list for all product owners. It could be political unrest or major union strikes in a particular region or even just vehicle breakdowns, management always wants to have a clear view of the problem and potential remedies. 

One of the strategic projects currently in the prototyping stage at some logistics firms is the enablement of management to proactively look for problems and system-generated suggestions on alternate route or strategy options.  The challenge is dealing with the enormous volumes and variety of unstructured data coming from everything from social media to IoT sensors on transportation vehicles to both assess risk and define resolutions.  

Apache Spark on Hadoop is an ultimate choice of tool for this type of real-time data processing that enables management to anticipate and handle crises before they even happen. Spark ability to analyses data from different streams is the most optimal way to predict events and alternates.  

Transparency for End Client 

Customers are very conscious about parcel delivery time. To predict when exactly parcels will arrive is a nightmare for logistics companies.  Timing depends on range of variables: Number of packages, traffic status along with stability of vehicle – even driver health. Millions of packages are delivered daily. Only through the analysis of volumes of data such as vehicle health, driver efficiency info as well as current traffic conditions, can an approximate time-window be provided for when a package will be delivered. 

This pilot projects currently in prototype stage require real-time data analysis.  With real-time data analysis tools logistic companies will be able to provide highly accurate information regarding package status with complete transparency to their clients. 

Campaigns and Web Analytics

A current usage of big data in many logistics companies is for the analysis of web traffic. A leading logistic company in Germany for example is using big data analysis to provide more personalized services for web visitors. Based on the individual interests of the visitor, different campaigns and services can be offered automatically.

HelpLine

Another interesting project is to streamline and personalize the customer experience when dealing with customer service calls. Typically packages are shipped with receiver and sender phone numbers. This information can be tied to customer service and, in the case of calls coming from these numbers, callers may be automatically notified about the status of their shipments and expected arrival time.  Expected times are calculated based on real-time data analysis from different sources about that location (traffic, volume etc.). This project is designed to help reduce the overall load on help center. 

The amount of data we have today is far beyond the processing power of conventional systems. Big Data allows us to do things which were not possible before. The quantitative shift is now leading toward a qualitative shift. With spark and real-time transactional database (Cassandra, MongoDB, NOSQL) everything is changing rapidly. Companies adopting big data are racing fast and filling the gaps which weren’t even possible before i.e. proactive measures to act before an ‘event’ even occurs. In short “Predictive Analytics with big data” will be the new norm in the future. 


Viewing all articles
Browse latest Browse all 824

Trending Articles