Getting into the Industry 4.0 mind set
Jul 7, 2016, 14:14 IST
Advertisement
Industry 4.0 is coming. The increasing digitization of manufacturing is already happening, through a combination of technologies, including the Internet of Things (IoT), big data, analytics and next-generation applications.The development of smart factories with secure, flexible and scalable manufacturing processes will bring a raft of benefits that will take companies to a new level with their operations. McKinsey Global Institute predicts that the annual economic impact of operations and equipment optimisation through the use of IoT will range between $1.2T−3.7T in 2025.
Among the new and powerful capabilities that Industry 4.0 promises are the identification of faults before they develop; increased efficiency and productivity; optimised product development and supply chains; and improved health and safety (e.g. by utilising remote-access to take people out of potentially hazardous environments).
Realising these benefits will require a significant shift however, not only in terms of how organisations use the data they create, but also in terms of how they share that data across traditionally siloed business departments. This is uncharted territory for the manufacturing industry, and whilst the rewards are significant, the challenges are also numerous.
Working together
Advertisement
One fundamental barrier preventing organisations from moving towards and industry 4.0 model lies in the way they are structured. To create products and get them to customers, manufacturers perform a wide range of activities, which generally take place in a standard set of functional units: research and development (or engineering), IT, manufacturing, logistics, marketing, sales, after-sale service, human resources, procurement, and finance.
Before products became smart and connected, data was generated primarily by internal operations and through transactions across the value chain—order processing, interactions with suppliers, sales interactions, customer service visits, and so on.
The responsibility for defining and analysing data tended to be decentralised within functions and siloed. Though functions shared data (sales data, for example, might be used to manage service parts inventory), they did so on a limited, episodic basis. By combining the data, companies knew something about customers, demand, and costs—but much less about the functioning of products.
Now, for the first time, these traditional sources of data are being supplemented by another source—the product itself. Smart, connected products can generate real-time readings that are unprecedented in their variety and volume.
This data has inherent value of its own in the production cycle, yet its value increases exponentially when it is integrated with other data, such as service histories, inventory locations, commodity prices, and traffic patterns. As the ability to unlock the full value of data becomes a key source of competitive advantage, organisations are looking at ways to break down traditional siloes and turn vast quantities of unstructured data into powerful insights.
Advertisement
Occasional cross-departmental collaboration and data-sharing is no longer sufficient. Intense, ongoing coordination becomes necessary across multiple functions, including design, operations, sales, service, and IT.
A unified data lake
From an operational perspective, transformation of this nature should not be underestimated. Lines will blur, and new functions will emerge with the specific aim of making sense of consolidated data-sets for the benefit of the business as a whole.
As this transformation takes place, technology needs to follow suit in a way which enables data assets to be pooled and analysed to deliver valuable insights.
A challenge is that the data from smart, connected products and related internal and external data are often unstructured. They may be in an array of formats, such as sensor readings, locations, temperatures, and sales and warranty history. Conventional approaches to data aggregation and analysis, such as spreadsheets and database tables, are ill-suited to managing a wide variety of data formats.
Advertisement
In this context of upheaval and change, it’s easy to see why an increasing number of organisations see data lakes as one of the key factors in fully realising the benefits of Industry 4.0. Data lakes allow organisations to store huge amounts of unstructured data so that it can be analysed and interrogated.
In order truly to tap into the power of the data being produced in a timely manner, it needs to be consolidated into a single data lake, so that big data and analytics tools are integrated and can access the full range of information to develop insight. A single data lake can therefore provide visibility of asset tracking, manufacturing operations, resources, products and key customer trends.
It’s also essential to give detailed and accurate descriptions to this shared data in order to provide the appropriate level of context.
There will be new types of internal and external data that manufacturers need to deal with — customer sentiment on social media, for example — so understanding its relevance is critical when taking advantage of emerging analytics capabilities.
Flexibility is key
Advertisement
Another key consideration is to ensure that your technology infrastructure is flexible enough to cope with the changing demands presented by Industry 4.0.
When building IT infrastructure around a data lake, organisations need to recognise that predicting data growth is difficult. As a result, an elastic, horizontally-scalable architecture is needed, enabling organisations to handle more load by incrementally growing architecture.
Products like EMC’s Isilon scale-out storage are also useful in this context, due to the simple way in which they allow storage capabilities to be changed. This software-defined approach to storage is a key part of a modern datacentre and enables the level of flexibility needed as businesses move towards Industry 4.0.
Common Standards
Scaling up data-lakes will also require common standards across data-lake providers. The recently launched “open-data platform” initiative aims to support this by promoting big data technologies with open-source software. Pivotal, the agile development services company co-owned by EMC, is one of those to have joined this effort.
Advertisement
It’s also worth noting that block storage is often the best solution for dealing with real-time data created by industrial sensors, as it is well suited to persistent I/O operations this requires.
Of course, the technology infrastructure required by organisations will vary, depending on what they want to achieve. The ‘one size fits all’ approach just won’t work when it comes to Industry 4.0.
Image source
As a result, manufacturing companies must spend time developing a strategy that will act as the foundation upon which their capabilities can be built. Once that is in place, it will become clear what needs to be done on the technology front to deliver on the potential of Industry 4.0.
About the author: Amit Mehta is the Country Manager of Isilon Storage Division at EMC India & SAARC
Advertisement
Image source