We recently talked about the renaissance of distributed computing as a consequence of the center of gravity in the IT world shifting back to a more distributed landscape.
After a decade of dominance by the centralized data managed of data centers and hyper-scale cloud providers, more and more workloads are again being processed at the network edge – and this trend is expected to continue. Analyst firm IDC predicts that 40 percent of enterprises will double their edge spending by 2022.
There are several advantages to processing data close to where it is generated. Firstly, it allows for greater agility, as decisions can be made faster.
You don’t need huge bandwidth to move large amounts of data/information up to the cloud/core datacenter for analysis – because you have the computing power available locally, and when capabilities such as analytics are moved nearer to the sensors and devices that generate the raw data, the outcome is near real-time feedback and faster actionable insights.
However, the true value of this data is realized when it is made available companywide to a business’ IT systems – whether these are running on-premises or in the cloud. Combining edge information collected by a company’s sensors, cameras and connected ‘things’ with enterprise data such as customer and market information takes analytics to new levels – making it possible to rapidly understand how an entire value chain is performing.
Immediate access to this information gives businesses a competitive advantage in fast-paced and challenging economic and trading environments.
However, this requires getting two entirely different systems to talk to each other. On the factory floor, industrial control systems (ICS) manage operational technology (OT) – the embedded hardware or software that monitors or controls physical devices or processes.
This has evolved entirely separately from IT – which is the software, hardware, communications technologies and services that generate data for enterprise use.
Consequently, IT systems just don't understand many of the OT protocols, such as LoRaWAN, NB-IoT, Sigfox, Modbus or OPC Unified Architecture, to name just a few, which help make up what’s generically known as the Internet of Things.
An answer to this is to deploy a Fujitsu INTELLIEDGE device to translate communications between the operational technology of the ICS and an IT infrastructure. By seamlessly connecting the collected data to the enterprise cloud, analytics, AI and machine learning, INTELLIEDGE enables businesses to unlock the value of all data collected from OT ‘things’.
A further challenge inherent in broadly scattered, distributed networks is keeping data under control. It’s easy to get stuck in functional silos. However, to prevent data loss, fulfill compliance requirements and make data available for business analytics purposes, it must be fully available.
To help achieve this, Fujitsu has formed a strategic partnership with NetApp allowing customers to build an enterprise-wide data fabric enabling the central management and control of all data, regardless of its location. The underlying software connects the isolated data resources to a single data fabric where each element is made available to any applications, across an entire distributed infrastructure.
The Data Fabric is powered by NetApp’s ONTAP data management software. This connects Fujitsu PRIMERGY and INTELLIEDGE servers located at the network edge with resources located in the core and/or cloud, enabling organizations to move information seamlessly between where it is held to where it is needed.
A typical installation that fully integrates all Edge OT data with the enterprise IT might include a FUJITSU INTELLIEDGE appliance at the edge – connected to a storage device and enterprise public/private cloud – all seamlessly connected via the Data Fabric.
Contact us to talk about how we can help you connect your OT data with the enterprise to open up a new world of business models and value.