Schneider Electric, a leader in the Digital Transformation of energy management and automation, has joined other industrial leaders and pioneers to form UniversalAutomation.org (UAO), an independent, not-for-profit association managing the reference implementation of a shared source runtime. For the first time, IT and OT software vendors, industrial end-users, OEMs and academics will share a common automation software layer across their automation technology — regardless of brand.
UniversalAutomation.org will drive the development of a vendor-agnostic ecosystem of portable and interoperable ‘plug and produce’ software that can run with almost any hardware. By decoupling software and hardware, sharing a reference runtime implementation of the IEC 61499 standard and merging the Information Technology (IT) and Operational Technology (OT) worlds, the organisation seeks to create an entirely new category of industrial automation and unleash the full potential of Industry 4.0.
“UniversalAutomation.org is the beginning of new era for industrial automation,” said Dr Barbara Frei, Executive Vice President Industrial Automation at Schneider Electric. “Current architectures have done a great job of advancing industry to where we are now, but to reach next-generation sustainability, innovation and agility, we must embrace portable and interoperable software. Doing so requires a reimagining of our current systems and processes, and collaboration on a new scale, which the UniversalAutomation.org members have agreed to do.”
Organisations involved in UniversalAutomation.org include: Aalto University, Advantech, Asus, Belden, Cargill, eaw Relaistechnik GmbH, ESA, ETP, Flexbridge, Georgia-Pacific, GR3N, Hirschmann, HTW Berlin, Intel, Jetter, Johannes Kepler University Linz, Kongsberg Maritime, Luleå Technological University, Lumberg Automation, Phoenix Contact, ProSoft, R. Stahl, Shell, VP Process, Wilo, Wood and Yokogawa. Others are expected to join soon, with the organisation actively recruiting new members.
Members will work together to develop and adopt the next generation of universal automation solutions by collectively incrementing the runtime following shared source principles. Members will have access and the ability to shape the next generation of automation. All entities looking to help advance industrial automation are encouraged to join.
“Industrial operations are undergoing a total transformation,” said John Conway, President of UniversalAutomation.org. “As the IT sector has proven, advances in Machine Learning, Augmented Reality, real-time analytics and the IoT hold great promise for step change advancements in performance, agility and sustainability. However, within industry, this promise is being held back by closed and proprietary automation platforms that restrict widespread adoption, hamper innovation, are challenging to integrate with third-party components, and are expensive to upgrade and maintain. Using a shared runtime changes all that.”
Greg Boucaud, Chief Marketing Officer of UniversalAutomation.Org, said: “UniversalAutomation.org is hitting the reset button on automation technology. Using the IEC 61499 standard for distributed systems, we can create a new, open industrial environment that will lead to a more sustainable, efficient future. UniversalAutomation.org will remove the barriers to innovation in automation by decoupling hardware and software and providing end-users with the freedom they have been asking for — to easily integrate different technologies regardless of where they came from and fully optimise their automation systems.”
Intelligent Data Centres spoke to industry experts to hear their predictions on how automation is set to shape the data centre space over the next 12 months.
Herman Chan, President, Sunbird Software
To see how automation will shape the data centre industry over the next 12 months, we only have to look at what the most sophisticated data centre managers are doing today. Modern data centre managers are already saving time, money and improving data accuracy with integration and automation. As their success stories continue to spread, their practices will increasingly be adopted by others.
Workday leverages its Data Center Infrastructure Management (DCIM) software’s bidirectional restful web service API to integrate systems and automate most of its data centre operations. Its automation use cases include provisioning and orchestration, VM data management, device state tracking and even parts management.
“DCIM really is for us a source of truth for the tens of thousands of bare metal assets that we have in our global data centres,” said Tim Putney, Workday.
MacStadium uses the same API to automate back-office processing and eliminate manual data entry and human error. It has integrated DCIM software with its billing platform, customer portal, administration system and accounting system to easily manage a high volume of customer orders.
“Using the API, we’re able to poll our existing racks, see where we have space available and assign that space automatically to a customer order via our website,” said Robert Perkins, MacStadium.
Leading data centre professionals at Comcast and eBay are leveraging their DCIM tool’s ability to automate rack power planning with highly accurate power budget profiles that are calculated for each device instance based on how they are used in their environment. With this automation, they have improved power capacity utilisation and eliminated stranded power while removing manual effort and risk.
“From an RoI perspective, [Auto Power Budget] is massive for us,” said Michael Piers, Comcast. “We’re getting 40% more usage out of our facilities and power sources.”
The same feature enables eBay to deploy projects with 33% fewer cabinets, saving it around US$10,000 for each server-ready cabinet it can avoid deploying. In one example, it saved US$120,000 in a single project.
And that’s just the tip of the iceberg. The best data centre managers in the world are also automating management reports, rack PDU configuration backups, request and work order emails, thresholds and alerts on power loads, three-phase balance, circuit breaker state change, temperature and humidity, and more.
“What we’re trying to do as an organisation is to get out of the data world and into the information world,” said Raymond Parpart, The University of Chicago. To achieve this, the university is making DCIM software ‘the centre of the universe for the data centre’.
This is the story we are hearing more each day and this trend will only continue over the next 12 months.
All data centre managers need to increase operational efficiency and improve data accuracy. The best managers have already found the solution and it’s catching on quickly.
Sanjay Kumar Sainani, Global SVP of Business Development & Global CTO for the Digital Power Business at Huawei Technologies
Prefabrication, modularisation and standardisation have become the trend of the data centre industry. Prefabrication can greatly shorten the TTM of data centres. Modularisation enables data centre flexibility that is on-demand deployment and capacity expansion. Standardisation can ensure high-quality data centre delivery with fast construction. In recent years, with the development of digital and AI technologies, data centres are evolving towards automation in design, production, delivery, operation and O&M. The details are as follows:
1. Achieve the effect of ‘What you see is what you get’. BIM (Building Information Model) is used in the design phase to automatically run the collision experiment, identify pipeline interference in advance and improve the design quality. The data shows that change workload can be reduced by 80% using the BIM model 3D design.
2. Achieve the effect of ‘What we design is what you get’. By means of digital production, i.e. the use of 3D twin models, fine production can be achieved to reach optimal air leakage rate, tightness and dimensional tolerance.
3. Fast delivery. In the delivery phase, prefabrication and testing of the integrated power supply and distribution, cooling and fire extinguishing systems in the container are completed in the factory based on the prefabricated modular construction mode. Through accurate calculation and analysis, the lego-like construction of the container is completed onsite during the delivery process, achieving fast delivery. Take Asiacell, the largest mobile operator in Iraq, as an example, construction is quick and services can be rolled out within just 10 months.
4. The DC status is optimal in real time. In data centre operations, the temperature control system of a data centre is used as an example. The temperature control system of a traditional large data centre is complex, including seven subsystems, such as a cooling tower and water chiller. More than 60 parameters need to be adjusted. However, it is difficult to achieve global optimisation based on human technology and experience. The AI algorithm of deep neural network is used to adjust the temperature control system. Global data centre optimisation is implemented, 7x24x365 online in real time, and PUE (Power Usage Effectiveness) is optimal in real time. In actual applications, China Unicom’s Zhongyuan data base in Henan province uses AI energy efficiency technologies to reduce the PUE from 1.54 to 1.35, saving more than 8 million kWh annually.
5. Enable automated driving in the data centre. According to Uptime’s latest user report, the failure rate caused by the data centre power supply system accounts for 43% of the overall failure rate. AI technologies can effectively avoid the failure rate of the power supply system in the data centre. For example, intelligent algorithms can be used to predict the service life of vulnerable components, such as fans and capacitors, and abnormal noises can be collected to determine device running status. In addition, AI inspection robots are used to perform local inspection, collect images, temperature and sound, and use cloud-based analysis functions.
Steve Hone, CEO, Data Centre Alliance (with special acknowledgement to Albert Adhoot, Colocation America)
Before the rise of cloud computing and the consequent proliferation of remote compute and storage assets, data centres were relatively straightforward systems that could be staffed by just a handful of qualified well-trained and experienced professionals. However, the emergence of new — and generally more complex — offerings in the cloud computing space (SaaS, PaaS, IaaS, etc.) has transformed the typical data centre into a high-tech mission critical clearinghouse for a variety of critical corporate workloads. The challenge of servicing this new demand is not only being felt by the hyperscalers such as Amazon AWS, Microsoft Azure, Google GCP etc., but is having an impact right across the hosting/provider community. As the entire sector scrambles to keep up, the demand for IT professionals with the requisite skillsets needed to manage these increasingly complex and mission critical environments has skyrocketed.
Unfortunately, the number of qualified candidates for these positions still seems to be decreasing rather than increasing. As such, data centre management teams are facing a severe staffing shortage that, if unaddressed, may well one day threaten a company’s ability to adequately maintain either its own digital assets or a provider’s ability to maintain its service offering. Given that as a society we are being driven towards what appears to be a fully automated and digitally dependant world, this is not great news. Most corporate stakeholders are fighting tooth and nail to hang on to the talent they still have, in what has become a very competitive marketplace, however, this will only work for so long before the cracks become too big to paper over.
Maybe in order to keep pace with the growing demand being placed on data centres, we need to let go of the past and start looking to the future with a more hybrid approach, which not only works collectively to attract new talent into the sector but also invests in solutions that allow data centres to thrive in the absence of extensive human oversight.
Thankfully, Machine Learning (ML) technology offers a solution, assisting with a range of server functions without automating IT management entirely. ML platforms combined with advanced heuristics can autonomously perform routine tasks like systems updating, security patching and file backups while leaving more nuanced, qualitative tasks to the IT personnel. Without the burden of handling each and every user request or incident alert, IT professionals can assume oversight roles for tasks that previously required their painstaking attention, which in turn will free up more time to focus on the more holistic management challenges.
For both individual companies and third-party data centre providers, this partnership-based approach may well provide a happy medium between outright automation and helping to elevate, rather than solving the chronic understaffing issues the sector is soon to face. If fully embraced, this ‘hybrid’ management model could well become the norm throughout the data centre industry. Machines are not going to replace human workers — well at least not anytime soon — but what ML can do is help overworked IT teams do everything required to keep a data centre running smoothly.
Click below to share this article