April 19, 2024

[ad_1]

We are rapidly entering the era of the new smart machine economy. This is when machines join—not replace—humans as intelligent participants in the software-defined, AI-driven environment.

For this era to flourish, we need to start the second wave of digital transformation. Many companies entered the first wave of digital transformation when they invested in information technology. This first wave gave us the ability to search, shop or conduct business transactions using a browser or mobile device, gave us access to collaboration tools that enable remote work, and enabled many other features and functions. The first wave focused primarily on people using technology to find information, connect with other people, and do things more efficiently.

The second wave does something similar — but for machines. The second wave of digital transformation is the migration of cloud and AI-based application investments from the world of IT (Information Technology) to the world of OT (Operational Technology). It is applied to devices and machines in the physical world around us in many industries such as aerospace, automotive, defense, industrial, medical and telecommunications.

After years of IT-related digital transformation — focusing primarily on the flow of information in the digital ether — the focus has now turned to machines. McKinsey’s Digital Manufacturing Global Expert Survey reveals that most manufacturing companies (68%) consider connectivity, intelligence and flexible automation as their top priority. The global industrial automation market is expected to reach US$326.14 billion by 2027 after a decade-long CAGR of 8.9%, according to Fortune Business Insights.

Unlocking opportunities on the smart edge

Stand-alone devices—say, a heart monitor whose only function was to measure heart rate without doing much else to it—something we used to know—are no longer cutting edge. Today’s devices collect and analyze data, communicate with each other and act on the data.

A heart monitor can transmit a patient’s data to a doctor or trigger a real-time alarm when the results may be dangerous to health. Self-driving cars can talk to the road infrastructure in real time, sense other cars in the area in real time, and act on that information by initiating accident avoidance. AI-based power grids can automatically manage generation and use across multiple, distributed energy resources.

To run such connected applications, networks are heavily dependent on cloud computing, analytics, artificial intelligence and machine learning, and 5G as the connectivity mechanism to enable them—and all of these new opportunities are at the smart edge .

Edge is a location, not a thing. Defines where data is detected and processed. The edge of the network is located at the farthest distance from the central data center and in or very close to machines such as cars, airplanes or robots. Part of the processing of the data from sensors embedded in machines must be in-situ, at the edge, making it the smart edge, while other data can be pushed to the cloud for further processing.

Multiple machines and devices operating at the intelligent edge share information with each other and data centers, forming digital loops. Such digital feedback loops are connected to big data systems to perform functions such as predictive outage avoidance, event correlation for functional faults in subsystems, software automation and monitoring, and event detection and resolution.

How to grow for the new smart machine economy

The complexity of intelligent systems means that embedded systems companies must digitally transform to enable the development, deployment, operation and servicing of such systems. To this end, they must adopt tools, capabilities and processes, such as:

Native and edge-friendly development techniques and tools, which are necessary to keep pace with time to market, system complexity and resource constraints. As we move toward edge computing, cloud hosting platforms will need to adapt to become edge-friendly or revamp to be native. An edge-native platform will retain the capabilities of a cloud platform, but also meet the new demands created by the edge. Wind River Studio provides a cloud-native and edge platform for developing, deploying, operating and servicing mission-critical intelligent systems. These cloud-native tools also allow developers to work anywhere, anytime, or in any way (desktop, remote, PC, tablet, etc.).

High level software automation. With intelligent systems, deployment at the edge often means that payloads must be deployed at scale across hundreds or potentially tens of thousands of geographic locations. It is not possible to manually deploy, operate or serve apps on the edge. Automation is the key to cost reduction for deployed edge distributed systems for both appliance and cloud infrastructure deployment types.

DevOps is the key to assembling complex embedded software at the intelligent edge. Traditionally, embedded software developers wrote code. When these were completed and the implementation had gone through QA, embedded ‘Ops’ (production) installed the systems. This cascading “waterfall” model is too slow for the smart edge, which works in real time.

Under the DevOps banner, different embedded developer personas (e.g. platform developers, application developers, operators, data scientists, or DevOps engineers) work in scrums. They push out new software releases as part of agile teams and do it so fast that it’s best to integrate Ops and QA (quality assurance, testing) teams in the development process.

Continuous integration and continuous development tools (CI and CD tools) Take new code and put it directly into a production application without stopping functionality. The rate of code release has increased so quickly—and many of the code releases are just small updates to existing applications—that it no longer makes sense to do a long uninstall/reinstall routine every day. To solve this problem, continuous integration (CI) and continuous code development (CD) were introduced. This is similar to the old “change the tire while the car is moving” idea. But here, it works.

Certification. Critical infrastructure software development has moved towards cloud-based DevOps principles. However, the security certification of such software still follows old-fashioned development patterns and involves expensive manual work, which causes high cost per line of code, prevents rapid adoption of new features, and slows development and operation. To reduce certification costs and achieve faster time to market, a new certification approach is required. This new approach needs to be aligned with a modern DevSecOps methodology and integrated into the continuous delivery process using automation, AI/ML and digital feedback loops. The constant release of new code creates a security risk exposure. Developer teams began adding security practices to the software development and delivery process to protect valuable assets at startup, runtime, and at rest. The result is a workflow known as DevSecOps.

The new smart machine economy promises not only to unlock economic value but also to make life easier and safer. To succeed, embedded systems companies must undergo the second wave of digital transformation and use modern, digital-friendly platforms, tools and processes.

[ad_2]

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *