April 19, 2024

[ad_1]

Things are getting smaller. Not only in terms of gadget miniaturization, medical nanotechnology, increasingly sophisticated industrial electromechanical units and the process of so-called shrinking inflation This leads to our candy bars being thinner or shorter for the same price, but also the data – the data also gets smaller.

Data is getting smaller in two key senses: a) we break down the component parts of application data streams into smaller, containerized components to work within similarly partitioned and containerized application services – and b) the time windows within which the business must to react to data events are reduced.

This latter temporal constraint on data naturally leads us to the reality of real-time data and the need to be able to work with it.

No real-time data, really

In terms of how the space-time universe we live in actually works, real-time data is something of a tautology, i.e. data always has some temporal disbursement that must be paid to exist. Data may travel at the speed of light, but it’s still a speed. When we talk about real time, we mean data transfers that operate fast enough that a human cannot perceive any time delay that has occurred. Thus, real time expresses a human perception of time rather than a machine perception or definition.

These are all important things because now we are supposed to embrace one Industry 4.0 world where our factories are managed by AI-augmented and intelligent automation. However, manufacturers may not be ready for Industry 4.0 if they face complex data issues created by production bottlenecks caused by disparate information systems in an organization, many of which will still require human intervention – from manual entering sensor readings into databases to ineffective Clear-to-Build status monitoring (ie ready-to-use) and lack of integration with Enterprise Resource Management (ERP) systems.

Eager to write a few misses in this space is Palo Alto-based KX. Variously known as KX and KX Systems, the company is recognized for its work in real-time, high-speed data streaming analytics within intelligent systems that can also concurrently handle tasks related to historical data workloads.

The analytics maturity curve

Looking at the speed of current industrial data processing and the need to achieve the personal Nirvana state of rapid analytics-intensive data flow, KX calls the state of evolution of any given company a point on the “analytics maturity curve.” Despite marketing-promoting naming aspirations, KX has a point, which is that the commercial window for creating differentiated value is narrowing for organizations in every market and sector. It stands to reason, then, that the faster they can act on insights derived from real-time data, the better the outcome.

As KX CTO Eric Raab has stated before, “The opportunities for streaming analytics have never been greater. In fact, according to my company’s research, 90% of businesses believe that to remain competitive over the next three years, they need to increase investment in real-time data analytics solutions. Whether it’s a financial institution that needs to adjust client portfolio settings according to ever-changing stock prices, or an e-commerce site that needs to generate a monthly report, data accuracy at speed is a huge challenge. “

What kind of data analytics can we get from enterprise software platforms that can perform at this kind of speed? KX says finding (and applying) anomalous data will be a key use case.

Generally defined and explained as data points, events or observations outside the normal behavior of a data set, anomalous data can be a key flag and indicator to alert a business that something has already caused (or is likely to cause) a problem somewhere in business.

“The ability to quickly detect and react to anomalous incidents is critical, particularly because having the ability to react in real time can limit the cost of anomalies. In addition to preventing problems from staying within the business, adopting real-time data can also improve process efficiency. The types of positives [advancements and innovations possible here include] faster service, increased sales, better product quality and reduced prices – showing how far-reaching and diverse the impact of real-time data can be,” notes KX, the research report on speed to business value.

The company insists that using real-time data systems brings productivity gains by reducing man-hours spent on data processing and management. This type of platform enables users to automate complex workflows that would otherwise be time-consuming and thus use proven Machine Learning (ML) models that provide some level of actionable insight to guide business actions

The road to the microsecond business

If, collectively, we have faced this argument and agreed (even by one percentage point) that we need an increased focus on real-time data and analytics technologies capable of working with complex, high-velocity information sources, then we may be a way to implement platforms like the KX and/or its competitors.

In this orange tree, KX is not the only fruit. A list of streaming experts today might include Confluent for its fully managed Kafka services, Tibco for its Tibco Spotfire product, Amazon Web Services Kineses, Microsoft Azure’s IoT offerings, and of course Apache Kafka itself for open source purveyors. That’s not to say the KX isn’t special, it just underscores and perhaps validates the company’s position in a well-defined technology discipline working to solve a critical need.

Businesses in any industry that implement this level of technology are on the way to what we might soon call “split-second business operations,” a term that may stick.

[ad_2]

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *