Elly Yates-Roberts |
In my previous articles for The Record, we talked about the first two steps in the process of gearing up on how you exploit data – collecting the data and making it coherent, then starting to unlock its potential with visualisation, applications, analytics and artificial intelligence (AI). In this article, we’ll drill down into the higher-order techniques for extracting knowledge and making data actionable, along with other aspects of increasing the value of your data.
First, I wanted to highlight a couple of key analytics platforms from Microsoft itself, Stream and Synapse Analytics. Stream Analytics is designed to provide real-time analytics for inbound data streams, such as those from connected vehicles, the plant floor, or in a mobility or freight-as-a-service system. Being serverless, Stream Analytics is easy to deploy and configure and integrates with Azure internet of things (IoT) technologies. Standard Query Language (SQL) syntax can be used for ease of query generation, or you could insert your own code, or even take advantage of integrated machine learning (ML). It can be run at point of ingest, higher up the stack on the results of a data subscription, or nearer to data sources with Azure IoT Edge. This flexibility allows one solution to be used for a wide variety of use cases and needs.
While Synapse Analytics also can support real time data and streams, think of this as your tool for managing a broad set of needs for historic data analysis. Synapse accelerates time to insight by simplifying and integrating the acquisition, manipulation, management and extraction of data, leveraging well-known paradigms and tools such as SQL and Apache Spark. Synapse Studio provides a one-stop-shop for everything the data engineer needs and can be embedded into continuous integration/continuous delivery approaches.
The goal of collecting a large and representative set of coherent data is to automate the unlocking of meaning and insight. The highest yield here can be obtained with cognitive approaches leveraging AI and ML. Scepticism is sometimes expressed about the viability of these solutions, but it is worth noting that even in traditionally challenging areas Microsoft AI solutions have achieved and exceeded parity with human performance in most areas of perception – see the ‘Microsoft AI breakthroughs’ illustration for examples.
With AI ready for prime time, there are hundreds of ways in which it can be applied across the automotive industry. But our focus here is on gaining knowledge from large data sets. In this space, the most useful scenarios are those where machine learning is used to spot patterns and derive predictions from datasets. For example, based on a sufficient set of telemetry and known outcomes, a predictive maintenance model can be derived, validated, and put into production to minimise downtime and waste. Information can also be inferred from previously unexploited but extractable correlations to other real-world properties. A good example is Bridgestone’s tyre-damage alert system, which uses limited tyre and non-tyre data from Microsoft’s connected vehicle platform to infer when and where tyre damage has taken place.
The Microsoft Azure cloud can be used to host an array of open source and commercial AI and ML solutions, but there is also an extensive portfolio of Microsoft’s own technologies. These include Azure ML, a suite of tooling and support for an end-to-end lifecycle of machine learning solution development; Azure Cognitive Search, which supports extraction of meaning from unstructured data; Cognitive Services, which allows non-experts to derive value from AI in a number of scenarios; the Bot Service, which simplifies development of conversational interfaces and organising access to support information; and the Bonsai service, which allows compact and high-performance ‘brains’ to be developed with a combination of directed and unsupervised learning. An example of the latter is a user scenario by Siemens, who were able to improve computer numerical control machine calibration from a process requiring an engineer visit and two to three hours, to one which requires only 13 seconds.
How do you put this all together and extract immediate value from investments in data?
A coherent enterprise data platform can be constructed leveraging the power, scale and efficiency of the Azure cloud, supporting data from within the enterprise as well as sharing with partners and even collective solutions shared with consortia or a city. An example of this approach with enterprise data is the eXtollo solution implemented by Daimler, working with Microsoft. This scalable solution supports the entire group and data scope and provides fine-grained access control to ensure that business units retain their autonomy and data security.
In many cases the value of data is higher when information is combined with that from partners and other stakeholders, which requires data to be shared while ownership and security is maintained. Microsoft has a range of technologies and design patterns that inform how these technologies can be deployed to create maximum value and win-win partnerships.
Such overall data platform approaches can lead to immediate value, where data-driven solutions are implemented quickly and cheaply to address use cases such as connected insurance and breakdown, shared freight solutions, and the coordination of mobility services. Microsoft teams in your region are ready to engage with customers to help explore these approaches with you and unlock the value of your data assets.
This is the third in a series of articles by John Stenlake, automotive industry solutions director for connected vehicle and mobility at Microsoft
This article was originally published in the Autumn 2020 issue of The Record. To get future issues delivered directly to your inbox, sign up for a free subscription.