January-March 2022

Technology – Machine learning

Calibrating the complex: Machine learning in action

an asset in the smart factory

Machine learning is developing into an effective tool in automotive design and engineering applications. Secondmind CEO, Gary Brotman shares some insights into how this works as the industry transitions to electric vehicles. Report by Nick Holt

CEO Gary Brotman describes Secondmind as a machine learning-based SaaS solutions and products company. Starting six years ago, the company’s focus is on helping engineers solve complex, high-dimensional problems. Currently focusing on the automotive sector after exploring a number of industries, the company has recently been working with Mazda around powertrain optimisation, specifically with optimising their ICE engine as well as EV and Hybrid control systems.

OEMs are facing increasing complexity in developing and producing both existing ICE and new electric powered vehicles. Could you share some insights into what you see are the major challenges being faced by car makers this area?

There’re three main kinds of pressures. One is consumer demand and the demands of the driving experience. These have increased, especially when sustainability becomes a key factor in a purchase decision. Every automotive manufacturer wants to achieve and exceed those carbon neutrality goals, but there’s also the business reality of how quickly you can adapt and what’s the right pace of transition.


Then there’s cost and overhead associated with legacy processes and systems that are focused on the way things have been done, and a high demand to move investment and strategic resources, green power and otherwise, into strategic areas of growth – where the industry is really going. But when you’re entrenched in ways of working, in processes that have gotten to a point and you’re still trying to get as much value out of the existing designs and components, it creates more complexity.


For example, the parameters and constraints that must be taken into consideration in the design or optimisation of an engine, or the systems overall, like a hybrid system. All these pressures increase the volume of parameters and those constraints. Whether it’s a pure ICE engine, a hybrid system or battery-electric, the design of experiments is time-consuming and costly and, in some cases, this complexity is a barrier to getting a product to market.


The last big challenge is around the acquisition, management, and usage of data. The storage and processing that goes into it is considerable, especially in the design of experiments where you are going through tens of thousands of experiments. The numbers are sometimes off the charts.

I’ve been in machine learning and AI since around 2014, and there’s still some mystique, fear and excitement about this breed of software that is not fully understood, it hasn’t fully proven itself yet

Gary Brotman CEO Secondmind

Are you involved in digital prototyping, if so, how far upstream of the production process do you get involved?

The production grade solution that we currently have on the market is our active learning platform for powertrain. Right now, its initial application is on the calibration phase with an ICE engine, and we’ve begun work on EV motor inverter calibration.


The same underlying platform, the same technology is also being used for system designs. It can help OEMs from the early development phase in the V-model by optimising the required design specifications of each complex component based on the expected system performance. By doing so, we envision enabling OEM management to make the right design decisions for new vehicles with so many complex components. The ultimate goal is to be able to calibrate in a virtual way, so what we wind up with is a prototype that’s closer to production grade and all the effort that goes into the churn of the V-model ends up being flattened and shortened.


There’s a distinction here because we talk about digital prototyping and simulation. There’s a visual approach to doing that, then there’s a numerical approach. We’re very much on the numerical side of digital twinning. Today we can develop a model which is a numerical representation, a highly accurate numerical representation of an engine. That same numerical representation could be developed in a virtual setting as well as with a physical engine. If you can virtualise more of this process without the hardware, the work that you can do with software modelling today is much more efficient.

Machine learning is a big part of optimising production operations. How do you see this being developed/applied?

I’ve been in machine learning and AI since around 2014, and there’s still some mystique, fear and excitement about this breed of software that is not fully understood, it hasn’t fully proven itself yet. We’re past the hype cycle but we still aren’t in the phase where there is broad adoption of state-of-the-art ML.


There’s classic data and analytics and there’s classic base level probabilistic modelling. Gaussian Process as a machine learning technique is something that is not new in automotive, it just hasn’t been advanced in the same way as deep learning and deep neural networks, where all the attention and gravity really has been.


The challenge that machine learning faces overall, where it’s not fully come into its own yet, is the data challenge. Data can be sparse, meaning that you have a ton of it but there’s very little signal in the noise. And accounting for noise is incredibly difficult to do well. How do you manage all that noise from a processing, storage, analytics standpoint? You wind up with the data-hungry models, but the critical question is do you have the right data? And do you have the technology that can make sense of the data that you have?


The experimentation process is time consuming, and machine learning hasn’t fully helped that process. Explainability is a challenge. The conventional wisdom today in AI is to use deep learning and it will solve all problems, but it’s also something you cannot interrogate. It’s a black-box approach to development. That doesn’t work well with gaining the trust and confidence of engineers, and also from a regulatory standpoint. If you must understand why a decision was made and why something went wrong, you can’t interrogate a DNN.


The last thing we shouldn’t forget is the value of the subject matter expert’s knowledge that cannot be baked into data. The approach today is binary. You either have the human who knows their stuff, completes a task and the effort is more manual. Or the expert is on the sideline while the algorithm gets the job done in an automated fashion. But rarely do you see an effective blend of the two, and for things to work out well you need the combined intelligence of the engineer and the algorithm.


With Secondmind Active Learning, we’re enabling an intelligent, automated approach to the design of experiments that allows you to incorporate the subject matter expert knowledge at appropriate points. The platform, the underlying algorithms, will take the initial simple experiment that the engineer gives then develops a model, which has the intelligence to pick more data from the next region of interest to get closer to the optimum for a chosen objective. In relation to the number of iterations or experiments, the model and algorithm intelligently select the region of interest, do the modelling, pick the next region, and accumulate knowledge as it goes. The model gets smarter with each iteration.


This results in greater efficiencies on the experimentation side, and we’re seeing about a 50% reduction in time. That’s through having less data in the mix, and the number of iterations that are required is halved because of the accuracy that results from active learning.

Companies are getting more comfortable with this idea of the need for data in some way, shape or form

Gary Brotman CEO Secondmind

How much of a challenge is data management in digitalising and optimising the production engineering process?

There’re a few threads on this. One, is it getting better? I think so, as companies are becoming more comfortable with the idea of the need for data in some way, shape or form.


Also, there used to be a mindset that if you started to collect data, you’ll figure out what to do with it later. You don’t. You should have an objective first and then ask the question “what kind of data do I need?”. That by nature will streamline what you end up collecting.


The truth is, the more data you have on a problem, the accuracy is going to be better. But there’s a point where you say, what’s good enough, what will it cost you in processing time, is it worth it? Or do you find an optimal point and focus on other KPIs.


Additionally, there’s no global or regional standardisation around data. It’s collected and formatted in different ways and schemas. The lack of standardisation is an endemic problem that will create problems regardless of the industry.


Also, you can’t roadmap data. So, from a planning standpoint, even if you’re presented with two very similar datasets, there are differences. And making something turnkey and creating a roadmap around data is highly risky.


But when you get into the automotive space, the problems that we see are complex engineering problems, involving the control system where there is quite a bit of data It’s really a matter of finding the right data at the right moment and working with what you have.

You have worked with Mazda on optimising its powertrains. Could you offer some insights into this project in terms of challenges, goals and solutions?

Mazda’s R&D team had been using the Gaussian Process and all the tools that have been available. They have some very sharp machine learning engineers who hit some barriers due to the complexity involved, and they ran into challenges operationalising the technology. They came to Secondmind for help. Mazda have been leaders in model-based design for some time and they have the same philosophy from a machine learning standpoint as us and see the same potential.


We had a two-year R&D collaboration around powertrain optimisation, with ICE calibration at the centre. SKYACTIV is the most complex engine platform on the market, but Mazda is facing the same challenges that everybody else is.


Mazda’s goal is to reduce the time taken and anything that results from that, from a cost perspective. Model-based calibration is still at the core of their business, so they licensed our technology and helped us co-develop the resulting solution. We’re also working with them in an R&D capacity to evolve our Active Learning Platform and virtualise and calibrate hybrid and electric powertrain control systems. And beyond that, we collectively see other opportunities in CASE.

Looking ahead do you see different challenges emerging in optimising battery electric and hydrogen fuel cell powertrains?

There are several application-specific challenges. On the battery electric front, the electric motor and inverter is a seemingly less complex problem from a calibration standpoint than an ICE engine. In the context of a hybrid system, the complexity rises again, but that particular component is not as high dimensional a problem as ICE.


But you have other problems with energy and the battery itself. Looking at battery management systems, there are different approaches to optimising range and battery life. It’s not always a ‘one size fits all’ where you calibrate the engine - everybody’s usage is different.


However, we’ve seen this ‘myopic’ focus on range in electric vehicles starting to change. The ultimate KPI so far has been to achieve a range that’s satisfactory, so you don’t have to worry about the infrastructure challenges of not having enough charging stations. Tesla has signalled a shift by changing their strategy when it comes to the types of batteries they are using. They announced that they’re not going to use only nickel-based lithium-ion batteries, instead they’re going to go back to some iron-based units, which are 30% cheaper and offer a shorter range. Along with environmental and cost issues, this choice also considers usage. So, if you’re creating these big, expensive batteries that are consistent across every vehicle to get a 500-mile range, but the consumer is primarily driving locally over short distances, it raises the question, is that really a good use of materials? The carbon footprint associated with creating that and amortising that cost over the life of the battery is much more difficult if you’re not getting the full utilisation out of it.


We haven’t studied hydrogen fuel cells yet, but there is interest in hydrogen ICE and how you can take an existing ICE engine and redesign it to be hydrogen based. Could this be a middle ground that satisfies the carbon neutrality demands and preserves or extends technology that exists, with some tweaks? I don’t yet know how big or small those tweaks are, or how big the market could be, but it’s an interesting domain.