Predictive engineering analytics (PEA) is a development approach for the manufacturing industry that helps with the design of complex products (for example, products that include smart systems). It concerns the introduction of new software tools, the integration between those, and a refinement of simulation and testing processes to improve collaboration between analysis teams that handle different applications. This is combined with intelligent reporting and data analytics. The objective is to let simulation drive the design, to predict product behavior rather than to react on issues which may arise, and to install a process that lets design continue after product delivery.
In a classic development approach, manufacturers deliver discrete product generations. Before bringing those to market, they use extensive verification and validation processes, usually by combining several simulation and testing technologies. But this approach has several shortcomings when looking at how products are evolving. Manufacturers in the automotive industry, the aerospace industry, the marine industry or any other mechanical industry all share similar challenges: they have to re-invent the way they design to be able to deliver what their customers want and buy today.[1]
Products include, besides the mechanics, ever more electronics, software and control systems. Those help to increase performance for several characteristics, such as safety, comfort, fuel economy and many more. Designing such products using a classic approach, is usually ineffective. A modern development process should be able to predict the behavior of the complete system for all functional requirements and including physical aspects from the very beginning of the design cycle.[2] [3] [4] [5] [6] [7] [8] [9]
To achieve reduced costs or fuel economy, manufacturers need to continually consider adopting new materials and corresponding manufacturing methods.[10] [11] That makes product development more complex, as engineers cannot rely on their decades of experience anymore, like they did when working with traditional materials, such as steel and aluminium, and traditional manufacturing methods, such as casting. New materials such as composites, behave differently when it comes to structural behavior, thermal behavior, fatigue behavior or noise insulation for example, and require dedicated modeling.
On top of that, as design engineers do not always know all manufacturing complexities that come with using these new materials, it is possible that the "product as manufactured" is different from the "product as designed". Of course all changes need to be tracked, and possibly even an extra validation iteration needs to be done after manufacturing.[12] [13]
Today's products include many sensors that allow them to communicate with each other, and to send feedback to the manufacturer. Based on this information, manufacturers can send software updates to continue optimizing behavior, or to adapt to a changing operational environment. Products will create the internet of things, and manufacturers should be part of it. A product "as designed" is never finished, so development should continue when the product is in use. This evolution is also referred to as Industry 4.0,[14] or the fourth industrial revolution. It challenges design teams, as they need to react quickly and make behavioral predictions based on an enormous amount of data.[15]
The ultimate intelligence a product can have, is that it remembers the individual behavior of its operator, and takes that into consideration. In this way, it can for example anticipate certain actions, predict failure or maintenance, or optimize energy consumption in a self-regulating manner. That requires a predictive model inside the product itself, or accessible via cloud. This one should run very fast and should behave exactly the same as the actual product. It requires the creation of a digital twin: a replica of the product that remains in-sync over its entire product lifecycle.[16] [17]
Consumers today can get easy access to products that are designed in any part of the world. That puts an enormous pressure on the time-to-market, the cost and the product quality. It's a trend which has been going on for decades. But with people making ever more buying decisions online, it has become more relevant than ever. Products can easily be compared in terms of price and features on a global scale. And reactions on forums and social media can be very grim when product quality is not optimal. This comes on top of the fact that in different parts of the world, consumer have different preferences, or even different standards and regulations are applicable. As a result, modern development processes should be able to convert very local requirements into a global product definition, which then should be rolled out locally again, potentially with part of the work being done by engineers in local affiliates. That calls for a firm globally operating product lifecycle management system that starts with requirements definition. And the design process should have the flexibility to effectively predict product behavior and quality for various market needs.[18]
Dealing with these challenges is exactly the aim of a predictive engineering analytics approach for product development. It refers to a combination of tools deployment and a good alignment of processes. Manufacturers gradually deploy the following methods and technologies, to an extent that their organization allows it and their products require it:
In this multi-disciplinary simulation-based approach, the global design is considered as a collection of mutually interacting subsystems from the very beginning. From the very early stages on, the chosen architecture is virtually tested for all critical functional performance aspects simultaneously. These simulations use scalable modeling techniques, so that components can be refined as data becomes available. Closing the loop happens on 2 levels:
Closed-loop systems driven product development aims at reducing test-and-repair. Manufacturers implement this approach to pursue their dream of designing right the first time.[19] [20]
1D system simulation, also referred to as 1D CAE or mechatronics system simulation, allows scalable modeling of multi-domain systems. The full system is presented in a schematic way, by connecting validated analytical modeling blocks of electrical, hydraulic, pneumatic and mechanical subsystems (including control systems). It helps engineers predict the behavior of concept designs of complex mechatronics, either transient or steady-state. Manufacturers often have validated libraries available that contain predefined components for different physical domains. Or if not, specialized software suppliers can provide them. Using those, the engineers can do concept predictions very early, even before any Computer-aided Design (CAD) geometry is available. During later stages, parameters can then be adapted.1D system simulation calculations are very efficient. The components are analytically defined, and have input and output ports. Causality is created by connecting inputs of a components to outputs of another one (and vice versa). Models can have various degrees of complexity, and can reach very high accuracy as they evolve. Some model versions may allow real-time simulation, which is particularly useful during control systems development or as part of built-in predictive functionality.<[21]
3D simulation or 3D CAE is usually applied at a more advanced stage of product development than 1D system simulation, and can account for phenomena that cannot be captured in 1D models. The models can evolve into highly detailed representations that are very application-specific and can be very computationally intensive.
3D simulation or 3D CAE technologies were already essential in classic development processes for verification and validation, often proving their value by speeding up development and avoiding late-stage changes. 3D simulation or 3D CAE are still indispensable in the context of predictive engineering analytics, becoming a driving force in product development. Software suppliers put great effort into enhancements, by adding new capabilities and increasing performance on modeling, process and solver side. While such tools are generally based on a single common platform, solution bundles are often provided to cater for certain functional or performance aspects, while industry knowledge and best practices are provided to users in application verticals. These improvements should allow 3D simulation or 3D CAE to keep pace with ever shorter product design cycles.[22] [23] [24]
As the closed-loop systems-driven product development approach requires concurrent development of the mechanical system and controls, strong links must exist between 1D simulation, 3D simulation and control algorithm development. Software suppliers achieve this through offering co-simulation capabilities for (MiL), Software-in-the-Loop (SiL) and Hardware-in-the-Loop (HiL) processes.[25] [26]
Already when evaluating potential architectures, 1D simulation should be combined with models of control software, as the electronic control unit (ECU) will play a crucial role in achieving and maintaining the right balance between functional performance aspects when the product will operate. During this phase, engineers cascade down the design objectives to precise targets for subsystems and components. They use multi-domain optimization and design trade-off techniques. The controls need to be included in this process. By combining them with the system models in MiL simulations, potential algorithms can be validated and selected. In practice, MiL involves co-simulation between virtual controls from dedicated controller modeling software and scalable 1D models of the multi-physical system. This provides the right combination of accuracy and calculation speed for investigation of concepts and strategies, as well as controllability assessment.[27] [28]
After the conceptual control strategy has been decided, the control software is further developed while constantly taking the overall global system functionality into consideration. The controller modeling software can generate new embedded C-code and integrate it in possible legacy C-code for further testing and refinement. Using SiL validation on a global, full-system multi-domain model helps anticipate the conversion from floating point to fixed point after the code is integrated in the hardware, and refine gain scheduling when the code action needs to be adjusted to operating conditions.
SiL is a closed-loop simulation process to virtually verify, refine and validate the controller in its operational environment, and includes detailed 1D and/or 3D simulation models.[29] [30]
See main article: Hardware-in-the-loop simulation. During the final stages of controls development, when the production code is integrated in the ECU hardware, engineers further verify and validate using extensive and automated HiL simulation. The real ECU hardware is combined with a downsized version of the multi-domain global system model, running in real time. This HiL approach allows engineers to complete upfront system and software troubleshooting to limit the total testing and calibration time and cost on the actual product prototype.
During HiL simulation, the engineers verify if regulation, security and failure tests on the final product can happen without risk. They investigate interaction between several ECUs if required. And they make sure that the software is robust and provides quality functionality under every circumstance. When replacing the global system model running in real-time with a more detailed version, engineers can also include pre-calibration in the process. These detailed models are usually available anyway since controls development happens in parallel to global system development.[31] [32] [33]
Evolving from verification and validation to predictive engineering analytics means that the design process has to become more simulation-driven. Physical testing remains a crucial part of that process, both for validation of simulation results as well as for the testing of final prototypes, which would always be required prior to product sign-off. The scale of this task will become even bigger than before, as more conditions and parameters combinations will need to be tested, in a more integrated and complex measurement system that can combine multiple physical aspects, as well as control systems.
Besides, also in other development stages, combining test and simulation in a well aligned process will be essential for successful predictive engineering analytics.[34]
Modal testing or experimental modal analysis (EMA) was already essential in verification and validation of pure mechanical systems. It is a well-established technology that has been used for many applications, such as structural dynamics, vibro-acoustics, vibration fatigue analysis, and more, often to improve finite element models through correlation analysis and model updating. The context was however very often trouble-shooting. As part of predictive engineering analytics, modal testing has to evolve, delivering results that increase simulation realism and handle the multi-physical nature of the modern, complex products. Testing has to help to define realistic model parameters, boundary conditions and loads. Besides mechanical parameters, different quantities need to be measured. And testing also needs to be capable to validate multi-body models and 1D multi-physical simulation models. In general a whole new range of testing capabilities (some modal-based, some not) in support of simulation becomes important, and much earlier in the development cycle than before.[35] [36] [37]
As the number of parameters and their mutual interaction explodes in complex products, testing efficiency is crucial, both in terms of instrumentation and definition of critical test cases. A good alignment between test and simulation can greatly reduce the total test effort and boost productivity.
Simulation can help to analyze upfront which locations and parameters can be more effective to measure a certain objective. And it also allows to investigate the coupling between certain parameters, so that the amount of sensors and test conditions can be minimized.[38]
On top of that, simulation can be used to derive certain parameters that cannot be measured directly. Here again, a close alignment between simulation and testing activities is a must. Especially 1D simulation models can open the door to a large number of new parameters that cannot directly accessed with sensors.[39]
As complex products are in fact combinations of subsystems which are not necessarily concurrently developed, systems and subsystems development requires ever more often setups that include partially hardware, partially simulation models and partially measurement input. These hybrid modeling techniques will allow realistic real-time evaluation of system behavior very early in the development cycle. Obviously this requires dedicated technologies as a very good alignment between simulation (both 1D and 3D) and physical testing.[40] [41] [42]
Tomorrow's products will live a life after delivery. They will include predictive functionalities based on system models, adapt to their environment, feed information back to design, and more. From this perspective, design and engineering are more than turning an idea into a product. They are an essential part of the digital thread through the entire product value chain, from requirements definition to product in use.
Closing the loop between design and engineering on one hand, and product in use on the other, requires that all steps are tightly integrated in a product lifecycle management software environment. Only this can enable traceability between requirements, functional analysis and performance verification, as well as analytics of use data in support of design. It will allow models to become digital twins of the actual product. They remain in-sync, undergoing the same parameter changes and adapting to the real operational environment.[43] [44] [45]