Advanced data analytics and machine learning allow for the application of mathematics and statistics by processing of datasets previously considered too complex like e.g. real-time historical production data and environmental effect , where patterns and relationships have hitherto been too complex to analyse. Data analytics enables materials performance to be evaluated by weighing information from dedicated materials testing, general experience and fundamental theory/models.
The emergence of the digital twin concept for assessing manufacturing process for optimization purposes as well as for an asset’s operational performance is transforming the ways production and asset integrity management are done. In manufacturing and integrity management using digital twins, material characteristics and behaviours will play an important role. Digital twin technology will drive product lifecycle support by running digital ‘what-if’ scenarios to evaluate component and systems performance, optimise process condition and to eliminate costly trial and error experiments, whilst minimising failures.
A multi-scale digital twin of a manufacturing process will enable an analysis of the various uncertainties and scatters in the production process and models and enabled by:
- Advanced sensors giving input to the production control and models.
- A digital framework in which the multi-scale models are implemented and enhanced by machine-learning (ML) algorithms. The vast amount of data collected through the manufacturing of components are transferred to the ML algorithms modifying and maintaining the digital twin of the manufacturing process.
The insights derived from such analyses will allow manufacturers to reduce waste and minimize quality variations. A premise for such a digital twin model is to develop multiscale materials modelling methodologies at different levels aiming at predictings properties at the end of the manufacturing process based on linking knowledge about the behaviour at different scales together.
For complex physical phenomena, like influence of environment and ageing, data analytics and machine learning will become a standard design tool. An early example is ageing of composite material, where data analytics in combination with multi-scale material modelling are used as a tool to understand what is happening physically to cause the degradation of composites. The multiscale approach follows a design pyramid approach relying on extensive detailed finite element analysis and testing in constituents’ level i.e. fibre, matrix and fibre matrix interface, and performing limited confirmation tests at larger scales to verify, validate and calibrate the predictions from smaller scale and developed analytical models.What lies ahead?
An optimal coupling of materials selection and structural design is essential to successfully achieve lighter and more purpose-built assets. Modelling, simulations and data analytics, combined with judicious testing and analysis, allow verification of new manufacturing strategies, design load scenarios etc. in an efficient, reproducible and well-documented manner Future data-driven design efforts will need to move beyond the borders of what is currently known and, indeed, beyond what is currently considered state of the art in current scientific models or where those models are not yet sophisticated enough to explain or analyse complex systems where theory and experience are lacking.
Verification and validation by use of data analytics and twin modelling containing information of the geometry, properties, and manufacturing process (like thermal history) and materials properties (both including experimental data and assumed model for the behaviour) will play a significant role in how assurance services will be carried out in 2030.