Alexander Epple, as a Senior Engineer at the Laboratory for Machine Tools and Production Engineering (WZL) of RWTH Aachen University, you head the Machine Data Analytics and NC Technology Department: how did this come about? Wouldn’t the post have been more suitable for a mathematician?

Alexander Epple: I admire mathematicians for their powerful algorithms and their capacity for tackling problems with a high degree of abstraction. These abilities also help when it comes to analysing big data. In the production world, due to the multiplicity of machines and processes involved, there are highly disparate kinds of data. Machines with the same processes permitting mutual comparisons are thus quite rare. Under these preconditions, purely statistical approaches are not very fruitful, and abstract big data approaches quickly come up against their limits in a production environment. It’s more fruitful to link knowledge of production technology, in the form of models, for instance, to the data concerned. This is why engineers have a place in the big data world as well.

Does your team reflect this interdisciplinary approach?

Alexander Epple: We have six academics working in my team, who are supported by highly qualified programmers and machinery technicians. The team is in fact very interdisciplinary: we have not only mechanical engineers, but also computer scientists and electrical engineers. What’s more, I work very closely together with Dr. Marcel Fey and his Machine Technology Department, since his people possess extensive knowledge of modelling. Together, we harness the capabilities of almost 30 academics, which enables us to drive ideas effectively forward. At the moment, however, we’re still seeking to expand our team.

How widespread is this cross-disciplinary approach in practice?

Michael Königs: In the field of simulation, particularly, we’ve had interdisciplinary teams at the WZL for a long time already. This approach has likewise proved its worth at other universities or research institutions. But we are also observing that interdisciplinary teams are no longer merely optional in the context of model-based near-real-time data processing; on the contrary, in future there will be no alternative to this sort of collaboration. Linking and bringing together methods and models from different specialisms is essential for unlocking the vast potentials involved in data analyses. So to sum up, it can be said that there have always been interdisciplinary approaches, but in future these will gain even further in importance.

Michel Königs, you are one of the computer scientists: how do you approach the world of mechanical engineering?

Michael Königs: When it comes to practical applications, you very quickly learn that the data quality of the signals recorded is crucial to the success of an analysis. Contrary to what a lot of people think, the data don’t always contain everything you need. For example, metrological systems in a machine tool supply position data, yes, but these provide only an approximation of the tool’s real path during a milling operation. There’s usually no way to draw conclusions on deflection effects, for instance, as a result of process forces or geometrical-kinematic inaccuracies of the machine tool being used. Knowledge of modelling can be employed to enrich the pure signal data with this missing information. This refined data record is essential, you see, for predicting the workpiece qualities being achieved during the actual machining process.

How to deal with the gigantic quantities of data involved?

Industry 4.0 leads to transparent production facilities, which thanks to the increase in the sensor technology fitted and their powerful evaluation electronics will generate big data. But how can the valuable raw data of a machine tool, for example, be acquired – can a relatively old machine without any sensor technology be retrofitted with it?

Alexander Epple: There are research projects that examine how relatively old machines can be retrofitted with the requisite sensors. In addition, we are currently pursuing approaches that initially utilise the sensor technology installed in the machine. Besides the motor current, each machine also acquires ongoing axis positions. Direct and indirect path measuring systems are usually installed. We can use these signals, for example, for process-concurrent determination of the propensity to vibrations. The process forces and component loadings can also be determined for approaches in terms of predictive maintenance. This is true for both old and for new machinery systems.

But big isn’t always beautiful: 100-per-cent data acquisition from the work of a machining centre, for instance, already means in realtime (35 process parameters per millisecond) an annual data volume of 5.8 terabytes. How do you filter out the interesting facts from it?

Michael Königs: For extracting interesting information and process parameters, we use both statistical methods (machine learning) and algorithms developed specifically for this purpose, which enable expert and domain knowledge to be integrated. Quite generally, though, it’s correct to say that continuous data acquisition will entail a huge quantity of data. There are approaches now that involve not acquiring all data continuously at the maximum scanning rate, but only at particular times, after defined events – e.g. a threshold limit’s being violated – or for certain processes. Other approaches compress the data volume by forming process parameters. Still other approaches, in their turn, deliberately utilise the large quantities of data in order to identify patterns using appropriate mathematical algorithms. It depends very closely on the application involved which approach is the likeliest to be successful.

Can this quantity of data still be handled with the customary hardware or is a supercomputer needed – a quantum computer?

Michael Königs: Present-day technology (if used correctly) will mostly suffice in our disciplines if individual or partial models are developed by experts. The broad, interdisciplinary interconnection and use of these models, however, swiftly brings the currently available hardware up against its limits. A cloud environment geared to these needs, in which both statistically and physically motivated individual models can be interlinked and executed with responsively fit-or-purpose computing resources, can provide the requisite connectivity and computing power here. But I don’t believe that an environment of this kind can be achieved only by using quantum computers.

Academics at the Fraunhofer Institute for Production Systems and Design Technology (IPK) have expressed the view it would be more sensible to make an intelligent preselection in the vicinity of the machine prior to storage, before then a downsized data record (“smart data”) is transferred to the cloud. What do you think of this approach?

Alexander Epple: This is an idea I can basically go along with. At the WZL, too, we’re examining local data processing and interpretation, and thus refining the data to create smart data in the immediate vicinity of the machinery system concerned. The advantages of this local data pre-processing are obvious. However, there are also firms who store and process all raw data unfiltered in a central system – like a cloud.

What is your conception of a cloud, and how can it be utilised?

Alexander Epple: By “cloud”, we mean a model for locationally independent, on-demand network access to a shared pool of configurable IT resources that can be demand-responsively utilised and enabled again. These subsume not only network bandwidths and computer hardware, but also services and applications. In the context of big data, cloud platforms, precisely by virtue of this scalability, and the broad availability of analytical algorithms, offer good preconditions for downstream analysis of data quantities that are too large, too complex, too weakly structured or heterogeneous to be evaluated manually or using classical methods of data processing. Upstream data transmission may prove to be technically challenging, however. A local data acquisition system at the machine is comparatively simple to design if the data only have to be forwarded. This, of course, has substantial advantages in terms of maintenance and roll-out, but conversely poses tough challenges for the bandwidth required to transmit the data. Local data pre-processing and compression can reduce this. However, every data compression operation entails a loss of information that may be irrelevant for ongoing analyses but be totally crucial for future scenarios. Sometimes, you only realise afterwards that the information no longer available would actually have been helpful in order to interpret a particular phenomenon.

Both these approaches have their own advantages, and it will depend on the strategy of the application partner concerned as to which approach he decides to pursue. Quite generally, we are observing a certain amount of scepticism when it comes to storing data centrally in a cloud system. But there are also options for a local “company cloud”. Even data evaluation directly at just one local machinery system can already offer major potentials for raising productivity.

Big Data raises productivity by 30 to 150 per cent

How did you come to be collaborating with the experts at SAP, who together with Cisco and Huawei have developed a big data client that acquires and stores all data in the cycle of the CNC?

Alexander Epple: The close collaboration was initiated by SAP, since they were looking for a research partner from the field of production technology, able to provide not only excellent basic R&D but also a lengthy track record in application-focused collaboration with industrial enterprises. We have supported SAP with customer-related projects in highly disparate fields. The results were a surprise even to us. For instance, at a German automaker, with SAP we have increased productivity by 30 per cent in the power train section, and substantially reduced the rejection rate. In the aerospace industry, we’ve likewise succeeded in raising productivity by almost 30 per cent, and at one German manufacturer of large machines by nearly 150 per cent.

What are the typical questions you’re confronted with?

Alexander Epple: Machine operators, process developers or quality engineers are often worried that their expert knowledge will not be needed any more in the long term. However, we believe that all essential decisions will still have to be taken by experts. They are familiar with numerous boundary conditions that may not even be amenable to being imaged by means of data. Data evaluation has to support the operator in his/her decision-making with appropriate visualisation of machinery or process states. Thanks to the new solutions, however, operators will in particular be spared the elaborate search and pre-processing work involved for individual pieces of process information, with its minimal contribution towards added value.

The relevant workload in the metalworking sector is high: could virtual prototyping or try-out be sufficiently improved with the aid of big data so that the number of real trials could be reduced or even eliminated entirely?

Alexander Epple: Substantial reductions are indubitably possible, yes. In our view, learning from data with the support of models has huge potential.

What do you and your team expect from the EMO Hannover, the world’s premier trade fair for the metalworking sector. What role will big data play there?

Alexander Epple: Due to our collaboration with numerous industrial partners, who will also be represented at the EMO Hannover, we already have a pretty definite idea: I believe it’s becoming progressively clearer that the use of big data in production operations can be substantially enhanced by incorporating specialist expertise. This eliminates the worries expressed by many specialist employees that their expert knowledge will soon be rendered superfluous by big data. I am hoping for more acceptance from the visitors and a certain amount of curiosity from a sector that otherwise tends to be rather conservative. So I’m looking forward to plenty of specific solutional approaches.

 

The interview was conducted by Nikolaus Fecht, specialist journalist from Gelsenkirchen.

 

Downloads