Uncategorised

20 years of PC-based control technology – looking back and looking forward

Hans Beckhoff gives a review of the last 20 years in PC-based control technology and an outlook on future developments.

A lot changes in two decades. In automation technology in particular, there are exciting innovations every year – sometimes even revolutionary ones; however, the actual impact on the market is not usually seen until 10 years later.

At Beckhoff, we delivered the first Industrial PC back in 1986, which means that we have had PC-based control technology ever since. And as early as 1990, during our first presentation at the Hannover Messe trade show, a journalist asked me how long the PLC would still be around? As a young engineer, I leaned back and said: Another five years – an incredibly long time for me at the time.

When this journalist asked me the same question again in 1995, Beckhoff was doing well and we had grown wonderfully with our technology – but PC-based control technology only accounted for a negligible share of the overall market. On the one hand, this is due to the time constant mentioned at the beginning. On the other hand, there is of course a certain inertia on the part of the large suppliers of control technology, which encourages them to stick with the tried-and-tested technologies – such as PLC technology.

Nevertheless, we are convinced that IPC technology is by far the most powerful and often the least expensive platform. It also is a platform that enables the best-possible integration of IT and automation features.

Also around the turn of the millennium, the hour of Ethernet began to strike in the industrial environment. In 2003, Beckhoff itself presented EtherCAT, a corresponding solution that is now internationally widespread and accepted.

We were  optimistic and knew that what we had was something good. But we weren‘t aware at the time that we were defining a kind of global standard with EtherCAT. As it happened so often in our company’s history, we progressed with a certain ‘naïve’ optimism and belief in our own strength and developed this technology out of our own conviction.

At that time, however, we were already seasoned fieldbus experts: On the one hand with regard to our own communication systems, which we had already launched on the market. On the other hand, we also knew all the other fieldbus systems – essentially CAN bus and PROFIBUS. Compared to all these existing solutions, the development of EtherCAT ultimately represented a real quantum leap: On the one hand in terms of performance, which we had optimized in such a way that we could use a single Ethernet telegram to collect bits and pieces of information from many participants in the field. On the other hand, we built distributed clocks into the system from the outset in order to integrate an absolutely accurate system time into an automation system. Another novelty we introduced: At the time, every bus had to have a master card – a fact that is almost forgotten today. With EtherCAT, this was no longer necessary; instead, the system could be operated on any standard Ethernet port.

After the first positive reactions from the market, we finally decided to make the EtherCAT technology available for open use. In this context, we founded the EtherCAT Technology Group. The release of the technology has certainly contributed significantly to the worldwide success of EtherCAT.

In 1998, we were able to offer IPCs with one CPU core and a clock frequency of 1 to 2 GHz for controlling a machine. Today we supply Industrial PCs with up to 36 cores and a clock frequency of 4 GHz. This shows that hardware development has made great progress – in other words, Moore’s law has proven its validity over the years. And we believe that this will be the case for at least the next 10 years. If today we can integrate image processing or measurement technology into the control system, if we can synchronize 100 axes in one machine instead of 20 and if path control is possible at the same time, then we owe it to this increase in performance.

Another decisive development over the past 20 years has been the combination of functional areas, for example by integrating safety into standard control technology. And as far as drive technology is concerned, such new drive types as our XPlanar, the levitating planar motor system that we nicknamed the flying carpet, and of course, the eXtended Transport System (XTS) based on inverse linear motors have been successfully introduced to the market. Basically, I see a trend for the future in specialized magnetic drive forms, because today they can be mastered algorithmically, which means that a lot of mechanical effort on the machine can be replaced by software functionality.

Especially with regard to software, the last 20 years have also been the time when the IT world has moved even closer together with the automation world. In the case of TwinCAT 3, for example, this has meant the integration of the various tool chains such as Visual C, C++ and IEC 61131 into Microsoft Visual Studio. A further advantage lies in the integration of MATLAB/Simulink and then measurement technology and image processing as a result. In short: I consider this consistent integration of functions originating from different areas or even from different companies in one software package to be one of the most important development trends of the last two decades.

All in all, automation technology has, in retrospect, become simpler and more cost-effective. Think, for example, of one-cable technology or the electronic motor nameplate – 20 years ago, this was either rare or non-existent. At the same time, costs per axis in control technology have decreased by between 20 and 40 per cent during this period.

One topic that has been on the Beckhoff agenda for over six years, but for which Beckhoff has not yet presented a market-ready solution, is completely PC-based or freely programmable safety technology. There are two different things that we have to consider here: First, we have been supplying hardware-based safety – i.e. the input and output terminals or safety logic terminals – for around 10 years now. These are freely programmable with a graphical editor and cover around 80 % of all standard safety functions. We have also decided to do without the safety hardware CPU and replace it with a purely software-based runtime. We have already developed the mathematical basics and special compiler techniques to do so. Internally, this is now a finished product – the only thing still missing is a simple graphical editor. It will be available by the end of next year and then the official market launch will take place.

Industrie 4.0 is a complex topic. Let’s start with digitisation: digitisation is something that the industry and the world have been experiencing since 1970. The further development of hardware and software concepts has permeated more and more areas of life – and thus also industry – with electronic data processing aids. In this respect, I don’t see a major leap in development, but rather a development that has been going on for a long time but is accelerating. The fact that German industry is still very competitive shows that domestic companies have done their homework quite well in this respect compared with other countries.

The third industrial age, in which we found ourselves until recently, was based on the Acatech model – which, as we know, invented the term Industrie 4.0 in the year 2011. In this model, the production environment is characterized by the local intelligence of machines. The fourth industrial age, which has just begun, is now characterized by the fact that this local intelligence is combined with cloud intelligence. This is already my main concept of Industrie 4.0 – i.e. machines that can ‘talk’ to each other via the cloud or call up services from the cloud and use them for processes on the machine. Conversely, a higher-level intelligence can also see the machines as an extended output arm.

At Beckhoff, we can well imagine that some machine intelligence is shifting towards the cloud – we call this the ‘avatar concept.’ Examples of this are the control of a machine with speech recognition running in the cloud or vibration analyses for predictive diagnoses, which do not have to be carried out online, but can be carried out offline in the cloud. Even today, however, we can ‘cloudify’ the entire PLC – depending on availability, bandwidths and achievable response times. With technologies such as 5G, a lot seems to be feasible here; however, the response times here are still above 1 ms – so a packaging machine, for example, cannot yet be controlled in this way.

Now we can make a projection and ask: What will communication look like in 20 years? Personally, I think that we will then be around 100 GBaud and, with the help of special switching and wireless technologies, we will be able to reduce the response times for centralized applications to well under a millisecond. And so in 20 years, your colleagues will be able to write retrospectively: 2018 was the time when the machines hesitantly began to talk to the cloud and retrieve services from the cloud – today, this is completely normal!

The basis for intelligence on the machine is, among other things, the hardware. This will continue to be determined in the next few years by Moore’s law, so that in 20 years we will certainly be able to use computers on machines that are 100 times more powerful than today. That would mean that you can control 100 times as many axes or cameras, or you can operate a machine with a lot of cameras 10 times faster. In this respect, we believe that, for example, the use of image processing systems on the machine – also as sensors and not just for workpiece evaluation – will increase dramatically.

On the other hand, as computing power and communication bandwidth increase, so do the cloud’s capabilities – at least by the same factor. Here, too, the engineer’s imagination is ultimately required to decide what can happen in this cloud. In this context, terms such as artificial intelligence (AI) and machine learning emerge – topics that will certainly have repercussions on machine functionality not in 20 years’ time, but in the next two to three years. At Beckhoff, we have also already founded a working group that investigates artificial intelligence algorithms for possible applications in automation – including path planning in robotics and sensor data fusion. The first results in these fields are very promising!

I have been asked; are the established automation technology manufacturers now running the risk of losing their ‘piece of the pie’? I don’t think so. After all, the big IT companies – Google, Microsoft and SAP – are approaching the application level from above. In other words, they have introduced edge computing concepts that in turn can contain local intelligence as well as machine control intelligence. In this respect, traditional machine control manufacturers are still way ahead in terms of their knowledge base, because automation technology is really complex. So I’m not worried that Google might suddenly offer motion controls or more complex measurement technology. And what’s more, the market is simply too small for these companies.

The large IT companies are primarily interested in the data because lucrative business models can be derived from it. Controllers or machine builders can supply this data.

As far as that is concerned, there will certainly be competition between automation suppliers and data processors. In addition, many machine end users have also developed their own strategy for this purpose.

Let me put it positively. First of all, I think that the fear for data security is much more pronounced in Germany than in other countries. However, if you want to successfully develop business models in this area, you should put that fear aside and consider what you could gain from all the data. Within the German AI community and even within the Federal Government’s ‘key issues paper on artificial intelligence’, there is a proposal to develop an anonymized general database into which personalized data can be imported and then made available anonymously as a general data pool for a wide range of different possible uses.

There are also many other practical methods: We have agreed with some of our customers, for example, that they occasionally run a test cycle on the machine that makes no statement about what has just been produced. During this test cycle, data is written that can then be used for predictive maintenance.

There are solutions to the problem of data security. I would always recommend not putting too much emphasis on fear at first, but rather looking positively at the different options available instead.

Send this to a friend