Vision sensors, the brain and intelligent data processing

At a certain age, around 70, our bodies begin to show signs of wear. What becomes apparent is that our built in control loops and data processing software steps in to compensate, and covers the gaps in the best way possible, working with the degraded sensors and equipment still functioning.

Dual sensors – the eyes!

J3827This was first obvious to me when I started to try to monitor the effects of glaucoma, which results in blind spots in the areas of sight for each eye. This is not apparent in normal life, as what you see is the brain-processed image from two eyes. Where one eye has a blind spot, data from the second eye is used to complete the single image in the mind – and this is how we do not notice the normal blind spots everyone has where the optic nerve leaves each eyeball. With the early stages of glaucoma, the smallish blind areas are only obvious when one eye is closed, so that your brain only works from one sensor. But our own sophisticated data processing fills in the blind spots with a plain area in the same colour/style of image as its surroundings, so you think that you see the whole panorama. With luck, opening the other eye will add detail from the second sensor, to complete the picture.

 

The brain as an image store…

When glaucoma gets severe, the blind spots from each eye begin to overlap, meaning that the processor has no data coming in from either of its two sensors for certain areas of your view, so the processor moves up a gear and fills the area with sort of a plain colour, the same as the surrounding areas. But actually it tries harder, and if you were viewing an array of books on shelves, the mind can insert a sort of composite image of the books there, and fill the blind spot: it almost tries to fill the space with past image information, when it last had an input from that area – when you were looking at that spot maybe. But the brain is not so good at this, and anyway it is old data. The driving authorities do not allow glaucoma sufferers to drive cars, as a child, or animal, or bollard, can disappear in a blind spot, replaced by an image of the surrounding tarmac road surface.

Astronomers already use this approach to refine telescope pictures of planets: getting the sharpest bits of multiple repeat images, disturbed by atmospherics, vibration etc., which can then be processed to produce an unblemished image.

Cataract operation – a sensor upgrade

Cataracts affect the vision, basically by making the image less precise, almost by adding a fog. I have just had the right eye operated on, and a ‘clean’ plastic lens inserted to replace the old cloudy lens. This does not help the glaucoma, but it gives the processor a whole new set of problems. Having worn glasses for myopia for 60 years, these were discarded as the new lens can see perfectly at distance. The brain now still uses the two sensors, but presents preferentially the sharply focused image from the right eye for distance viewing, supressing the fuzzy, out of focus image still available from the left eye. For close up views, maybe when reading, the left eye image is used, as the muscles of the right eye have lost some of their strength, and cannot focus up-close. So the brain switches sensors. All this happened straight away, no learning time needed.

Lazy muscles?

It was more of a problem when my glasses came back, one lens replaced with plain glass. Maybe they had been distorted by the optician, but the two images were displaced vertically with respect to one another. This requires the eyeballs to not move together, as they do when moving from side to side, but for one eye to move up with respect to the other. Not easy, particularly as the processor called on muscles not accustomed to such work to operate separately. So there was a mechanical delay in the control loop, of around a second, whenever I moved my gaze onto a different subject. The brain was doing the job, it was just the body complaining! However, a better option here is to adjust the frame of the glasses to restore normal operations…. maybe I “Should have gone to” ….a better optician!

The next step?

But do I need to wear glasses at all … or can the brain do the job without them? People use this technique with contact lenses, working with one for long distance sight, and one for close-up work.

The above is based on an article supplied for the Journal ‘South African Instrumentation and Control’, published October 2018 by technews.co.za

Advertisements

Leaps in Technology, and the spin-off

Whilst pacemakers and other implants have become fairly commonplace in medical treatment systems, these still rely on battery technology, and have a limited life. When dealing with electrodes or sensor devices positioned carefully, sometimes deep in the body, a battery capsule is embedded under the skin, to enable future access for replacement. A new development project at MIT, to be more fully described at an August conference, describes a very small medical implant that can be powered and interrogated using radio frequency waves, even though it is deep within the body.

Medical devices that can be ingested or implanted in the body could offer doctors new ways to diagnose, monitor, and treat diseases. In their study, the researchers tested a prototype about the size of a grain of rice, but they anticipate that it could be made smaller. Giovanni Traverso, a research affiliate at MIT’s Koch Institute for Integrative Cancer Research, is now working on a variety of ingestible systems that can be used to deliver drugs, monitor vital signs, and detect movement of the GI tract.

In the brain, implantable electrodes that deliver an electrical current are used for deep brain stimulation, which is often used to treat Parkinson’s disease or epilepsy. Wireless brain implants could also help deliver light to stimulate or inhibit neuron activity through opto-genetics.

In animal tests the researchers have shown that the radio waves can power devices located 10cm deep in tissue from a distance of 1m. Until now, this has been difficult to achieve because radio waves tend to dissipate as they pass through the body. To overcome that, the researchers devised In Vivo Networking (IVN), a system that relies on an array of antennas that emit radio waves of slightly different frequencies. As the radio waves travel, they overlap and combine in different ways. At certain points, where the high points of the waves overlap, they can provide enough energy to power an implanted sensor.

Mobile phone developments

The ubiquitous mobile phone. In various past articles I have mentioned the spin-off effects of the technology behind telecommunications and the mobile phone being used to create new industrial sensors, relying on the research and the production capabilities for the devices required for the industry. These spin-offs include the rise of radar level measurement systems, the use of wireless in many industrial sensors, and also the availability of many laser diodes, used for interferometry, liquid analysis etc.

Another major development is that of the liquid lens, used in these same mobile phones. This gets really personal, as for the last 60 years I have been an avid aero-spotter, keenly watching light aircraft arrive at our local airport using a telescope to identify them. Then, on arrival at or near the airport, using long and heavy telephoto lenses to photograph them. Later, I collected antique telescopes, manufactured from 1780 to maybe 1850, as they were still really the best quality optical systems, despite modern (commercial) developments. Again, long and heavy things.

But along came the liquid lens. This is a very small lens device, now commonly used in iPads and mobile phones. The liquid droplet forming the lens has its shape changed electronically, using an electronic control system. This is able to change focal length (to focus) and change optical axis (for optical image stabilization, ie to reduce camera shake effects) – all within a few milliseconds.

The idea for this invention came from research on the phenomenon known as “Electro-wetting” by Professor Bruno Berge, in Lyon, France, with the original patents being issued in 2002. Prof Berge started working on liquid interfaces from 1991 at the Ecole Normale Supérieure in Lyon. A drop of water affected by electro-wetting can function as a variable magnifying glass: so two clear, non-miscible liquids of the same density, one being electronically controlled water, can serve as a lens, depending on the curvature of the interface between them. The two liquids are sealed and held in a metal casing that is typically smaller than 10mm in diameter.

Berge first approached Canon cameras with the invention, but attracted no funding. So with French state funding, and investment fund backing, Berge founded the company VariOptic in 2002. In 2007 they established a production line in China, and in 2009 the first industrial barcode reader with a VariOptic lens appeared on the market. Machine vision manufacturer Cognex was an early adopter of the technology, for barcode ID readers.

A new module now available from IDS (Imaging Development Systems) is a single board USB interface camera, available for use with and control of liquid lenses. These low-cost uEye LE industrial cameras with twist-proof USB Type-C connection and practical USB power delivery are quoted as interesting for logistics systems (eg for package acceptance and sorting), for microscopy and traffic monitoring, as well as for installation in small medical or industrial devices.

So, I am still waiting for a lightweight long focal length telephoto ‘liquid’ lens for my Canon camera. Maybe not the telescope – for as I pointed out to Prof Berge, one of my favourite telescopes dating from the 1790s was made by Matthew Berge, his namesake!

The full story about the Prof Berge development of liquid lenses was first reported by me as the very first blog post on www.telescopecollector.co.uk, back in December 2013.

This article was first published in the South African journal of Instrumentation and Control issue of August 2018, published by technews.co.za

Plant control systems and the internet

The following is my personal view of the business planning quandary faced by the major automation companies, first expressed in a Comment page published by Technews.co.za in the South African Journal of Instrumentation and Control, SAIC, March 2018 issue:

It is a common saying that the pace of technology change accelerates with time: although possibly as the observers get older, they become set in their ways, and cannot keep up.

This is certainly true, in my experience: I am getting older, set in my ways, and struggle to keep up. However:

It is not only the pace of such changes, but the speed at which the changes are spread across the ‘world market’, that makes new technologies so rapidly applied and, sometimes, profitable. In consumer markets, the effect is most evident, with the spread of mobile phones and mobile computing: possibly this would all not have come to pass without the availability of the Internet fuelling the spread of information. But for automation, and industrial sensors, has the technology change been rapid? I believe it has, and believe it is now accelerating ever faster, taking advantage of the advances made to meet the demands of other users. This has been evident, and mentioned in these columns, in referring to wireless sensors, batteries for self-powered devices, and self-power from solar or vibration or heat energy. There are many more developments that should be included in that list.

The problem for Automation companies

But how are the major sensor and automation companies driving this growth into their businesses using advances in technology: what are they researching? Where are they investing to get a business advantage? I think that their business planners are having a difficult time at the moment.

Around ten years ago, the big new technology coming to the fore was wireless communication from battery powered sensors. The large automation companies, like Emerson and Honeywell, invested heavily into this technology, and there was the inevitable confrontation between two rival systems – WirelessHart and ISA100. The automation marketplace thrives on such confrontations, for example the spat between Foundation Fieldbus and Profibus. It happens in other markets too; think of Blu-Ray and standard DVDs, PAL and NTSC TV systems etc.

Other perceived growth areas

After the wireless investments blossomed, the Internet was looming, and everyone believed they had to take advantage of the data that could be collected, and networked. Certainly Emerson and ABB went heavily into power network control systems, but ABB had major product availability and systems installation capability in the power industry and has made real progress. Emerson eventually sold out of this network power business, but retains the Ovation DCS used for thermal power station control on site.

Automation companies also bought into the long-established, relatively dormant and slow market of condition monitoring systems, by acquiring the companies quoted to be ‘active’ in the field, who had the ‘black art’ knowledge of industrial condition monitoring. Personal experience, back in the ‘70s, has taught me what a hard sell and difficult market even the simpler condition monitors offer, monitoring bearing wear etc, and that hardly suits the major project potential that might be of interest to big contractors. Complex systems, such as those applied to turbines in power stations, did offer potential, but needed real specialist back-up.

Additionally, the people in the business, such as Schaeffler perhaps (once again the product suppliers with the customer base), slowly developed their own bearing monitoring systems, ranging from portable hand-held units to bigger wired/wireless systems – these are the ones that I believe will succeed in this market. An alternative approach adopted was based on wireless technology developments, which needed a central monitoring system, the ultimate goal for the automation guys. Sensors for steam trap monitoring were designed by majors such as Emerson, to expand their plant control systems into condition monitoring for the plant engineers.

Sure enough, after a slower start, steam trap companies such as Anderson (US) and Spirax Sarco (UK) developed their own systems, and had the market entry with the customers using their traps. The opposite approach was adopted by Yokogawa, which is the pioneer of ISA100 industrial wireless systems. They created alliances with people like Bently Nevada, the bearing condition monitoring sensor people, and with Spirax Sarco on steam traps. Maybe this was to be able to reverse sell them the back-up products and technology for wireless systems, or maybe to hope for the potential of a plant monitoring control system supply.

Software systems

Most of the automation majors have alliances with the large software and computing companies, like Cisco and HP. The current approach seems to be to use these alliances to piggy-back a 24/7 plant monitoring system using the Internet, supplied as a service across the world. Again, I believe the companies with the product on the ground, the stuff that needs monitoring, will be the major players. Here it looks like GE, monitoring its own brands of refrigeration compressors, large pumps and gas turbines at power stations and offshore etc. are best placed.

The future

The quandary is where the Internet will help the industrial control systems and sensor suppliers expand their businesses in the future. The answer deduced above is stick to what you know and what you are known for. The irony is that the major with the best potential now is Rockwell Automation, with its systems based around Ethernet communications, interfacing with anything, plus their onsite Ethernet hardware, with control systems already configured to deal with such varied inputs. Maybe this was why Emerson made an abortive take-over offer for Rockwell late last year. The potential has also been seen by Profibus, who are pushing forwards with their Profinet, and where they go, Siemens will always be in the background.

Confusion over radar level measurement

We have learned not to get too confused over suppliers using buzz-words and clever marketing names, but recently it seems the major level measurement system vendors have been introducing new and higher radar frequency systems as their latest development – and therefore, by implication, maybe the best. We were used to 6 GHz, and then 26 GHz radar frequencies, but why should we suddenly go to 80 GHz? Then, perhaps just to add a little excitement to the mix, Endress+Hauser started talking about 113 GHz!

113GHz_Key_Visual

The E+H radar line up that offers 113GHz!

This article was first featured in the journal South African Instrumentation & Control in September 2017, a journal published by Technews

Let’s dispel a few myths. Firstly, in the same way that lasers for fibre-optic communications systems made the technology available to create infrared optical systems for process gas analysers, and mobile phone technology possibly provided the hardware for the first radar level measurement systems; the 80 GHz versions are a result of measurement technology made commercially viable on the back of production investment in the distance measurement systems and parking sensors used in modern cars. So the suppliers take the available sensors and chipsets to create a new industrial product, and then have to find the best applications – in this case, the ones that might benefit from the 80 GHz.

Secondly, E+H do not have a 113 GHz system, this is a marketing statement, made to catch attention – ‘with a wink’ is their expression. They claim a ‘complete radar competence of 113 GHz’ because this is the sum of the many different frequencies their different sensors use! These are 1, 6, 26 and 80 GHz.

So why have different frequencies?

Possibly the best explanation for the applications suited to the different frequencies has been provided by the Rosemount measurement division of Emerson, in their “Engineer’s Guides”. The Emerson expertise stretches back many years, having acquired the Saab Tank Radar business. Per Skogberg, from the Gothenburg HQ in Sweden, separates the devices into low, medium and high frequency, to generalise.

Radar signals are attenuated, i.e. they lose signal strength as they pass through the air, or vapour, above the liquid. High frequencies are more severely affected than lower. When the air has moisture, steam or liquid droplets (from spray or filling) present, the attenuation is higher. Equally in solids applications, dust particles have the same effect. So low and medium frequency radar are best when there is dust or moisture present.

At lower frequencies, the wavelength is longer (30-50 mm), so surface ripples in a tank have a small effect. At higher frequencies, surface ripples and foam on the surface can be a problem. But the shorter wavelength of the high frequency units (4 mm) allows accurate operation over short ranges, for example in small tanks. The higher frequency units can use a smaller sensor construction, so the unit is easier to install. The beam angle is narrower, so it can be aimed at a smaller target area and therefore can be positioned more easily to avoid any obstructions in the tank. But even this can be a disadvantage, as the installation needs to be exactly vertical and any turbulence of the surface during filling or stirring can cause the signal to be lost temporarily, in larger tanks.

When reading these suggestions, it is important to remember that Emerson does not offer an 80 GHz unit yet, so their marketing approach would naturally bias users to look at low and medium frequency units. The suppliers of high frequency units (Vega, Krohne and E+H) would point out that in many liquid storage tanks the surface is undisturbed, since any foam, turbulence and significant ripples (>2 mm) caused by filling or liquid transfer will only cause short-term interference. Plus the small antenna size and short range performance make 80 GHz units very useful for smaller process vessels and tanks.

Radar system types

There are two types of radar systems, Guided Wave Radar (GWR) and Free Space Radar. The GWR systems use a conducting rod, or similar, extending down into the liquid, often working in a stilling chamber attached to the main process tank. These operate at low microwave frequencies, and are independent of surface turbulence and foam. They are useful for shorter range measurements and interface measurement between liquids, as well as long ranges.

The Free Space Radar systems are more widely used, since they are top-mounted with nothing in the tank: indeed, some can operate through non-conducting windows in the tank roof. Low and medium frequency radar systems generally transmit a signal pulse and measure the liquid distance by the time delay for the returned pulse. High frequency (80 GHz) systems use an FMCW radar measurement, where the frequency of the transmission is swept, and the frequency difference of the returned signal is measured to assess the distance. The FMCW technique is also used at 26 GHz in some recently launched sensors.

Radar systems can transmit their measurement data using 4-20 mA, fieldbus systems like HART, FF, Profibus PA and Modbus, or indeed via wireless systems like Bluetooth. The low and medium frequency pulsed radar systems generally operate over a two-wire interface: some of the higher frequency FMCW systems require more power and use a separate power connection.

Major applications

Simple low-cost radar level measurement sensors have been specifically designed for water industry use, in sewage sumps and flume flow measurement, by Vega and Endress+Hauser. Vega suggest that 40,000 such sensors are now in use in the water industry, mainly in Europe, and claim their total output of such sensors exceeds 550,000 units over the last 25 years.

Several of these devices use simple Bluetooth interrogation and programming from a handheld PDA: E+H demonstrates this at its facility in Maulberg, working on the stream that runs through the factory complex, as seen below.

Micropilot_FMR10_FMR20_on test stream at Maulberg, with operator using Bluetooth

Both E+H and Vega produce further industrial units for use on process vessels, and storage vessels for solids and liquids. Recently, E+H has extended its capability to add long-range units, such as the 80 GHz FMR62, working at up to 80 m range, with an accuracy of 1 mm. Other units work up to 125 m range, at 3 mm accuracy. These units will eventually be aimed at the large petrochemical industry storage tank markets, and specifically are working towards use for custody transfer duties.

Krohne have similarly announced a new range of its 80 GHz Optiwave sensors. Some of these can even operate at up to 700°C, for example for use on molten salt vessels in solar power plants. Lower specification units rated at up to 150°C can be used through a tank roof made of plastic, or similar materials. Suitable for small or narrow tanks, the unit can measure ranges of up to 100 m. Krohne also offers lower frequency Optiwave systems for use on solids and powders, or to electronically monitor the float position in magnetic level indicator columns attached to process vessels.

Postscript: Krohne is organising a webinar with the title “80 GHz Radar Level – Allrounder or Overrated?” to discuss their recent developments with such systems. This webinar will take place on 18th October 2017 at 3pm London time/10am New York time.

The value of Specialist Automation Suppliers

Engineers around the world are looking at how to benefit from the various solutions to the IIOT on offer: the article posted on 2 February entitled “How DCS Vendors see their IIOT future” covered the approaches being adopted by some of the major DCS vendors. This follow-up article, written for and first published in South Africa, in the Technews South African Instrumentation & Control Journal, March 2017, covers the approach of some of the smaller, specialist suppliers to their own selected sectors of the process industries.

While the major DCS suppliers try to work out how to provide revenue earning services from the growth of the IIOT, there are many specialist engineering product and systems suppliers who are investing in making their products easier for engineers to use in networks, and operate within the IIOT.

Most of these specialists are primarily focussed on the production of their valves, sensors, controllers or drives: this is their business – and they need their products to work with any interface the customer requires. Their expertise in interfacing their own products is the best available, they have an in-house systems knowledge base and capability. Most now offer this capability to their would-be product users as a service – offering a custom designed system incorporating the products. So look to these suppliers to offer the best engineering at an economic price, within their specialist field.

Typically these single-minded companies were set up by a design engineer with a good original product idea, and this has been developed and refined over the years. Often the company is family owned – and engineering / R&D investment takes precedence over profit distribution. Some such companies still exist in the USA, and a few in the UK, like JCB and Rolls Royce. Several specialist engineering product examples are found in suppliers originating from Germany, Scandinavia and middle Europe, where the culture seems to have encouraged their survival.

Beckhoff Automation

Arnold Beckhoff started his company in 1953: Beckhoff Automation now has a turnover of Euro 620 million, and employs 3350 people. The company implements open automation systems based on PC control technology, scalable from high performance Industrial PCs to mini PLCs, I/O and fieldbus components, plus drive technology and automation software. Supplying systems to many industries, Beckhoff works with and supplies components for over 15 major fieldbus systems. Motion control solutions solve single and multiple axis positioning tasks, and their servomotors offer combined power and feedback over a standard motor cable.

The Beckhoff TwinCAT 3 engineering and control automation software integrates real-time control with PLC, NC and CNC functions in a single package, and then all Beckhoff controllers are programmed using TwinCAT in accordance with IEC 61131-3. While the built-in TwinCAT condition monitoring libraries allow the on-site controllers to monitor the status of the sensors, to reduce downtime and maintenance costs, it also allows wider comparisons with connections to such cloud services as Microsoft Azure or Amazon Web Services. Other data connections are available, for example a smartphone app enables immediate local and mobile display of a machine‘s alarm and status messages.

Bürkert Fluid Control Systems

Bürkert was founded in 1946 by Christian Bürkert: it now has sales of Euro 412 million and employs over 2500 people. The product base is gas and liquid control valves, systems for measuring and controlling gases and liquids, plus sensors for monitoring such fluids, extending to complete automation solutions and fluid systems – this capability is known as their ‘Systemhaus’. While their products are now applied across many industries, their particular specialisations have been in sanitary, sterile and hygienic applications (food, beverage, biotech and pharmaceuticals), micro applications (medical, inkjet and beverage mixing/vending), and water treatment industries.

From the UK operation, Bürkert provide locally engineered solutions and systems for their pharma, food and brewery customers in particular. Locally made craft beers are a major growth area in the UK, and most start small, with no real automation. One example was Stroud Brewery, who needed to expand production by a factor of 5x, and preferably not increase their staff numbers: Bürkert designed a PLC system and intelligent control panel, which automated the temperature control of the cold and hot liquor tanks, and in the mash pan. In addition a system for controlling the run-off rate from the mash tun simply uses three separate Bürkert level sensors.

Bürkert also have developed their own ‘Device Cloud’, they call this ‘mySITE’. This collects data from Bürkert sensors around the world, using an on-site interface known as mxConnect – which can also accept data inputs from other sensors.

National Instruments

National Instruments was only started in 1976, in the USA, by Dr James Truchard and a colleague, who are still involved in the business. Now sales are $1320 million, and they have 7400 employees worldwide. Their declared Mission is to “equip scientists and engineers with systems that accelerate productivity, innovation, and discovery” – and their focus has always been to supply research establishments and engineers with open, software-centric platforms with modular, expandable hardware. This gives its own logistics problems, with 35,000 customers served annually.

It is difficult for me, as an outside observer, to relate the NI systems to an oil refinery or chemical plant application: but it comes into its own when the data handling grows in complexity – for example in pharmaceutical and biotech applications, and the sort of plants where engineers have a major input in monitoring the application. Mention cyclotron or Tokomak, CERN or the Large Hadron Collider, and NI and its LabView are embedded in their engineering control systems. All 108 collimators on the LHC are position controlled using LabView.

National Grid UK, which controls the distribution and transmission of electric power round the country, has adopted a control system based on the NI CompactRIO for the whole network. With many new power generating sources, HVDC connections, variable inputs from solar and wind farms, and the phasing out of major fossil fuelled plants, National Grid found that traditional measurement systems did not offer adequate coverage or response speed to handle these new challenges and risks. They adopted a platform, based on the CompactRIO, to provide more measurements – and also adapt with the evolving grid for generations to come. This interconnected network includes 136 systems, with 110 permanently installed in substations throughout England and Wales and 26 portable units that provide on-the-go spot coverage as needed.  The associated software systems provide their engineers with customized measurement solutions that can be upgraded in the future as new grid modernization challenges arise.

In terms of IoT developments, NI has just opened an Industrial IoT lab at the NI Austin HQ in the USA, to focus on intelligent systems that connect operational technology, information technology and the companies working on these systems. Many other companies are co-operating in this venture, like Cisco and SparkCognition, and the lab intends to foster such collaboration to improve overall interoperability. In addition NI has partnered with IBM and SparkCognition to collaborate on a condition monitoring and predictive maintenance testbed: this will use the SparkCognition cognitive analytics to proactively avoid unplanned equipment fatigue and failure of critical assets.

(c) Nick Denbow 2017

How DCS Vendors see their IIOT future

Engineers around the world are looking at how to benefit from various IIOT offerings: the survey below covering the approaches being adopted by some of the major DCS vendors was first published in South Africa, in the Technews South African Instrumentation & Control Journal, February 2017. Next month a similar article will cover the approach of some of the specialist suppliers to the process industries.

The last year saw all the major DCS and process control systems suppliers re-assess their business positioning, in the face of the turndown in capital spending as a result of the continuing recession and fall in commodity prices, led by oil. Their problem is that their main business cycles between feast and famine, as it is dependent on investment project business. Harry Forbes of ARC Advisory Group notes that automation companies will do nearly anything to protect their installed user base, because that’s where they believe future revenues will come, and come more easily than winning projects. So the way to survive the famine is to provide on-going services to these asset owners, to maintain the business relationship, and be better positioned when capital investment returns. Plus they stop competitive suppliers gaining a foothold via similar service contracts.

The current area of interest for most manufacturing plants is IIOT, and so the automation vendors have been focusing on this, plus Big Data and analytics, offered by remote ‘cloud-based’ services. The different suppliers come from different market positions, and so their approaches, while offering the same, are tailored in different ways.

Emerson Automation Solutions

Peter Zornio of Emerson expressed his very clear view of this market back in April at their Global User’s Exchange in Brussels. Emerson is involved in the IIOT: this does not include the ‘Smart Cities’ that Siemens and ABB talk about, nor Industrie 4.0, which extends from production back up into design concepts – IIOT is just ‘Manufacturing’. I believe Emerson also recognise that their process control systems cannot be a part of IIOT, they must be fenced off, with firewalls etc, to prevent cyber-security worries, and blocked from external inputs. But this does not stop them transmitting information outwards, and the whole Emerson approach of ‘Pervasive Sensors’ – their major new topic for 2015 – is now an important feed, into IIOT analytics.

The resulting offering is a cloud-based service developed in co-operation with MicroSoft, using their Azure IoT Suite of cloud services. Having worked with MicroSoft for over 20 years, their Windows 10 IoT technology will be incorporated into both the DeltaV and Ovation control systems and in data gateways to serve plant data to the Azure IoT Suite. Emerson will then provide the data analysis services that feed back information and recommendations to the relevant plant personnel, for example about plant performance or equipment maintenance. Zornio described this as a remote service similar to the ‘Monitoring Centre’ typical of the electricity generation industry, or the ‘iOps centre’ typically described in the oil and gas industry – which shows the areas of focus for the Emerson control system business.

Since then, Emerson restructured their widely separated divisions, Process Management and Industrial Automation, into one business, Emerson Automation Solutions, under newly appointed president Michael Train. This brings in some of the factory automation aspects covered by the old Industrial Automation Division, and extends the potential for the same IIOT monitoring into other areas of the manufacturing plant, such as power supplies, packaging and even discrete manufacturing. However, as part of their restructuring, Emerson has sold off significant parts of what was their Industrial Automation business, bringing in significant amounts of cash. In December the Network Power business, serving mainly data centre and telecommunications customers, was sold to Platinum Equity for $4Bn: the business will be rebranded ‘Vertiv’. Then, just this month, the deal to sell the alternators, drives and motors businesses known as Leroy-Somer (France) and Control Techniques (UK) to the Nidec Corporation was finalised: their combined annual sales were $1.7Bn, but of more relevance now to Emerson, the resulting cash payment received from Nidec is $1.2Bn. So Emerson Automation Solutions has probably earmarked part at least of that $5.2Bn of cash for some interesting, relevant acquisitions, maybe in this IIOT services area.

Rockwell Automation

Rockwell Automation has a totally different customer profile, perhaps the reverse of that described for Emerson, having great strength in factory automation, food processing and discrete process control in general. Their product portfolio is strong on motor control, actuators, energy management etc, using Ethernet based systems and controllers, which give simple interfaces to remote data systems. Steven Meyer of SAIC reported that the Rockwell South African MD Barry Elliot commented at the Electra Mining Show that the challenge is ‘to do more with the assets the organisation already owns’. He added that “In most cases the data already exists: our challenge is to implement systems that enable us to turn this into actionable information to streamline productivity and efficiency”. Just what the customer audience wanted to hear.

In November Rockwell launched their ‘FactoryTalk Analytics for Machines’ cloud application, based on – the MicroSoft Azure cloud enabled capability – yes, them again! OEMs using Rockwell/Allen Bradley controllers on their machinery can embed a FactoryTalk Cloud gateway device, to interface to this Rockwell remote analytical service.  Back at corporate level, the new Rockwell CEO is Blake Moret, and his attention is also on developing the oil and gas process systems business that was actually doing well in Rockwell, but is smaller than that of rivals like Emerson: so he has acquired Maverick Technologies, one of their system integrator customers. First this give Rockwell access to the Maverick five years of experience in supplying remote operations support as a service. Second, Walt Boyes of the Industrial Automation Insider has pointed out that Maverick has craftily recruited many otherwise retiring process experts from such companies as Dow, DuPont, ExxonMobil and other first tier companies, amassing a couple of hundred very valuable grey heads with continuous process management expertise. These are very useful for remote service support and advice, supplied even from their retirement homes!

ABB and IoTSP

Maybe ABB will have an alternative approach? ABB has a concept described as the Internet of Things, Services and People (IoTSP). They last year joined the Steering Committee of the Industrial Internet Consortium, an organisation founded by AT&T, Cisco, General Electric, IBM, and Intel in 2014. Then in September they recruited Guido Jouret as their ‘Chief Digital Officer’ – he was at one time the General Manager of the Cisco ‘Internet of Things’ division. October, however, brought them back into line with Rockwell and Emerson, when their new ABB Ability offering was announced as standardised on MicroSoft Azure, “expanding the ABB leadership in energy and the fourth industrial revolution”: ABB will take “full advantage of Azure services such as Azure IoT Suite and Cortana Intelligence Suite to capitalise on insights gathered at every level from device, to system, to enterprise, to cloud”. Although ABB say they have had many years of successful collaboration with MicroSoft, from the website it appears Ability is a new venture – looking for applications in transport infra-structure, digital power substations, fleet management services, Smart buildings etc.

Yokogawa

Yokogawa started 2016 with two acquisitions, first ‘Data-as-a-Service’ provider Industrial Evolution Inc, who provide cloud-based plant data sharing services, followed by KBC Technologies, who specialise in offering oil and petrochemical production plants the advanced software needed for process optimisation and simulation. These two were combined to create their new Industrial Knowledge Division. Executive vp Satoru Kurosu commented that “Key strategic objectives of Yokogawa’s Transformation 2017 plan are to expand the solution service business, focus on customers, and co-create new value with customers through innovative technologies and services”.

They then followed up with a strategic investment in FogHorn Systems Inc, a Silicon Valley specialist in fog computing – said to be the solution to faster processing of IIOT data present in the cloud. At the year-end, Yokogawa made a further significant investment into IIOT technology, first with a $900k investment into Bayshore Networks, who specialise in cybersecurity, and have developed the Bayshore IT/OT Gateway for use in the cloud, separating IT Departments from OT (Operational Technology) infrastructure networks. More than that, Yokogawa announced the establishment of a new Architecture Development Division in California, to pursue the development of the core technologies needed to establish the robust and flexible architecture required to improve operational efficiency and productivity when using the IIoT. Their aim is to expand this US engineering centre to over 50 staff in the next five years.

In February 2017 Yokogawa published their own release describing how these businesses will work together, and introducing another co-operation with Telit IoT Platfoms LLC, who are said to offer “offers unmatched expertise, resources, and support to make IoT on-boarding easy – reducing risk, time to market, complexity, and costs for asset tracking, remote monitoring and control, telematics, industrial automation, and predictive maintenance across many industries and vertical markets worldwide”. The most interesting aspect of their approach is that they seem to be moving towards “Plug-and-play” technology expanding to enable sensors to automatically join and adapt to plant networks, plus cloud reporting and condition monitoring, making the plant engineer’s job a lot simpler!

Obviously Yokogawa have major ambitions to develop and offer IIOT cloud data services with the best in technology and cybersecurity, all with a reduced customer detailed input.

Developments in South Africa

With so many major suppliers stepping up to offer cloud based IIOT data analysis and reporting services, what do the plant managers do? Steven Meyer’s report on the recent conference on the topic organised by the African branch of the Manufacturing Enterprise Solutions Association highlighted the recent PricewaterhouseCoopers report showing that South African companies plan to spend around R6Bn per year, until 2020, to implement the ideas of the fourth industrial revolution. In a keynote speech, local PwC director Pieter Theron made the telling comment that companies will need to find the right collaboration partners in order to improve their business efficiency through the technologies of the fourth industrial era – very few have the capability to go it alone.

These comments ring true for many large businesses all around the World: and it is clear that there are several interesting potential partners for these potential IIOT users to evaluate!

©Processingtalk.info

The latest Robots are Friendly

We all know what a robot is. But then it really does depend on whether you immediately think of them in ‘sci-fi’ films, or paint spray booths, or welding on automotive production lines, or stacking in automated warehouses. These have been the big applications, in big automated factories, with around 240,000 robots sold last year. The article below was for a column published in the November issue of South African Instrumentation & Control, see a digital copy on http://www.instrumentation.co.za/archives.aspx

The emergence of the cobot supplier

However, there is a new breed of robot now: collaborative robots, or cobots, have only really emerged as practical devices in the current decade. A cobot is a robot that is intended physically to interact with humans in a shared workspace, so the special pens and protective light curtains around the robot operating area are gone. The cobot is designed to work alongside a human operator, typically maybe lifting the heavier items involved in electronic device assembly operations: it has smooth surfaces with no sharp edges, and protected joints, so a human working alongside cannot trap their fingers, plus it stops at the slightest external touch.

Additionally, the cobot is flexible, it can be trained (taught) by the assembly operator, by guiding its arms and grippers to show it what to do. Currently the cobot market is around 5% of the total, $100 m last year. These robots are lower in cost, say $24,000 each, but are aimed at the small to medium sized companies that account for 70% of global manufacturing, where flexibility is essential. New international standards for their safe design and use are emerging, and there are many suppliers, as the market is forecast to be $1Bn by 2020.

ABB’s YuMi

yumi-robot

One such product is the ABB YuMi (‘you-me’) desk-top robot: a dual-arm small parts assembly robot that has flexible hands, incorporates parts feeding systems, camera-based part location and automated control: yet it has twice the reach and more strength than an operator. It can collaborate, side-by-side (or across the bench), with humans in a normal manufacturing environment, enabling companies to get the best of both humans and robots, working together.

In April, the ABB YuMi was recognised for outstanding achievements in commercialising innovative robot technology with the prestigious Invention and Entrepreneurship Award at the Automatica trade fair in Munich. There followed a Golden Finger award as ‘one of the best industrial robots of 2016’ at the China International Robot Show in Shanghai. One out of every four robots sold today is sold in China, which is the world’s leading robotics growth market: 68 000 units were sold there in 2015, 17% up on 2014.

YuMi was specifically designed to help consumer electronics meet the challenges produced by the need for customised personal electronics products, by enabling operators and cobots to share tasks, with easy training when the task changes. The YuMi appears to be targeted at the assembly operations common with electronic equipment, significantly in Southeast Asia.

Universal Robots – another successful start-up.

Universal Robots (UR) was formed in Odense, Denmark in 2005, with the goal of making robot technology accessible to small and medium-sized enterprises. It introduced its first cobot in 2008, and particularly focused on food industry applications, with 3,5 and 10 kg payload cobots. Their average payback period of 195 days for customers is claimed as the fastest in the industry.

Recently its cobot arms have been awarded certification for use in clean room applications, so UR robots can now be used in areas where purity and hygiene – such as particle emission, easy-to-clean surfaces and extreme reliability – are decisive criteria for precise automation processes. This opens up more applications in the food industry, in the production of microchips and semiconductors, and in the electrical and optoelectronic industries.

At the end of 2014, more than 3500 UR robots were installed worldwide: currently they claim the figure is 6000 – annual sales maybe growing x2,5 in just over a year. Mercedes-Benz has replaced old robots with humans on some lines, to better manage customised products. They are moving to having production workers guiding a part automatic robot. Scientists at MIT, working with BMW, have found that robot-human teams can be about 85% more productive than either of them, alone. Subsequently Universal Robots were rated #25 on the MIT Technology Review’s list of the world’s 50 smartest companies: Teradyne Inc then acquired UR for $285 m in 2015.