Power transmission, and the internet! @ProcessingTalk #PAuto

Changes in the technology around us that we work with and even create when involved with automation and control, are having a wider effect on society as a whole. Two areas that have been influenced this way are the growth of alternative methods of power generation and transmission, and the enormous power demands of the Internet, which will lead to a crisis somewhere.

The following article was written for the September issue of the journal “South African Instrumentation and Control”, which is published by Technews in South Africa.

Last November, this column described the long distance HVDC power transmission systems being installed by ABB, taking power across China, and also those used on undersea links between the mainland and offshore islands, or even oil industry offshore platforms. In reverse, similar DC power links deliver the new green power from offshore windfarms to national networks. Now GE has described how their MVDC technology from the GE Power Conversion business has been applied by Scottish Power to deliver extra power across existing lines between North Wales and the island of Anglesey (it is a quirk of the UK power industry structure that Scottish Power also supplies England and Wales).

The GE project converted the existing 33 kV AC transmission links to work with 27 kV MVDC, using GE power electronic inverters in sub-stations at either end of the line. This will increase the power available over the existing cables by 23%, enabling the supply to meet the future needs on Anglesey, without any additional environmental impact. GE point out that these same techniques are being applied in wind and solar farms, facilitating direct connection to an efficient MVDC power collection grid, giving a lower cable cost and less expensive installations.

Needless to say, the installations in Wales and Anglesey will be monitored by remote asset management systems, operated by GE engineers via the Internet.

DC power networks

DC power is becoming more prominent, both at the beginning and at the end of the grid. It is produced by wind turbines and solar PVs and used by everything from smartphones, laptops and electric cars, to the data centres that keep the Internet running.

However, having to convert back and forth between AC and DC along the way leads to wasted energy through resistance and heat – is this just to enable an interface with our old fashioned infrastructure? Our office buildings have computer network access on every desk, and even at home, the power sockets are fitted with added USB power outputs. The modern LED lighting systems, and ordinary domestic lamp bulbs, now use low power DC supplies. Why then do we need AC for more than power duties such as heating and cooking? Maybe it is time to convert homes to have most outlets just providing a DC supply from one power source housed in the local sub-station.

Internet burnout

There is a problem in adding too much emphasis on interrogating, monitoring and controlling everything via the Internet. The problem is the amount of power needed to run the data centres that store and distribute our data. In 2015, data centres worldwide consumed 30% more electricity than the whole of the UK demand for power – they took 3% of the global electricity supply. Ian Bitterlin, Britain’s foremost data centre expert and a visiting professor at the University of Leeds, says the amount of energy used by data centres is doubling every four years: and he points to a study focused on Japan, which suggests that their own data centres will consume the entire Japanese electricity supply by 2030. Carry on at this rate, and at worst the whole Internet will fail – at the very least there will need to be access charges and taxes to control the growth in Internet use.

Most data centres are sited in cold climates, to assist with cooling the electronics, as most of the power they use seems to be consumed by large cooling fans. While the heat generated directly contributes to global warming, the power used in 2015 accounted for 2% of total global greenhouse gas emissions, giving the data centres the same carbon footprint as the whole airline industry.

I had hoped that there would be an answer to this problem by using solar or other green DC sources to power these centres, but this seems unlikely, if the major power requirement is for the fans. Naturally, research continues on reducing the data centre demand for power, but it may be too late!

Advertisements

Vision sensors, the brain and intelligent data processing

At a certain age, around 70, our bodies begin to show signs of wear. What becomes apparent is that our built in control loops and data processing software steps in to compensate, and covers the gaps in the best way possible, working with the degraded sensors and equipment still functioning.

Dual sensors – the eyes!

J3827This was first obvious to me when I started to try to monitor the effects of glaucoma, which results in blind spots in the areas of sight for each eye. This is not apparent in normal life, as what you see is the brain-processed image from two eyes. Where one eye has a blind spot, data from the second eye is used to complete the single image in the mind – and this is how we do not notice the normal blind spots everyone has where the optic nerve leaves each eyeball. With the early stages of glaucoma, the smallish blind areas are only obvious when one eye is closed, so that your brain only works from one sensor. But our own sophisticated data processing fills in the blind spots with a plain area in the same colour/style of image as its surroundings, so you think that you see the whole panorama. With luck, opening the other eye will add detail from the second sensor, to complete the picture.

 

The brain as an image store…

When glaucoma gets severe, the blind spots from each eye begin to overlap, meaning that the processor has no data coming in from either of its two sensors for certain areas of your view, so the processor moves up a gear and fills the area with sort of a plain colour, the same as the surrounding areas. But actually it tries harder, and if you were viewing an array of books on shelves, the mind can insert a sort of composite image of the books there, and fill the blind spot: it almost tries to fill the space with past image information, when it last had an input from that area – when you were looking at that spot maybe. But the brain is not so good at this, and anyway it is old data. The driving authorities do not allow glaucoma sufferers to drive cars, as a child, or animal, or bollard, can disappear in a blind spot, replaced by an image of the surrounding tarmac road surface.

Astronomers already use this approach to refine telescope pictures of planets: getting the sharpest bits of multiple repeat images, disturbed by atmospherics, vibration etc., which can then be processed to produce an unblemished image.

Cataract operation – a sensor upgrade

Cataracts affect the vision, basically by making the image less precise, almost by adding a fog. I have just had the right eye operated on, and a ‘clean’ plastic lens inserted to replace the old cloudy lens. This does not help the glaucoma, but it gives the processor a whole new set of problems. Having worn glasses for myopia for 60 years, these were discarded as the new lens can see perfectly at distance. The brain now still uses the two sensors, but presents preferentially the sharply focused image from the right eye for distance viewing, supressing the fuzzy, out of focus image still available from the left eye. For close up views, maybe when reading, the left eye image is used, as the muscles of the right eye have lost some of their strength, and cannot focus up-close. So the brain switches sensors. All this happened straight away, no learning time needed.

Lazy muscles?

It was more of a problem when my glasses came back, one lens replaced with plain glass. Maybe they had been distorted by the optician, but the two images were displaced vertically with respect to one another. This requires the eyeballs to not move together, as they do when moving from side to side, but for one eye to move up with respect to the other. Not easy, particularly as the processor called on muscles not accustomed to such work to operate separately. So there was a mechanical delay in the control loop, of around a second, whenever I moved my gaze onto a different subject. The brain was doing the job, it was just the body complaining! However, a better option here is to adjust the frame of the glasses to restore normal operations…. maybe I “Should have gone to” ….a better optician!

The next step?

But do I need to wear glasses at all … or can the brain do the job without them? People use this technique with contact lenses, working with one for long distance sight, and one for close-up work.

The above is based on an article supplied for the Journal ‘South African Instrumentation and Control’, published October 2018 by technews.co.za

Italian Pharma co selects Emerson to enable a digital transformation…

FIS – Fabbrica Italiana Sintetici, a leading active pharmaceutical ingredients manufacturer, has selected Emerson to digitize operations and work processes at three manufacturing sites in Italy. With these $20m (€16.1m) contracts, Emerson will provide automation technology to help create a fully electronic manufacturing environment for increased efficiencies, quality and regulatory compliance.

“It is vital that FIS develops the right relationship to help it expand operations in a measured and prudent manner,” said Franco Moro, general manager of FIS. “Working with Emerson provides FIS with a trusted partner as we digitize work processes and invest in automation to improve production and efficiency.”

As part of its growth strategy, FIS constructed a new $123m (€100m) unit at its Termoli site, doubling capacity to produce active pharmaceutical ingredients. Emerson will implement its Syncade manufacturing execution system at the Termoli site, as well as the Montecchio facility, providing automated workflows and paperless procedures and record-keeping. Paperless manufacturing improves production efficiency and offers widespread benefits in compliance, product quality, inventory and document control, which are critical in the highly regulated pharmaceutical industry.

These two leading companies have partnered before; Emerson provided its DeltaV distributed control system to monitor and control manufacturing at the Termoli site in 2017. As part of this latest agreement, Emerson will expand the automation system to incorporate additional measurement and control instrumentation. By standardising on DeltaV across its Termoli, Montecchio and Lonigo sites, FIS aims to improve efficiency and ensure consistent operations.

“This project reinforces Emerson’s strong relationship with FIS, and, as a trusted advisor, we will continue to support its business objectives on a long-term basis,” said Mike Train, executive president of Emerson Automation Solutions. “Our expertise will help FIS further automate work processes and boost profitability across these three sites as part of a company-wide digital transformation strategy.”

The contracts are part of a 10-year strategic framework agreement signed with Emerson for the supply of its DeltaV and Syncade systems, measurement instrumentation and control valves, as well as a 10-year service agreement that covers the control systems at all three sites plus the new MES systems at Termoli and Montecchio.

Siemens and Bentley Systems consolidate alliance

Siemens has announced that their successful strategic alliance with Bentley Systems, the software development company that concentrates on infrastructure project management, will be further expanded to strengthen their joint business cooperation and investment initiatives. The result of this latest agreement between the two companies means that their initial 50 million Euro investment program will be doubled – taking it to 100 million Euro. In addition, the Siemens stake in the Bentley Systems company has been increased to over 9%, as a result of the continuous investment of Siemens into secondary shares of Bentley’s common stock.

Klaus Helmrich, member of the Managing Board of Siemens AG, declared: “I’m very pleased with how well our alliance started, how strong the relationship is. We are now investing up to the next collaboration level with Bentley. For example we will strengthen their engineering and project management tools using the Siemens enterprise wide collaboration platform ‘Teamcenter’ to create a full ‘Digital Twin’ for the engineering and construction world. Integrated company-wide data handling and IoT connectivity via Siemens ‘MindSphere’ will enable our mutual customers to benefit from the holistic ‘Digital Twins’.”

Greg Bentley, the CEO of Bentley Systems added: “In our joint investment activities with Siemens to date, we have progressed worthwhile opportunities together with virtually every Siemens business forgoing digital’ in infrastructure and industrial advancement. As our new jointly offered products and cloud services now come to market, we are enthusiastically prioritizing further digital co-ventures. We have also welcomed Siemens’ recurring purchases of non-voting Bentley Systems stock on the NASDAQ Private Market, which we facilitate in order to enhance liquidity, primarily for our retiring colleagues.”

Background to Bentley Systems

Bentley Systems is a software development company that supports the professional needs of those responsible for creating, building and managing the world’s infrastructure, including roadways, bridges, airports, skyscrapers, industrial and power plants as well as utility networks.

Founded in 1984, Bentley has more than 3,500 colleagues in over 50 countries, and is on track to surpass an annual revenue run rate of $700 million. Since 2012, Bentley has invested more than $1 billion in research, development, and acquisitions. The collaboration with Siemens commenced in 2016.

Editor’s note: The Bentley website ‘About us‘ page has an excellent video illustrating some of the worldwide applications where the Bentley infrastructure project software has been applied.

Algae control at Sellafield 

LG Sonic, a leading international manufacturer of algae and biofouling control systems, has installed multiple LG Sonic Industrial Wet Systems at the Sellafield  nuclear power facility in the UK. This led to a significant improvement in the clarity of the water and the visibility into the storage ponds. As a result of these ultrasonic processing systems there has been an exceptional reduction in blue-green algae and chlorophyll levels in the treated storage ponds.

Sellafield, a nuclear fuel reprocessing and nuclear decommissioning site, handles nearly all the radioactive waste generated by the 15 operational nuclear reactors in the United Kingdom. In 2015, the UK government started a major clean-up of the stored nuclear waste facilities in Sellafield because of the bad condition of storage ponds. One of the main causes of these bad conditions  was poor visibility in the water due to algae growth.

The Solution: Ultrasound technology

To improve water visibility in the storage ponds, four LG Sonic Industrial Wet systems were installed. The systems have 12 ultrasonic programmes to effectively control different types of algae, and are able to treat algae in a relatively short time. GPRS control allows the user to monitor and change the ultrasound programme remotely. Furthermore, status updates and alerts are received when power outages occur.

In only three weeks after the installation of the LG Sonic ultrasonic systems, there was a significant reduction in blue-green algae count and chlorophyll levels. As a result of the this reduction, the water started to clear and it was possible to see vessels and containers in the storage ponds that in recent years were only visible when using a tethered underwater mobile camera device.

Over 10,000 LG Sonic systems have been installed worldwide, including many on the current European FP7 projects.

unnamed (1)

A recent picture of a storage pond

Concern over EDF reactor faults

HazardEx, the UK journal covering industrial hazards and regulations worldwide, has published an interesting update on the state of the EPR nuclear reactor being built at Flamanville, in Normandy. Potential problems at this site are of as much concern to UK residents in Southern England, as will be the case over the future reactors of the same type planned for Hinkley Point in Somerset.

HazardEx says:

The French state electricity generator Electricité De France (EDF) has put the cost of repairing recently discovered flaws at the new EPR reactor being built at Flamanville in Normandy at Euro400 million ($468 million). This takes the total cost of the project to Euro10.9 billion, more than three times its original budget.

showimage

The Flamanville plant – an EDF picture

EDF had previously warned that problems with welds at the reactor under construction in Flamanville were worse than first expected. The utility said on July 25 that out of the 148 inspected welds at the latest generation reactor, 33 had quality deficiencies and would need to be repaired.

The most recent projections envisaged the Flamanville 3 reactor loading nuclear fuel at the end of the fourth quarter of 2018, but EDF said this was now scheduled for the fourth quarter of 2019. The reactor was originally scheduled to come on stream in 2012.

Flamanville was the second EPR reactor to be constructed: the first was Olkiluoto in Finland, which has suffered comparable delays and cost overruns, and this is also now due to enter service in 2019.

This means that the first EPR to enter production will probably be at the Taishan nuclear plant in China. Work on Taishan 1 and 2 reactors has also suffered repeated delays, but not on the scale of the French and Finnish plants. At least one of the Chinese reactors is expected to be commissioned this year.

In the UK, there are currently plans to build two of these EPRs at Hinkley Point in Somerset. These reactors could be further delayed if the new problems at Flamanville are not easily resolved. The UK EPRs are already mired in political controversy over the high cost of the project.

ENDS

Leaps in Technology, and the spin-off

Whilst pacemakers and other implants have become fairly commonplace in medical treatment systems, these still rely on battery technology, and have a limited life. When dealing with electrodes or sensor devices positioned carefully, sometimes deep in the body, a battery capsule is embedded under the skin, to enable future access for replacement. A new development project at MIT, to be more fully described at an August conference, describes a very small medical implant that can be powered and interrogated using radio frequency waves, even though it is deep within the body.

Medical devices that can be ingested or implanted in the body could offer doctors new ways to diagnose, monitor, and treat diseases. In their study, the researchers tested a prototype about the size of a grain of rice, but they anticipate that it could be made smaller. Giovanni Traverso, a research affiliate at MIT’s Koch Institute for Integrative Cancer Research, is now working on a variety of ingestible systems that can be used to deliver drugs, monitor vital signs, and detect movement of the GI tract.

In the brain, implantable electrodes that deliver an electrical current are used for deep brain stimulation, which is often used to treat Parkinson’s disease or epilepsy. Wireless brain implants could also help deliver light to stimulate or inhibit neuron activity through opto-genetics.

In animal tests the researchers have shown that the radio waves can power devices located 10cm deep in tissue from a distance of 1m. Until now, this has been difficult to achieve because radio waves tend to dissipate as they pass through the body. To overcome that, the researchers devised In Vivo Networking (IVN), a system that relies on an array of antennas that emit radio waves of slightly different frequencies. As the radio waves travel, they overlap and combine in different ways. At certain points, where the high points of the waves overlap, they can provide enough energy to power an implanted sensor.

Mobile phone developments

The ubiquitous mobile phone. In various past articles I have mentioned the spin-off effects of the technology behind telecommunications and the mobile phone being used to create new industrial sensors, relying on the research and the production capabilities for the devices required for the industry. These spin-offs include the rise of radar level measurement systems, the use of wireless in many industrial sensors, and also the availability of many laser diodes, used for interferometry, liquid analysis etc.

Another major development is that of the liquid lens, used in these same mobile phones. This gets really personal, as for the last 60 years I have been an avid aero-spotter, keenly watching light aircraft arrive at our local airport using a telescope to identify them. Then, on arrival at or near the airport, using long and heavy telephoto lenses to photograph them. Later, I collected antique telescopes, manufactured from 1780 to maybe 1850, as they were still really the best quality optical systems, despite modern (commercial) developments. Again, long and heavy things.

But along came the liquid lens. This is a very small lens device, now commonly used in iPads and mobile phones. The liquid droplet forming the lens has its shape changed electronically, using an electronic control system. This is able to change focal length (to focus) and change optical axis (for optical image stabilization, ie to reduce camera shake effects) – all within a few milliseconds.

The idea for this invention came from research on the phenomenon known as “Electro-wetting” by Professor Bruno Berge, in Lyon, France, with the original patents being issued in 2002. Prof Berge started working on liquid interfaces from 1991 at the Ecole Normale Supérieure in Lyon. A drop of water affected by electro-wetting can function as a variable magnifying glass: so two clear, non-miscible liquids of the same density, one being electronically controlled water, can serve as a lens, depending on the curvature of the interface between them. The two liquids are sealed and held in a metal casing that is typically smaller than 10mm in diameter.

Berge first approached Canon cameras with the invention, but attracted no funding. So with French state funding, and investment fund backing, Berge founded the company VariOptic in 2002. In 2007 they established a production line in China, and in 2009 the first industrial barcode reader with a VariOptic lens appeared on the market. Machine vision manufacturer Cognex was an early adopter of the technology, for barcode ID readers.

A new module now available from IDS (Imaging Development Systems) is a single board USB interface camera, available for use with and control of liquid lenses. These low-cost uEye LE industrial cameras with twist-proof USB Type-C connection and practical USB power delivery are quoted as interesting for logistics systems (eg for package acceptance and sorting), for microscopy and traffic monitoring, as well as for installation in small medical or industrial devices.

So, I am still waiting for a lightweight long focal length telephoto ‘liquid’ lens for my Canon camera. Maybe not the telescope – for as I pointed out to Prof Berge, one of my favourite telescopes dating from the 1790s was made by Matthew Berge, his namesake!

The full story about the Prof Berge development of liquid lenses was first reported by me as the very first blog post on www.telescopecollector.co.uk, back in December 2013.

This article was first published in the South African journal of Instrumentation and Control issue of August 2018, published by technews.co.za