512-800-6031 editor@ramreview.com

We are moving into an evolving age of “green energy” and “a green economy.” In the process, we’re facing issues on how to integrate legacy systems with new technologies and how to protect them from each other.  As noted in several previous articles, including my recent wind-turbine case study (Part 9 of my “Reliability & Maintenance Opportunities” series, see link below), various unexpected conditions can arise when modern systems incorporating non-linear components, i.e., electronics and VFDs, are connected to a power grid that was conceived well before such components went into use.



Click Here To Read The Referenced “Reliability & Maintenance Opportunities” Series Installment,
“(Part 9) ‘Bigger Picture’ Case Study” (June 11, 2021)


THE CURRENT STATE OF ELECTRICAL RELIABILITY
What we have in the present is an outdated transmission and distribution system full of aging equipment and components. The grid of the past was built for large, base-loaded power plants and a vastly different operating context than what is required to accommodate today’s electrical needs. Consequently, suppliers everywhere are patching existing systems or adding new technologies to legacy systems, in an attempt to keep facilities online, and equipment owners are dealing with increasingly complex failure mechanisms.

As I mention in my IEEE editorial, ‘From Green Coils to a Green Economy,” there is good news (click here to read it). As things evolve in the electrical-reliability community, materials and systems continue to advance.  The advancements may be seriously lagging, but there is movement in the right direction, along with challenges/opportunities for existing and new engineers, technicians, and reliability specialists entering this new era.

Just how new is “this new era?” The kickoff for what we are trying to successfully manage today was in 1994. That is when the decision to de-regulate power generation was made. I remember it well, as I was serving as the Midwest energy representative to USAB (government relations) for IEEE at the time. This was the around the same time that the Energy Policy Act of 1992 was driving creation of USDOE’s “Challenge” programs addressing electric motors, variable frequency drives, steam systems, and a number of other technologies. The programs’ original focus was on high-energy consuming industrial sectors, where PWM (pulse width modulated) VFDs and energy-efficient machines and operating context would become popular. The Internet opened up during that time, increasing the demand for servers, which then drove heavier HVAC loads that were still distributed and housed at individual company locations. At this time, the need to protect electric motors from the new electrical controls became a priority for OEM and service companies, leading to the development of new filtering and insulation technologies.

Research into the impacts of the new electrical environment increased in the mid-1990s, and a multitude of theories on failure modes were published. In the end, an older concept of what happens in the micro universe of less than a millimeter between conductors in an electric-motor winding in high power machines became a concern in much smaller, lower-voltage units.  Mechanical faults were tied to the new electrical environment and whole new fields of grounding-system engineering, insulation-system research, nanotechnologies, and power-quality engineering and standards began to appear or be updated.  It was a heady time for young engineers, in terms of reliability improvements and research: new areas of study were being discovered and older ones rediscovered after decades of stagnation and gray-haired engineers and their followers doing the same things, the same way for decades. This era also brought about advances in testing and predictive-maintenance methodologies, as the field became more intriguing and new problems needed solving.

Then the new Century arrived, bringing with it the need to handle increased electrical stress, as more and more operations moved to electronically controlled automation and greater Internet usage. The pace continued at a steady clip until smart phones hit the market in the late 2000s, thus firing up the race to provide increased support for data and related data centers. By 2008, with the unexpected spike in fuel costs a few years prior, the pressure to produce hybrid-electric and electric cars increased and auto manufactures entered the market along with a few new manufacturers. The discussion of the impact of plug-in and all-electric vehicles began around this time in the engineering communities, whereupon new standards were initiated, and older ones were updated. Research and development surged, as did publication of academic papers on new technologies designed to defend cables to machines, and transformers to transmission lines and insulators, as changes to the system evolved and continued to have greater impacts, as they do now.

In the late 1990s, additional distributed energy had begun entering the grid, including from methane recovery in landfills to solar, windpower, and geothermal systems, fuel cells, and energy storage in pumping stations, among others. At the time, no one was quite sure which new energy system would take the lead.  However, new research into each of the new distributed-energy sources was being pursued and supported through national laboratories and universities. Around 2006, wind energy took the lead and continues to expand at an accelerated pace, with supporting technologies, such as energy storage, not far behind. Older legacy-system facilities that reach the end of their useful life are being allowed to close. Among them are many traditional coal-fired baseload power plants that are being replaced (often quickly) by natural-gas facilities.

To manage the long distances associated with distributed plants, high voltage DC (HVDC) transmission and distribution systems began to receive more attention.  These systems have large AC converters at one end of the system and inverters to change the DC back to AC at the other end.  While HVDC systems have existed since the 1800s for short distances (~1 mile radius from the DC generation station), and Europe adopting HVDC systems in the 1950s, it was not until the 1970s before the USA installed a major system in the Pacific Northwest. At this time, with the national labs, academia, and others seeing HVDC distribution as a method of stabilizing the grid with the introduction of additional distributed energy systems, new standards have been, and are being, developed to provide guidance on interoperability between the AC grid and DC grid.  This includes the potential of using HVDC for offshore wind-energy installations now being planned off the coasts of the United States.

In the mid-2010s, the “cloud” was developed as a method for information systems to be placed on servers outside of a data-owner’s facility. That, of course, led to increased construction of data centers. These facilities centralized high energy consumption with power-quality considerations into a variety of unique locations as services, companies, and the public used more and more data through the Internet.

Use of new distributed technologies continues to grow as those installed in the late 1990s and 2000s are entering into the later part of their aging cycles.  Whereas regulated powerplants in the early 1990s would go through scheduled 5-, 10-, 15-, and 20-year overhauls of different levels, today’s aging systems are just going into their re-manufacturing and overhaul periods. In the meantime, the types of loads we are putting on the grid systems, including some that are almost 100 years old, have not let up, in fact, much of the critical equipment currently in use across the U.S. power grid has exceeded its expected life as we are adding and patching in new systems and technology. 


THE FUTURE STATE OF ELECTRICAL RELIABILITY
What would happen if your operations lost power for a day?  Several days?  A week?  Indefinitely? These are the questions that smart-grid and electrical-materials engineers have been asking for quite a while.

The term “smart grid” is a direct result of implementation of new power-generation technology into the grid, while limited improvements or expansion of the existing infrastructure has been occurring. There is an entire science surrounding this new concept, including such things as smart cities, and whole new companies and corporate divisions are stepping up to participate.  We will return to much of the science surrounding smart cities in future articles, including concepts around islanding and managing power distribution.

The concepts around “big data” and “machine learning” stem from this massive growth in the electrical and information economy. The original ideas discussed in engineering, academic, and information-technology communities in the 1990s focused on marketing and finance. Other groups realized there were technical opportunities for business and productivity.  The idea that is now called IoT was initiated through the need to manage distributed generation and gaming technology.  Both have been pushed through a collision of technologies and availability of information to this point and will continue to evolve in the future. We can, though, expect they will be different than what we currently envision.

The result is that we have growing information on how things work. Human beings are visual by nature, so the ability to “see” complex interactions through digital information has allowed us to dig deeper into complex problems, such as those being experienced in power systems.  Societal pressure to reduce the consumption of organic materials (oil) is also having an impact on how we are looking at electrical materials, given the majority of insulation systems are plastics. The dielectric materials used within oil-filled transformers are organic, as well, with many having moved to plastics in place of wooden materials.

Physics, chemistry, and engineering have managed to keep up with the impact of the growing changes in the national grid, local distribution systems, and electrical machinery, from switchgear to electric machines. While basic materials, such as copper, are still the primary conductors, insulation systems and enamels that prevent short circuits and grounds have advanced significantly since the 1970s. Partial- discharge-resistant materials (often misnamed spike resistant wire) that became available in the 1990s are gradually being replaced with a 2010-era nano-dielectric enamel.  Those changes in partial-discharge material required changes in how we viewed test results in electric motors and generators, and the nano-dielectric material has accomplished the same condition again and again. Changes to standards on the horizon.

Other materials have been added to the market, but they are primarily based on original materials. The result is a change in how we had to do something as simple as in insulation resistance test.  From the old days looking at a few hundred Megohms and a polarization index of several digits, we now are seeing insulation systems in electric motors measuring in the Gig Ohms and Terra Ohms with polarization index results of as low as 1.0 for a good system.  The result was that we had to make changes to the standards surrounding insulation-resistance testing that can make what used to be a basic test complex.

We can expect the next major advancement in materials to be either nanotube related or carbon-fiber materials.  Presently, the best insulation-system materials are Nomex (a plant fiber), and mica (a flaked rock), but both still require organic materials for strength and flexibility. These and polyester varnish (required for nuclear-power and high-radiation environments) and epoxy materials have seen some changes, but, overall, they are similar to what they’ve been over the past 50+ years.  New sheet insulation materials are in academic and materials companies’ R&D programs with relatively few papers being published identifying a replacement for Nomex, a staple for almost all electric machines, which was almost retired at the end of the 2010s.  For smaller machines, organic systems such as PEEK (plastic material) eliminate partial discharge, but they are subject to bruising (crush damage), which causes them to delaminate from the conductor material.  However, it is virtually partial- discharge-proof, making it an outstanding material for severe voltage spikes caused by poorly installed VFDs, it just requires a different approach to handling. On the other hand, there currently are no testing methods that will detect degrading. Thus, it will measure “good” until it catastrophically fails.

Major cabling systems have also aged. This means insulation materials around the conductors fracture and “tree,” which indicates a migration of contamination, such as moisture, through the system.   Failing insulation is often repaired by splicing. In local distribution systems, such as Washington, DC, a different approach is taken.  A special foam-filler insulating material is pressure-inserted through the conduit to fill in and provide a 10- to 20-year additional life for the cable.  The use of the material does require a change to how testing is performed on the cabling so that additional damage won’t occur, but the deferred cost puts the need to fund a major upgrade into someone else’s tenure.


POTENTIAL BREAKTHROUGHS
It happened during a conference where a discussion about the future of skilled trades and funding was part of a panel session. A college student stood up in the back of the room and stated, “We don’t need skilled trades in maintenance because systems will be self-healing.”  Then he stormed out.

If he had been my student, I would have flunked him. However, he was on to something, although not exactly what he may have thought. There is active research into systems that are “self healing,” which are much like “never flat” tires that are available. The concept is that the material will react to some defect causing a scar or scab to form.  In the electrical-insulation community, this could be an additional layer in an insulation system that melts and flows to fill an electrical path between conductors or ground.  Some of these materials have been labeled “smart,” but their form and function are relatively straightforward and based on electro-chemical, or thermal, reaction.

Technical societies associated with the research and development of these materials have been discussing them for the past decade. To put this into perspective, it took 20 years to take nano-dielectric materials from concept to market, but only for low-voltage wire applications.  In effect, we can expect self-healing insulation systems to become available sometime in the 2020s. This will impact cables, motors, generators, and similar systems that enter the market.

However, in the work that’s presently underway, the self-healing works once and then must be addressed again later. What is being looked at is the fact that when self-healing does take place, the material somehow communicates that damage has occurred through a concept of self-diagnostics.  This could be a color change, some type of signal, or some type of connected response system that may communicate with an alarm display.  Is this possible?  Of course.

The future of insulation materials, and electrical reliability, is communication.  Electrical systems already carry information that we must read. For instance, with Electrical Signature Analysis (ESA), Partial Discharge, and other technologies, we’re using detectors to pick up the signals produced by the system.  Eventually we can expect a combination of advanced neural networks and the material itself acting as a sensor to produce the information to identify impending defects far in advance of failures.  Self-healing systems would then react to such faults and help greatly extend the life of an asset.

The ability to self-heal and form a temporary bridge from fault to repair is not limited to electrical systems.  The big step will be the testing required before these new technologies can come to market.


A FEW FINAL THOUGHTS
As we advance in areas of electrical reliability, we need to be prepared for the impacts of the changes discussed here. For one, the information needed to properly perform prognostic and even predictive maintenance testing is more than what we required in the past.  Another is the impact of the present environment or specific operating context of electrical equipment and systems. Overall, a level of expertise is required to consider and manage new systems that change depending on the electrical stresses in an asset.

By the way, the student speaking out during the referenced panel discussion was wrong. It will not be a that we won’t need skilled technicians in the future, but that technicians will need advanced knowledge and understanding based on future job-skills requirements. Future electrical- and mechanical-reliability technicians will have levels of technical competency far beyond those of today.TRR



ABOUT THE AUTHOR
Howard Penrose, Ph.D., CMRP, is Founder and President of Motor Doc LLC, Lombard, IL and, among other things, a Past Chair of the Society for Reliability and Maintenance Professionals, Atlanta (smrp.org). Email him at howard@motordoc.com, or info@motordoc.com, and/or visit motordoc.com.



Tags:
reliability, availability, maintenance, RAM, electric motors, electrical systems, electrical equipment, power plants, power generation, wind turbines, wind energy, solar power, nuclear power, workforce issues, skills training, job training