Condition monitoring of rotating or vibrating structures often involves a lengthy process of attaching sensors, gathering data and processing any collected information. But what if you could just see vibrations and motion? Recently developed computer analysis techniques use video to detect and measure vibration remotely, monitoring anything from a single component to a massive structure in one go.
A video camera takes a brief video of the machinery (or plant, or structure) to be inspected, while it is in operation. A laptop computer then processes the video – detecting motion between frames at every pixel – and outputs a modified version that exaggerates areas in motion. Even motion that is completely invisible to the human eye becomes clear.
The author has put together a YouTube playlist containing several videos
that show how motion amplification works. It is available at: www.is.gd/xukuyu
Dr Jeff Hay is CEO of Tennessee-based RDI Technologies and one of the inventors of what he calls ‘Motion Amplification’ (it’s a trademark): “Instead of thinking of our camera as a video camera, think of it as having 2.3 million data points. In the vibration measurement world, one to three sensors is typical – we give you a million times that.”
Keith Gallant, reliability consultant at RMS, a UK distributor of RDI’s systems, adds: “Once you’ve set up your camera, data collection takes about 30 seconds.” The next step is to select areas of particular interest. “You draw a reading of interests [ROI] anywhere on the screen and it’s like having an accelerometer on that spot,” says Gallant. These areas can then be analysed more closely, and displacement can be measured. And it can detect small motions; in a controlled environment, says Dr Hay, “we can see a deflection of 0.25 microns [micrometres] from about a metre”.
One of the most powerful features is frequency filtering: you can set it to only show motion happening within (or outside) a certain range of frequencies, to isolate the source of a particular vibration.
|"When you amplify and you've filtered it, that cleans up the image, and you can very easily see the problems. We say start wide and then move in. You can change your lens or move the camera and zoom in on the problem," Keith Gallant, reliability consultant at RMS|
As with any sophisticated measurement tool, training is vital. “If you buy a camera, you do two days’ training,” says Gallant, who adds: “An important part of that is practical experience, so we do that at an operating industrial site.”
The frequencies the system can detect are limited by the frame rate: if the camera operated at the typical 24fps (frames per second), the highest frequency of vibration that could be detected would be 12Hz. For this reason, RDI uses a high frame rate (HFR) camera that runs at up to 1,300fps, for a maximum frequency of 650Hz – plenty for most industrial applications. And motion amplification can deal with the lowest of subsonic vibrations: “You just take a longer sample of video,” says Dr Hay. Ten seconds of video allows you to see 0.1Hz vibrations, for instance.
Rotating machinery is the obvious application, showing issues where you might expect them – at the coupling between a motor and a pump, for instance, or where a mounting foot is loose – and where you might not, such as where a control box or housing resonates. It can also help position other types of sensors. “You can use our device to see where the nodes and antinodes are, to determine where to put a permanent sensor. Pipework has been an increasing trend,” says Gallant. “We’ve had most popularity with oil and gas and petrochemical industries, who are looking at asset integrity.”
|"This is a huge area because you can look at a complete set of pipework at once and instantly see which pipes are moving and which are not," Dr Jeff Hay, CEO of RDI Technologies|
RDI’s Motion Amplification system is not the only system of its type. MIT demonstrated an ‘Eulerian Video Magnification’ (EVM) or ‘motion magnification’ prototype in 2012, calling it “a microscope for temporal variations.” It was reported, at the time, that the researchers revamped portions of their work based on feedback, posted the code online for inventors and programmers and hoped to create a smartphone app.
Dragon Vision is a software product, available in the UK from CBM Partners, using ‘video deflection technology’ to provide vibration analysis. The firm says: “Thousands of vibration points are mapped in a single video and converted into vibration signals.” Unlike the RDI system, this uses video from any suitable camera; the firm has used an iPhone XS to give a resolution of 2.5 microns at 1m and 120Hz which, it says “is very useful for 95% of common machines".
The Dragon Vision system first detects motion,
showing this as a 'heat map' of the target.
Like motion amplification, areas of interest can be selected and analysed in greater detail, and particular frequencies emphasised or filtered out.
Dragon Vision can measure the frequency, amplitude
and phase of vibrations at selected points.
The IRIS M is RDI’s mainstream unit, a conventional-looking camera with interchangeable lenses to tailor the field of view, mounted on a conventional tripod. “We have built-in stabilisation software and we put vibration suppression pads under the tripod,” says Dr Hay. The IRIS M has HD resolution (for example, 1,920 x 1,080 pixels) at a frame rate of 180fps, or reduced resolution at up to 1,300fps. Hay adds that “camera sensors get better every day, and we benefit from that,” but “not all cameras are created equal… and we want to make sure that the customer has a good experience”.
The IRIS MX is a more advanced unit with a far higher frame rate – up to 1,400fps at full HD resolution, or 29,000fps at reduced resolution. And the MX can also be used like a conventional high-speed camera, to slow down or freeze motion – useful for inspecting rotating components or detecting belt slippage.
RDI also produces IRIS CM for continuous monitoring. This consists of three cameras connected to a central recorder/computer. “You can set it on a piece of machinery that’s erratic and troubleshoot it,” says Dr Hay. “It’s autonomous and triggered by an event such as a certain level of motion.” The system records 90 minutes in total, buffering 10 seconds before and after an event. “We also have a scenario when people want multiple views of an event — like a coastdown.”
So, what of end users? The author spoke to the head of condition monitoring at EDF Nuclear, who said: “We are growing ever more impressed with Motion Amplification technology. We have had three issues to date where it has proved invaluable in that it has prevented wrong decisions being taken for a repair strategy. It is also the first time where nobody has questioned the findings of our work when we are asked to investigate a plant fault.”
Keith Gallant of RMS says the motion amplification system complements existing vibration analysis tools, such as ODS (operation deflection shape) software, used to model equipment as it runs. “You could draw 20-30 ROIs (readings of interest), extract the time waveforms and export them into modal analysis software.” He adds that other firms are using the system at the design stage, to verify that real-life components behave like their simulated counterparts.
The uses are can be even more varied: EVM has demonstrated the pulsing of veins, the breathing of a baby and even the variations in shape of a larynx as somebody sings. One of the first applications was bridge inspection — looking at deflection as traffic moved over it. RDI also produces CableView, a system that is said to be completely non-contact, utilising a camera to measure the tension of guy wires.