Machine Vision: How to see the World | Avnet Silica

Display portlet menu

Machine Vision: How to see the World | Avnet Silica

Display portlet menu

Machine Vision: How to see the World

-

Machines aren’t just becoming smarter every day – they are also developing the ability to see the world around them. In fact, artificial eyes can penetrate to greater depths and pick out much more detail than our weaker human eyes can ever hope to see. And yes, they can see in the dark, too. For IoT applications, this could be a real eye-opener.

Light detection and ranging (Lidar) is a remote sensing method that uses pulsed laser light to measure ranges (variable distances). Similar to its close cousins, radar using radio waves and sonar emitting sound waves under water, Lidar uses laser light to accurately measure ranges (distances) – something that is critical for many sectors and industrial processes. Theoretically, measurement can be performed using other technologies, but Lidar has multiple advantages over them. For example, it has a higher resolution than radar and a wider range than camera sensors and can even perform in the dark. Furthermore, Lidar produces 3D data and can detect and differentiate objects. As a result, it can be used to track objects, detect physical protrusions and survey landscapes. Measuring carbon dioxide, sulfur dioxide and methane in the air or water can also be achieved with Lidar, making it ideal for things like geographical surveys, autonomous driving, industrial applications and logistics.

 

The Face of the Earth

Geographical surveys are not just for mapping mineral and water resources, they have been used to map zones prone to earthquakes, tsunamis, landslides, flooding and volcanic activity to deal with potential disasters. In addition, coastlines need to be surveyed for effective management and planning for navigational, environmental and homeland security purposes.

 

Lidar is helping us understand sea-level rises and coastal flooding/inundation impacts, and marine life habitat mapping.

Jon Stine, Enterprise Sales GM, Retail at Intel

 

As global warming raises sea levels and causes extreme weather events like enormous floods and droughts, high-resolution geographical surveys will be crucial for monitoring the effects of global warming, such as coastline erosion. This information can help governments to devise coping strategies, such as safe and responsible land use and emergency preparedness, to protect civilian lives and marine wildlife.

Monitoring the Warming: A view looking north-east from Virginia Key shows the topobathy-metric surface of the intertidal zone near Fisher Island. This image was created from the LiDAR bare earth model colored by elevation. (source ©: NOAA)

The National Oceanic and Atmospheric Administration (NOAA), an agency primarily responsible for mapping US shorelines, sought to increase the efficiency and reduce the subjectivity of older technologies, such as tide coordinated aerial photography. Data collection in intertidal zones was also poor because these areas are too shallow for survey vessels to approach safely. NOAA now uses airplanes and helicopters equipped with two types of Lidar: topographic and bathymetric. Topographic Lidar typically uses a near-infrared laser to map the land, while bathymetric Lidar uses water-penetrating green light to measure seafloor and riverbed elevations. The two images can be combined to produce a single ‘topobathymetric’ image of the coastal area.

 

Driving Force

How do we make sure autonomous cars and trucks pay proper attention and avoid disasters in near darkness, pouring rain, heavy snow or on a winding road? Lidar’s high speed and accuracy enable distant object detection on a moving vehicle, even in low light. This makes it a powerful technology for advanced driver-assistance systems (ADAS) in drones and vehicles. It performs less well in heavy rain and snow or in fog but work is in progress to find ways to improve its reliability in all but the worst conditions.

Earlier Lidar technologies were typically used in applications with a high tolerance for failure. In contrast, the bar is much higher for safety, accuracy, and reliability when using it in autonomous vehicles. Products need to be designed from scratch to meet the 3D sensing requirements of automotive active safety systems, ADAS implementations and autonomous vehicles. In addition, solutions must be small and unobtrusive which is a challenge for Lidar.

Aerial Mapping Yellow: Scan uses a mix of drone-mounted passive and active laser sensors to create 3D maps for mining, civil engineering, forestry, environmental research, and archeology.

 

The latest high-resolution 3D systems, such as MicroVision’s Long Range Automotive Lidar, meet stringent 3D sensing requirements. They are compact and nonintrusive, with fast frame rates and can characterise an environment more accurately with a higher level of information analysis. These features enable vehicles to respond more rapidly to obstacles and avoid them.

 

Making Things Work

Industrial applications for Lidar include things like manufacturing and robotics automation. Systems, such as Baraja’s Spectrum-Scan, are being deployed in mining vehicles to map locations and visualise environments. It can also help improve worker safety in mines and caves.

 

Vehicles need to see further out and understand more about what they see, such as which objects are moving and where are they heading.

Jari Honkanen, MicroVision

 

Mines are often located in remote areas and underground, where pockets of lethal gases tend to accumulate. Methane is explosive, carbon dioxide and carbon monoxide are lethal to humans at high concentrations, and hydrogen sulfide is poisonous and highly flammable. Therefore, using laser light to detect gas pockets allows miners to deal with them promptly, keeping them safe and making their time in the mine more efficient.

Loggerhead Lighthouse: This image is a topobathy-metric point cloud colorised by intensity. This image was captured during a survey of the Dry Tortugas, and is the lighthouse on Loggerhead Key. (source ©: NOAA)

Just as Lidars image the 3D topography of the environment for autonomous vehicles, they can be used above ground to detect unwanted gas concentrations during gas mapping. For example, a methane leak can be detected by pointing a laser beam of a specific color at the suspected location or along a survey line. Some of the laser light is absorbed by the target gas and the rest bounces back as a diffused beam. The Lidar receives the reflected beam, measures the light absorption and calculates the density of the leaking methane. This process generates geo-specific images of gas plumes that can be combined with satellite and other data to create a map illustrating the leak’s extent and location. The GPS coordinates of the map can quickly direct a maintenance crew to the leak. So far, ExxonMobil and SoCal-Gas have used Lidar gas-mapping technology to detect leaks.

A Sentry in the Sky: At least 2 percent of gas resources is wasted through leaks of methane. A drone-based laser-based Remote Methane Leak Detector (RMLD) by Physical Sciences might solve the problem. (source ©: Bridger Photonics)

 

What Will the Future Bring?

In all cases, high-definition cameras supplement and complement the other systems to add visual context. The color cameras also provide the ability to monitor traffic lights and read road signs, which is beyond the other three sensors. This is useful where emergency signs have been put out by emergency services or for roadworks that may not be mapped on satnav systems. Lidar sensors are being developed that are even more miniaturised and lighter so they can be used for unmanned aerial systems, such as drones performing geographical surveys. Meanwhile, reducing costs, increasing reliability and enhancing data resolution will be common goals for the next generation.

Lidar systems can accurately map and visualise surrounding hazardous environments in-creasing safety and efficiency.

Jim Kane, Baraja Industries

 

We are also likely to see further democratisation of Lidar applications. For example, the low-performance 3D Lidars appearing in smartphones are likely to be replaced by ones with higher resolution and processing. In short, Lidar will be more compact, more capable and lower cost.

An Eye on the Road
Scientists have approached the challenges of determining range, angle and velocity in two ways: using radio waves and using laser light. Lidar, radar and sonar each have their benefits and deficiencies in autonomous vehicles. Google’s spin-off company Waymo and Aurora, which took over Uber’s ill-fated project, both use Lidar but have radar for support in foggy, rainy or snowy conditions to assist Lidar’s less-reliable results in extreme weather. Tesla uses ultrasonic sonar to map the environment and a front-facing radar.

■ Radar
Radio detection and ranging, coupled with hi-res cameras, is an approved way for providing machines with ‘sight’. In 1886, German physicist Heinrich Hertz demonstrated that radio waves are reflected by metallic objects. British scientists, spurred on by the militarization of Germany, used this property to develop radio-based detection and ranging in 1935. Independently, Germany, the USA and Russia also developed radar systems in the following year or two. These systems were all designed to detect huge hunks of metal at sea or in the air and in the years since then refinements have been made to finesse the detection capabilities to include all manner of objects, including humans.

■ Lidar
In 1960 the invention of laser technology created a flurry of research activity to find uses for the powerful light beams. In 1961, a US scientist named Malcom Stitch, who worked for the Hughes Aircraft Corporation, introduced the first functional Lidar system but it wasn’t until the arrival of a commercially available global positioning system (GPS) using satellites and inertial measurement units (IMUs) in the late 1980s that accurate positional data became available. Lidar’s reach is constantly increasing. Currently, Aeva, based in Silicon Valley, is selling its 4D Lidar system on-a-chip which has a range of 500m.

Learn more about Artificial Intelligence & Machine Learning here

READ MORE ARTICLES

Machine Vision: How to see the World | Avnet Silica

Display portlet menu

Sign up for the Avnet Silica Newsletter!

Stay up-to-date with latest news on products, training opportunities and more!

Take a DEEP look into the future!

Get the latest market trends and in-depth trainings on our Digital Event Experience Portal!

Avnet Silica Design Hub

Browse and review hundreds of proven reference designs to accelerate your design process. Our designs can be modified and saved in our AVAIL design tool and then exported to your CAD tool of choice.

Machine Vision: How to see the World | Avnet Silica

Display portlet menu
Related Articles
A cnc machine is milling a piece of metal
Explore options for choosing an optical rotary encoder for motion control and position sensing
July 24, 2023
Rotary position sensing is used in motor control and user interface panels. It comprises various encoding methods. Optical encoders can offer advantages. We look at the options and how to choose the right one for your application.
three electric motors on glass table
How are magnetic rotary encoders used in industrial automation?
July 24, 2023
The Hall effect is used in rotary encoders to monitor position, speed and direction in electric motors and rotary dials. The encoder designs vary, so it is helpful to understand the main features and how they affect performance.

Machine Vision: How to see the World | Avnet Silica

Display portlet menu
Related Events