Home » IoT Embedded Systems » Featured Stories » Hyperspectral Imaging – Basics and Agriculture Apps  

Hyperspectral Imaging – Basics and Agriculture Apps  

At the SPIE Photonics West 2015 event, Imec talks about the basics of hyper-spectral imaging and miniaturized applications in precision agriculture.

By John Blyler, Editorial Director

At the SPIE Photonics West 2015 event, I talked about commonplace visual and new spectral camera technologies and applications with Andy Lambrechts, Ph.D, hyperspectral imaging team leader at Imec – the international nanoelectronics research institute based in Leuven, Belgium. What follows is a portion of that interview. — JB

Blyler: Please remind the readers of the importance of hyperspectral imaging.

Lambrechts: Hyperspectral imaging is a combination of spectroscopy (visible light dispersed according to its wavelength by a prism) and visible imaging. Instead of just taking a picture and getting a color image, you obtain a spectral measurement for each pixel in the image in the scene. Hyperspectral imaging is used to find objects, identify material and detect processes.

Blyler: When you say “imaging,” do you mean using traditional red-green-blue (RGB) color models to display images in electronic systems?

Lambrechts: RGB models give you three measurement points which are very wide bands in the visible range that mimic what the human eye sees. However, for most industrial and medical applications, the information obtained by the human eye sees is not optimal. You need more information.

In the lab, spectroscopes can give you 1,000 measurement points from which you can create a very accurate spectral curve across all wavelengths for a single point. The hyperspectral camera combines these two technologies – spectrographic and visible images. You will still have a two dimensional image so you can recognize the scene. But for every point in the scene you’ll also have tens to hundreds of measurement points. Basically, you take the wide band RGB image and split the light from the visible or infrared range into many narrow bands. Those bands give you a different set of measurements which give you much more information about what you see.

Blyler: How is that result engineered?

Lambrechts: There are three different types of hyperspectral imaging sensor that have been built at imec – a line scan sensor, a snapshot mosaic sensor and a snapshot tiled sensor. Commercial hyperspectral cameras typically have a line mask which selects one line of the scene that is then split using a grating on a prism into the different wavelength bands. A sensor captures the three dimensional data set: two dimensions of the object and a third dimension of the spectral wavelengths (see Figure 1).  You map this three dimension data set onto your image and continue scanning the object. Other types of cameras use liquid crystal tunable technology, which are placed in front of the image sensor. The filters are sequentially switched in these tunable cameras so that you capture all of the different wavelengths.

Blyler: How does Imec created spectral image technology?

Lambrechts: We have used our material science knowledge and nanometer semiconductor manufacturing skills to make optical filters at individual pixels on a sensor. We have about 220 image sensors on the wafer. At design time, we can select which wavelength we want each individual pixel to see. We work with our customers to see which wavelengths are needed by their particular applications. Often the images can be scanned, as with a camera on a conveyor belt or from space on a satellite. Typically, a line scanner will give you the best performance for moving objects.

Blyler: Does the process node of the semiconductor wafer affect the performance of the image sensor?

Lambrechts: It really depends on the image sensor that we start from because in this case we aren’t developing the transistors. That’s why the process node doesn’t really matter. We’re incorporating optical film stacking on the wafers where the film thickness layers are measured in nanometers. The total thickness of the filter stack is about 1 micron. From the end user perspective, the camera with a normal visual image sensor looks the same as the camera with the spectral image sensor. The only difference is the way you interface the images to the software processor.

Blyler: This makes the camera very light-weight. What are some of the new and usual applications?

Lambrechts: A very hot topic these days is precision agriculture. You can monitor how the plants are growing and when they need more water or fertilizers.

Hyperspectral cameras are typically mounted on small flying drones or on tractors to inspect the crops. It’s a very hot topic because people are continuously trying to optimize their product while keeping below certain legislation regarding the application of fertilizer and other treatments to plants. If you can detect diseases early on in the field, then you don’t need to apply very aggressive chemicals. You can monitor crop health carefully enough to ensure the high organic and biologic conditions that more and more people want (in their food).

Many vendors at this show (SPIE Photonics West 2015) are using our hyperspectral wafer chips. For example, Bayspec has some really cool demos that use our sensors in handheld, battery operated field applications. They add motorized lens and touch screen technology to target industrial applications. This one incorporates a smart camera with a dedicated imaging pipeline the handle all of the image processing (see Figure 1).

Another example is Adimec. They are a company from the Netherlands that creates ruggedized cameras with very good correction technology for high-end applications like medical or security apps.

Figure 1: Examples of hyperspectral image cameras. (Imec booth/(SPIE Photonics West 2015)

Figure 1: Examples of hyperspectral image cameras. (Imec booth/(SPIE Photonics West 2015)

Blyler: So hyperspectral imaging adds a great deal of information beyond what you can see.

Lambrechts: Yes – you have a much higher discriminative power. For example, a dish full of geometrically shaped blue objects will just look blue to the human eye in the visible spectrum (390 to 700 nm).  But in the near infrared range, these same objects look very different and provide more information to be analyzed. (Editor’s note: The infrared spectrum has longer wavelengths than those of visible light, extending from the nominal red edge of the visible spectrum at 700nm to 1mm (300 GHz).]

Or consider a group of lego blocks. After scanning the legos, I can open a spectral viewer to reveal spectral response for every one of those blocks. If I click on one pixel you’ll get a particular wavelength or visible color. If you drop the mouse among multiple pixels you’ll see variations within the group (see Figure 2). You’ll get information about the different intensities and wavelengths, more information than just the color. From this information you can tell a lot about the object’s composition, density, etc.

Figure 2: Common objects reveal much more when viewed through a hyperspectral imaging camera. (SPIE Photonics West 2015)

Figure 2: Common objects reveal much more when viewed through a hyperspectral imaging camera. (SPIE Photonics West 2015)

Blyler: Let’s talk more about precision agriculture applications.

Lambrechts: Very accurate measurements of plant health can be obtained by using drones containing hyperspectral cameras to fly over crops fields to gain accurate measurements of plant health. Ximea is a company that put two of it’s cameras on a drone; one was a traditional RGB camera and the other was our 16-band multispectral camera weighing only 27 grams. The drone wouldn’t have been able to lift off the ground with a traditional hyperspectral camera.

Blyler: These nanoscale scale spectral devices must also be very low power.

Lambrechts: Some of them are battery operated that run for multiple days in the field.

Blyler: Tell us more about the agricultural applications.

Lambrechts: A lot of calibration is needed to correctly interpret hyperspectral data results for precision agriculture. What would a field with too much fertilizer look like in terms of its wavelength spectrum? Or a field that was too wet? Or that was inoculated with a known disease? These are critical agronomic issues. To answer these questions, various cameras were placed on an octocopter – an eight propeller drone – that included a line scanner type of hyperspectral camera. Images from the line scanner were stitched together while the drone flew over the field. The result was a very high resolution “picture” of the field (see Figure 3). In terms of the visual image, you can actually see the individual strawberry plants. The analysis of the spectral image was added to give more information. For example, a redder color corresponds to an increase in chloroform in the plant.

Companies have used our sensors in the field by using the outputs from cameras to compute a vegetation index to display certain plant parameters. This data is then recomputed into a more meaningful display for the end user to give them a course of action, such as adding less fertilizer, etc. A translation stage is needed to offer the information in a more useful way for the end user. You don’t want your farmers to first get a PhD in agro-precision theory.

Figure 3: Here are the results of hyperspectral imaging applied to a strawberry field. One way to measure chlorophyll content in crops is with the reflectivity-based Normalized Difference Vegetation Index (NDVI).

Figure 3: Here are the results of hyperspectral imaging applied to a strawberry field. One way to measure chlorophyll content in crops is with the reflectivity-based Normalized Difference Vegetation Index (NDVI).

Blyler: What about other applications, for example, in the medical industry?

Lambrechts: I talk about the agriculture because it is nice and people can related to it. It is also one of the few applications where we are not under an NDA.

Blyler: Thank you.

 

Great information delivered straight to your inbox

Leave a Reply

Your email address will not be published. Required fields are marked *

*