USDA ARS ARonline Magazine

United States Department of Agriculture

AgResearch Magazine

ARS Home l About ARS l Contact ARS
AR Research Magazine

“Eye-in-the-Sky” Made More Useful to Farmers

A Water Deficit Index shows crops that have ample moisture (blue), crops that need irrigation (yellow), and bare, dry soil (orange).
Based on remote sensing from satellite or aircraft along with meteorological data, the WDI shows crops that have ample moisture (blue), crops that need irrigation (yellow), and bare, dry soil (orange).
(K7510-2)

Information picked up by airplanes and satellites will someday help farm operators maintain healthy, high-yielding crops, with minimum use of irrigation water, fertilizer, and pesticides.

An Agricultural Research Service project now under way in Arizona is aimed at demonstrating how remote sensing can be used in farm management. It’s called MADMAC—Multispectral Airborne Demonstration at Maricopa Agricultural Center. The project’s team is analyzing data obtained from 15 airplane overflights at 3,900 and 7,500 feet above fields of cotton and other crops from mid-April to the end of September 1994.

“Satellites have been beaming data back to Earth for more than two decades,” says Thomas R. Clarke, a physical scientist with the U.S. Water Conservation Laboratory in Phoenix, Arizona. “But most of that information hasn’t helped farmers because no one could figure out how to interpret it.

“One important goal of the Arizona project is to develop ways to turn those numbers and readings into reliable, timely, and meaningful information,” Clarke says. “That will one day enable farmers to micromanage areas as small as a few acres within each of their fields. It will guide them to specific areas that need more fertilizer, irrigation water, or weed and insect control.”

Until recently, adds Clarke, researchers have had an overwhelming amount of data but no efficient way to use it. For example, they might obtain billions of bits of information about a 1,000-acre farm over a growing season. Their challenge was to translate all that into recommendations for farmers and farm management advisers.

Physical scientists M. Susan Moran and Jiaguo Qi, biologist Paul J. Pinter, Jr., and agricultural engineer Edward M. Barnes are also members of the ARS research team now working on analysis of the MADMAC data. They are in the Environmental and Plant Dynamics Research Unit at the Phoenix lab.

The airplane used in taking measurements was equipped with a frame holding four video cameras. Three of the cameras were filtered to allow each to receive only one kind of light on the recording tape—near-infrared, red, or yellow-green. The fourth camera measured the far-infrared energy that is related to surface temperature.

The goal of MADMAC is to combine these four measurements to create maps of crop growth and crop stress related to irrigation schedules, fertilizer applications, and weed and insect infestations.

To work, it is necessary to convert MADMAC digital video data into values of surface reflectance and temperature. So researchers placed specially coated 25- foot-square tarps on the ground during every aircraft overpass.

After each flight, they retrieved the digital numbers associated with the center of the tarps and computed a “normal” reading. Then the scientists calculated a correction factor from on-site recordings to use in calibrating the hundreds of reflectance and temperature measurements taken by the airborne sensor. These airborne measurements eliminated the need for ground personnel to physically record every overflight for every field, each day.

Physical scientist Susan Moran adjusts a fixed position four-camera monitoring device to take soil readings.
ARS scientist Susan Moran adjusts a fixed position four-camera monitoring device to take soil readings. Specially coated tarps in the background enable scientists to obtain baseline digital readings during aircraft or satellite observations.
(K7510-1)

On the ground, the team made up to 875 separate observations of crop and soil conditions during each overflight at the University of Arizona's Maricopa center, 20 miles south of Phoenix. Those observations, which included crop type, estimated plant height, growth stage, percent crop cover, soil surface texture and dampness, and presence of insects and weeds, were matched to the video images.

The advantages of video images for farm management are the fine spatial resolution—about 3 to 6 feet—and the potential availability of data immediately after the flight. For comparison, the spatial resolution of data from currently orbiting satellites is about 60 to 90 feet, and the data are not available to researchers for several days or weeks.

To provide timely, reliable maps of crop conditions from video images, the MADMAC engineers and scientists developed methods to process video images and provide information in a matter of hours after the flight. This automated processing included correction for effects of atmospheric conditions, misalignment of cameras, and aircraft motion.

Researchers noted that video images can have as much as a 40 percent variance in brightness, depending on the sun’s position in the sky and the camera’s viewing angle. The team developed a computer model to correct this phenomenon known as the bidirectional reflectance factor. The model corrects readings for all points in the picture and allows a comparison between pictures taken over the same fields at different times.

Because of aircraft motion and video camera optics, the horizontal lines in the video images are often offset, resulting in a zigzag pattern along field edges. This geometric flaw is complicated further because the shift is variable, and some shifts are required for less than a pixel—the smallest measurement unit. The team developed software to scan the images and make corrections automatically.

Misalignment of cameras and aircraft motion can cause band-to-band image offsets that diminish the accuracy and value of the video images. The errors resulting from these offsets are also corrected by software the scientists developed. From that, the team gets exceptional band registration for all bands, including the critical near-infrared radiation, and the processing time is only 15 minutes for 80 sets of the 3-band images.

"The processing techniques and farm management products resulting from the MADMAC experiment are useful for both aircraft-based cameras and upcoming satellite sensors," says Moran.

Major improvements in satellite-based technology are planned. For example, a new satellite system, expected to be in operation within 3 years, will provide crop updates as often as every 3 days. But cloudy days will still hide many fields from view, leaving those farmers who need daily guidelines in a bind. The Phoenix team is using computer models to bridge the information gap for the missed days.

The team also plans to merge the remotely sensed data with a decision support system that will help farmers decide when to begin certain farming operations, like insecticide or irrigation applications.

"The work here in Arizona complements work by our agency in Weslaco, Texas," says Pinter. "There, ARS' Remote Sensing Research Unit plays a prominent role in developing suitable equipment to record images from remote."

The Texans have also established spectral signatures of dozens of plant, soil, and water conditions that can be used to identify pest and nutrient problems on range and croplands. [See "Orbiting Eye Will See Where Crops Need Help," Agricultural Research, April 1996, pp. 12-14.]

"When we've succeeded, the system will be what scientists only dreamed of just 20 years ago," says Moran. -- By Dennis Senft, ARS.

Thomas R. Clarke and Paul J. Pinter, Jr., are based with the USDA-ARS Environmental and Plant Dynamics Research Unit, Phoenix, AZ 85040; phone (602) 437-1702; and M. Susan Moran is at the Southwest Watershed Research Center, Tucson, AZ

"Eye-in-the-Sky Made More Useful to Farmers" was published in the December 1996 issue of Agricultural Research magazine.

Share   Go to Top Previous Story