Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
SYSTEM AND METHOD FOR MAPPING GEOSYNCHRONOUS REAL IMAGE DATA INTO IDEALIZED IMAGES
Document Type and Number:
WIPO Patent Application WO/1986/001592
Kind Code:
A1
Abstract:
A satellite imaging system (100) provides successive coregistered images from the perspective of an idealized reference position, such as the nominal geosynchronous orbit position of the satellite. The disclosed satellite (101) is spin-stabilized, and so includes a despun platform (107) which supports a communications subsystem (109), and a spun rotor (111) which supports a sensor (103). The communications subsystem transmits sensor data along a path (115) to a ground station (105). Linear interpolation of real image data as a function of satellite position and imager attitude provides the ideal images. The disclosed invention is also applied to a pointing sensor such as one employed on a three-axis stabilized satellite. The coregistered idealized images coregistered facilitate analysis of weather feature movements and enhance weather forecasting.

Inventors:
MACPHERSON DUNCAN (US)
Application Number:
PCT/US1985/001377
Publication Date:
March 13, 1986
Filing Date:
July 19, 1985
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
HUGHES AIRCRAFT CO (US)
International Classes:
G01C11/00; G01C11/02; G06T1/00; (IPC1-7): G01C11/02
Other References:
Information Processing 74, 5-10 October 1974, Amsterdam, (NL) R. BERNSTEIN: "Digital Processing of Earth Observation Imagery", pages 733-737, see paragraphs 2, 3, 3.3, 4.5
Download PDF:
Claims:
CLAIMS
1. What is Claimed is: A satellite imaging system for providing idealized images of a scene from the perspective of a reference position, said system comprising: a satellite nominally in geosynchronous orbit; an imager for producing image data representing said scene, said image data being in the form of pixel intensities and locations, said imager being located onboard said satellite; means for providing imager attitude and satel lite position determinations; and means for transforming said image data into an image representing said scene from the perspective of said reference position, said transforming means utilizing said attitude and position determinations.
2. The system of Claim 1 further characterized in that said reference position is the nominal geosynchronous orbit of said satellite.
3. The system of Claim 1 further comprising a remote station, and means for transmitting image data from said satellite to said remote station, said remote station including said transforming means.
4. A processing station for producing successive images of a scene from a predetermined reference position by transforming image data from an imaging system including a satellite nominally in geosynchronous orbit, said satellite including an imager for producing image data representing said scene, said image data being in the form of pixel intensities and locations, said system including means for providing imager attitude and satellite position determinations, said processing station comprising: means for transforming said data into an image representing said scene from the perspective of said reference position, said transforming means utilizing said attitude and position determinations.
5. The station of Claim 4 further characterized in that said reference position in the nominal geosynchronous orbit of said satellite.
6. A method for producing an idealized image of a scene, said method comprising: gathering pixel intensity data from a satellite with an imager; determining satellite location and imager attitude; defining a reference position at a predetermined longitude, latitude and altitude; and transforming said pixel intensity data into a scene image from the perspective of said reference position using said location and attitude determinations.
7. The method of Claim 6 further characterized in that said imager data is in the form of scan lines, each scan line including plural pixels, each pixel having an associated intensity, each scan line being at least one pixel high and plural pixels long, said method further comprising the step of: buffering said data so that multiple scan lines of data are stored at a given time.
Description:
SYSTEM AND METHOD FOR MAPPING GEOSYNCHRONOUS REAL IMAGE DATA INTO IDEALIZED IMAGES

BACKGROUND

The present invention relates to meteorological satellites, and, more particularly, to a system and method providing for accurate absolute location of idealized images and, therefore, precise coregistration of successive images.

Meteorological satellites support weather forecasting ' by providing continuous imaging of the earth from geosynchronous orbit. Each earth image indicates the location of identifiable weather features, e.g., clouds. Weather forecasting involves, in part, extrapolating weather feature movements, determined by comparing successive images.

The nominal geosynchronous orbit of a satellite is usually only approximate due to deviations in inclination, eccentricity, and/or period. As the relative positions of satellite and earth shift, so does the perspective of the satellite camera or imager. If, for example, the orbit inclination is not 0°, an image taken during a 20 minute period will be compressed or stretched on the North-South (NS) dimension. These perspective shifts introduce image distortion and location errors. As a result, successive images are not generally coregistered, making it difficult to separate the effects of feature movements and perspective changes.

With a 0.1° inclination the distortion has an root-mean- square (RMS) value of 95 μRAD if the satellite attitude is held fixed inertially. This is considerably in excess of a 14 μRAD limit due to all effects specified for future weather satellites by the National Weather Service (NWS).

These image distortions and errors in location of part or all of the image compromise the utility of the images by introducing errors in locating natural or artificial reference lines (grids) on the images, and by introducing errors in coregistration of successive images, making it difficult to separate the effects of weather feature movements from perspective changes. The present approach to coregistration is to examine a series of images and align them using ground landmarks. This is done after the images are collected, and is vulnerable to cloud obscuration. This approach " has been supplemented by extrapolating forward to avoid operational delays at the expense of degraded accuracy. This approach has proved to be cumbersome and results in unacceptably large errors.

A problem with computerized manipulation of completed images is the computational power required. Each image is constituted by on the order of 108 pixels which must be processed by the computer with resulting high demands. Clearly, computational recon¬ struction is very costly in processing power and storage capacity. Thus, there has heretofore been no practical approach providing for real time coregistration of successive satellite images with the accuracy required by current meteorological applications.

SUMMARY OF THE INVENTION

The present invention provides for near real-time construction of idealized images to permit coregistration of successive images. A satellite system includes a satellite nominally in geosynchronous orbit, an imager onboard the satellite, subsystems for determining satellite position and imager attitude, and a computer for producing an idealized image from the perspective of a reference position. The idealized image is constructed on an ongoing basis from real image data together with the position and attitude information.

The present invention is based on the recognition that the variations in satellite position and imager attitude are small enough to allow the use of simple linear interpolations to permit construction of an image from the perspective of a reference position. The reference position can be the nominal geosynchronous position of the satellite generating the real image data. Generally, the real image data is collected in the form of pixel locations and intensities. The pattern of pixels sampled depends on the detector arrangement and the scanning pattern of the imager. The present invention transforms the real data into an idealized image on a batch basis, so that data collected earlier in an image scan is corrected prior to the completion of the image. Since the data is processed on a batch basis, the amount of data being managed at any given time is greatly reduced. For example, where a scene is scanned line-by-line, only a small percentage of the total lines constituting the image need be in computer storage for transformation to the ideal coordinates.

The challenge in this processing is to develop a system and procedure simple enough to permit the voluminous image data to be processed at a real time rate and with acceptable computer storage requirements. This is achieved by performing complex calculations only once per line and making appropriate adjustments to the individual pixels.

Accordingly, the present invention provides for the near real-time production of a succession of co- registered ideal images using only a fraction of the computing power and memory required where entire images are to be manipulated. This implementation provides all imaging registration substantially better than required, for example, by the NWS for future weather satellites. Thus, the promptness and accuracy of weather forecasting are significantly enhanced.

BRIEF DESCRIPTION OF THE DRAWINGS FIGURE 1 is a schematic perspective view of a satellite imaging system in accordance with the present invention.

FIGURE 2 depicts the relationship between the collected real image data and the constructed ideal image produced by the system of FIG. 1.

DESCRIPTION OF THE PREFERRED EMBODIMENTS

A meteorological satellite imaging system 100, illustrated in FIG. 1, includes a satellite 101 with an imaging sensor 103 directed at the earth 99, and a ground station 105 for producing idealized images.

The illustrated satellite 101 is spin-stabilized, and so includes a despun platform 107 which supports a communication subsystem 109, and a spun rotor 111

which supports the sensor 103 along with its radiative cooler 113 used to cool the sensor's detector array. The sensor 103 includes visible and infrared detectors to provide real image data at multiple frequency bands. The communications subsystem 109 transmits sensor data along a path 115 to the ground station 105.

The ground station 105 includes a computer which transforms the received real image data into an image of the earth 99 as viewed from the satellite's nominal geosynchronous orbit position. Alternatively another reference perspective could be selected.

A portion of a grid 200 for an idealized image is shown in FIG. 2, with two segments of rows 260 and 270 of real image pixels superimposed thereon. The format of the idealized image is an array of pixels which are aligned and contiguous in both EW and NS directions.

The illustrated portion includes segments of five contiguous rows 210, 220, 230, 240 and 250; each row segment is shown with six contiguous pixels, e.g., 221, 222, 223, 224, 225 and 226. The full grid 200 is on the order of 11,000 rows of 11,000 pixels each. The rows 260 and 270 of real pixels are shown as staggered and overlapping. Within each row, adjacent real pixels are shown as contiguous. However, the present invention provides for the production of idealized images where adjacent pixels and rows are contiguous, overlapping or underlapping.

In general, the rows of real pixels will be oriented obliquely with respect to the idealized grid, as shown. Typically, the magnitude of the angle between the idealized rows and the real rows in such that a real row would cross no more than four rows of the idealized grid over its 11,000 pixel length. However, the obliqueness is exaggerated in FIG. 2 for expository purposes.

To form the desired idealized image, an intensity must be assigned to each of the pixels in the idealized grid. This is done by computing a weighted average of the intensities of the overlapping real pixels. For example, the intensity to be assigned to the ideal pixel 222 at the second row 220 and second column 280 of the illustrated portion of the idealized grid can be computed from the intensities of the four real pixels 262, 263, 272 and 273 which overlap the ideal pixel. The contribution of the real pixel 262 is its intensity multiplied by the ratio of the area of overlap between that real pixel 262 and the ideal pixel 222 to the area of the ideal pixel 222. The contributions of the other three real pixels 263, 272 and 273 are calculated in a similar fashion. The contributions are summed to obtain the intensity to be assigned the ideal pixel 222.

The challenge is to assign intensities on an on¬ going row-by-row basis to minimize delay and computational requirements. For this reason the implemented procedure is mathematically equivalent to the above described process but implements the calculations in a different, more efficient sequence.

The present embodiment processes pixels in batches (of the order of 30,000 batches per image). Within each batch the intensity interpolations are performed first in the EW direction for all pixels, then in the NS direction.

Thus, intensities can be assigned to the idealized grid as real pixel data is received, with only a small delay on the order of a few seconds. The view in the idealized image (except for points near the limb) is substantially that which would be obtained from a satellite at exactly the desired longitude and exactly 0° lattitude, with no attitude errors.

Since image intensities in the ideal image are derived from linear interpolations of intensities in the real pixels, the interpolation process introduces no error if scene intensity varies linearly. The interpolation error is always small compared to the effects of errors in knowledge of real pixel location.

In the illustrated embodiment, each line contains . eight rows of real pixels. The specific procedure for each line of real pixels follows. The line is read into a seven line wrap-around buffer. As a new line is read in, it replaces the oldest input line currently in memory. Seven input lines suffice to allow correction for all attitude and orbit error contributions to image distortion. The required EW real pixel adjustment due to spacecraft location, spacecraft attitude, and spin phase error is calculated (ΔM integer pixels + _M fractional pixel, both constants). The required NS real pixel adjustment due to spacecraft location, imager mirror position, and encoder position is calculated (ΔF, a constant). The required NS real pixel adjustments due to spacecraft attitude (a Taylor series in EW location) is calculated (ΔV, one point of this vector is calculated for every 8 horizontal pixels in the line) . Two ideal output lines are buffered. One line is calculated while the other is available as data output, For each ideal output line, the following steps are performed: 1) the ideal output pixels are located in real coordinates; 2) the ideal pixel intensities are obtained by interpolating EW and NS between the four appropriate real pixel intensities; and 3) the line of pixels is gridded (if desired). Note that the grid points are always at the same ideal pixel locations and the gridding process is trivial.

The computational process is detailed below for the visible image. Infrared image processing is similar but less computationally demanding (the pixels are larger in size and fewer in number) . Each line of visible image data has 8 pixels NS and (up to) 11000 pixels EW. The largest image has about 1400 lines of data. To ensure that the data is available to generate the ideal image, each border of the ideal image is 24 pixels closer to the image center than the real image.

Herein, pixel variables are defined as follows: d is the detector number in the line (1 d 8); Q is the ideal line number (4 _< Q 1399); q is the real line number d <. q <.1402); N is the ideal NS pixel location (N = 8(Q-1) + d) ; n is the real NS pixel location (n = 8(q-l) + d) ; M is the ideal EW. ixel location (25 M _< 10976); m is the real EW pixel location d <. m 11000); L is the real NS pixel location adjusted for interpolation;

PR is real pixel intensity;

Pg is real pixel intensity with EW alignment; and

Pj. is ideal pixel intensity.

Initialize calculations for each q as follows. Calculate the EW offset of real pixels relative to ideal pixels due to spacecraft location, spacecraft attitude and spin phase error = ΔM whole pixels and a pixel fraction _M. Calculate the NS offset of real pixels relative to ideal pixels due to spacecraft location, imager mirror position and encoder position = ΔF pixels (not necessarily in integer) . Calculate the constants KQ, ^ , K2 used to represent the offset of the real pixels relative to the ideal pixels due to spacecraft attitude where the representation is ΔV(M) = K 0 + K^M + K 2 M 2 pixels.

Calculate the NS pixel locations adjustments ΔV(M) = K 0 + K j^ M + K 2 M 2 every 8th value of M and used for 64 pixels (8EW x 8NS) .

Ideal output lines are generated as follows. Divide the ideal output line into blocks of 8 x 498 pixels each (maximum of 22 blocks for largest image) . For each ideal block of this type, find a block of real pixels 12 x 500 such that the area defined by the ideal block is completely contained within the real block.

EW alignment of the real input pixel locations is performed according to the equation:

P En , M = δM <P Rn , m ) + (1 - δM) (P Rn , m + i)

where m = (M + ΔM)

To compute the NS interpolation coefficient for each idealized pixel (M and N) , find n where

L n < N < L n +ι and L n = n - (ΔF + ΔV)

If H is not integer (pixel locations n and n+1 8 in same line)

Z = (N - j -)

If H is an integer (pixel locations n and n+1 in 8 different lines)

Z = (N - Ln) (L n+ ι - L n)

The value calculated for Z is valid for 8 consecutive horizontal pixels. It is also valid for the 8 adjacent pixels in pixel location N + 1, N + 2, etc., until the boundary between lines in the real data is encountered. Finally, calculate the ideal pixel intensity:

Pin, m - Z ( PEn, M> + " Z> < p En + 1, M>

The output consists of 1 line (8 x 10976) of intensities for idealized pixels.

In the illustrated embodiment, seven registers are used, and each registered is filled in 0.8 seconds with one scan line of data. Thus, in any image, the first line of ideal pixels is delayed 5.6 seconds from the generation of the first line of real pixels.

Following this a line of ideal pixels is generated for each line of real pixels until the real pixels are complete. One line of ideal pixels is generated in the next 0.8 seconds to complete the ideal image. In accordance to the foregoing, the present invention provides for the transformation of satellite image data into a scene as viewed from an reference perspective, which can be the satellite nominal position in geosynchronous orbit. By rendering in this idealized perspective, successive image are automatically co¬ registered so that accurate comparisons, and thus a more reliable forecast, can be made.

The invention is applicable to pointing sensors as well as spinning sensors, and to three-axis stabilized satellites as well as spin-stabilized satellites.

These and other modifications and variations are within the scope of the present invention.