[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN111754392B - Dynamic imaging simulation method for high-resolution optical camera of Mars orbit device - Google Patents

Dynamic imaging simulation method for high-resolution optical camera of Mars orbit device Download PDF

Info

Publication number
CN111754392B
CN111754392B CN202010480409.3A CN202010480409A CN111754392B CN 111754392 B CN111754392 B CN 111754392B CN 202010480409 A CN202010480409 A CN 202010480409A CN 111754392 B CN111754392 B CN 111754392B
Authority
CN
China
Prior art keywords
image
model
simulation
imaging
dimensional
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010480409.3A
Other languages
Chinese (zh)
Other versions
CN111754392A (en
Inventor
刘世杰
童小华
金雁敏
贺钒
陈晨
刘大永
陈鹏
谢欢
冯永玖
王超
许雄
柳思聪
魏超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tongji University
Original Assignee
Tongji University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tongji University filed Critical Tongji University
Priority to CN202010480409.3A priority Critical patent/CN111754392B/en
Publication of CN111754392A publication Critical patent/CN111754392A/en
Application granted granted Critical
Publication of CN111754392B publication Critical patent/CN111754392B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/06Topological mapping of higher dimensional structures onto lower dimensional surfaces
    • G06T3/067Reshaping or unfolding 3D tree structures onto 2D planes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/89Radar or analogous systems specially adapted for specific applications for mapping or imaging
    • G01S13/90Radar or analogous systems specially adapted for specific applications for mapping or imaging using synthetic aperture techniques, e.g. synthetic aperture radar [SAR] techniques
    • G01S13/9021SAR image post-processing techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/001Texturing; Colouring; Generation of texture or colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Image Processing (AREA)
  • Studio Devices (AREA)

Abstract

The invention relates to a dynamic imaging simulation method of a high-resolution optical camera of a Mars orbit device, which comprises the following steps: s1, importing dynamic parameters of a track; s2, calculating camera orientation parameters and establishing an imaging model; s3, importing a three-dimensional model and a view position under the conditions of the position, the gesture and the illumination parameters; s4, generating a dynamic simulation image; s5, generating multispectral full-color images; s6, simulating various degradation effects: for the generated simulated image, the influence of geometric factors, optical device degradation factors and transmission distortion factors on the image is considered, an influence model is respectively built, the simulated image after various influences is sequentially calculated through pixel level traversal, and a final simulated degradation image is generated. Compared with the prior art, the method and the device dynamically generate the optical simulation image under various influence factors by considering various degradation effects such as satellite geometric offset, optical device degradation, transmission distortion influence and the like, and have the advantages of dynamics, tightness, simulation and portability.

Description

Dynamic imaging simulation method for high-resolution optical camera of Mars orbit device
Technical Field
The invention relates to satellite imaging simulation technology, in particular to a dynamic imaging simulation method of a high-resolution optical camera of a Mars orbit.
Background
The foreign remote sensing satellite imaging simulation technology starts earlier. As early as the middle of the 20 th century, the university of Arizona in America optical center established the first physical simulation system of space remote sensors in the world by arranging artificial light sources and targets in a laboratory to simulate the satellite's in-orbit flight and image acquisition process. By the 90 s, software simulating a remote sensing imaging system has begun to appear abroad due to the rapid development of computer technology. In 1995, NASA in the united states combined multiple generic software together developed a PATCOD platform that could be used for spacecraft analysis and design. In 2000, the remote sensing imaging simulation software DIRSIG developed by the institute of robusts engineering was engineered to simulate the image degradation and related optical MTF effects of a satellite imaging system caused by the scanning mode. In 2001, the german aerospace center born et al developed an end-to-end hyperspectral remote sensing system simulation kit SENSOR that has been successfully used in the related projects of the european space agency. In 2003, the French Toluz laboratory constructed a three-dimensional earth surface scene based on DART radiation transmission models and successfully simulated remote sensing imaging results under different atmospheric conditions and sensor responses. At present, the three-dimensional display platform of the earth, moon, mars and other extraterrestrial celestial bodies has map display software established by Google corporation, for example, google Moon displays the three-dimensional topographic map of the Moon, but only stays on three-dimensional observation and control, has less simulation on the imaging result of a remote sensing satellite, and can not output a simulation image according to the position of the simulation satellite.
Disclosure of Invention
The invention aims to overcome the defects of the prior art and provide a dynamic imaging simulation method of a high-resolution optical camera of a Mars orbit device.
The aim of the invention can be achieved by the following technical scheme:
a dynamic imaging simulation method of a high-resolution optical camera of a Mars orbit device comprises the following steps:
s1, importing track dynamic parameters: simulating the running orbit of the satellite, and acquiring the magnitude of the currently set error parameter, and the position parameter, the attitude parameter and the illumination parameter of the satellite at each moment;
s2, calculating camera orientation parameters and establishing an imaging model: calculating the internal and external azimuth elements of the simulated image at the moment through the acquired orbit dynamic parameters, and deducing an imaging model from the ground simulated three-dimensional coordinates to the two-dimensional image plane coordinates;
s3, importing a three-dimensional model and a view position under the conditions of the position, the gesture and the illumination parameters;
s4, generating a dynamic simulation image: generating a pair of dynamic two-dimensional simulation images in a pixel point traversing mode according to the imaging model through the acquired local three-dimensional ground model;
s5, generating a multispectral full-color image based on the generated two-dimensional simulation image;
s6, simulating various degradation effects: for the generated simulated image, the influence of geometric factors, optical device degradation factors and transmission distortion factors on the image is considered, an influence model is respectively built, various influenced simulated images are sequentially calculated in a pixel level traversing mode, and a final simulated degradation image is generated.
Preferably, the S3 includes:
the solar azimuth angle parameters of the understar point are obtained through the ephemeris and the orbit three-dimensional coordinates, the day and night conditions in the view field range are judged through the solar azimuth angle parameters, the brightness value in the view field in the night environment is weakened in the imaging process, and the planet day and night alternation condition is simulated.
Preferably, in the step S4, in the final stage of the generation of the analog image, an effect of the aperture parameter on the light entering amount needs to be simulated.
Preferably, the S4 includes:
setting a plurality of aperture parameter options, and adjusting brightness of an image derived by an imaging model according to different proportional coefficients to obtain analog image brightness information considering the influence of light quantity; according to the data type of the analog image, converting the brightness information into integer data, and performing the process of simulating overexposure on pixels larger than the maximum display range of the data type to generate a final two-dimensional analog image.
Preferably, the step S5 includes: based on the generated two-dimensional simulation image, the three-dimensional simulation image is divided into three single-channel images, the three single-channel images respectively contain information of red, green and blue frequency bands of the original image, the simulation is carried out on the infrared frequency bands, and the simulation is carried out on the full-color sensor, so that multispectral full-color images are generated.
Preferably, the influence model of the geometric factors includes: an attitude angle disturbance simulation model and a scanning line frequency desynchronization model.
Preferably, the influence model of the optical device degradation factor includes: a defocus effect model, a modulation transfer function simulation model and a sampling model.
Preferably, the influence model of the transmission distortion factor includes: sensor noise model, photoelectric conversion and quantization model, signal conversion transmission noise model.
Preferably, the defocus effect model includes: simulating the defocus effect by gaussian convolution; the sampling model adopts a uniform sampling model.
Preferably, the sensor noise model includes: simulating disturbance when the sensor processes the photoelectric signal by using Gaussian noise; the photoelectric conversion and quantization model includes: quantifying the gray level of the analog image, uniformly grading the discretization of the radiation intensity, electronically storing the radiation intensity sub-band of the ground object in a gray level value form of 0-255, and then reducing the gray level value of the corresponding band according to the spectral responsivity in proportion; the signal conversion transmission noise model includes: assuming that the distribution of the error codes is compliant with Gaussian distribution, impulse noise is added to binary data streams in image simulation to simulate the error codes.
Compared with the prior art, the invention develops a dynamic imaging simulation method of the high-resolution optical camera of the Mars orbit, and dynamically generates optical simulation images under various influencing factors by considering various degradation effects such as satellite geometric offset influence, optical device degradation influence, transmission distortion influence and the like, and has the following advantages:
1. dynamic properties: the invention can dynamically generate the high-resolution optical camera simulation image of the Mars orbit device according to the orbit attitude and other change data.
2. Tightness: the invention deduces the strict geometric model of imaging, simulates imaging according to satellite position, attitude and other data, and ensures the tightness of the generated image.
3. Simulation: the invention considers various degradation influence factors such as geometric deviation influence, optical device degradation influence, transmission distortion influence and the like in the satellite dynamic imaging process, and realizes the dynamic imaging simulation of the high-resolution optical camera of the Mars orbit device under the various degradation influence factors.
4. Portability: the method used in the invention is a universal photogrammetry method and a computer graphics method, and the input and output of the method are not strictly limited, and the imaging simulation of other planets or asteroids can be realized after simple program rewriting and data modification.
Drawings
FIG. 1 is a general flow chart of the method of the present invention;
FIG. 2 is a rotational relationship between coordinate systems involved in satellite sensor imaging;
FIG. 3 is a flutter-free image;
FIG. 4 is an image after adding a disturbance;
FIG. 5 is a graph of the result of increasing line frequency in turn;
FIG. 6 is a full color resulting image;
FIG. 7 is a graphical result of sequentially increasing defocus levels;
fig. 8 is a graph of a comparison of an ideal image (left) with a degraded image (right) with the addition of other degradation factors affecting image quality (where the solar altitude is 90 °).
Detailed Description
The invention will now be described in detail with reference to the drawings and specific examples. The present embodiment is implemented on the premise of the technical scheme of the present invention, and a detailed implementation manner and a specific operation process are given, but the protection scope of the present invention is not limited to the following examples.
In the dynamic generation process of the virtual optical image, each pixel of the linear array sensor is subjected to virtual imaging through a multithreading technology to generate an image row, and then the ground is continuously subjected to push-broom virtual imaging according to the sampling frequency of the camera to generate a continuous two-dimensional image. In the imaging process, information such as orbit parameters, attitude parameters, illumination conditions and optical parameters of a high-resolution camera of a circulator is called, scene information of different longitudes and latitudes on the surface of the planet is mapped to different coordinate positions of a two-dimensional image matrix according to a geometrical optical imaging principle, different heights correspond to different defocus degrees, and the scene information of different illumination conditions on the surface of the planet corresponds to gray values of a two-dimensional image, so that the full record of three-dimensional ground scene information is realized.
Two simulation models can be realized, one is based on a traditional photogrammetry mode, namely an imaging model based on a colinear equation, and the imaging model has the advantages of strict calculation process, but because the calculation amount of a beam method is too large and no mature programming framework is integrated, the simulation image generation time is long, and the dynamic property is difficult to meet the requirement. The other is a simulation model for converting a three-dimensional model into a two-dimensional image based on computer graphics, which has the advantages of mature programming frame integrated library and high operation efficiency, but has the defects that a three-dimensional model and a two-dimensional image have only a rotation matrix relationship, no backtracking of two-dimensional points to three-dimensional points is avoided, and the tightness is sacrificed. The method integrates the advantages of the two models, the pose, the speed and the disturbance of the pose and the speed on the sensor are simulated by using the traditional photogrammetry mode, and the result is converted into a two-dimensional image by using the computer graphics mode, so that the dynamic property required by the simulation is achieved, and certain precision is ensured.
The rigorous sensor model, i.e., the ground point, is rotated through a series of coordinate systems and then imaged at the focal plane of the sensor. The collinearity equation is the basic mathematical model of a rigorous imaging model, involving rotational relationships between multiple coordinate systems.
The three-dimensional points of the planetary surface are mapped to a two-dimensional image plane of the camera through the linear propagation of light and the conversion between different coordinate systems, and the relation is that:
the imaging process of a satellite sensor can be described by a series of rotations R of the spatial coordinate system, the general transformation of which is shown in fig. 2, a rotating orthogonal matrix of the sensor coordinate system to the object space coordinate systemCan be expressed as the product of rotation matrices between multiple coordinate systems:
in the method, in the process of the invention,
is a rotating orthogonal matrix from a planetary fixed coordinate system to a sensor coordinate system,
is a rotation orthogonal matrix from a satellite body coordinate system to a sensor coordinate system,
is a rotational orthogonal matrix from the satellite orbit coordinate system to the satellite body coordinate,
is from inertial system toA rotating orthogonal matrix of the orbital coordinate system,
is a rotating orthogonal matrix from a fixed coordinate system to an inertial system.
Satellite star coordinate system XYZ B And satellite orbit coordinate system XYZ O The relation between the two is:
in the method, in the process of the invention,
assuming that the position of the satellite in the planet inertial coordinate system at a certain moment is p and the velocity is v, the rotation relationship between the satellite orbital coordinate system and the planet stationary system can be described by the following rotation matrix:
next, the three-dimensional surface is mapped to the image space plane through a series of coordinate transformations according to the mapping relationship of the planetary surface three-dimensional coordinates and the image Fang Erwei image coordinates. In the simulation process, the two-dimensional correspondence of the three-dimensional scene is the multiple transformation from the three-dimensional object system to the two-dimensional screen system. These transformations mainly include:
model transformation (Modeling transformation): the object local coordinate system is subjected to rigid transformation and is converted into the world system.
Visual transformation (Viewing transformation): the effect of the visual transformation is also a rigid transformation, the origin changing from the world origin to the point of view, called the eye coordinate system, which is also a local coordinate system. The matrix is the same as above.
Projective transformation (Projection transformation): the camera system is transformed into the normalized device coordinate system (camera system with coordinate scale (-1, 1)) by a non-affine complex transformation. This also involves homogeneous spatial transitions.
Viewport transformation (Viewport mapping): the normalized device coordinate system is converted to a screen two-dimensional coordinate system. For satellite imaging simulation, i.e. an image plane two-dimensional coordinate system.
In order to ensure the dynamic performance and improve the calculation efficiency, a computer graphics method is used for carrying out imaging simulation of dynamic images of the push-broom sensor.
The method realizes modeling of the camera optical system and related link simulation, realizes simulation of the imaging process of the detector, generates an image, and adds degradation effects on the image caused by errors including flutter, solar altitude, orbit parameters, attitude parameters, camera optical parameters and the like.
As shown in fig. 1, the method comprises the following steps:
s1, importing track dynamic parameters:
and simulating the running orbit of the satellite, and acquiring the magnitude of the currently set error parameter, and the position parameter, the attitude parameter and the illumination parameter of the satellite at each moment.
S2, calculating camera orientation parameters and establishing an imaging model:
and calculating the internal and external azimuth elements of the simulated image at the moment through the acquired orbit dynamic parameters, and deducing an imaging model from the ground simulated three-dimensional coordinates to the two-dimensional image plane coordinates according to a collineation equation and a transformation relation among the coordinate systems.
S3, importing a three-dimensional model and a view position under the conditions of position, gesture and illumination parameters:
the solar azimuth angle parameters of the understar point are obtained through the ephemeris and the orbit three-dimensional coordinates, the day and night conditions in the view field range are judged through the solar azimuth angle parameters, the brightness value in the view field in the night environment is weakened in the imaging process, and the planet day and night alternation condition is simulated.
S4, generating a dynamic simulation image:
and generating a pair of dynamic two-dimensional simulation images in a pixel traversing mode according to the imaging model through the acquired local three-dimensional ground model.
In the final stage of the simulated image generation, the influence of the aperture parameters on the light input amount needs to be simulated. In an image directly calculated using an imaging model, pixel brightness information depends only on the initial orthophoto layer brightness and the illumination in the field of view, while different amounts of incoming light are linear with respect to the brightness of the final image. Therefore, a plurality of aperture parameter options are set, and the brightness of the image derived by the imaging model is adjusted according to different proportionality coefficients, so that the analog image brightness information considering the influence of the light incoming quantity can be obtained. And finally, converting the brightness information into integer data according to the data type of the analog image, and performing the process of analog overexposure on pixels larger than the maximum display range of the data type to generate and display the final analog image.
S5, generating multispectral full-color images:
the primary difference of imaging spectrometers carried on each satellite is that the spectrum sensing modes of the CCDs are different, and currently, full-color CCDs and multispectral CCDs are mainly used in sensors carried on deep space satellites.
Based on the generated two-dimensional ideal image, the two-dimensional ideal image is divided into three single-channel images, the three single-channel images respectively contain information of red, green and blue frequency bands of the original image, and the infrared frequency bands are simulated through the following empirical formula and the full-color sensor is simulated.
Wherein R, G, B represents the red, green and blue bands, respectively, and PAN and NI represent the analog panchromatic and near infrared bands.
S6, simulating various degradation effects:
for the generated simulated image, the influence of geometric factors, optical device degradation factors and transmission distortion factors on the image is considered, an influence model is respectively built, various influenced simulated images are sequentially calculated in a pixel level traversing mode, and a final simulated degradation image is generated.
The simulated three-dimensional planet ground surface scenery is projected to the imaging focal plane of the detector through a series of link links. The ground feature information originally consists of continuous radiation information and geometric information, and is changed into a series of discrete binary information after energy conversion from photons to electronic pulses. In this process, the comprehensive effects of optics, electronics, information transmission, geometric imaging and the like are faced. The series of factors are divided into a geometric influence model, an optical device influence model and a transmission distortion influence model according to properties, and the geometric influence model, the optical device influence model and the transmission distortion influence model are subjected to simulation modeling.
Geometrical influence: when the satellite runs on the orbit, the design model is an elliptical orbit, but the deviation is estimated due to external factors such as attraction force, heat and the like and errors of a mechanical structure. Therefore, factors such as disturbance of attitude angles, desynchronization of scanning lines and the like are simulated.
Influence of the optics: when the CCD sensor works, the whole set of links for converting photons into current can be influenced by various physical factors, such as inconsistent responsivity of each wave band of spectrum, internal noise caused by an amplifier, and the like. Meanwhile, due to the fact that the camera is out of focus and the like, the generated image can also generate blurring caused by corresponding information loss.
Effects of transmission distortion: in the transmission process of binary electronic pulse information, the influence of various electronic and informatics factors can be met, so that the random information loss, namely noise phenomenon, of the image is generated.
For the preliminarily generated simulated images, the influence of the three main factors on the images is considered, influence models are respectively built, the simulated images after various influences are calculated in sequence in a pixel level traversing mode, and a final simulated degradation image is generated.
The degradation effect is a phenomenon that an image is affected by objective condition constraint and image quality is reduced compared with an ideal image. The three-dimensional planet ground surface scenery is simulated, the three-dimensional planet ground surface scenery is projected to the imaging focal plane of the detector through a series of link links, the ground feature information originally consists of continuous radiation information and geometric information, and the ground feature information is changed into a series of discrete binary information after energy conversion from photons to electronic pulses. In this process, the comprehensive effects of optics, electronics, information transmission, geometric imaging and the like are faced. The series of factors are classified into transmission link degradation, optical element degradation effect, and geometric factor degradation effect by properties, and are simulated separately as follows.
The geometrical factor degradation effect simulation comprises the following steps:
(1) Attitude angle disturbance simulation
The satellite flutter is supposed to change according to a certain sine function in the three attitude angle directions, namely, at the time t, the yaw angle phi and the pitch angle theta of the satellite, and the rolling angle phi accords with the following formula:
ψ=A ψ sin(ω ψ t+α)
θ=A θ sin(ω θ t+β)
the image shift of the satellite in three directions is:
δ yaw =Δs·ψ
δ pitcg =f′·θ
δ roll =f′·Φ
where Δs is the distance from the image point to the principal point and f' is the equivalent focal length.
(2) Desynchronization of scan line frequency
The scan line frequency of a push-broom satellite depends on the relative movement speed of the satellite with respect to the satellite surface. When the line frequency of the satellite is not matched with the speed of the point below the satellite, the phenomenon of desynchronization of the scanning line appears, which is represented on the image as stretching (when the line frequency is too large) and compressing (when the line frequency is too small) of the ground scenery on the satellite. Specifically, the line frequency should be determined by the following formula:
wherein HSF is the line frequency of push-broom satellite imaging, D μ Is the size of a single pixel of the imaging sensor, μ is the equivalent focal length, SUB is the linear velocity of the undersea point, and height is the undersea point to satellite height. In the simulation, the latter two parameters can be settled by the satellite orbit settlement position and the planetary parameters to obtain an approximate value. Meanwhile, the frame pair type imaging field angle corresponding to the ground distance swept by the push-broom type satellite per second is as follows:
the orbit of a satellite is often elliptical, with closest and farthest points, and the corresponding understar points are often near the poles, with ground resolution differences of several orders of magnitude. Therefore, in the whole orbit simulation, it is often necessary to judge the orbit height and the speed of the satellite lower point, and dynamically change the line frequency so as to avoid the desynchronization of the satellite scanning line frequency.
The optical element degradation effect simulation includes:
(3) Defocus effect model
When light passes through the imaging center without being collected on the image plane, but before and after it, the final imaging will exhibit a blurry effect, called defocus effect. In a passive imaging model, a large number of light rays reflected by the surface of an object are densely planar, and can be decomposed into coincidence of light signal functions of a large number of light sources in two dimensions. Thus, it can be simulated in two-dimensional convolution.
Due to the randomness of the light distribution, the defocus effect is modeled mainly by gaussian convolution:
wherein I and L respectively represent an out-of-focus blurred image and a clear image,is a convolution symbol, N is random noise, and k is a fuzzy kernel. In defocus blur, the blur kernel is often seen as a gaussian model:
where (x, y) is the position of an image point in the image, σ is the standard deviation, and σ is the blur order. The smaller σ, the closer the image is to the ideal image. In real photography, σ is proportional to the speckle radius R, i.e., σ=k·r is a constant that can be pre-calibrated for defocus.
In the simulation process, the fuzzy core is regarded as a disc model or a Gaussian model, and the standard deviation of the disc radius or the Gaussian model corresponding to each pixel is estimated, so that the fuzzy core is obtained. By setting different sigma values, images with different degrees of focus can be obtained.
(4) MTF simulation
The modulation transfer function (Modulation Transfer Function, MTF) is a general degradation indicator of the system. The modulation transfer function measures the amplitude response of the system to the sine wave input and is used primarily to analyze some specific imaging systems, and determines the resolution of the imaging system for target details. It is not limited to only a specific physical process or transmission module in the whole link but is evaluated by the effect of the whole medium on the final contrast of the image. Since the remote sensing system can be decomposed into a spatially invariant linear system, it can be expressed by the following formula:
i.e. the spectrum of the final image is equal to the product of the MTF and the input image spectrum. In the simulation, we directly simulate the MTF degradation effect of the system overall.
For an optical system, the fourier transform of its Point Spread Function (PSF) is called the optical transfer function (Optical Transfer Function, OTF), and the MTF is the modulus of the OTF that reflects the system's ability to transfer frequency domain information. Describing the bright-dark contrast of an image, the modulation degree can be expressed as:
wherein M is the modulation degree at the frequency, I max For the highest light intensity, I min Is the lowest light intensity.
The modulation transfer function is a function curve obtained by traversing modulation degrees corresponding to all frequency points in a frequency range.
During the simulation we transform the image into fourier space and then simulate the degradation effect of MTF with gaussian low pass filtering.
The fourier transform of a two-dimensional image is given by:
where F (x, y) is an image of size m×n, and F (u, v) is an image converted into fourier space.
And after Gaussian filtering, performing inverse Fourier transform to obtain an image with simulated MTF degradation effect.
(5) Sampling model
Sampling is the process of converting a signal (a continuous function in time or space) into a sequence of values (a discrete function in time or space). According to the sampling theorem, if the highest frequency of the signal is guaranteed to be less than half the sampling frequencyThe original signal can be restored. Because we use three-dimensional digital model to simulate the ground for satellite remote sensing simulation, the most basic model of uniform sampling model, i.e. average segmentation of the image plane of f (x, y) into M x N minimum units, and taking the signal at the center of each unit, can be used as the sampling model in image simulationAs a result of the sampling.
In order to ensure that the image resolution can be restored to the three-dimensional digital model used in the sampling process, the pixel ground resolution of the imaging image is required to be ensured to be larger than the DEM and the Mosaic image used for manufacturing the digital planetary and local high-resolution three-dimensional model. The sensor ground resolution can be calculated using the following formula:
wherein sigma is ground resolution, d 0 Is the pixel diameter, f is the focal length, and H is the altitude.
In the final stage of the simulated image generation, the influence of the aperture parameters on the light input amount needs to be simulated. In an image directly calculated using an imaging model, pixel brightness information depends only on the initial orthophoto layer brightness and the illumination in the field of view, while different amounts of incoming light are linear with respect to the brightness of the final image. Therefore, a plurality of aperture parameter options are set, and the brightness of the image derived by the imaging model is adjusted according to different proportionality coefficients, so that the analog image brightness information considering the influence of the light incoming quantity can be obtained. And finally, converting the brightness information into integer data according to the data type of the analog image, and performing the process of analog overexposure on pixels larger than the maximum display range of the data type to generate and display the final analog image.
Based on the generated two-dimensional ideal image, the two-dimensional ideal image is divided into three single-channel images, the three single-channel images respectively contain information of red, green and blue frequency bands of the original image, and the infrared frequency bands are simulated through the following empirical formula and the full-color sensor is simulated.
Wherein R, G, B represents the red, green and blue bands, respectively, and PAN and NI represent the analog panchromatic and near infrared bands.
The transmission link degradation effect simulation includes:
(6) Sensor noise model
In the sensor processing the photoelectric signal, it may be affected by broadband gaussian noise from many natural sources, such as thermal vibration of atoms in the conductor (called thermal noise or Johnson-Nyquist noise), shot noise, heave noise, cosmic noise from celestial bodies such as the sun, etc., or random errors caused by amplifiers, demodulators, etc. of the sensor itself, which affect satellite image simulation, are probability-fit to gaussian distribution, so we use gaussian noise to simulate these disturbances. Taking z as the noise value, the distribution of noise over the analog image corresponds to:
and setting independent and different mu values for different wave bands, so that the sensor noise effect in the multispectral image can be simulated.
(7) Photoelectric conversion and quantization model
The actual ground object has different irradiance according to the physical characteristics of the ground object, and the radiation intensity before photons enter the sensor in a diffuse reflection mode is a continuous function related to the material quality. Since the simulation conditions do not include texture parameters, they are considered to be an equal constant. After photons enter each cell of the CCD sensor, electrons in the semiconductor are activated, converting energy into a current form. Depending on the spectral frequencies, the responsivity (responsivity) of the device to different light frequencies can be expressed by the following equation:
where S (λ) represents the sensitivity at the wavelength λ, dX represents the photon quantity at one integration, and dY represents the output value of one integration depending on the light intensity (irradiance) and the size of the detection region.
The current intensity after photoelectric conversion is in direct proportion to the photon quantity of the corresponding wavelength detected by the CCD unit, the value is measured by an internal circuit, and finally the value is reflected on the gray level of the generated image.
In the simulation, the gray scale of the simulation image is quantized, 256 levels, namely 8bit gray scale grading is adopted, the discretization of the radiation intensity is uniformly graded, the radiation intensity sub-band of the ground object is electronically stored in a gray scale value form of 0-255, and then the gray scale value of the corresponding band is reduced according to the spectral responsivity in proportion.
(8) Signal conversion transmission noise model
The ground surface feature geometric and radiation information obtained by the sensor is encoded into binary values in the sensor, and is transmitted and stored in the form of electronic pulses, so that the problem of error rate in signal conversion and transmission is inevitably faced. The definition of the bit error rate (symbol error rate, SER) is the ratio of the erroneous bits to the total bits, reflecting the loss of signal of the final image data. We assume that the distribution of bit errors follows gaussian distribution, and for binary data streams in image simulation, impulse noise is added to them to simulate bit errors. The model of impulse noise is:
where z is noise, a, b are wrong gray values, usually 0 and 255, i.e. salt and pepper noise, p a ,p b Subject to gaussian distribution.
Examples
The simulation method dynamically simulates push-broom imaging to generate a simulation image for parameters such as a track, a gesture and the like which are dynamically transmitted. The result is mainly affected by several factors: geometric factors such as the position and the attitude of a satellite, sensor factors such as the switching-on level of a sensor, the spectrum induction degree and the like, and factors such as random noise effect in various imaging and transmission processes.
1) Satellite flutter effects
Satellite flutter means that under factors such as attitude adjustment momentum wheel, attitude adjustment thruster and external disturbance, the position of a ground target on an imaging device changes, when a spacecraft enters a sunshine area from a shadow area, solar radiation heat flow received by the spacecraft is suddenly increased, and unnecessary vibration, namely a thermal vibration effect, of the spacecraft is easily caused. These flutter effects can cause a disturbance in the attitude angle of the satellite, resulting in image shifts due to the effects of imaging the sensor. In modeling the effects of chatter, we often simulate these disturbances as a simple harmonic vibration.
In order to test the simulation of satellite flutter by the simulation method, we simulate the roll angle phi (roll) of the satellite into a simple harmonic disturbance. The formula and the value are as follows:
the imaging results are shown in fig. 3 and 4, the imaging result of fig. 3 is the imaging result without disturbance, and the imaging result of fig. 4 is the imaging result after disturbance, and it can be seen from the imaging that the imaging generates left and right image shifts due to the disturbance of a simple harmonic of the satellite on the rolling angle.
2) Line frequency influence
The three simulated imaging result diagrams of fig. 5 show two cases of the desynchronization of the scanning lines. The HSFs of the three panels are 2000, 1000, 500, respectively, with the correct HSF being 1000, the imaging result of the middle panel.
3) Kind of imaging spectrometer
Fig. 6 shows a simulation generated image, which simulates the imaging situation of a left, middle and right spliced three-linear array CCD and carrying a full-color imager. The true color map of the first three bands is shown.
4) Focusing parameters
In the satellite running process, the focusing distance is sometimes required to be adjusted in a certain range, so that a short-time image defocusing condition can occur.
FIG. 7 shows the imaging differences in different defocus situations in the simulation software generated images, wherein the A image is the generated image without defocus, the focal length is 4.6m, the B image is the generated image with defocus + -0.05 m, the C image is the generated image with defocus + -0.1 m, and the D image is the generated image with defocus + -0.2 m. It can be seen that the degree of blurring increases in turn.
5) Other degradation factors affecting image quality
Other degradation factors that affect image quality include solar altitude, MTF, spectral response, sensor noise, and communication noise.
First, the satellite changes the solar altitude according to the time of operation, causing an overall change in the sensor light intensity. Second, the contrast of the image is reduced by the ideal value due to the effects of diffraction, aberration, etc. in the actual imaging process, and the simulation of these imaging system media, i.e., MTF simulation. In addition, when the sensor absorbs light with different spectral frequencies and converts the light into energy, the device has different responsivity to different frequency bands. Meanwhile, a random error, i.e., sensor noise, caused by a cosmic noise or the like from a celestial body such as the sun, or an amplifier, a demodulator, or the like of the sensor itself. Finally, the ground surface feature geometric and radiation information obtained by the sensor is encoded into binary values in the sensor, and is transmitted and stored in the form of electronic pulses, so that the error rate in signal conversion and transmission, namely communication noise, is faced. Fig. 8 shows the degraded image after adding these influencing factors.

Claims (10)

1. A dynamic imaging simulation method of a high-resolution optical camera of a Mars orbit device is characterized by comprising the following steps:
s1, importing track dynamic parameters: simulating the running orbit of the satellite, and acquiring the magnitude of the currently set error parameter, and the position parameter, the attitude parameter and the illumination parameter of the satellite at each moment;
s2, calculating camera orientation parameters and establishing an imaging model: calculating the internal and external azimuth elements of the simulated image at the moment through the acquired orbit dynamic parameters, and deducing an imaging model from the ground simulated three-dimensional coordinates to the two-dimensional image plane coordinates;
s3, importing a three-dimensional model and a view position under the conditions of the position, the gesture and the illumination parameters;
s4, generating a dynamic simulation image: generating a pair of dynamic two-dimensional simulation images in a pixel point traversing mode according to the imaging model through the acquired local three-dimensional ground model;
s5, generating a multispectral full-color image based on the generated two-dimensional simulation image;
s6, simulating various degradation effects: for the generated simulated image, the influence of geometric factors, optical device degradation factors and transmission distortion factors on the image is considered, an influence model is respectively built, various influenced simulated images are sequentially calculated in a pixel level traversing mode, and a final simulated degradation image is generated.
2. The method for dynamic imaging simulation of a high-resolution optical camera of a Mars orbiter according to claim 1, wherein S3 comprises:
the solar azimuth angle parameters of the understar point are obtained through the ephemeris and the orbit three-dimensional coordinates, the day and night conditions in the view field range are judged through the solar azimuth angle parameters, the brightness value in the view field in the night environment is weakened in the imaging process, and the planet day and night alternation condition is simulated.
3. The dynamic imaging simulation method of a high-resolution optical camera of a Mars orbiter as claimed in claim 1, wherein in the step S4, in the final stage of the generation of the simulation image, the influence of the aperture parameters on the light entering amount is required to be simulated.
4. The method for dynamic imaging simulation of a high-resolution optical camera of a Mars orbiter according to claim 1, wherein S4 comprises:
setting a plurality of aperture parameter options, and adjusting brightness of an image derived by an imaging model according to different proportional coefficients to obtain analog image brightness information considering the influence of light quantity; according to the data type of the analog image, converting the brightness information into integer data, and performing the process of simulating overexposure on pixels larger than the maximum display range of the data type to generate a final two-dimensional analog image.
5. The method for simulating dynamic imaging of a high-resolution optical camera of a Mars orbiter according to claim 1, wherein S5 comprises: based on the generated two-dimensional simulation image, the three-dimensional simulation image is divided into three single-channel images, the three single-channel images respectively contain information of red, green and blue frequency bands of the original image, the simulation is carried out on the infrared frequency bands, and the simulation is carried out on the full-color sensor, so that multispectral full-color images are generated.
6. The method for simulating dynamic imaging of a high-resolution optical camera of a Mars orbiter according to claim 1, wherein the model of influence of geometrical factors comprises: an attitude angle disturbance simulation model and a scanning line frequency desynchronization model.
7. The method for simulating dynamic imaging of a high-resolution optical camera of a Mars orbiter according to claim 1, wherein the model of influence of the degradation factors of the optical device comprises: a defocus effect model, a modulation transfer function simulation model and a sampling model.
8. The method for simulating dynamic imaging of a high-resolution optical camera of a Mars orbiter according to claim 1, wherein the model of influence of the transmission distortion factor comprises: sensor noise model, photoelectric conversion and quantization model, signal conversion transmission noise model.
9. The method for dynamic imaging simulation of a high-resolution optical camera of a Mars orbiter of claim 7, wherein the defocus effect model comprises a gaussian convolution model and a uniform sampling model, and the gaussian convolution model is used for simulating defocus effect.
10. The method of dynamic imaging simulation of a Mars orbiter high-resolution optical camera of claim 8, wherein the sensor noise model includes a model configured to: simulating disturbance when the sensor processes the photoelectric signal by using Gaussian noise;
the photoelectric conversion and quantization model is configured to: quantifying the gray level of the analog image, uniformly grading the discretization of the radiation intensity, electronically storing the radiation intensity sub-band of the ground object in a gray level value form of 0-255, and then reducing the gray level value of the corresponding band according to the spectral responsivity in proportion;
the signal conversion transmission noise model is configured to: assuming that the distribution of the error codes is compliant with Gaussian distribution, impulse noise is added to binary data streams in image simulation to simulate the error codes.
CN202010480409.3A 2020-05-30 2020-05-30 Dynamic imaging simulation method for high-resolution optical camera of Mars orbit device Active CN111754392B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010480409.3A CN111754392B (en) 2020-05-30 2020-05-30 Dynamic imaging simulation method for high-resolution optical camera of Mars orbit device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010480409.3A CN111754392B (en) 2020-05-30 2020-05-30 Dynamic imaging simulation method for high-resolution optical camera of Mars orbit device

Publications (2)

Publication Number Publication Date
CN111754392A CN111754392A (en) 2020-10-09
CN111754392B true CN111754392B (en) 2023-08-04

Family

ID=72674461

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010480409.3A Active CN111754392B (en) 2020-05-30 2020-05-30 Dynamic imaging simulation method for high-resolution optical camera of Mars orbit device

Country Status (1)

Country Link
CN (1) CN111754392B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112395741B (en) * 2020-10-27 2023-08-29 北京空间机电研究所 Space-time spectrum integrated optical remote sensing imaging object image mapping method
CN114152267A (en) * 2021-02-26 2022-03-08 武汉大学 Mars orbit camera image simulation method and system
CN113391309B (en) * 2021-06-15 2022-09-09 电子科技大学 Radial downward-looking imaging method for Mars detector radar
CN114742779A (en) * 2022-04-01 2022-07-12 中国科学院光电技术研究所 High-resolution self-adaptive optical image quality evaluation method based on deep learning

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103345737A (en) * 2013-06-04 2013-10-09 北京航空航天大学 UAV high resolution image geometric correction method based on error compensation
CN103913148A (en) * 2014-03-26 2014-07-09 中国科学院长春光学精密机械与物理研究所 Full-link numerical simulation method of aerospace TDICCD (Time Delay and Integration Charge Coupled Device) camera

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019062166A1 (en) * 2017-09-30 2019-04-04 中国科学院遥感与数字地球研究所 Method for automatic geometric correction of cross-platform moon-based earth observation imaging

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103345737A (en) * 2013-06-04 2013-10-09 北京航空航天大学 UAV high resolution image geometric correction method based on error compensation
CN103913148A (en) * 2014-03-26 2014-07-09 中国科学院长春光学精密机械与物理研究所 Full-link numerical simulation method of aerospace TDICCD (Time Delay and Integration Charge Coupled Device) camera

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
The High Resolution Stereo Camera (HRSC) of Mars Express and its approach to science analysis and mapping for Mars and its satellites;K.Gwinner等;《Planetary and Space Science》;20160731;第126卷;第93-138页 *

Also Published As

Publication number Publication date
CN111754392A (en) 2020-10-09

Similar Documents

Publication Publication Date Title
CN111754392B (en) Dynamic imaging simulation method for high-resolution optical camera of Mars orbit device
CN103913148B (en) Space flight TDI CCD camera full link numerical value emulation method
Windhorst et al. The Hubble Space Telescope wide field camera 3 early release science data: panchromatic faint object counts for 0.2–2 μm wavelength
Suzuki et al. Initial inflight calibration for Hayabusa2 optical navigation camera (ONC) for science observations of asteroid Ryugu
Cao et al. The High-Resolution IRAS Galaxy Atlas
CN110849353B (en) Embedded space target astronomical positioning method
CN104977024B (en) A kind of day blind in-orbit modification method of ultraviolet remote sensing camera Absolute Radiometric Calibration Coefficients
Cota et al. PICASSO: an end-to-end image simulation tool for space and airborne imaging systems
CN106845024A (en) A kind of in-orbit imaging simulation method of optical satellite based on wavefront inverting
WO2017218098A1 (en) Generating high resolution images
Hueso et al. The planetary laboratory for image analysis (PLIA)
CN104361563A (en) GPS-based (global positioning system based) geometric precision correction method of hyperspectral remote sensing images
CN105547286B (en) A kind of compound three visual fields star sensor star map simulation method
Abolghasemi et al. Design and performance evaluation of the imaging payload for a remote sensing satellite
Jorda et al. OASIS: a simulator to prepare and interpret remote imaging of solar system bodies
CN106126839B (en) A kind of three-linear array stereo mapping satellite imagery emulation mode and system
Laherrere et al. POLDER on-ground stray light analysis, calibration, and correction
LENZ et al. Simulation of Earth Observation Data Utilizing a Virtual Satellite Camera
Iovenitti " Star coverage": a simple tool to schedule an observation when FOV rotation matters
CN112395741B (en) Space-time spectrum integrated optical remote sensing imaging object image mapping method
CN113589318A (en) Satellite-borne infrared staring camera entrance pupil radiation image simulation method
Schläpfer et al. Parametric geocoding of AVIRIS data using a ground control point derived flightpath
Maier et al. Flatfield Calibrations with Astrophysical Sources for the Nancy Grace Roman Space Telescope's Coronagraph Instrument
Li et al. Imaging Chain Modelling and a Scaling Experiment for Optical Remote Sensing of Lunar Surface
Griffith et al. High-Fidelity Electro-Optical Space Domain Awareness Scene Simulator

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant