[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Next Article in Journal
Spatial–Temporal Analysis of Greenness and Its Relationship with Poverty in China
Previous Article in Journal
Improving Real-Scene 3D Model Quality of Unmanned Aerial Vehicle Oblique-Photogrammetry with a Ground Camera
You seem to have javascript disabled. Please note that many of the page functionalities won't work as expected without javascript enabled.
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Design of Exterior Orientation Parameters Variation Real-Time Monitoring System in Remote Sensing Cameras

1
Changchun Institute of Optics, Fine Mechanics and Physics, Chinese Academy of Sciences, Changchun 130033, China
2
University of Chinese Academy of Sciences, Beijing 101408, China
*
Author to whom correspondence should be addressed.
Remote Sens. 2024, 16(21), 3936; https://doi.org/10.3390/rs16213936
Submission received: 23 September 2024 / Revised: 17 October 2024 / Accepted: 18 October 2024 / Published: 23 October 2024

Abstract

:
The positional accuracy of satellite imagery is essential for remote sensing cameras. However, vibrations and temperature changes during launch and operation can alter the exterior orientation parameters of remote sensing cameras, significantly reducing image positional accuracy. To address this issue, this article proposes an exterior orientation parameter variation real-time monitoring system (EOPV-RTMS). This system employs lasers to establish a full-link active optical monitoring path, which is free from time and space constraints. By simultaneously receiving star and laser signals with the star tracker, the system monitors changes in the exterior orientation parameters of the remote sensing camera in real time. Based on the in-orbit calibration geometric model, a new theoretical model and process for the calibration of exterior orientation parameters are proposed, and the accuracy and effectiveness of the system design are verified by ground experiments. The results indicate that, under the condition of a centroid extraction error of 0.1 pixel for the star tracker, the EOPV-RTMS achieves a measurement accuracy of up to 0.6″(3σ) for a single image. Displacement variation experiments validate that the measurement error of the system deviates by at most 0.05″ from the theoretical calculation results. The proposed EOPV-RTMS provides a new design solution for improving in-orbit calibration technology and image positional accuracy.

1. Introduction

Remote sensing cameras are the core equipment of optical observation satellites, and with the advancement of technology, their clarity and resolution capabilities have continuously improved [1,2,3]. From the 1 m resolution commercial Earth observation satellite IKONOS to DigitalGlobe’s 0.31 m high-resolution third-generation commercial Earth observation satellite WorldView-4, the performance of remote sensing cameras have been consistently enhanced [4,5]. However, beyond capturing high-resolution images, improving the positional accuracy of satellite imagery has become a critical issue for maximizing the performance and application potential of remote sensing cameras. The factors affecting image positioning accuracy primarily include changes in the geometric parameters of the remote sensing camera (which can be further divided into changes in exterior orientation parameters and changes in interior orientation parameters), in-orbit calibration methods, and image matching [6,7,8]. As reported in the literature, the geometric parameters of the remote sensing camera can undergo significant changes due to environmental variations during the launch and in-orbit operation, particularly the stability of the exterior orientation parameters (the transformation matrix between the remote sensing camera and the star tracker), which can have a profound impact on high-precision image positioning. This is critical for high-accuracy remote sensing missions [9,10,11]. Currently, researchers primarily ensure the accuracy of exterior orientation parameters through rigorous ground calibration, the use of ultra-low expansion coefficient materials, and high-precision thermal control methods [12,13,14].
Traditional in-orbit calibration methods for remote sensing cameras, such as Ikonos, Pleiades, and SPOT, involve processing images captured by the camera of ground control points and star images acquired by star tracker to calibrate the exterior orientation parameters of the camera. The calibrated exterior orientation parameters are used for image georeferencing until the next calibration [15,16,17,18]. GeoEye-1 performs on-orbit calibration using ground calibration fields in several countries, including the United States, Australia, and Japan. To further enhance the accuracy of satellite orbit and attitude measurements, the satellite is equipped with dual-lens star trackers and a sun tracker. By using autonomous calibration and block adjustment, GeoEye-1 calibrates the satellite’s three-axis attitude and stability, improving image-positioning accuracy [19]. The ZY-3 satellite matches the images to be calibrated with high-precision reference data from ground-calibration fields (DOM, DEM) to calibrate abundant ground control points and, combined with satellite attitude data, calibrates the camera’s exterior orientation parameters. Its image-positioning accuracy is about 15 m (CE90) [20,21]. However, calibration methods that rely entirely on ground-calibration fields can only reflect the camera’s state at the time of calibration and cannot account for changes in exterior orientation parameters when ground control points are absent.
To this end, researchers have analyzed the trend of changes in the exterior orientation parameters of remote sensing cameras using mathematical fitting methods. By statistically analyzing these parameters, they have established predictive mathematical models for parameter variation trends. For example, the long-term variation of exterior orientation parameters for the ALOS satellite is predicted using a linear model, while short-term variations are estimated using a second-order Fourier series [22]. However, due to the complexity of actual in-orbit operations, it is difficult to achieve high-precision predictions of changes in exterior orientation parameters using mathematical fitting methods. Additionally, for the Tianhui-1 series satellites, Wang Renxiang and colleagues developed the LMCCD camera and the EFP bundle adjustment method, establishing a mathematical model for photogrammetric calculations based on the uncontrolled positioning of a three-line array CCD camera [23]. This improved the image-positioning accuracy of the Tianhui-1-03 satellite. However, the mathematical model for the three-line array camera is highly complex and lacks general applicability.
The research methods based on mathematical fitting all have certain shortcomings, making it challenging to improve the calibration accuracy of camera exterior orientation parameters and to monitor real-time changes. As a result, recent literature has proposed several new calibration methods for remote sensing cameras. For example, the ATLAS system aboard NASA’s ICESat-2 establishes a correlation between the reference beam, the star tracker, and the laser altimeter system [24]. However, this method relies on mechanical structures to ensure the consistency of the reference beam’s direction, without involving the star tracker, which is not suitable for most remote sensing cameras. On the other hand, the Gaofen-14 satellite proposed a method of auxiliary calibration of camera geometric parameters in the way of self-collimation. By placing an array detector on both sides of the focal plane position of the camera to receive the self-collimation light of the system, the change of exterior orientation parameters of the remote sensing camera was monitored. Although this method is effective, it relies solely on reference prisms for confirmation without establishing a direct connection, and its measurement accuracy still requires validation [25].
Based on the aforementioned content, according to the geometric model of the remote sensing camera in-orbit calibration, this article proposes a real-time monitoring system of high-precision exterior orientation parameters changes applied to remote sensing cameras. This system can monitor the variations of exterior orientation parameters between the camera and star tracker in real time, without the need for ground control points, and correct the camera’s exterior orientation parameters to further enhance image-positioning accuracy. By establishing an active optical monitoring path and using lasers to create a real-time connection between the star tracker and the remote sensing camera’s exterior orientation parameters, the system performs real-time in-orbit calibration of the exterior orientation parameters through simultaneous imaging of the laser and stars by the star tracker. The advantage of this system is that it is not affected by ground control points and can monitor the variations in the exterior orientation parameters of the remote sensing camera around the clock. Real-time data collection not only provides a clear understanding of the variations in the exterior orientation parameters during the camera’s in-orbit operation but also further improves image-positioning accuracy, fully leveraging the performance and application potential of the remote sensing camera.

2. EOPV-RTMS Design Principles and Calibration Theoretical Model

2.1. EOPV-RTMS Design Principles

To improve the in-orbit calibration methods of remote sensing cameras and enhance image-positioning accuracy while eliminating the long-term reliance on ground control points and the inability to monitor the exterior orientation parameter variations in real time, this article proposes an exterior orientation parameters variation real-time monitoring system (EOPV-RTMS) in remote sensing cameras. The system establishes a direct connection between the remote sensing camera and the star tracker using a laser. The laser, positioned at the focal plane of the remote sensing camera, is directed into the star tracker. The star tracker simultaneously images the laser and starlight to monitor the variations in the exterior orientation parameters between the remote sensing camera and the star tracker in real time. The overall layout of the EOPV-RTMS is illustrated in Figure 1. The system consists of the remote sensing camera optical system, laser emission module, laser relay system, and star tracker.
As shown in Figure 2, the laser propagation path in the EOPV-RTMS is illustrated. In this system, the laser emission module is positioned at the focal plane of the remote sensing camera optical system, with a laser wavelength of 850 nm. The laser emitted from the camera’s focal plane is collimated into parallel light after passing through the camera’s optical system. A portion of the light beam is received and reflected by right-angle cone mirror in the laser relay system into a narrow band-pass filter and dichroic mirror. The dichroic mirror allows the laser and star to be simultaneously received by the focal plane detector of the star tracker. In the images collected by the star tracker, the displacement change of the laser indicates the variations in the exterior orientation parameters between the remote sensing camera and the star tracker. By real-time monitoring of the star and laser with the star tracker, the system obtains real-time data on the variations of the exterior orientation parameters of the remote sensing camera, which can be used to correct the image products captured by the camera, thereby improving the image-positioning accuracy.
To ensure that the EOPV-RTMS can simultaneously image both the laser and star, the laser relay system is a critical component of the design. The design of the laser relay system is shown in Figure 3. The parallel laser emitted after passing through the remote sensing camera optical system is reflected 180° by right-angle cone mirror. Then, through the narrow band-pass filter and dichroic mirror, it enters the star tracker at the same time with the star, and is finally received by the star tracker detector. The right-angle cone mirror used in the laser relay system ensures that the laser is reflected into the star tracker. Additionally, the design of the right-angle cone mirror maintains rotational invariance, ensuring that the emitted light remains parallel to the incident light and is not affected by vibrations or thermal disturbances. This avoids potential impacts on monitoring accuracy due to variations in the mirror.
The narrow band-pass filter (NBPF) and dichroic mirror (DM) in the laser relay system ensure that the star tracker can simultaneously monitor both star and laser. The transmittance and reflectance of the narrow band-pass filter and dichroic mirror are shown in Figure 4. The narrow band-pass filter effectively blocks wavelengths other than the 850 nm laser and helps suppress stray light entering the laser relay system [26]. The dichroic mirror ensures that there is no signal crosstalk between the star and the laser, allowing both to be received simultaneously by the star tracker.

2.2. EOPV-RTMS Calibration Process and Theoretical Model

Based on the geometric model for in-orbit calibration of the remote sensing camera, this article proposes a novel in-orbit calibration process for EOPV-RTMS and establishes a corresponding theoretical model. As derived from Equation (1), the image-positioning accuracy of a remote sensing camera primarily depends on GPS, star tracker, and precise orientation parameters of the remote sensing camera. The position information of the target point in the image can be represented in the WGS84 coordinate system as follows:
X Y Z W G S 84 = X G Y G Z G + m M J 2000 W G S 84 M S t a r J 2000 M C a m e r a S t a r x x 0 Δ x y y 0 Δ y f c
In the image positioning mathematical model, X Y Z W G S 84 T represents the position information of the observed ground target point in the WGS84 coordinate system, and its accuracy determines the image positioning precision. X G Y G Z G W G S 84 T refers to the satellite’s in-orbit position, which can be obtained through the GPS. “m” is the scale factor. M J 2000 W G S 84  denotes the transformation matrix from the J2000 coordinate system to the WGS84 coordinate system. M S t a r J 2000 represents the transformation matrix from the star tracker to the J2000 coordinate system, which can be calibrated by imaging stars using the star tracker in-orbit. M C a m e r a S t a r represents the installation matrix parameters between the star tracker and the camera, also known as the exterior orientation parameters. x , y represents the coordinates of the observed target point in the camera body coordinate system. x 0 , y 0 are the coordinates of the camera’s principal point, and f c is the focal length of the remote sensing camera system, known as the interior orientation parameter. Δ x , Δ y represents the coordinate deviation of the observed target point due to distortion caused by the camera’s optical system.
Figure 5 illustrates the calibration process of the EOPV-RTMS. The first step involves the star tracker observing stars during the early operation of the satellite to complete the geometric parameter calibration of the star tracker [27,28,29]. The second step entails the remote sensing camera collecting ground image information, along with the star tracker’s attitude information, to complete the calibration of the remote sensing camera’s exterior orientation parameters. Subsequently, the results of the exterior orientation parameter calibration are used to calibrate the interior orientation parameters of the remote sensing camera [30,31]. In the third step, after the initial calibration of the geometric parameters of both the star tracker and the remote sensing camera, the EOPV-RTMS establishes a laser transformation matrix from the remote sensing camera to the star tracker by imaging the laser through the star tracker. The fourth step involves the star tracker continuously observing stars to refine the geometric parameters of the star tracker. In the fifth step, during the stable operation of the satellite, environmental variations can cause changes in exterior orientation parameters. The real-time monitoring system collects laser imaging data, and based on the offset in the laser imaging positions, calculates the deviation matrix of the exterior orientation parameters. The sixth step utilizes this data to correct the exterior orientation parameters of the remote sensing camera, integrating the corrected parameters into the image information to improve image-positioning accuracy. The EOPV-RTMS allows for a reduction in reliance on ground control points, enabling clear insights into the variations of the exterior orientation parameters of the remote sensing camera during in-orbit operations.
Based on the in-orbit calibration theory and calibration process, a theoretical model for the calibration of exterior orientation parameters is established [32]. The first step is the calibration of the star tracker’s attitude matrix. The accurate calibration of the star tracker’s attitude matrix serves as the foundation for the EOPV-RTMS and is crucial for the accuracy of image positioning. During the calibration of the star tracker’s attitude matrix, the direction vector of any given navigation star vector Κ i in the star tracker’s coordinate system can be expressed as κ i :
κ i = M S t a r J 2000 Κ i
In the above equation, M S t a r J 2000 represents the transformation matrix between the star tracker body coordinate system and the J2000 coordinate system. The direction vector of any arbitrary navigation star vector Κ i in the J2000 coordinate system can be expressed as:
Κ i = cos η cos ξ sin η cos ξ sin ξ
In the above equation, η , ξ represents the stellar coordinates expressed in terms of right ascension and declination in the J2000 coordinate system. The direction vector of the navigation star in the star tracker coordinate system can be specifically expressed as:
κ i = 1 x K i u 0 2 + y K i v 0 2 + f s 2 x K i u 0 y K i v 0 f s
where x K i , y K i represents the position of the navigation star κ i in the star tracker coordinate system, u 0 , v 0 is the principal point coordinate of the star tracker, and f s is the focal length of the star tracker system.
When the star tracker images stars within a star field, the Quest algorithm can be used to convert the problem of solving the Wahab least-squares loss function into a problem of solving matrix eigenvalues, thereby obtaining the optimal attitude transformation matrix M S t a r J 2000 * which can be specifically expressed as follows:
F M S t a r J 2000 * = 1 2 i = 1 n a i κ i M S t a r J 2000 * Κ i 2
In the equation, a i represents the weighting coefficient, i = 1 n a i = 1 .
After the geometric calibration of the star tracker is completed, ground control point i from the imagery can be used for the initial calibration of the exterior orientation parameters M C a m e r a S t a r of the remote sensing camera. At this point, the ground calibration value of the interior orientation parameters is considered as the true value, and the initial value of the exterior orientation parameters M C a m e r a S t a r is used to linearize the calibration model, resulting in the error equation for control point i:
C i E r r o r = T i M E r r o r L i E r r o r
where T i = X R y ( θ y ) X R x ( θ x ) X R z ( θ z ) Y R y ( θ y ) Y R x ( θ x ) Y R z ( θ z ) i , R x ( θ x ) , R y ( θ y ) , and R z ( θ z ) represent the exterior orientation parameters of the remote sensing camera in three directions. M E r r o r is the correction value for the exterior orientation parameters, and L i E r r o r is the difference vector for control point i, calculated using the ground-calibrated interior orientation parameters and the current exterior orientation parameters.
Finally, using the least squares method, the optimal estimate of the correction values for the exterior orientation parameters from each measurement can be obtained. After the initial calibration, the exterior orientation parameters of the remote sensing camera can be expressed as:
M C a m e r a S t a r = M E r r o r M C a m e r a S t a r
Next, the linear keyhole imaging model is used to calibrate the principal point and focal length of the remote sensing camera, employing the homography matrix to solve for these parameters. The interior orientation parameter matrix of the camera can be expressed as:
R C a m e r a = f x τ x 0 0 f y y 0 0 0 1
In the equation, f x and f y are the factor of proportionality for the camera focal length along the X and Y directions, respectively, while τ is the tilt coefficient that describes the orthogonality between the x and y directions.
Let the homography matrix be H = R C a m e r a M C a m e r a W G S 84 = h 1 h 2 h 3 , where M C a m e r a W G S 84 represents the transformation matrix between the WGS84 coordinate system and the camera coordinate system. By using the known coordinates of ground control points in both the camera coordinate system and the WGS84 coordinate system, a known homography matrix can be obtained for each ground calibration image.
Since the vectors in the M C a m e r a W G S 84 matrix are standard orthogonal vectors, the following formula can be obtained:
h 1 T R C a m e r a T R C a m e r a 1 h 2 = 0 h 1 T R C a m e r a T R C a m e r a 1 h 1 = h 2 T R C a m e r a T R C a m e r a 1 h 2
Let A = R C a m e r a T R C a m e r a 1 = a 11 a 12 a 13 a 21 a 22 a 23 a 31 a 32 a 33 . Since matrix M C a m e r a W G S 84 is a symmetric matrix, it can be represented by a 6-dimensional vector: a = a 11 a 12 a 22 a 13 a 23 a 33 T . Therefore, Equation (5) can be expressed as:
h i T A h j = b i j T a
where b i j = h i 1 h j 1 h i 1 h j 2 + h i 2 h j 1 h i 2 h j 2 h i 2 h j 1 + h i 1 h j 3 h i 3 h j 2 + h i 2 h j 3 h i 3 h j 3 T .
When n ground calibration images are acquired, containing m control points, Equation (6) can be further expressed as:
B a = b 12 T b 11 b 12 T a = 0
From Equation (7), B is a 2n × 6 matrix. When more than three ground-calibration images are selected, the vector aa can be obtained using the least squares method. Further, by applying Cholesky decomposition, the interior orientation parameter matrix R C a m e r a of the remote sensing camera can be determined, thus establishing the new principal point x 0 * , y 0 * and the camera focal length f c * . For satellites in low Earth orbit, the interior orientation parameters remain relatively stable during in-orbit operations, and thus, throughout the satellite’s lifecycle, the interior orientation parameters typically only need to be calibrated a few times [33].
Once the attitude of the star tracker and the exterior and interior orientation parameters of the remote sensing camera are determined, the laser transformation matrix between the remote sensing camera and the star tracker can be calibrated. In the EOPV-RTMS, each laser can be represented in vector form in the camera coordinate system as:
V i = 1 x L i x 0 * 2 + y L i y 0 * 2 + f c * 2 x L i x 0 * y L i y 0 * f c *
Here, (x, y) represents the coordinates of different lasers in the camera coordinate system. After passing through the EOPV-RTMS, the vector form in the star tracker coordinate system can be expressed as follows:
v i k = 1 x L i k u 0 2 + y L i k v 0 2 + f s 2 x L i k u 0 y L i k v 0 f s
Here, x L i k , y L i k represents the coordinates of the laser imaging points in the star tracker coordinate system, obtained from the centroid extraction algorithm in the k-th image.
Therefore, according to the change of vector relationship between laser object and image, the new exterior orientation parameter matrix between the remote sensing camera and the star tracker can be determined as follows:
V i = M C a m e r a S t a r * v i k
Similarly, according to the least square method, the optimal solution of the exterior orientation parameter matrix can be further calculated through the vector change relationship between the multi-channel laser object and image:
min M S t a r C a m e r a k = 1 m i = 1 n V i M C a m e r a S t a r * v i k
In the equation, m is the number of images collected, and n is the number of lasers. To minimize the impact of centroid extraction errors for the laser image points, multiple consecutively acquired laser images are typically used for the calculation.
The exterior orientation parameter matrix between the camera and the star tracker, obtained through the EOPV-RTMS, includes both the initial installation matrix between the camera and the star tracker and the environmental disturbance-induced variation matrix, which can be specifically expressed as:
M C a m e r a S t a r * = M C h g M E r r o r M C a m e r a S t a r
Here, M C a m e r a S t a r is the initial transformation matrix between the star tracker coordinate system and the camera coordinate system, obtained through ground measurement calibration. M E r r o r is the correction value of the exterior orientation parameters of the camera during the initial in-orbit calibration process. M C h g represents the variation in the camera coordinate system relative to the star tracker coordinate system during in-orbit operations, which corresponds to the exterior orientation parameter changes monitored by the system.
Simultaneously, the EOPV-RTMS can continuously monitor the imaging information of star and laser through the star tracker, thereby obtaining real-time changes in the exterior orientation parameters of the remote sensing camera and the star tracker. By applying the corrected orientation parameters to the acquired ground images, higher accuracy in image positioning can be achieved.
Finally, a new mathematical model of remote sensing camera image positioning can be obtained by calibrating the internal and exterior orientation parameters of the remote sensing camera with star, ground, and laser in-orbit:
X Y Z W G S 84 = X G Y G Z G W G S 84 + m M J 2000 W G S 84 M S t a r J 2000 * M C a m e r a S t a r * X x 0 * Δ x Y y 0 * Δ y f c *

3. Simulation Analysis of EOPV-RTMS Measurement Accuracy

In the EOPV-RTMS, the measurement accuracy of the system is influenced by centroid extraction errors from the star tracker, with the precision of centroid extraction directly determining the accuracy of the monitoring system. When there are changes in the exterior orientation parameters between the remote sensing camera and the star tracker, the laser imaging points on the star tracker detector surface will experience shifts in both the X and Y directions. Therefore, this article employs the Monte Carlo analysis method to simulate and analyze the impact of centroid extraction errors on the measurement accuracy of the EOPV-RTMS.
In Section 2, due to the difficulty in intuitively presenting the changes in the exterior orientation parameters of the remote sensing camera within the calibration theoretical model of the EOPV-RTMS, the model for the EOPV-RTMS has been simplified based on the research conducted by Liu, Wei [34]. An active optical monitoring path is established using the EOPV-RTMS. By moving the camera’s focal plane along the X and Y axes, a movement effect of the laser points is generated, allowing the star tracker to detect information about laser movement. The simplified model is shown in Figure 6. In Figure 6, the changes in the exterior orientation parameters between the remote sensing camera and the star tracker are equivalent to the movement of the camera’s focal plane. Of course, changes in the camera’s exterior orientation parameters should also include movement and rotation along the Z-axis. However, movement along the Z-axis does not affect image-positioning accuracy, and rotation can similarly be decomposed into translations of the remote sensing camera’s focal plane along the X and Y axes. Therefore, this article only presents the displacements along the two axes.
In Figure 6, the EOPV-RTMS establishes an active optical monitoring path between the remote sensing camera and the star tracker, where changes in the exterior orientation parameters are equivalent to translations of the remote sensing camera’s focal plane along the X and Y axes. Similarly, the positions of the laser image points received by the star tracker will also shift. For example, the light emitted from point A on the camera’s focal plane will be imaged at position A’ on the star tracker after passing through the EOPV-RTMS. The initial angles between the laser pointing located at the remote sensing camera’s focal plane and the Z-axis in the XOZ and YOZ directions are defined as α and β, respectively. When the exterior orientation parameters between the remote sensing camera and the star tracker change, it is equivalent to the laser translating along the X and Y axes, resulting in changes of Δα and Δβ, with Δα and Δβ measured in arcsec. When the laser moves from position A to B along the Y-axis, the corresponding laser position A’ in the star tracker will also shift to B’. By measuring the distance from B’ to A’, the specific change in the camera’s focal plane movement along the Y-axis can be calculated, while the movements along the X and Y axes can be expressed as:
Δ α = ( x L i D + tan α x L i + Δ x D )
Δ β = ( y L i D + tan β y L i + Δ y D )
In the above equation, D represents the pixel size of the remote sensing camera, x L i  and y L i denote the specific positions of the laser points in the camera coordinate system. Δx and Δy represent the displacement of the laser points along the X and Y axes in the camera coordinate system, and they also indicate the changes in the camera’s focal plane.
According to geometric principles and error precision analysis, the standard deviation ρ of the EOPV-RTMS can be expressed as:
ρ = 1 N 1 arctan d σ p i x f s
where N represents the number of laser points, d is the pixel size of the star tracker detector, and σ p i x is the centroid extraction error. Taking the star tracker detector pixel size of 7.5 μm and a system focal length of 50 mm as an example, the relationship between the centroid extraction error σ p i x and the standard deviation ρ of the EOPV-RTMS is illustrated in Figure 7.
As shown in Figure 7, the simulation analysis results indicate that the measurement accuracy of the EOPV-RTMS increases linearly with the centroid extraction error of the star tracker. However, compared to the centroid extraction accuracy of the star tracker for stars, its extraction accuracy for laser points is superior. This is primarily because the laser path is more stable within the camera system, with less external environmental interference compared to stars. Additionally, unlike lasers, the imaging position of stars is continuously moving, and errors due to S-curve error and motion-induced blur significantly impact star extraction [35]. Therefore, the laser profiles are more ideal, leading to higher extraction accuracy.
To more accurately represent the impact of the centroid extraction error of a single laser point on the measurement accuracy of the EOPV-RTMS, this article employs the Monte Carlo simulation analysis method. Simulations were conducted when the centroid extraction error of the star tracker was less than 0.1 pixel. As shown in Figure 8, when the centroid extraction error of the star tracker is controlled within 0.1 pixel and follows a normal distribution, the measurement accuracy of the EOPV-RTMS can reach ±0.6″(3σ). Thus, to achieve high-precision calibration parameters, the centroid extraction error of the star tracker should be controlled to within 0.1 pixel. According to reported literature, common image-processing methods such as the centroid method and Gaussian fitting can achieve centroid extraction accuracy better than 0.1 pixel [36,37].

4. Experimental Testing of Measurement Accuracy for the EOPV-RTMS

Based on the simulation analysis of the measurement accuracy for the EOPV-RTMS presented in Section 3, it can be concluded that when the star tracker error is controlled within 0.1 pixel, the measurement accuracy of the monitoring system can achieve ± 0.6″(3σ). To validate this innovative monitoring method, we have established a verification platform to assess the system’s accuracy and effectiveness. Figure 9 shows the verification platform for the EOPV-RTMS. As illustrated in Figure 9, an optical system simulating the remote sensing camera is implemented using a telescopic system. Four 850 nm lasers are placed at the focal plane of the system, and the lasers are received by the star tracker through the telescopic system and right-angle cone mirror. Additionally, the four 850 nm light sources placed at the system’s focal plane are controlled by a micro-displacement controller, allowing the light sources to be moved along the X and Y directions using piezoelectric control, with a maximum travel range of 100 μm.
According to Equation (12), using the verification platform, image data collected by the star tracker is used to monitor the changes in exterior orientation parameters between the remote sensing camera and the star tracker. First, we validated the consistency between the measurement accuracy of the EOPV-RTMS and the results from the simulation analysis. Secondly, in real operating environments, irregular thermal conditions and vibrations can lead to complex changes in exterior orientation parameters between the remote sensing camera and the star tracker, making it difficult to confirm the effectiveness of the real-time monitoring system. Therefore, this study equates the changes in exterior orientation parameters between the remote sensing camera and the star tracker to translations of the camera’s focal plane for effective experimental verification. By using a micro-displacement controller to create displacements in the laser sources, which are equivalent to translations of the camera’s focal plane, and comparing the measurement results of the real-time monitoring system with theoretical values, the effectiveness of the proposed method is assessed.
As shown in Figure 10, the image of the laser points emitted by the four lasers in the star tracker is displayed. In the star tracker image, the majority of the laser energy is concentrated within a 5 × 5 pixel area, allowing for the confirmation of the laser point coordinates using the centroid extraction method. Compared to the centroid extraction accuracy of star image points, the extraction accuracy of the laser image points can be achieved within 0.1 pixel.
As shown in Figure 11, the measurement accuracy result of the exterior orientation parameter M C h g between the remote sensing camera and the star tracker is measured by the EOPV-RTMS. During the experiment, one image was collected every 100 ms, resulting in a total of 800 images in both the X and Y directions. In each image, the laser coordinate changes were used to obtain the laser vector information. The results indicate that, in single-image measurements, the real-time monitoring system’s measurement accuracy in both the X and Y directions can achieve 0.6″ (3σ), which is consistent with the simulation results.
The above measurement results validate the system’s monitoring accuracy, which is consistent with the simulation analysis. However, such results are insufficient to prove the effectiveness of the system’s measurements. Therefore, we propose to evaluate the system by varying the position of the laser sources, measuring the angular change of these positions in real time, and comparing the difference between the measured and reference angular changes. Based on the simulation analysis in Section 3, the exterior orientation parameters between the remote sensing camera and the star tracker can be derived by simulating the displacement of the camera’s focal plane. As such, the laser sources were mounted on a precision displacement controller. By controlling the precise movement of the displacement controller to simulate the focal plane displacement, the monitoring system’s measurements were compared with reference values to assess the system’s design effectiveness.
During the experiment, the precision displacement controller was first returned to its initial position, and the EOPV-RTMS was activated. The platform was then moved along the X and Y axes in 20 μm increments up to a total displacement of 100 μm. During the platform’s movement, the monitoring system continuously captured laser-position data and performed calculations. To verify the system’s design effectiveness, the measured average value at each motion stage was compared with the theoretical value.
As shown in Figure 12, the real-time measurement results of the exterior orientation parameter changes during movement along the X and Y directions are presented. With the displacement of the precision displacement controller in both the X and Y directions, the measurement results of the monitoring system also change accordingly. The detailed results are shown in Table 1, where the displacement amount at each stage is reflected by the precision displacement controller, and the data collected by the monitoring system represents the average value continuously monitored during that stage. As can be seen in Table 1, the difference between the theoretical reference displacement and the actual measurements from the monitoring system is within 0.05″. The monitoring results are in good agreement with the reference displacement data, demonstrating that the EOPV-RTMS can effectively measure variations in the line of sight of the remote sensing camera. By compensating for these changes, the calibration accuracy of the camera’s exterior orientation parameters is improved, leading to enhanced image-positioning accuracy.

5. Discussion

High-precision image positioning is an important technical indicator for evaluating the performance of remote sensing cameras [38]. However, due to environmental changes during satellite operation, there is a significant deviation between the actual state of the remote sensing camera’s exterior orientation parameters and the ground calibration [39]. To address this, this article proposes an exterior orientation parameters variation real-time monitoring system. The overall design of the system is shown in Figure 1, where active optical monitoring is established using lasers between the remote sensing camera and the star tracker. The movement of the laser image point on the star tracker detector characterizes the changes in the exterior orientation parameters of the remote sensing camera. Based on the geometric model of in-orbit calibration for the remote sensing camera and in conjunction with the EOPV-RTMS, a novel theoretical model for the calibration of the exterior orientation parameters of remote sensing cameras is proposed, and an in-orbit calibration process is established.
From Figure 7 and Figure 8, the centroid extraction error of the star tracker affects the measurement of the system. For the star tracker, the centroid extraction error can often be improved to better than 0.1 pixel using methods based on the centroid method and Gaussian fitting [40]. As shown in Figure 11, during the ground validation experiments, this article further verifies the impact of the star tracker’s centroid extraction error on the measurement accuracy of the exterior orientation parameters under conditions where the camera’s exterior orientation parameters remain stable over a short period, which is generally consistent with the simulation results. Additionally, to validate the effectiveness of the system, this article used a micro-displacement controller to simulate changes in the remote sensing camera’s exterior orientation parameters. By monitoring the movement of the laser on the focal plane along the X and Y directions, the effective measurement accuracy of the system was verified. As shown in Figure 12 and Table 1, the actual measurement results of the EOPV-RTMS in the X and Y directions differ from the theoretical change values by a maximum of only 0.05″, fully meeting the precision and accuracy requirements of the system design.
The design advantages of the EOPV-RTMS proposed in this article lie in its ability to eliminate the long-term dependence on ground control points after the remote sensing camera completes its initial in-orbit calibration. This system operates without restrictions of time and location, enabling real-time monitoring of changes in the exterior orientation parameters of the remote sensing camera. This active optical monitoring system can compensate for the errors in the exterior orientation parameters obtained during the initial calibration of the remote sensing camera and correct the camera’s line of sight relative to inertial space. At the same time, the system’s real-time monitoring of changes in exterior orientation parameters enhances the understanding of how the exterior orientation parameters of the remote sensing camera vary in orbit. It also allows for the correction of acquired image information based on the patterns of exterior orientation parameter changes, thereby further improving the positioning accuracy of satellite images.

6. Conclusions

To address the changes in exterior orientation parameters of remote sensing cameras caused by environmental variations during in-orbit operation, this article conducts research on the in-orbit calibration methods for exterior orientation parameters. A novel exterior orientation parameters variation real-time monitoring system is proposed to tackle issues such as the poor timeliness and low measurement accuracy of calibration between the camera and the star tracker. Through simulation analysis and experimental validation, the accuracy and effectiveness of the EOPV-RTMS have been further confirmed. Experimental results indicate that the measurement accuracy of the real-time monitoring system can reach 0.6″(3σ), which aligns with the measurement accuracy determined by the Monte Carlo simulation method. In experiments simulating changes in exterior orientation parameters using a micro-displacement controller, the actual measurement results for changes in the X and Y directions differ from the theoretical change values by a maximum of only 0.05″, fully meeting the precision and accuracy requirements of the system design. The proposed EOPV-RTMS provides a new method for in-orbit calibration of the exterior orientation parameters of remote sensing cameras, addressing the inability of traditional on-orbit calibration methods to monitor changes in these parameters in real time and the low calibration accuracy. Additionally, it offers a new auxiliary means to improve the positioning accuracy of satellite images. In future work, we will focus on the actual usage of the system and explore the feasibility of monitoring high-frequency and low-frequency vibrations during the on-orbit operation of remote sensing cameras.

Author Contributions

Conceptualization, H.L. and C.L.; Methodology, H.L.; Software, H.L.; Validation, H.L., P.X. and S.L.; Formal analysis, H.L.; Investigation, H.L.; Resources, C.L.; Data curation, H.L.; Writing—original draft preparation, H.L.; Writing—review and editing, P.X.; Visualization, S.L.; Supervision, C.L.; Project administration, C.L.; Funding acquisition, C.L. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the National Natural Science Foundation of China, grant number 62175236.

Data Availability Statement

The original contributions presented in the study are included in the article, further inquiries can be directed to the corresponding author.

Acknowledgments

We would like to express our sincere appreciation to Fei Xing and Xuedi Chen, both from the Department of Precision Instrument, State Key Laboratory of Precision Measurement Technology and Instruments, Tsinghua University, Beijing, China. Their invaluable guidance and assistance in the theoretical modeling, experimental setup, and troubleshooting of issues related to the EOPV-RTMS were critical to this research. We also thank the Department of Precision Instrument, State Key Laboratory of Precision Measurement Technology and Instruments, Tsinghua University, Beijing, China, for their strong support of this work. Their contributions are greatly appreciated.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Li, L.; Li, Z.; Wang, Z.; Jiang, Y.; Shen, X.; Wu, J. On-Orbit Relative Radiometric Calibration of the Bayer Pattern Push-Broom Sensor for Zhuhai-1 Video Satellites. Remote Sens. 2023, 15, 377. [Google Scholar] [CrossRef]
  2. Aguilar, M.A.; del Mar Saldaña, M.; Aguilar, F.J. Assessing geometric accuracy of the orthorectification process from GeoEye-1 and WorldView-2 panchromatic images. Int. J. Appl. Earth Obs. Geoinf. 2013, 21, 427–435. [Google Scholar] [CrossRef]
  3. Zhang, X.; Fang, X.; Li, T.; Gu, G.; Li, H.; Shao, Y.; Jiang, X.; Li, B. Multi-Channel Hyperspectral Imaging Spectrometer Design for Ultraviolet Detection in the Atmosphere of Venus. Remote Sens. 2024, 16, 1099. [Google Scholar] [CrossRef]
  4. Akumu, C.E.; Amadi, E.O.; Dennis, S. Application of Drone and WorldView-4 Satellite Data in Mapping and Monitoring Grazing Land Cover and Pasture Quality: Pre- and Post-Flooding. Land 2021, 10, 321. [Google Scholar] [CrossRef]
  5. Sefercik, U.G.; Alkan, M.; Atalay, C.; Jacobsen, K.; Büyüksalih, G.; Karakış, S. Optimizing the Achievable Information Content Extraction from WorldView-4 Stereo Imagery. PFG J. Photogramm. Remote Sens. Geoinf. Sci. 2020, 88, 449–461. [Google Scholar] [CrossRef]
  6. Guan, Z.; Jiang, Y.; Wang, J.; Zhang, G. Star-Based Calibration of the Installation Between the Camera and Star Sensor of the Luojia 1-01 Satellite. Remote Sens. 2019, 11, 2081. [Google Scholar] [CrossRef]
  7. Opromolla, R.; Fasano, G.; Rufino, G.; Grassi, M.; Pernechele, C.; Dionisio, C. A new star tracker concept for satellite attitude determination based on a multi-purpose panoramic camera. Acta Astronaut. 2017, 140, 166–175. [Google Scholar] [CrossRef]
  8. Tang, X.; Xie, J. Overview of the key technologies for high-resolution satellite mapping. Int. J. Digit. Earth 2012, 5, 228–240. [Google Scholar] [CrossRef]
  9. Lee, S.; Shin, D. On-Orbit Camera Misalignment Estimation Framework and Its Application to Earth Observation Satellite. Remote Sens. 2015, 7, 3320–3346. [Google Scholar] [CrossRef]
  10. Liu, Q.; He, X.; Guan, F.; Zhao, Y.; Jiang, F.; Tian, F.; Wang, S. Method and Implementation of Improving the Pointing Accuracy of an Optical Remote Sensor Using a Star Sensor. Trait. Du Signal 2019, 36, 311. [Google Scholar] [CrossRef]
  11. Pi, Y.; Li, X.; Yang, B. Global iterative geometric calibration of a linear optical satellite based on sparse GCPs. IEEE Trans. Geosci. Remote Sens. 2019, 58, 436–446. [Google Scholar] [CrossRef]
  12. Wang, M.; Zhu, Y.; Pan, J.; Yang, B.; Zhu, Q. Satellite jitter detection and compensation using multispectral imagery. Remote Sens. Lett. 2016, 7, 513–522. [Google Scholar] [CrossRef]
  13. Wang, M.; Cheng, Y.; Chang, X.; Jin, S.; Zhu, Y. On-orbit geometric calibration and geometric quality assessment for the high-resolution geostationary optical satellite GaoFen4. ISPRS J. Photogramm. Remote Sens. 2017, 125, 63–77. [Google Scholar] [CrossRef]
  14. Liu, H.; Liu, C.; Liu, S.; Yong, Q.; Wang, X.; Zhao, Y.; Ding, Y.; Xie, P. Design of a focusing system for micro-nano satellite remote sensing camera based on thermal control technology. J. Therm. Stress. 2024, 47, 1–18. [Google Scholar] [CrossRef]
  15. Helder, D.; Coan, M.; Patrick, K.; Gaska, P. IKONOS geometric characterization. Remote Sens. Environ. 2003, 88, 69–79. [Google Scholar] [CrossRef]
  16. Ager, T.P. Evaluation of the Geometric Accuracy of Ikonos Imagery. In Proceedings of the Algorithms and Technologies for Multispectral, Hyperspectral, and Ultraspectral Imagery IX, Orlando, FL, USA, 21–25 April 2003; Volume 5093, pp. 613–620. [Google Scholar] [CrossRef]
  17. Kubik, P.; Lebègue, L.; Fourest, S.; Delvit, J.M.; de Lussy, F.; Greslou, D.; Blanchet, G. First in-Flight Results of Pleiades 1A Innovative Methods for Optical Calibration. In Proceedings of the International Conference on Space Optics—ICSO 2012, Ajaccio, Corsica, France, 20 November 2017; Volume 10564, pp. 54–63. [Google Scholar] [CrossRef]
  18. Mhangara, P.; Mapurisa, W.; Mudau, N. Comparison of image fusion techniques using satellite pour l’Observation de la Terre (SPOT) 6 satellite imagery. Appl. Sci. 2020, 10, 1881. [Google Scholar] [CrossRef]
  19. Aguilar, M.A.; Aguilar, F.J.; Saldaña, M.; Fernández, I. Geopositioning accuracy assessment of GeoEye-1 panchromatic and multispectral imagery. Photogramm. Eng. Remote Sens. 2012, 78, 247–257. [Google Scholar] [CrossRef]
  20. Zhao, Y.; Liu, Y.; Gao, S.; Liu, G.; Wan, Z.; Hu, D. Deep Learning-Based Digital Surface Model Reconstruction of ZY-3 Satellite Imagery. Remote Sens. 2024, 16, 2567. [Google Scholar] [CrossRef]
  21. Tang, X.; Xie, J.; Wang, X.; Jiang, W. High-precision attitude post-processing and initial verification for the ZY-3 satellite. Remote Sens. 2014, 7, 111–134. [Google Scholar] [CrossRef]
  22. Tadono, T.; Shimada, M.; Watanabe, M.; Mukaida, A.; Kawamoto, S.; Imoto, N.; Yamashita, J. Initial Results of Calibration and Validation for ALOS Optical Sensors. In Proceedings of the 2006 IEEE International Symposium on Geoscience and Remote Sensing, Denver, CO, USA, 31 July–4 August 2006; pp. 1643–1646. [Google Scholar] [CrossRef]
  23. Wang, J.; Wang, R.; Hu, X.; Su, Z. The on-orbit calibration of geometric parameters of the Tian-Hui 1 (TH-1) satellite. ISPRS J. Photogramm. Remote Sens. 2017, 124, 144–151. [Google Scholar] [CrossRef]
  24. Markus, T.; Neumann, T.; Martino, A.; Abdalati, W.; Brunt, K.; Csatho, B.; Farrell, S.; Fricker, H.; Gardner, A.; Harding, D.; et al. The Ice, Cloud, and land Elevation Satellite-2 (ICESat-2): Science requirements, concept, and implementation. Remote Sens. Environ. 2017, 190, 260–273. [Google Scholar] [CrossRef]
  25. Cao, B.; Jianrong, W.; Yan, H.; Yuan, L.; Xiuce, Y.; Xueliang, L.; Gang, L.; Yongqiang, W.; Zhuang, L. On-orbit geometric calibration and preliminary accuracy verification of GaoFen-14 (GF-14) optical two linear-array stereo camera. Eur. J. Remote Sens. 2023, 56, 2289013. [Google Scholar] [CrossRef]
  26. Liu, H.; Liu, C.; Xie, P.; Liu, S.; Wang, X.; Zhang, Y.; Song, W.; Zhao, Y. Stray light analysis and suppression of high-resolution camera line-of-sight variation real-time monitoring system (LoS Var RTMS). Opt. Express 2024, 32, 24184–24199. [Google Scholar] [CrossRef]
  27. Li, J.; Xiong, K.; Wei, X.; Zhang, G. A star tracker on-orbit calibration method based on vector pattern match. Rev. Sci. Instrum. 2017, 88, 043101. [Google Scholar] [CrossRef]
  28. Bao, J.; Zhan, H.; Sun, T.; Fu, S.; Xing, F.; You, Z. A window-adaptive centroiding method based on energy iteration for spot target localization. IEEE Trans. Instrum. Meas. 2022, 71, 1–13. [Google Scholar] [CrossRef]
  29. Bao, J.; Zhan, H.; Sun, T.; Xing, F.; You, Z. Adaptive energy filtering method based on time-domain image sequences for high-accuracy spot target localization. Appl. Opt. 2022, 61, 3034–3047. [Google Scholar] [CrossRef]
  30. Guan, Z.; Zhang, G.; Jiang, Y.; Shen, X. Low-frequency attitude error compensation for the Jilin-1 satellite based on star observation. IEEE Trans. Geosci. Remote Sens. 2023, 61, 1–17. [Google Scholar] [CrossRef]
  31. Li, X.; Yang, L.; Su, X.; Hu, Z.; Chen, F. A correction method for thermal deformation positioning error of geostationary optical payloads. IEEE Trans. Geosci. Remote Sens. 2019, 57, 7986–7994. [Google Scholar] [CrossRef]
  32. Wang, Y.; Wang, M.; Zhu, Y. On-orbit calibration of installation parameter of multiple star sensors system for optical remote sensing satellite with ground control points. Remote Sens. 2020, 12, 1055. [Google Scholar] [CrossRef]
  33. Pi, Y.; Yang, B.; Li, X.; Wang, M. Study of full-link on-orbit geometric calibration using multi-attitude imaging with linear agile optical satellite. Opt. Express 2019, 27, 980–998. [Google Scholar] [CrossRef]
  34. Liu, W.; Wang, H.; Jiang, W.; Qian, F.; Zhu, L. Real-Time On-Orbit Calibration of Angles Between Star Sensor and Earth Observation Camera for Optical Surveying and Mapping Satellites. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2019, 4, 583–588. [Google Scholar] [CrossRef]
  35. Wei, X.; Xu, J.; Li, J.; Yan, J.; Zhang, G. S-curve centroiding error correction for star sensor. Acta Astronaut. 2014, 99, 231–241. [Google Scholar] [CrossRef]
  36. Karaparambil, V.C.; Manjarekar, N.S.; Singru, P.M. Sieve Search Centroiding Algorithm for Star Sensors. Sensors 2023, 23, 3222. [Google Scholar] [CrossRef]
  37. Delabie, T.; Schutter, J.D.; Vandenbussche, B. An accurate and efficient Gaussian fit centroiding algorithm for star trackers. J. Astronaut. Sci. 2014, 61, 60–84. [Google Scholar] [CrossRef]
  38. Cheng, Y.; Jin, S.; Wang, M.; Zhu, Y.; Dong, Z. A new image mosaicking approach for the multiple camera system of the optical remote sensing satellite GaoFen1. Remote Sens. Lett. 2017, 8, 1042–1051. [Google Scholar] [CrossRef]
  39. Wang, M.; Cheng, Y.; Tian, Y.; He, L.; Wang, Y. A new on-orbit geometric self-calibration approach for the high-resolution geostationary optical satellite GaoFen4. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2018, 11, 1670–1683. [Google Scholar] [CrossRef]
  40. Wang, H.; Xu, E.; Li, Z.; Li, J.; Qin, T. Gaussian analytic centroiding method of star image of star tracker. Adv. Space Res. 2015, 56, 2196–2205. [Google Scholar] [CrossRef]
Figure 1. Exterior orientation parameters variation real-time monitoring system layout.
Figure 1. Exterior orientation parameters variation real-time monitoring system layout.
Remotesensing 16 03936 g001
Figure 2. Laser propagation path in the EOPV-RTMS.
Figure 2. Laser propagation path in the EOPV-RTMS.
Remotesensing 16 03936 g002
Figure 3. Laser relay system layout.
Figure 3. Laser relay system layout.
Remotesensing 16 03936 g003
Figure 4. Transmission and reflectivity curves of the narrow band-pass filter and dichroic mirror.
Figure 4. Transmission and reflectivity curves of the narrow band-pass filter and dichroic mirror.
Remotesensing 16 03936 g004
Figure 5. EOPV-RTMS calibration process.
Figure 5. EOPV-RTMS calibration process.
Remotesensing 16 03936 g005
Figure 6. Simplification of the exterior orientation parameters variation real-time monitoring system model.
Figure 6. Simplification of the exterior orientation parameters variation real-time monitoring system model.
Remotesensing 16 03936 g006
Figure 7. Impact of star tracker centroid extraction errors on the measurement accuracy of the EOPV-RTMS.
Figure 7. Impact of star tracker centroid extraction errors on the measurement accuracy of the EOPV-RTMS.
Remotesensing 16 03936 g007
Figure 8. Measurement accuracy of the EOPV-RTMS (centroid extraction error of star tracker ≤ 0.1 pixel).
Figure 8. Measurement accuracy of the EOPV-RTMS (centroid extraction error of star tracker ≤ 0.1 pixel).
Remotesensing 16 03936 g008
Figure 9. Verification platform for the EOPV-RTMS.
Figure 9. Verification platform for the EOPV-RTMS.
Remotesensing 16 03936 g009
Figure 10. Laser image points from four lasers in the star tracker.
Figure 10. Laser image points from four lasers in the star tracker.
Remotesensing 16 03936 g010
Figure 11. Measurement accuracy of the EOPV-RTMS.
Figure 11. Measurement accuracy of the EOPV-RTMS.
Remotesensing 16 03936 g011
Figure 12. Measurement results of exterior orientation parameters changes during focal plane movement along the X/Y axis.
Figure 12. Measurement results of exterior orientation parameters changes during focal plane movement along the X/Y axis.
Remotesensing 16 03936 g012
Table 1. Statistical results of exterior orientation parameters changes during focal plane movement along the X/Y axis.
Table 1. Statistical results of exterior orientation parameters changes during focal plane movement along the X/Y axis.
NumberMeasurement of Exterior Orientation
Parameters Changes in the X Direction
Measurement of Exterior Orientation
Parameters Changes in the Y Direction
Position Change
(μm)
Reference Angle Change
(″)
Measure Angle Change
(″)
Error
(″)
Position Change
(μm)
Reference Angle Change
(″)
Measure Angle Change
(″)
Error
(″)
10---0---
220.262.092.130.0420.362.12.090.01
340.434.174.20.0340.724.24.150.05
460.116.26.190.0160.316.226.190.03
579.998.258.240.0179.88.238.250.02
699.8710.310.340.0499.9610.3110.280.03
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Liu, H.; Liu, C.; Xie, P.; Liu, S. Design of Exterior Orientation Parameters Variation Real-Time Monitoring System in Remote Sensing Cameras. Remote Sens. 2024, 16, 3936. https://doi.org/10.3390/rs16213936

AMA Style

Liu H, Liu C, Xie P, Liu S. Design of Exterior Orientation Parameters Variation Real-Time Monitoring System in Remote Sensing Cameras. Remote Sensing. 2024; 16(21):3936. https://doi.org/10.3390/rs16213936

Chicago/Turabian Style

Liu, Hongxin, Chunyu Liu, Peng Xie, and Shuai Liu. 2024. "Design of Exterior Orientation Parameters Variation Real-Time Monitoring System in Remote Sensing Cameras" Remote Sensing 16, no. 21: 3936. https://doi.org/10.3390/rs16213936

APA Style

Liu, H., Liu, C., Xie, P., & Liu, S. (2024). Design of Exterior Orientation Parameters Variation Real-Time Monitoring System in Remote Sensing Cameras. Remote Sensing, 16(21), 3936. https://doi.org/10.3390/rs16213936

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop