[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
You seem to have javascript disabled. Please note that many of the page functionalities won't work as expected without javascript enabled.
 
 
Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (55)

Search Parameters:
Keywords = star tracker

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
19 pages, 937 KiB  
Article
Star Identification Algorithm Based on Dynamic Distance Ratio Matching
by Ya Dai, Chenguang Shi, Liyan Ben, Hua Zhu, Rui Zhang, Shufan Wu, Sixiang Shan, Yu Xu and Wang Zhou
Remote Sens. 2025, 17(1), 62; https://doi.org/10.3390/rs17010062 - 27 Dec 2024
Viewed by 259
Abstract
A star tracker is a widely used celestial sensor in astronomical navigation systems, which calculates the spacecraft’s high-precision attitude by observing stars in space to obtain several star vectors. Existing star identification algorithms typically require the selection of a specific anchor star (e.g., [...] Read more.
A star tracker is a widely used celestial sensor in astronomical navigation systems, which calculates the spacecraft’s high-precision attitude by observing stars in space to obtain several star vectors. Existing star identification algorithms typically require the selection of a specific anchor star (e.g., the nearest neighbor star), using the line connecting the target star and the anchor star as the rotational reference axis to achieve rotation invariance in the star identification algorithm. However, this approach makes the entire identification algorithm overly dependent on the anchor star, resulting in insufficient identification accuracy in cases of excessive positional noise or a high number of false stars. In this paper, we adopt the angles between any neighboring stars and the distances ratio between neighboring stars and the observed star as the initial matching criteria. We then calculate the matching with each navigation star using the accumulated angle in the counterclockwise direction based on this criterion. The navigation star with the highest matching is identified. Unlike other identification algorithms that require selecting the nearest neighbor star which can be easily affected by interference as the rotational reference axis, our method effectively achieves rotation invariance in star identification by leveraging angle information. Therefore, it exhibits better tolerance to positional noise, magnitude noise, and false stars, particularly demonstrating higher robustness against focal length variations. Full article
Show Figures

Figure 1

Figure 1
<p>The imaging principle of a star tracker.</p>
Full article ">Figure 2
<p>A sketch of feature extraction.</p>
Full article ">Figure 3
<p>Graphical representation of a typical star identification process.</p>
Full article ">Figure 4
<p>Schematic diagram of cumulative angle matching.</p>
Full article ">Figure 5
<p>Distance proportional verification.</p>
Full article ">Figure 6
<p>Dynamic distance proportional verification.</p>
Full article ">Figure 7
<p>Schematic diagram of the adjacent star.</p>
Full article ">Figure 8
<p>The relationship between star spot position noise and star identification rate.</p>
Full article ">Figure 9
<p>The relationship between number of pseudo stars and star identification rate.</p>
Full article ">Figure 10
<p>The relationship between focal length and star identification rate.</p>
Full article ">Figure 11
<p>The relationship between star image distortion and star identification rate.</p>
Full article ">Figure 12
<p>Comparison of algorithm computation time.</p>
Full article ">Figure 13
<p>Supplemental adjacent stars results.</p>
Full article ">Figure 14
<p>Ideal star spot position. The numbers of (<b>a</b>) is [star id, magnitude].</p>
Full article ">Figure A1
<p>Algorithm workflow.</p>
Full article ">
18 pages, 12989 KiB  
Article
Design of Exterior Orientation Parameters Variation Real-Time Monitoring System in Remote Sensing Cameras
by Hongxin Liu, Chunyu Liu, Peng Xie and Shuai Liu
Remote Sens. 2024, 16(21), 3936; https://doi.org/10.3390/rs16213936 - 23 Oct 2024
Viewed by 771
Abstract
The positional accuracy of satellite imagery is essential for remote sensing cameras. However, vibrations and temperature changes during launch and operation can alter the exterior orientation parameters of remote sensing cameras, significantly reducing image positional accuracy. To address this issue, this article proposes [...] Read more.
The positional accuracy of satellite imagery is essential for remote sensing cameras. However, vibrations and temperature changes during launch and operation can alter the exterior orientation parameters of remote sensing cameras, significantly reducing image positional accuracy. To address this issue, this article proposes an exterior orientation parameter variation real-time monitoring system (EOPV-RTMS). This system employs lasers to establish a full-link active optical monitoring path, which is free from time and space constraints. By simultaneously receiving star and laser signals with the star tracker, the system monitors changes in the exterior orientation parameters of the remote sensing camera in real time. Based on the in-orbit calibration geometric model, a new theoretical model and process for the calibration of exterior orientation parameters are proposed, and the accuracy and effectiveness of the system design are verified by ground experiments. The results indicate that, under the condition of a centroid extraction error of 0.1 pixel for the star tracker, the EOPV-RTMS achieves a measurement accuracy of up to 0.6″(3σ) for a single image. Displacement variation experiments validate that the measurement error of the system deviates by at most 0.05″ from the theoretical calculation results. The proposed EOPV-RTMS provides a new design solution for improving in-orbit calibration technology and image positional accuracy. Full article
Show Figures

Figure 1

Figure 1
<p>Exterior orientation parameters variation real-time monitoring system layout.</p>
Full article ">Figure 2
<p>Laser propagation path in the EOPV-RTMS.</p>
Full article ">Figure 3
<p>Laser relay system layout.</p>
Full article ">Figure 4
<p>Transmission and reflectivity curves of the narrow band-pass filter and dichroic mirror.</p>
Full article ">Figure 5
<p>EOPV-RTMS calibration process.</p>
Full article ">Figure 6
<p>Simplification of the exterior orientation parameters variation real-time monitoring system model.</p>
Full article ">Figure 7
<p>Impact of star tracker centroid extraction errors on the measurement accuracy of the EOPV-RTMS.</p>
Full article ">Figure 8
<p>Measurement accuracy of the EOPV-RTMS (centroid extraction error of star tracker ≤ 0.1 pixel).</p>
Full article ">Figure 9
<p>Verification platform for the EOPV-RTMS.</p>
Full article ">Figure 10
<p>Laser image points from four lasers in the star tracker.</p>
Full article ">Figure 11
<p>Measurement accuracy of the EOPV-RTMS.</p>
Full article ">Figure 12
<p>Measurement results of exterior orientation parameters changes during focal plane movement along the X/Y axis.</p>
Full article ">
18 pages, 15800 KiB  
Article
Research on Precise Attitude Measurement Technology for Satellite Extension Booms Based on the Star Tracker
by Peng Sang, Wenbo Liu, Yang Cao, Hongbo Xue and Baoquan Li
Sensors 2024, 24(20), 6671; https://doi.org/10.3390/s24206671 - 16 Oct 2024
Viewed by 831
Abstract
This paper reports the successful application of a self-developed, miniaturized, low-power nano-star tracker for precise attitude measurement of a 5-m-long satellite extension boom. Such extension booms are widely used in space science missions to extend and support payloads like magnetometers. The nano-star tracker, [...] Read more.
This paper reports the successful application of a self-developed, miniaturized, low-power nano-star tracker for precise attitude measurement of a 5-m-long satellite extension boom. Such extension booms are widely used in space science missions to extend and support payloads like magnetometers. The nano-star tracker, based on a CMOS image sensor, weighs 150 g (including the baffle), has a total power consumption of approximately 0.85 W, and achieves a pointing accuracy of about 5 arcseconds. It is paired with a low-cost, commercial lens and utilizes automated calibration techniques for measurement correction of the collected data. This system has been successfully applied to the precise attitude measurement of the 5-m magnetometer boom on the Chinese Advanced Space Technology Demonstration Satellite (SATech-01). Analysis of the in-orbit measurement data shows that within shadowed regions, the extension boom remains stable relative to the satellite, with a standard deviation of 30′′ (1σ). The average Euler angles for the “X-Y-Z” rotation sequence from the extension boom to the satellite are [−89.49°, 0.08°, 90.11°]. In the transition zone from shadow to sunlight, influenced by vibrations and thermal factors during satellite attitude adjustments, the maximum angular fluctuation of the extension boom relative to the satellite is approximately ±2°. These data and the accuracy of the measurements can effectively correct magnetic field vector measurements. Full article
(This article belongs to the Section Remote Sensors)
Show Figures

Figure 1

Figure 1
<p>Self-developed nano-star tracker.</p>
Full article ">Figure 2
<p>Electronic functional block diagram of the star tracker.</p>
Full article ">Figure 3
<p>Photograph of a star tracker circuit board.</p>
Full article ">Figure 4
<p>Schematic diagram of the multi-tasking pipeline of the star tracker.</p>
Full article ">Figure 5
<p>Application layer software thread of the star tracker.</p>
Full article ">Figure 6
<p>Composition of automatic calibration system for star tracker.</p>
Full article ">Figure 7
<p>Actual picture of the star tracker calibration device: (<b>a</b>) overall view; (<b>b</b>) working state.</p>
Full article ">Figure 8
<p>The process diagram of residual calibration: (<b>a</b>) image of the marked star points; (<b>b</b>) calibration residual diagram.</p>
Full article ">Figure 9
<p>Static multi-satellite simulator experimental test diagram: (<b>a</b>) the static multi-satellite simulation test device; (<b>b</b>) the test results.</p>
Full article ">Figure 10
<p>Ground test diagram: (<b>a</b>) the joint field stargazing experiment device of the probe assembly; (<b>b</b>) measurement results of the star tracker, where Q1, Q2, Q3, and Q4 represent the tracker’s output attitude quaternions.</p>
Full article ">Figure 11
<p>Measured star point image data of 100 ms exposure of star tracker—“Casiopeia”: (<b>a</b>) image collected by the star sensor; (<b>b</b>) the star point data analyzed in the software, which corresponds one-to-one with the identified stars; (<b>c</b>) starry sky image of the “Cassiopeia” position in the Stellarium software (v1.28).</p>
Full article ">Figure 12
<p>Remanence test experiment.</p>
Full article ">Figure 13
<p>Assembly diagram of the star tracker on the Chinese Advanced Space Technology Demonstration Satellite: (<b>a</b>) the probe assembly on the extension boom, with the red light shield covering the precise attitude measurement component of the nanosatellite star tracker developed in this study; (<b>b</b>) assembly diagram of the star tracker and extension rod structure on the entire satellite; (<b>c</b>) the coordinate system relationships of the satellite’s extension boom, where the satellite platform’s boom base is defined as the <span class="html-italic">XY</span> plane and the boom’s extension direction is defined as the <span class="html-italic">Z</span>-axis.</p>
Full article ">Figure 14
<p>Conversion Euler angles from NST system by the star tracker to satellite system, the time range of the data in the figure is UTC: 13 February 2023 9:09:10 to 12:05:50.</p>
Full article ">Figure 15
<p>Quaternions collected by the star tracker: (<b>a</b>) data sourced from the star tracker mounted on the satellite body; (<b>b</b>) data sourced from the star tracker on the extension boom. The time range of the data in the figure is UTC: 13 February 2023 9:09:10 to 12:05:50.</p>
Full article ">
18 pages, 69893 KiB  
Article
Grayscale Iterative Star Spot Extraction Algorithm Based on Image Entropy
by Qing Zhao, Jiawen Liao, Derui Zhang and Jia Feng
Appl. Sci. 2024, 14(20), 9207; https://doi.org/10.3390/app14209207 - 10 Oct 2024
Viewed by 605
Abstract
Star trackers are susceptible to interference from stray light, such as sunlight, moonlight, and Earth atmosphere light, in the space environment, resulting in an overall improvement in the star image grayscale, poor background uniformity, low star extraction rate, and high number of false [...] Read more.
Star trackers are susceptible to interference from stray light, such as sunlight, moonlight, and Earth atmosphere light, in the space environment, resulting in an overall improvement in the star image grayscale, poor background uniformity, low star extraction rate, and high number of false star spots. In response to these challenges, this paper proposes a grayscale iterative star spot extraction algorithm based on image entropy. The implementation of the algorithm is mainly divided into two steps: (1) The algorithm conducts multiple grayscale iterations, effectively utilizing the prior information on the local contrast of star spots to filter out stray light backgrounds to a certain extent. (2) By establishing an inner–outer template, the image entropy algorithm is employed to obtain the real star targets to be extracted, which further suppresses the background clutter and noise. Numerical simulations and experimental results demonstrate that, compared to traditional detection algorithms, this algorithm can effectively suppress background stray light, enhance star extraction rates, and reduce the number of false star spots, and it exhibits superior detection performance in complex backgrounds across various scenarios. Full article
Show Figures

Figure 1

Figure 1
<p>(<b>a</b>) Inner template; (<b>b</b>) outer template; (<b>c</b>) the combined extraction sliding window of star spots.</p>
Full article ">Figure 2
<p>Composition of the sliding window for star spot extraction.</p>
Full article ">Figure 3
<p>Different positions of the star spot extraction window in the star image: (<b>a</b>) presence of partial star spots within the inner template; (<b>b</b>) presence of complete star spots within the inner template; (<b>c</b>) absence of star spots within the inner template.</p>
Full article ">Figure 4
<p>Flowchart of star spot extraction process based on the IEGI algorithm.</p>
Full article ">Figure 5
<p>Simulated star images: (<b>a</b>) simulated star image without stray light; (<b>b</b>) simulated star image with uniformly distributed stray light; (<b>c</b>) simulated star image with linearly distributed stray light; (<b>d</b>) simulated star image with Gaussian distributed stray light.</p>
Full article ">Figure 6
<p>The correct number of star spot extractions under different stray light conditions.</p>
Full article ">Figure 7
<p>The number of false star spots under different stray light conditions.</p>
Full article ">Figure 8
<p>Real star image captured with the camera. (<b>a</b>) stray light condition 1; (<b>b</b>) stray light condition 2; (<b>c</b>) stray light condition 3.</p>
Full article ">Figure 9
<p>Real star spot distribution corresponding to the captured star images.</p>
Full article ">Figure 10
<p>The number of correctly extracted star spots by different algorithms under different conditions.</p>
Full article ">Figure 11
<p>The number of false star spots detected by different algorithms under different conditions.</p>
Full article ">
28 pages, 5098 KiB  
Article
A Robust High-Accuracy Star Map Matching Algorithm for Dense Star Scenes
by Quan Sun, Zhaodong Niu, Yabo Li and Zhuang Wang
Remote Sens. 2024, 16(11), 2035; https://doi.org/10.3390/rs16112035 - 6 Jun 2024
Viewed by 1019
Abstract
The algorithm proposed in this paper aims at solving the problem of star map matching in high-limiting-magnitude astronomical images, which is inspired by geometric voting star identification techniques. It is a two-step star map matching algorithm relying only on angular features, and adopts [...] Read more.
The algorithm proposed in this paper aims at solving the problem of star map matching in high-limiting-magnitude astronomical images, which is inspired by geometric voting star identification techniques. It is a two-step star map matching algorithm relying only on angular features, and adopts a reasonable matching strategy to overcome the problem of poor real-time performance of the geometric voting algorithm when the number of stars is large. The algorithm focuses on application scenarios where there are a large number of dense stars (limiting magnitude greater than 13, average number of stars per square degree greater than 185) in the image, which is different from the sparse star identification problem of the star tracker, which is more challenging for the robustness and real-time performance of the algorithm. The proposed algorithm can be adapted to application scenarios such as unreliable brightness information, centroid positioning error, visual axis pointing deviation, and a large number of false stars, with high accuracy, robustness, and good real-time performance. Full article
(This article belongs to the Section Remote Sensing Image Processing)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Frequency distribution and cumulative counts of stars of different magnitudes. (The numbers in parentheses are the average number of stars per square degree.)</p>
Full article ">Figure 2
<p>Correspondence between catalog and star image stars. The stars in (<b>a</b>) are located in the catalog, the star points in (<b>b</b>) are extracted from the star image; <math display="inline"><semantics> <msub> <mi>C</mi> <mn>1</mn> </msub> </semantics></math>, <math display="inline"><semantics> <msub> <mi>C</mi> <mn>2</mn> </msub> </semantics></math>, <math display="inline"><semantics> <msub> <mi>C</mi> <mn>3</mn> </msub> </semantics></math> match with <math display="inline"><semantics> <msub> <mi>S</mi> <mn>1</mn> </msub> </semantics></math>, <math display="inline"><semantics> <msub> <mi>S</mi> <mn>2</mn> </msub> </semantics></math>, <math display="inline"><semantics> <msub> <mi>S</mi> <mn>3</mn> </msub> </semantics></math>, and no star points corresponding to <math display="inline"><semantics> <msub> <mi>C</mi> <mn>4</mn> </msub> </semantics></math> have been extracted from the star image; <math display="inline"><semantics> <msub> <mi>S</mi> <mn>4</mn> </msub> </semantics></math> is a false star due to noise.</p>
Full article ">Figure 3
<p>Schematic diagram of joint sorting operation. The star image angle vector is in front and the catalog angle vector is behind to form the joint angle vector. <math display="inline"><semantics> <msub> <mi>N</mi> <mi>S</mi> </msub> </semantics></math> and <math display="inline"><semantics> <msub> <mi>N</mi> <mi>C</mi> </msub> </semantics></math> are the number of elements of the star image angle vector and the catalog angle vector, respectively.</p>
Full article ">Figure 4
<p>Schematic diagram of the maximum matching degree calculated from <math display="inline"><semantics> <msub> <mover> <mi mathvariant="bold">F</mi> <mo>·</mo> </mover> <mi>M</mi> </msub> </semantics></math>. <span class="html-italic">L</span> denotes the number of subsegments in <math display="inline"><semantics> <msub> <mover> <mi mathvariant="bold">F</mi> <mo>·</mo> </mover> <mi>M</mi> </msub> </semantics></math> whose elements are all <span class="html-italic">true</span>, and <math display="inline"><semantics> <msub> <mi>n</mi> <mi>k</mi> </msub> </semantics></math> represents the number of <span class="html-italic">true</span> elements in the <span class="html-italic">k</span>th subsegment.</p>
Full article ">Figure 5
<p>Rough matching process flow chart.</p>
Full article ">Figure 6
<p>Block diagram of the data synthesis process.</p>
Full article ">Figure 7
<p>Typical synthetic star background image. (<b>a</b>) Synthetic image of a sparsely distributed star sky region (visual axis pointing: 0° declination, 0° right ascension). (<b>b</b>) Synthetic image of a densely distributed star sky region (visual axis pointing: 290° declination, 0° right ascension).</p>
Full article ">Figure 8
<p>Matching rates for different visual axis pointing deviations.</p>
Full article ">Figure 9
<p>Average matching accuracy for different visual axis pointing deviations (black line is the error introduced by the synthetic data).</p>
Full article ">Figure 10
<p>Average running time for different visual axis pointing deviations.</p>
Full article ">Figure 11
<p>Matching rates for different numbers of false stars.</p>
Full article ">Figure 12
<p>Average matching accuracy for different numbers of false stars (black line is the error introduced by the synthetic data).</p>
Full article ">Figure 13
<p>Average running time for different numbers of false stars.</p>
Full article ">Figure 14
<p>Matching rate for different positioning deviations.</p>
Full article ">Figure 15
<p>Average matching accuracy for different positioning deviations (black line is the error introduced by the synthetic data).</p>
Full article ">Figure 16
<p>Average running time for different positioning deviations.</p>
Full article ">Figure 17
<p>Matching rate for different magnitude deviations.</p>
Full article ">Figure 18
<p>Average matching accuracy for different magnitude deviations (black line is the error introduced by the synthetic data).</p>
Full article ">Figure 19
<p>Average running time for different magnitude deviations.</p>
Full article ">Figure 20
<p>Matching rate for different tasks.</p>
Full article ">Figure 21
<p>Average matching accuracy for different tasks.</p>
Full article ">Figure 22
<p>Average running time for different tasks.</p>
Full article ">Figure 23
<p>Matching error of RIAV.</p>
Full article ">Figure 24
<p>Matching error of GMV.</p>
Full article ">Figure 25
<p>Matching error of the proposed algorithm.</p>
Full article ">Figure 26
<p>Comprehensive performance radar charts for star map matching algorithms.</p>
Full article ">
23 pages, 5990 KiB  
Article
Technology Demonstration of Space Situational Awareness (SSA) Mission on Stratospheric Balloon Platform
by Randa Qashoa, Vithurshan Suthakar, Gabriel Chianelli, Perushan Kunalakantha and Regina S. K. Lee
Remote Sens. 2024, 16(5), 749; https://doi.org/10.3390/rs16050749 - 21 Feb 2024
Viewed by 2529
Abstract
As the number of resident space objects (RSOs) orbiting Earth increases, the risk of collision increases, and mitigating this risk requires the detection, identification, characterization, and tracking of as many RSOs as possible in view at any given time, an area of research [...] Read more.
As the number of resident space objects (RSOs) orbiting Earth increases, the risk of collision increases, and mitigating this risk requires the detection, identification, characterization, and tracking of as many RSOs as possible in view at any given time, an area of research referred to as Space Situational Awareness (SSA). In order to develop algorithms for RSO detection and characterization, starfield images containing RSOs are needed. Such images can be obtained from star trackers, which have traditionally been used for attitude determination. Despite their low resolution, star tracker images have the potential to be useful for SSA. Using star trackers in this dual-purpose manner offers the benefit of leveraging existing star tracker technology already in orbit, eliminating the need for new and costly equipment to be launched into space. In August 2022, we launched a CubeSat-class payload, Resident Space Object Near-space Astrometric Research (RSONAR), on a stratospheric balloon. The primary objective of the payload was to demonstrate a dual-purpose star tracker for imaging and analyzing RSOs from a space-like environment, aiding in the field of SSA. Building on the experience and lessons learned from the 2022 campaign, we developed a next-generation dual-purpose camera in a 4U-inspired CubeSat platform, named RSONAR II. This payload was successfully launched in August 2023. With the RSONAR II payload, we developed a real-time, multi-purpose imaging system with two main cameras of varying cost that can adjust imaging parameters in real-time to evaluate the effectiveness of each configuration for RSO imaging. We also performed onboard RSO detection and attitude determination to verify the performance of our algorithms. Additionally, we implemented a downlink capability to verify payload performance during flight. To add a wider variety of images for testing our algorithms, we altered the resolution of one of the cameras throughout the mission. In this paper, we demonstrate a dual-purpose star tracker system for future SSA missions and compare two different sensor options for RSO imaging. Full article
(This article belongs to the Section Satellite Missions for Earth and Planetary Exploration)
Show Figures

Figure 1

Figure 1
<p>Star tracker prototype launched on a stratospheric balloon in 2022.</p>
Full article ">Figure 2
<p>RSONAR II Model; (<b>a</b>) RSONAR II CAD Model, (<b>b</b>) RSONAR II payload integrated on the gondola.</p>
Full article ">Figure 3
<p>RSONAR II harness diagram.</p>
Full article ">Figure 4
<p>Example of a sequence of PCO camera images captured from a field campaign. The red circle shows the location of the RSO as it transits. These images have been enhanced with the use of the Zscale algorithm.</p>
Full article ">Figure 5
<p>Block diagram outlining the closed-loop image acquisition application once the payload is powered on.</p>
Full article ">Figure 6
<p>Software block diagram of the STARDUST payload.</p>
Full article ">Figure 7
<p>Sample downlinked image enhanced with the Zscale algorithm.</p>
Full article ">Figure 8
<p>An illustration of the RSO detection algorithm’s processing steps, taking the lit pixels corresponding to RSOs in 3 different images and associating them together into a detection. The red, green, and blue pixels represent the centroids of the RSO from the first, second, and third image, respectively.</p>
Full article ">Figure 9
<p>The distribution of time delays between images during high-resolution 2048 × 2048 imaging, with the frequency of occurrences on the <span class="html-italic">Y</span>-axis and the time difference in milliseconds on the <span class="html-italic">X</span>-axis. The mode (460 ms), median (470 ms), and mean (593.28 ms) are indicated by the red dashed, green solid, and orange dash-dotted lines, respectively.</p>
Full article ">Figure 10
<p>The graph illustrates temperature fluctuations of various components during the mission flight on 22 August 2023, from 4:52 a.m. to 9:34 a.m. (UTC). Notably, both the payload and the environment experienced significant temperature changes in the initial two hours. Subsequently, the temperature of all components stabilized as the flight coasted at the targeted altitudes, with only minor temperature fluctuations observed.</p>
Full article ">Figure 11
<p>Plot of limiting magnitude over integration time for RSONAR II sensors.</p>
Full article ">Figure 12
<p>The histogram displays the frequency of stars detected at different magnitudes (brightness levels) in the Johnson V (visual) band, with two datasets represented: subpayload 1 in blue and subpayload 2 in red. The x-axis represents the magnitude (Johnson V), a logarithmic scale used to measure the brightness of stars, while the y-axis indicates the frequency of stars at the detection’s magnitude ranges.</p>
Full article ">Figure 13
<p>(<b>a</b>) Subpayload 1 and (<b>b</b>) subpayload 2 display parts of the Pisces constellation captured by subpayloads 1 and 2, respectively, towards the end of their operational period during flight.</p>
Full article ">Figure 14
<p>Contrast maps for (<b>a</b>) subpayload 1 and (<b>b</b>) subpayload 2 capturing the same starfield, which reveal the variations in local contrast across each sensor’s image. Brighter squares indicate areas of higher contrast, likely corresponding to celestial bodies, against the darker background of space.</p>
Full article ">Figure 15
<p>Two histograms are presented, each in log scale, representing the distribution of pixel intensities from the minimum to maximum pixel values in the images: (<b>a</b>) a 16-bit image from subpayload 1, and (<b>b</b>) an 8-bit image from subpayload 2, respectively.</p>
Full article ">Figure 16
<p>The Life Cycle of Celestial Objects Pts. 1 &amp; 2: (<b>a</b>) RSONAR II payload display; (<b>b</b>) some of the etched messages seen through magnifying glass.</p>
Full article ">
18 pages, 2515 KiB  
Article
Detection of Degraded Star Observation Using Singular Values for Improved Attitude Determination
by Kiduck Kim
Sensors 2024, 24(2), 593; https://doi.org/10.3390/s24020593 - 17 Jan 2024
Viewed by 846
Abstract
This study introduces an innovative approach aimed at enhancing the accuracy of attitude determination through the computation of star observation quality. The proposed algorithm stems from the inherent invariance of singular values under attitude transformations, leveraging the concept of assessing error magnitude through [...] Read more.
This study introduces an innovative approach aimed at enhancing the accuracy of attitude determination through the computation of star observation quality. The proposed algorithm stems from the inherent invariance of singular values under attitude transformations, leveraging the concept of assessing error magnitude through the deviation of singular values. Quantization becomes imperative to employ this error magnitude as a weighting factor within the attitude determination process. To fulfill this purpose, this study applies p-value hypothesis testing to calculate quantized error levels. Simulation results validate that the calculated weights derived from the proposed algorithm lead to a discernible enhancement in attitude determination performance. Full article
Show Figures

Figure 1

Figure 1
<p>A set of two stars seen from two different reference frames: (<b>a</b>) an inertial reference frame; (<b>b</b>) the body reference frame.</p>
Full article ">Figure 2
<p>Variation of singular values at each noise level.</p>
Full article ">Figure 3
<p>Quality value calculation results under Scenario 1.</p>
Full article ">Figure 4
<p>Quality value calculation results under Scenario 2.</p>
Full article ">Figure 5
<p>Quality value calculation results under Scenario 3.</p>
Full article ">Figure 6
<p>Quality value trend on error size.</p>
Full article ">Figure 7
<p>Quality value calculation results with bias type noise.</p>
Full article ">Figure 8
<p>Quality value calculation results under Scenario 4.</p>
Full article ">Figure 9
<p>Quality value calculation results under Scenario 5.</p>
Full article ">Figure 10
<p>Comparing results of the attitude determination accuracy under Scenario 3.</p>
Full article ">Figure 11
<p>Comparing results of the attitude determination accuracy.</p>
Full article ">Figure 12
<p>Comparing results of the attitude determination accuracy up to 150 arc-seconds.</p>
Full article ">
17 pages, 5035 KiB  
Article
A Dual-Purpose Camera for Attitude Determination and Resident Space Object Detection on a Stratospheric Balloon
by Gabriel Chianelli, Perushan Kunalakantha, Marissa Myhre and Regina S. K. Lee
Sensors 2024, 24(1), 71; https://doi.org/10.3390/s24010071 - 22 Dec 2023
Cited by 1 | Viewed by 1848
Abstract
Space systems play an integral role in every facet of our daily lives, including national security, communications, and resource management. Therefore, it is critical to protect our valuable assets in space and build resiliency in the space environment. In recent years, we have [...] Read more.
Space systems play an integral role in every facet of our daily lives, including national security, communications, and resource management. Therefore, it is critical to protect our valuable assets in space and build resiliency in the space environment. In recent years, we have developed a novel approach to Space Situational Awareness (SSA), in the form of a low-resolution, Wide Field-of-View (WFOV) camera payload for attitude determination and Resident Space Object (RSO) detection. Detection is the first step in tracking, identification, and characterization of RSOs, including natural and artificial objects orbiting the Earth. A space-based dual-purpose camera that can provide attitude information alongside RSO detection can enhance the current SSA technologies which rely on ground infrastructure. A CubeSat form factor payload with real-time attitude determination and RSO detection algorithms was developed and flown onboard the CSA/CNES stratospheric balloon platform in August 2023. Sub-degree pointing information and multiple RSO detections were demonstrated during operation, with opportunities for improvement discussed. This paper outlines the hardware and software architecture, system design methodology, on-ground testing, and in-flight results of the dual-purpose camera payload. Full article
(This article belongs to the Special Issue New Trends on Sensor Devices for Space and Defense Applications)
Show Figures

Figure 1

Figure 1
<p>Star tracker payload on 2022 STRATOS Balloon Platform (<b>a</b>), and 2023 STRATOS Balloon Platform (<b>b</b>).</p>
Full article ">Figure 2
<p>IDS UI-3370CP-M-GL camera (<b>a</b>), and Raspberry Pi 4 Model B OBC (<b>b</b>) used on STARDUST.</p>
Full article ">Figure 3
<p>Software block diagram for STARDUST, visually outlining the functions and logic.</p>
Full article ">Figure 4
<p>Block diagram outlining the first step, extracting centroids, in the RSO detection algorithm.</p>
Full article ">Figure 5
<p>Block diagram outlining the second step, detecting RSOs, in the RSO detection algorithm.</p>
Full article ">Figure 6
<p>Processed starfield image with an RSO captured during the STRATOS 2023 campaign.</p>
Full article ">Figure 7
<p>Attitude pointing Yaw, Pitch, and Roll errors for the LIS algorithm.</p>
Full article ">Figure 8
<p>Attitude pointing Yaw, Pitch, and Roll errors for the tracking mode algorithm.</p>
Full article ">Figure 9
<p>RSO centroids corresponding to three sequential images, plotted as single, colored pixels on a black background. The Euclidean distances calculated by the algorithm are also plotted.</p>
Full article ">
11 pages, 6962 KiB  
Technical Note
An Improved In-Flight Calibration Scheme for CSES Magnetic Field Data
by Yanyan Yang, Zeren Zhima, Xuhui Shen, Bin Zhou, Jie Wang, Werner Magnes, Andreas Pollinger, Hengxin Lu, Feng Guo, Roland Lammegger, Na Zhou, Yuanqing Miao, Qiao Tan and Wenjing Li
Remote Sens. 2023, 15(18), 4578; https://doi.org/10.3390/rs15184578 - 17 Sep 2023
Cited by 3 | Viewed by 1677
Abstract
The CSES high precision magnetometer (HPM), consisting of two fluxgate magnetometers (FGM) and one coupled dark state magnetometer (CDSM), has worked successfully for more than 5 years providing continuous magnetic field measurements since the launch of the CSES in February 2018. After rechecking [...] Read more.
The CSES high precision magnetometer (HPM), consisting of two fluxgate magnetometers (FGM) and one coupled dark state magnetometer (CDSM), has worked successfully for more than 5 years providing continuous magnetic field measurements since the launch of the CSES in February 2018. After rechecking almost every year’s data, it has become possible to make an improvement to the in-flight intrinsic calibration (to estimate offsets, scale values and non-orthogonality) and alignment (to estimate three Euler angles for the rotation between the orthogonalized sensor coordinates and the coordinate system of the star tracker) of the FGM. The following efforts have been made to achieve this goal: For the sensor calibration, FGM sensor temperature corrections on offsets and scale values have been taken into account to remove seasonal effects. Based on these results, Euler angles have been estimated along with global geomagnetic field modeling to improve the alignment of the FGM sensor. With this, a latitudinal effect in the east component of the originally calibrated data could be reduced. Furthermore, it has become possible to prolong the updating period of all calibration parameters from daily to 10 days, without the separation of dayside and nightside data. The new algorithms optimize routine HPM data processing efficiency and data quality. Full article
(This article belongs to the Special Issue Satellite Missions for Magnetic Field Analysis)
Show Figures

Figure 1

Figure 1
<p>The magnetic field intensity residual from (<b>a</b>) level 2 data products; (<b>b</b>) recalibrated data without temperature correction; (<b>c</b>) recalibrated data with temperature correction.</p>
Full article ">Figure 2
<p>The temperature variation of two FGM sensor and electronic box. (<b>a</b>) for electronic box; (<b>b</b>) for FGM1; (<b>c</b>) for FGM2.</p>
Full article ">Figure 3
<p>Variation for offsets, scale values and non-orthogonality angles gained from recalibration including temperature variation of offsets and scale factors. The mean values were subtracted for each parameter. Circle dots denote the parameters estimated from the measured datasets while star dots represent values obtained from interpolation (see text for more details). (<b>a</b>–<b>c</b>) time variation for offsets scale values and non-orthogonality angles, respectively.</p>
Full article ">Figure 4
<p>Histogram distribution of the residual field for level 2 (<b>left</b>) and recalibrated (<b>right</b>) data.</p>
Full article ">Figure 5
<p>Flow diagram of the CSES FGM coordinate transformation. Numbers 1 to 4 stands for the four steps of the transformation.</p>
Full article ">Figure 6
<p>The variation of in-orbit-estimated three Euler angles for the alignment of the FGM sensor.</p>
Full article ">Figure 7
<p>The three residual field comparisons between the (<b>a</b>) current level 2 data and (<b>b</b>) the recalibrated data.The red, green and blue dots stands for the north, east and center component respectively.</p>
Full article ">Figure 8
<p>The estimated three residual fields from 19 January to 29 April 2020 for the CDSM missing data. The red, green and blue dots respectively represent north, east and central components.</p>
Full article ">
15 pages, 3246 KiB  
Article
Stratospheric Night Sky Imaging Payload for Space Situational Awareness (SSA)
by Perushan Kunalakantha, Andrea Vallecillo Baires, Siddharth Dave, Ryan Clark, Gabriel Chianelli and Regina S. K. Lee
Sensors 2023, 23(14), 6595; https://doi.org/10.3390/s23146595 - 21 Jul 2023
Cited by 3 | Viewed by 1971
Abstract
Space situational awareness (SSA) refers to collecting, analyzing, and keeping track of detailed knowledge of resident space objects (RSOs) in the space environment. With the rapidly increasing number of objects in space, the need for SSA grows as well. Traditional methods rely heavily [...] Read more.
Space situational awareness (SSA) refers to collecting, analyzing, and keeping track of detailed knowledge of resident space objects (RSOs) in the space environment. With the rapidly increasing number of objects in space, the need for SSA grows as well. Traditional methods rely heavily on imaging RSOs from large, narrow field-of-view (FOV), ground-based telescopes. This research outlines the technology demonstration payload, Resident Space Object Near-space Astrometric Research (RSONAR)—a star tracker-like, wide FOV camera combined with commercial off-the-shelf (COTS) hardware to image RSOs from the stratosphere, overcoming the disadvantages of ground-based observations. The hardware components and software algorithm are described and evaluated. The eligibility of the payload for SSA is proven by the image processing algorithms, which detect the RSOs in the images captured during flight and the survival of the COTS components in the near-space environment. The payload features a low-resolution, wide FOV camera coupled with a Field Programmable Gate Array (FPGA)-based platform that houses the altitude and time-based image capture algorithm. The newly developed payload in a 2U-CubeSat form factor was flown as a space-ready payload on the CSA/CNES stratospheric balloon research platform to carry out algorithm and functionality tests in August 2022. Full article
(This article belongs to the Section Optical Sensors)
Show Figures

Figure 1

Figure 1
<p>Xilinx PYNQ-Z1 FPGA development board.</p>
Full article ">Figure 2
<p>RSONAR payload CAD model outlining the structure and some electronics within the payload from two views: (<b>a</b>) Isometric view; (<b>b</b>) Side view. Note that triangular prism-like segments are added to the pain 2U segment.</p>
Full article ">Figure 3
<p>RSONAR payload fastened to the gondola. (<b>a</b>) CAD model depiction of the mounting scheme; (<b>b</b>) Actual integration of the payload to the gondola.</p>
Full article ">Figure 4
<p>Block diagram outlining the autonomous algorithm used to power on the payload, check the altitude and time, and take images with pre-defined camera parameters corresponding to the mode.</p>
Full article ">Figure 5
<p>Diagram outlining the connections between the electronics in the RSONAR payload.</p>
Full article ">Figure 6
<p>Diagram outlining the electrical components used on the power distribution unit.</p>
Full article ">Figure 7
<p>Pictured in the foreground is a smaller balloon used to keep the attached gondola steady upon launch. In the background is the larger balloon being inflated to launch. The larger balloon expanded even further as it ascended into the stratosphere.</p>
Full article ">Figure 8
<p>Example of an image created from stacking a sequence of images. Stars appear as bright dots (example contained in dashed red circle), while RSOs appear as streaks (example contained in dashed red box). Multiple streaks are visible in the image, corresponding to multiple RSOs.</p>
Full article ">Figure 9
<p>Plot of the temperatures inside the PCO camera at various locations (sCMOS sensor, camera, and power supply) for the duration that the payload was powered alongside the temperature of the environment for the duration of the flight. Temperature logging of the camera ended at 8:30 AM local time, while environmental temperature logging continued until 1:10 PM local time. This work is based on observations with the CNES temperature sensor under a balloon operated by CNES, within STRATO-SCIENCE 2022 and in the framework of the CNES/CSA Agreement.</p>
Full article ">
29 pages, 36313 KiB  
Article
Interplanetary Student Nanospacecraft: Development of the LEO Demonstrator ESTCube-2
by Janis Dalbins, Kristo Allaje, Hendrik Ehrpais, Iaroslav Iakubivskyi, Erik Ilbis, Pekka Janhunen, Joosep Kivastik, Maido Merisalu, Mart Noorma, Mihkel Pajusalu, Indrek Sünter, Antti Tamm, Hans Teras, Petri Toivanen, Boris Segret and Andris Slavinskis
Aerospace 2023, 10(6), 503; https://doi.org/10.3390/aerospace10060503 - 26 May 2023
Cited by 11 | Viewed by 3355
Abstract
Nanosatellites have established their importance in low-Earth orbit (LEO), and it is common for student teams to build them for educational and technology demonstration purposes. The next challenge is the technology maturity for deep-space missions. The LEO serves as a relevant environment for [...] Read more.
Nanosatellites have established their importance in low-Earth orbit (LEO), and it is common for student teams to build them for educational and technology demonstration purposes. The next challenge is the technology maturity for deep-space missions. The LEO serves as a relevant environment for maturing the spacecraft design. Here we present the ESTCube-2 mission, which will be launched onboard VEGA-C VV23. The satellite was developed as a technology demonstrator for the future deep-space mission by the Estonian Student Satellite Program. The ultimate vision of the program is to use the electric solar wind sail (E-sail) technology in an interplanetary environment to traverse the solar system using lightweight propulsion means. Additional experiments were added to demonstrate all necessary technologies to use the E-sail payload onboard ESTCube-3, the next nanospacecraft targeting the lunar orbit. The E-sail demonstration requires a high-angular velocity spin-up to deploy a tether, resulting in a need for a custom satellite bus. In addition, the satellite includes deep-space prototypes: deployable structures; compact avionics stack electronics (including side panels); star tracker; reaction wheels; and cold–gas propulsion. During the development, two additional payloads were added to the design of ESTCube-2, one for Earth observation of the Normalized Difference Vegetation Index and the other for corrosion testing in the space of thin-film materials. The ESTCube-2 satellite has been finished and tested in time for delivery to the launcher. Eventually, the project proved highly complex, making the team lower its ambitions and optimize the development of electronics, software, and mechanical structure. The ESTCube-2 team dealt with budgetary constraints, student management problems during a pandemic, and issues in the documentation approach. Beyond management techniques, the project required leadership that kept the team aware of the big picture and willing to finish a complex satellite platform. The paper discusses the ESTCube-2 design and its development, highlights the team’s main technical, management, and leadership issues, and presents suggestions for nanosatellite and nanospacecraft developers. Full article
(This article belongs to the Special Issue Advances in CubeSat Sails and Tethers (2nd Edition))
Show Figures

Figure 1

Figure 1
<p>The Coulomb drag propulsion tether has a redundant structure, making it resistant to micrometeoroid and orbital debris impacts. The wire thickness is 50 <math display="inline"><semantics> <mi mathvariant="sans-serif">μ</mi> </semantics></math>m. A human hair is used for scale below.</p>
Full article ">Figure 2
<p>Satellite spin rate change with the E-sail tether in the Earth’s ionosphere [<a href="#B33-aerospace-10-00503" class="html-bibr">33</a>]. The tether is charged while moving downstream or upstream of ionospheric plasma to increase or decrease the satellite spin rate, respectively. The change in spin rate would be noticeable in one or a few orbits (depending on the deployed tether length). Artwork credit: Laila Kaasik, Rute Marta Jansone, and Anna Maskava.</p>
Full article ">Figure 3
<p>Deorbiting with ionospheric plasma brake in the low-Earth orbit ionosphere [<a href="#B33-aerospace-10-00503" class="html-bibr">33</a>]. As the satellite moves through the relatively stationary ionosphere, the charged tether creates a braking force to deorbit the satellite. Given the satellite’s orbital parameters, the plasma brake deorbiting effect would take weeks to months to be noticeable. Artwork credit: Laila Kaasik, Rute Marta Jansone, and Anna Maskava.</p>
Full article ">Figure 4
<p>A render of the Ionospheric Plasma Brake module.</p>
Full article ">Figure 5
<p>A render of the Earth Observation Payload camera modules.</p>
Full article ">Figure 6
<p>The compact 65 × 41 mm<math display="inline"><semantics> <msup> <mrow/> <mn>2</mn> </msup> </semantics></math> satellite module used for testing 15 materials in space, which consists of a printed circuit board (PCB) with coated materials and a cover plate with holes to expose tested materials to atomic oxygen in LEO.</p>
Full article ">Figure 7
<p>Exploded view of the 3U satellite, which showcases the satellite and its systems’ arrangement. The attitude and orbit control system (AOCS) components are distributed throughout the satellite—in the avionics stack (see <a href="#aerospace-10-00503-f008" class="html-fig">Figure 8</a>), cold gas propulsion, magnetorquers on side panels of X− axis Y+ axis directions, and sensors on the six side panel PCBs.</p>
Full article ">Figure 8
<p>A render of the satellite avionics stack electronics module. System names starting from the top: battery management printed circuit board (PCB) with batteries; electrical power system; communication system; onboard computer; star tracker. The AOCS components in the avionics stack—reaction wheels, star tracker, magnetorquer at the bottom of the stack, and sensors on the onboard computer (OBC) board.</p>
Full article ">Figure 9
<p>The inter-system electrical and the data-connections architecture. SP: side panels, CTS: Corrosion Testing in Space, HSCOM: high-speed communications, CGP: Coulomb drag propulsion (Ionospheric Plasma Brake), EPS: electrical power system, COM: communications system, OBC: onboard computer, ST: star tracker, EOP: Earth observation payload, CGP: cold–gas propulsion.</p>
Full article ">Figure 10
<p>Side panel locations and naming.</p>
Full article ">Figure 11
<p>Side panel configuration table. 1. The bottom Z-axis side panel controls the magnetorquer. However, the coil driver components are physically on the ST PCB. 2. Resistor count for deployable structures. 3. Both internal communication protocol (ICP) buses for redundancy. 4. Additional maximum power point trackers (MPPT) for deployable side panels.</p>
Full article ">Figure 12
<p>Two of three in-house developed coreless magnetorquers.</p>
Full article ">Figure 13
<p>Subsystem software architecture.</p>
Full article ">Figure 14
<p>In-house developed circularly polarized S-band patch antenna.</p>
Full article ">
18 pages, 16443 KiB  
Article
AI-Based Real-Time Star Tracker
by Guy Carmeli and Boaz Ben-Moshe
Electronics 2023, 12(9), 2084; https://doi.org/10.3390/electronics12092084 - 2 May 2023
Cited by 1 | Viewed by 7217
Abstract
Many systems on Earth and in space require precise orientation when observing the sky, particularly for objects that move at high speeds in space, such as satellites, spaceships, and missiles. These systems often rely on star trackers, which are devices that use star [...] Read more.
Many systems on Earth and in space require precise orientation when observing the sky, particularly for objects that move at high speeds in space, such as satellites, spaceships, and missiles. These systems often rely on star trackers, which are devices that use star patterns to determine the orientation of the spacecraft. However, traditional star trackers are often expensive and have limitations in their accuracy and robustness. To address these challenges, this research aims to develop a high-performance and cost-effective AI-based Real-Time Star Tracker system as a basic platform for micro/nanosatellites. The system uses existing hardware, such as FPGAs and cameras, which are already part of many avionics systems, to extract line-of-sight (LOS) vectors from sky images. The algorithm implemented in this research is a “lost-in-space” algorithm that uses a self-organizing neural network map (SOM) for star pattern recognition. SOM is an unsupervised machine learning algorithm that is usually used for data visualization, clustering, and dimensionality reduction. Today’s technologies enable star-based navigation, making matching a sky image to the star map an important aspect of navigation. This research addresses the need for reliable, low-cost, and high-performance star trackers, which can accurately recognize star patterns from sky images with a success rate of about 98% in approximately 870 microseconds. Full article
Show Figures

Figure 1

Figure 1
<p>AI-based real-time star tracker system diagram.</p>
Full article ">Figure 2
<p>FPGA AI-based star tracker block diagram.</p>
Full article ">Figure 3
<p>Bayer pattern.</p>
Full article ">Figure 4
<p>Star Tracker GUI.</p>
Full article ">Figure 5
<p>Star detection report. On the left side of the report, we can see the Linux terminal output. It shows that the system detected one group seven times. This effect is due to the star search algorithm. After filtering out identical results, we are left with a single star. In the center of the image, we see star detection and DEP identification. In the top right corner, we see a reference image of a real star. Below it, there is an image histogram.</p>
Full article ">Figure 6
<p>Star detection by the FPGA.</p>
Full article ">Figure 7
<p>Star Identification algorithm: choosing stars by intensity.</p>
Full article ">Figure 8
<p>Camera FOV: star number is arranged by intensity.</p>
Full article ">Figure 9
<p>Kohonen map and indexing to Almanac.</p>
Full article ">Figure 10
<p>Neighboring stars at the defined spatial angle of R ≤ 7.5 deg.</p>
Full article ">Figure 11
<p>Kohonen distance map.</p>
Full article ">Figure 12
<p>Activation frequency map.</p>
Full article ">Figure 13
<p>Quantization error and topographic error. Quantization error: 0.00014; topographic error: 0.02218.</p>
Full article ">
22 pages, 6919 KiB  
Article
GMM-Based Adaptive Extended Kalman Filter Design for Satellite Attitude Estimation under Thruster-Induced Disturbances
by Taeho Kim, Natnael S. Zewge, Hyochoong Bang and Hyosang Yoon
Sensors 2023, 23(9), 4212; https://doi.org/10.3390/s23094212 - 23 Apr 2023
Cited by 2 | Viewed by 2003
Abstract
Star images from star trackers are usually defocused to capture stars over an exposure time for better centroid measurements. While a satellite is maneuvering, the star point on the screen of the camera is affected by the satellite, which results in the degradation [...] Read more.
Star images from star trackers are usually defocused to capture stars over an exposure time for better centroid measurements. While a satellite is maneuvering, the star point on the screen of the camera is affected by the satellite, which results in the degradation of centroid measurement accuracy. Additionally, this could result in a worse star vector outcome. For geostationary satellites, onboard thrusters are used to maintain or change orbit parameters under orbit disturbances. Since there is misalignment in the thruster and torque is generated by an impulsive shape signal from the torque command, it is difficult to generate target torque; in addition, it also impacts the star image because the impulsive torque creates a sudden change in the angular velocity in the satellite dynamics. This makes the noise of the star image non-Gaussian, which may require introducing a method for dealing with non-Gaussian measurement noise. To meet this goal, in this study, an adaptive extended Kalman filter is implemented to predict measurement vectors with predicted states. The GMM (Gaussian mixture model) is connected in this sequence, giving weighting parameters to each Gaussian density and resulting in the better prediction of measurement vectors. Simulation results show that the GMM-EKF exhibits a better performance than the EKF for attitude estimation, with 30% improvement in performance. Therefore, the GMM-EKF could be a more attractive approach for use with geostationary satellites during station-keeping maneuvers. Full article
(This article belongs to the Special Issue Attitude Estimation Based on Data Processing of Sensors)
Show Figures

Figure 1

Figure 1
<p>A pinhole camera model of star tracker.</p>
Full article ">Figure 2
<p>Single thruster’s setup direction regarding body axis.</p>
Full article ">Figure 3
<p>Example thruster setup for a satellite (featuring six thruster units).</p>
Full article ">Figure 4
<p>Euler angle error profile.</p>
Full article ">Figure 5
<p>(<b>a</b>) Three-axis angular velocity profile; (<b>b</b>) Angular velocity norm profile.</p>
Full article ">Figure 6
<p>(<b>a</b>) Thruster force profile of each thruster unit; (<b>b</b>) Three-axis torque profile.</p>
Full article ">Figure 7
<p>Star image under <math display="inline"><semantics> <mrow> <mfenced close="|" open="|"> <mi mathvariant="bold-sans-serif">ω</mi> </mfenced> <mo>=</mo> <mn>0.0173</mn> <mo>°</mo> <mo>/</mo> <mi mathvariant="normal">s</mi> </mrow> </semantics></math>.</p>
Full article ">Figure 8
<p>Star image under <math display="inline"><semantics> <mrow> <mfenced close="|" open="|"> <mi mathvariant="bold-sans-serif">ω</mi> </mfenced> <mo>=</mo> <mn>0.3</mn> <mo>°</mo> <mo>/</mo> <mi mathvariant="normal">s</mi> </mrow> </semantics></math>.</p>
Full article ">Figure 9
<p>GMM distribution of non-Gaussian noise of star vector measurement.</p>
Full article ">Figure 10
<p>(<b>a</b>) Euler estimate error profile of GMM-EKF; (<b>b</b>) Bias estimate error profile of GMM-EKF.</p>
Full article ">Figure 11
<p>(<b>a</b>) Euler estimate error of GMM-EKF (Monte Carlo simulation); (<b>b</b>) Bias estimate error profile of GMM-EKF (Monte Carlo simulation).</p>
Full article ">Figure 12
<p>(<b>a</b>) Euler estimate error profile of EKF; (<b>b</b>) Bias estimate error profile of EKF.</p>
Full article ">Figure 13
<p>Euler estimate error with GMM-EKF (solid line) and EKF (dashed line).</p>
Full article ">
9 pages, 2097 KiB  
Communication
Variable Angular Rate Measurement for a Spacecraft Based on the Rolling Shutter Mode of a Star Tracker
by Shuo Zhang, Fei Xing, Ting Sun and Zheng You
Electronics 2023, 12(8), 1875; https://doi.org/10.3390/electronics12081875 - 16 Apr 2023
Cited by 2 | Viewed by 1517
Abstract
Angular rate is a piece of useful information for the attitude control of a spacecraft. The star tracker as a space optical sensor can be used to measure the angular rate of a spacecraft. In this paper, a novel approach is proposed to [...] Read more.
Angular rate is a piece of useful information for the attitude control of a spacecraft. The star tracker as a space optical sensor can be used to measure the angular rate of a spacecraft. In this paper, a novel approach is proposed to improve the measurement accuracy of the angular rate during spacecraft rotation. The electronic rolling shutter (RS) imaging mode of the complementary metal-oxide semiconductor (CMOS) image sensor in a star tracker is applied to obtain much higher sampling frequency for reducing the change of the angular rate between the sampling interval. The optic flow vector on the imaging plane is approximated within the second order using three successive star images to reflect the nonlinear effect from the variable angular rate. The experiment is performed to demonstrate the advantage of the new approach for variable angular rate measurement. Full article
(This article belongs to the Section Industrial Electronics)
Show Figures

Figure 1

Figure 1
<p>Vector observation model of a star tracker.</p>
Full article ">Figure 2
<p>Operations of GS and RS modes.</p>
Full article ">Figure 3
<p>Laboratory experiment system.</p>
Full article ">Figure 4
<p>Star tracker used in the experiment.</p>
Full article ">Figure 5
<p>True angular rate.</p>
Full article ">Figure 6
<p>Measurement errors of angular rate.</p>
Full article ">
17 pages, 3873 KiB  
Article
Payload Camera Breadboard for Space Surveillance—Part I: Breadboard Design and Implementation
by Joel Filho, Paulo Gordo, Nuno Peixinho, Rui Melicio and Ricardo Gafeira
Appl. Sci. 2023, 13(6), 3682; https://doi.org/10.3390/app13063682 - 14 Mar 2023
Cited by 2 | Viewed by 2128
Abstract
The rapid increase of space debris poses a risk to space activities, so it is vital to develop countermeasures in terms of space surveillance to prevent possible threats. The current Space Surveillance Network is majorly composed of radar and optical telescopes that regularly [...] Read more.
The rapid increase of space debris poses a risk to space activities, so it is vital to develop countermeasures in terms of space surveillance to prevent possible threats. The current Space Surveillance Network is majorly composed of radar and optical telescopes that regularly observe and track space objects. However, these measures are limited by size, being able to detect only a tiny amount of debris. Hence, alternative solutions are essential for securing the future of space activities. Therefore, this paper proposes the design of a payload camera breadboard for space surveillance to increase the information on debris, particularly for the under-catalogued ones. The device was designed with similar characteristics to star trackers of small satellites and CubeSats. Star trackers are attitude devices usually used in satellites for attitude determination and, therefore, have a wide potential role as a major tool for space debris detection. The breadboard was built with commercial off-the-shelf components, representing the current space-camera resolution and field of view. The image sensor was characterized to compute the sensitivity of the camera and evaluate the detectability performance in several simulated positions. Furthermore, the payload camera concept was tested by taking images of the night sky using satellites as proxies of space debris, and a photometric analysis was performed to validate the simulated detectability performance. Full article
(This article belongs to the Special Issue Cutting Edge Advances in Image Information Processing)
Show Figures

Figure 1

Figure 1
<p>The architecture of the breadboard.</p>
Full article ">Figure 2
<p>Mechanical design and assembled breadboard.</p>
Full article ">Figure 3
<p>Variance versus mean linear relation for several images extracted by the MT9J001 with the laser beam switched on.</p>
Full article ">Figure 4
<p>Mean values of each set of measurements in different integration times.</p>
Full article ">Figure 5
<p>Camera in the interferometer (<b>a</b>); image of the light beam focused on the sensor with a horizontal intensity per pixel cut shown (<b>b</b>).</p>
Full article ">Figure 6
<p>Schematic illustration of the Sun-object-observer geometry.</p>
Full article ">Figure 7
<p>Debris size in the function of the distance from a ground-based sensor and their corresponding orbital velocity, assumed to be circular, for phase angles of 120 deg and 20 deg.</p>
Full article ">Figure 8
<p>Breadboard connected to the computer and positioned for the observations.</p>
Full article ">Figure 9
<p>Some detected objects.</p>
Full article ">Figure 10
<p>Data reduction process of the images of the tests.</p>
Full article ">
Back to TopTop