[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN118765012A - Airport illumination control method and system - Google Patents

Airport illumination control method and system Download PDF

Info

Publication number
CN118765012A
CN118765012A CN202411038607.9A CN202411038607A CN118765012A CN 118765012 A CN118765012 A CN 118765012A CN 202411038607 A CN202411038607 A CN 202411038607A CN 118765012 A CN118765012 A CN 118765012A
Authority
CN
China
Prior art keywords
aircraft
data
airport
model
algorithm
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202411038607.9A
Other languages
Chinese (zh)
Inventor
戴爱鹏
丁柏平
杨锋
黄阳彪
龚政
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Zhongfuneng Electric Equipment Co Ltd
Original Assignee
Shenzhen Zhongfuneng Electric Equipment Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Zhongfuneng Electric Equipment Co Ltd filed Critical Shenzhen Zhongfuneng Electric Equipment Co Ltd
Priority to CN202411038607.9A priority Critical patent/CN118765012A/en
Publication of CN118765012A publication Critical patent/CN118765012A/en
Pending legal-status Critical Current

Links

Landscapes

  • Traffic Control Systems (AREA)

Abstract

The invention is suitable for the technical field of aviation illumination, and provides an airport illumination control method and system, wherein the method comprises the steps of collecting state data of each aircraft, airport environment data and barrier data in real time, and integrating the data through a data fusion algorithm to generate comprehensive airport scene information; constructing a real-time airport scene model based on airport scene information, and dynamically simulating the motion trail of the aircraft by using a prediction algorithm based on aircraft state data; analyzing an airport scene model according to a machine learning algorithm, predicting the behaviors of the aircrafts, carrying out priority assessment on each aircraft, and formulating a target lighting scheme based on the behavior prediction result and the priority assessment result; and automatically adjusting the brightness, the color, the flicker frequency and the indication direction of the lamplight according to the formulated target lighting scheme. The invention solves the problems of response lag, difficult coordination and potential safety hazard existing in the existing airport lighting system when various aircrafts are processed simultaneously.

Description

Airport illumination control method and system
Technical Field
The invention relates to the technical field of aviation illumination, in particular to an airport illumination control method and system.
Background
In recent years, along with the rapid development of unmanned aerial vehicle technology, the application field of unmanned aerial vehicles is continuously expanded, and the unmanned aerial vehicles comprise a plurality of aspects such as logistics distribution, agricultural monitoring and rescue. At the same time, piloted aircraft remain an important place in the fields of commercial aviation, private flight and military. However, the conventional airport lighting system is mainly designed for a piloted plane, and is difficult to meet the requirement of the joint operation of the piloted plane and the unmanned plane.
Conventional airfield lighting systems are typically based on a fixed light pattern and cannot adapt in real time to changes in the aircraft and to environmental conditions. The lamplight design in the fixed mode cannot be flexibly adjusted to adapt to the take-off, landing and navigation requirements of different types of aircrafts in a mixed operation environment of the unmanned aerial vehicle and the man-machine. For example, during night or low visibility conditions, conventional lighting systems cannot be adjusted to the specific needs of the drone, which may lead to increased safety risks for the operation of the drone.
In addition, the existing system has the problems of response lag, difficult coordination, potential safety hazard and the like when processing the concurrency of various aircrafts. As the number of drones in airports increases, the regulatory capability of conventional lighting systems is insufficient to handle complex flight environments. Particularly in emergency situations, such as taking off and landing of unmanned aerial vehicles and organic vehicles at the same time, or when unmanned aerial vehicle clusters are operated, the traditional system cannot quickly respond, mutual interference among aircrafts is easily caused, and the risk of flight accidents is increased.
Disclosure of Invention
Based on the above, the invention aims to provide an airport illumination control method and system, so as to fundamentally solve the problems of response lag, difficult coordination and potential safety hazard existing in the conventional airport illumination system when various aircrafts are processed simultaneously.
An airport lighting control method according to an embodiment of the present invention includes:
acquiring state data, airport environment data and obstacle data of each aircraft in real time, and integrating the data through a data fusion algorithm to generate comprehensive airport scene information;
Constructing a real-time airport scene model based on airport scene information, and dynamically simulating the motion trail of the aircraft by using a prediction algorithm based on aircraft state data;
analyzing an airport scene model according to a machine learning algorithm, predicting the behaviors of the aircrafts, carrying out priority assessment on each aircraft, and formulating a target lighting scheme based on the behavior prediction result and the priority assessment result;
and automatically adjusting the brightness, the color, the flicker frequency and the indication direction of the lamplight according to the formulated target lighting scheme.
In addition, the airport lighting control method according to the above embodiment of the present invention may further have the following additional technical features:
Further, the step of integrating the data by the data fusion algorithm to generate comprehensive airport scene information includes:
performing data cleaning, data calibration and standardization processing on the state data, airport environment data and obstacle data of each aircraft;
And (3) carrying out data integration on the state data of each aircraft, the airport environment data and the obstacle data by adopting a Kalman filtering or multi-source information fusion method to generate comprehensive airport scene information.
Further, the step of constructing a real-time airport scene model based on the airport scene information and dynamically simulating the motion trail of the aircraft by using a prediction algorithm based on the aircraft state data comprises the following steps:
Based on geographic information and a structure diagram of an airport, constructing an airport scene model by using a three-dimensional modeling tool, and dynamically updating the environment state and the obstacle position in the airport scene model according to airport environment data and obstacle data collected in real time;
According to the motion characteristics of the aircraft, a kinematic model of the aircraft is constructed, and a preset prediction algorithm is selected to predict the motion trail of the aircraft;
taking the real-time collected aircraft state data as input, and predicting the motion trail of the aircraft by using the constructed kinematic model and the selected prediction algorithm, and dynamically simulating the motion trail of the aircraft in the airport scene model;
Visually displaying the predicted motion trail of the aircraft in an airport scene model, and displaying the current position of the aircraft and the predicted motion trail;
and comparing the motion trail obtained by simulation with the actually collected aircraft state data, evaluating the accuracy of a prediction algorithm, and adjusting and optimizing a kinematic model and the prediction algorithm according to the evaluation result.
Further, the steps of analyzing the airport scene model according to the machine learning algorithm, predicting the behaviors of the aircrafts, evaluating the priorities of the aircrafts, and formulating the target lighting scheme based on the behavior prediction result and the priority evaluation result comprise:
Extracting key features from the airport scene model, and training the machine learning model by using a preset machine learning algorithm based on the extracted key features;
Inputting the collected aircraft state data into a trained machine learning model, predicting future behaviors of the aircraft, and evaluating the priority of each aircraft;
And according to the behavior prediction result and the priority evaluation result of the aircraft, a corresponding target lighting scheme is formulated.
Further, the method further comprises:
Optimizing a running path by adopting a collaborative operation strategy optimization algorithm based on the behavior prediction result of each aircraft;
Correcting the optimized running path according to the real-time collected aircraft state data;
And transmitting the corrected and optimized running path to the aircraft through a communication link so that the aircraft dynamically adjusts the route according to the received running path.
It is also an object of another embodiment of the present invention to provide an airport lighting control system, the system comprising:
The data integration module is used for collecting the state data of each aircraft, the airport environment data and the obstacle data in real time, and integrating the data through a data fusion algorithm to generate comprehensive airport scene information;
the model construction module is used for constructing a real-time airport scene model based on airport scene information and dynamically simulating the motion trail of the aircraft by using a prediction algorithm based on aircraft state data;
The illumination scheme making module is used for analyzing the airport scene model according to the machine learning algorithm, predicting the behaviors of the aircrafts, carrying out priority assessment on each aircraft, and making a target illumination scheme based on the behavior prediction result and the priority assessment result;
And the illumination adjusting module is used for automatically adjusting the brightness, the color, the flicker frequency and the indication direction of the lamplight according to the formulated target illumination scheme.
Further, the data integration module includes:
the data processing unit is used for carrying out data cleaning, data calibration and standardization processing on the state data of each aircraft, the airport environment data and the obstacle data;
And the data integration unit is used for integrating the data of each aircraft state data, the airport environment data and the obstacle data by adopting a Kalman filtering or multi-source information fusion method to generate comprehensive airport scene information.
Further, the model building module includes:
the first model building unit is used for building an airport scene model by utilizing a three-dimensional modeling tool based on geographic information and a structural diagram of an airport, and dynamically updating the environment state and the obstacle position in the airport scene model according to airport environment data and obstacle data collected in real time;
The second model construction unit is used for constructing a kinematic model of the aircraft according to the motion characteristics of the aircraft, and selecting a preset prediction algorithm to predict the motion trail of the aircraft;
The track simulation unit is used for taking the real-time acquired aircraft state data as input, predicting the motion track of the aircraft by using the constructed kinematic model and the selected prediction algorithm, and dynamically simulating the motion track of the aircraft in the airport scene model;
The track display unit is used for visually displaying the predicted motion track of the aircraft in the airport scene model and displaying the current position of the aircraft and the predicted motion track;
The optimizing unit is used for comparing the motion trail obtained by simulation with the actually collected aircraft state data, evaluating the accuracy of the prediction algorithm, and adjusting and optimizing the kinematic model and the prediction algorithm according to the evaluation result.
Further, the lighting scheme formulation module includes:
the model training unit is used for extracting key features from the airport scene model and training the machine learning model by using a preset machine learning algorithm based on the extracted key features;
The prediction evaluation unit is used for inputting the acquired aircraft state data into the trained machine learning model, predicting the future behaviors of the aircraft and evaluating the priority of each aircraft;
And the lighting scheme making unit is used for making a corresponding target lighting scheme according to the behavior prediction result and the priority evaluation result of the aircraft.
Further, the system further comprises:
The running path optimization module is used for optimizing the running path by adopting a collaborative operation strategy optimization algorithm based on the behavior prediction result of each aircraft;
The running path correction module is used for correcting the optimized running path according to the real-time acquired aircraft state data;
And the data transmission module is used for transmitting the corrected and optimized running path to the aircraft through the communication link so as to enable the aircraft to dynamically adjust the route according to the received running path.
According to the airport illumination control method provided by the embodiment of the invention, the comprehensive airport scene information can be generated by collecting and fusing the state data, the airport environment data and the obstacle data of each aircraft in real time, and the real-time motion trail prediction and the behavior prediction of the aircraft can be performed based on the information. The machine learning and intelligent decision algorithm are combined to make a target lighting scheme, so that the airport lighting system can automatically adjust the brightness, color, flicker frequency and indication direction of the lamplight according to actual requirements, and the safety of the aircraft in the take-off and landing and ground sliding processes is improved; by utilizing a data fusion algorithm, a prediction algorithm and a machine learning algorithm, the real-time monitoring and dynamic adjustment can be carried out on various types of aircrafts, the problems of response lag and difficult coordination when the traditional illumination system processes the concurrency of various aircrafts are solved, various complex flight environments can be flexibly dealt with by carrying out priority evaluation and behavior prediction on the aircrafts, and the flexibility and the response speed of the system are obviously improved. The optimization and dynamic adjustment of the running path of the aircraft can be realized by the optimized collaborative operation strategy and the data exchange with the communication system of the aircraft, so that the mutual interference and waiting time of the aircraft in an airport scene are reduced, and the overall running efficiency of the airport is improved; the problems of response lag, difficult coordination and potential safety hazard existing in the existing airport lighting system when various aircrafts are processed simultaneously are solved.
Drawings
FIG. 1 is a flow chart of a method of controlling airfield lighting in a first embodiment of the present invention;
FIG. 2 is a block diagram of an airfield lighting control system in a second embodiment of the present invention;
the following detailed description will further illustrate the invention with reference to the above-described drawings.
Detailed Description
In order that the invention may be readily understood, a more complete description of the invention will be rendered by reference to the appended drawings. Several embodiments of the invention are presented in the figures. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete.
It will be understood that when an element is referred to as being "mounted" on another element, it can be directly on the other element or intervening elements may also be present. When an element is referred to as being "connected" to another element, it can be directly connected to the other element or intervening elements may also be present. The terms "vertical," "horizontal," "left," "right," and the like are used herein for illustrative purposes only.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The terminology used herein in the description of the invention is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. The term "and/or" as used herein includes any and all combinations of one or more of the associated listed items.
Example 1
Referring to fig. 1, an airport lighting control method according to a first embodiment of the present invention is shown, for convenience of explanation, only a portion related to an embodiment of the present invention is shown, and the airport lighting control method provided in the embodiment of the present invention includes:
Step S10, acquiring state data, airport environment data and obstacle data of each aircraft in real time, and integrating the data through a data fusion algorithm to generate comprehensive airport scene information;
In one embodiment of the present invention, the step of generating the integrated airport scene information by integrating data through the data fusion algorithm includes:
performing data cleaning, data calibration and standardization processing on the state data, airport environment data and obstacle data of each aircraft;
And (3) carrying out data integration on the state data of each aircraft, the airport environment data and the obstacle data by adopting a Kalman filtering or multi-source information fusion method to generate comprehensive airport scene information.
Specifically, the data acquisition includes aircraft state data acquisition, airport environment data acquisition and obstacle data acquisition. The aircraft state data acquisition is to acquire the position information, speed, heading, type (with a man or an unmanned aerial vehicle), task information and other data of the aircraft in real time by utilizing sensors (such as GPS, an Inertial Measurement Unit (IMU), an altimeter and the like) on the aircraft, a ground radar system, a visual system, an ADS-B system, thermal imaging equipment and the like, so as to provide the dynamic information of the aircraft. The airport environmental data acquisition is to acquire the current weather conditions (such as wind speed, wind direction, temperature, humidity, visibility, rainfall and the like) and the airport environmental conditions (such as runway conditions, illumination conditions and the like) by using a weather station, an environmental sensor (such as a temperature sensor, a humidity sensor, an air pressure sensor, a wind speed and wind direction sensor, an illumination sensor and the like) and a monitoring system, so as to provide real-time environmental data. And the obstacle data acquisition is to acquire fixed and movable obstacles (such as vehicles, personnel and the like) in an airport through a radar system, a laser radar (LiDAR), a vision system and other detection equipment, so as to provide the position information of the obstacles.
In particular, sensors onboard an aircraft include Global Positioning Systems (GPS), inertial Measurement Units (IMUs), flight Management Systems (FMSs), weather sensors, communications systems, and aircraft electronic systems. At the moment, the longitude and latitude position and the ground speed data of the aircraft are acquired in real time through the GPS receiver, and the absolute altitude of the aircraft is acquired by utilizing GPS altitude measurement (such as differential GPS). The roll angle, pitch angle and yaw angle of the aircraft are measured by accelerometers and gyroscopes in an Inertial Measurement Unit (IMU). Heading information provided by magnetometers or gyroscopes in Inertial Measurement Units (IMUs). Information on flight plans, ranges, fuel consumption, etc. of the aircraft is provided by a Flight Management System (FMS). Meteorological conditions around the aircraft are measured by on-board meteorological sensors (e.g., temperature probes, barometric pressure sensors). Status data is transmitted through the aircraft's communication system (e.g., satellite communication, VHF radio communication) with the ground control center. Various operating parameters and status data of the aircraft are recorded by aircraft electronic systems, such as Flight Data Recorders (FDRs), for real-time monitoring and subsequent analysis. These data are then typically aggregated over the internal network of the aircraft (e.g., an aircraft bus system) and sent via a communication system to a ground control center or other monitoring system. While the ground radar system comprises a primary radar (Primary Surveillance Radar, PSR) and a secondary radar (Secondary Surveillance Radar, SSR), wherein the primary radar measures the distance and the azimuth of the aircraft by transmitting and receiving reflected signals. The primary radar is now independent of the equipment on the aircraft, so that an aircraft without transponders can be detected. And the secondary radar is communicated with a transponder on the aircraft to acquire information such as the altitude, the identification code and the like of the aircraft. The secondary radar is thus able to provide more detailed and accurate data. Wherein the vision system is used for identifying the type and the flight state of the aircraft, in particular, cameras in the vision system are installed around the airport to capture real-time images of the aircraft. And the computer vision technology is utilized to analyze the image captured by the camera, and the type (such as a commercial airplane, an unmanned aerial vehicle and the like) and the flight state (such as take-off, landing, taxiing and the like) of the aircraft are identified. Wherein the ADS-B system is configured to receive the position, velocity, heading, and flight plan data broadcast by the aircraft. Wherein ADS-B devices on board an aircraft will periodically broadcast their position, velocity, flight plan and other status data. The ground station receives this data for airspace monitoring. While ground stations and other aircraft may receive these broadcast signals to achieve coordination of air traffic and collision avoidance. The thermal imaging device is used for detecting the position and the flight state of the aircraft under the low-visibility condition, specifically, the thermal imaging device detects the infrared radiation of the aircraft, generates a thermal imaging image, and detects the position and the flight state of the aircraft by analyzing the thermal imaging image, so that the position and the flight state of the aircraft are helped to be identified, and particularly, under the night or severe weather condition. Further, data from different sensors are fused and processed, and specifically, data from a radar system, a vision system, an ADS-B system and a thermal imaging device are fused by using a data fusion algorithm (such as Kalman filtering), redundant and noise information is removed, and accuracy and reliability of the data are ensured. And then extracting the key state data of the aircraft from the fused data, including type, position, speed, task information and the like, so as to generate real-time aircraft state data.
For aircraft type identification, automated identification techniques, such as Radio Frequency Identification (RFID), radar systems, or optical identification (e.g., camera identification) may be used to identify the type of aircraft. Or may communicate its type information (e.g., ADS-B system, remote control signals) in a communication protocol using the aircraft. And sources of mission information may include airline scheduling systems that provide information on flight plans, flight times, mission types, and the like, air traffic control systems, and aircraft own systems. The air traffic control system provides information such as flight instructions, take-off and landing permissions, route adjustments, and the like for the aircraft. The aircraft's own systems include aircraft's voyage computers, mission management systems, etc. for providing real-time mission status and updates. The mission information at this point typically includes flight plans including departure time, destination, route, expected arrival time, etc., flight status, priority instructions, operational requirements, and the like. The flight status includes the current status of the aircraft, mission progress, critical phases (e.g. takeoff, cruise, landing). The priority instructions are from priority instructions for air traffic control, such as handling of emergency, priority landing, etc. The operational requirements include special operational requirements such as weather changes, runway restrictions, etc. The task information extraction process includes: and establishing a data interface with an airline dispatching system, an air traffic control system and an aircraft system to acquire relevant task information. Or acquire real-time data using an API call interface, where an API (application programming interface) can extract data from an external system and integrate it into an internal system. Specific examples are: by connecting to the airline scheduling system, flight plan information, including flight number, departure time, destination, etc., is obtained using the API. Or acquiring the instruction from the air traffic control system, acquiring the real-time flight instruction and the priority information at the moment, and analyzing the current operation requirement and the priority adjustment. Or from the aircraft system, at which time current state information of the aircraft, such as current position, speed, flight phase, etc., is obtained.
The airport environment data comprises meteorological data, illumination intensity and runway state data, and can be realized by the following technologies and equipment: for weather data, weather stations can be used to monitor and record weather conditions at airports in real time. Where weather stations are typically equipped with a variety of sensors including temperature sensors, humidity sensors, anemometers, wind vanes, barometers, and rain sensors. At this time, the weather station continuously collects and records the data of the sensors to form real-time weather data. Alternatively, laser anemometers (LIDAR) may be used to measure wind speed and direction around an airport. The laser anemometer measures the moving speed of aerosol and dust in the atmosphere by emitting laser beams and receiving reflected signals, so as to calculate the wind speed and the wind direction. The laser anemometer can provide more accurate local wind speed and wind direction data, and is particularly suitable for monitoring phenomena such as wind shear. Or weather radar may also be used to monitor the intensity and type of precipitation. The weather radar measures the echo intensity of the rainfall by transmitting and receiving microwave signals, and calculates the intensity and type of the rainfall (such as rain, snow, hail and the like). The weather radar is used for monitoring the precipitation conditions around the airport at this time and providing real-time precipitation data. As regards the illumination intensity, it is possible to measure the illumination intensity of the airport with an illumination sensor that measures the illumination intensity in the area of the airport in real time and transmits the data to a central control system. Alternatively, a solar meter may be used to measure the time and intensity of sunlight. The sunshine meter records the sunshine hours and the intensity change by detecting the time and the intensity of the direct irradiation of the sunshine. The solar meter is used for evaluating the daytime illumination condition and adjusting the lighting system. For runway status data, runway surface sensors may be used to monitor the status of the runway surface, including humidity, temperature, coefficient of friction, and the like. Specific runway surface sensors may include humidity sensors, temperature sensors, friction measuring devices. The sensors are arranged on the surface of the runway, and data such as humidity, temperature and friction coefficient of the surface of the runway are collected in real time. Or the runway monitoring system can be used for comprehensively monitoring the use state and the safety of the runway. A particular runway monitoring system may include a floor sensor, a camera, a pressure sensor. At this time, the runway monitoring system collects the service conditions of the runway in real time, such as whether foreign matters exist or not, whether the surface of the runway is damaged or not, and the like.
Wherein the acquisition of obstacle data involves the monitoring of fixed and moving obstacles within and around the airport. The collection of such obstacle data may be accomplished by a variety of devices and systems. The following specific acquisition modes are as follows: for fixed obstacles, it may be detected using radar systems, laser radar (LIDAR), high resolution aerial photography, and satellite images. In particular, radar systems (e.g., airport surveillance radar, ground radar) detect the presence and location of objects by transmitting and receiving radio waves so as to detect and track the location of fixed obstacles (e.g., buildings, towers, airport equipment). While the lidar measures the distance of the object using laser beam reflection, thereby creating a detailed three-dimensional model, so that the three-dimensional position and height of the fixed obstacle are measured with high accuracy. High resolution aerial and satellite images are captured in the air and satellite scanned using aerial photography, satellite imaging systems, and the like to obtain ground images and analyze such that images of airport areas and their surroundings are captured, identifying and recording fixed obstructions. For moving obstructions, it may be detected using ground based radar, vision systems, etc. In particular, ground radars use radar waves to detect the position and velocity of moving objects, allowing real-time detection and tracking of moving obstructions (e.g., airport service vehicles, other aircraft). The video system identifies the position and the moving path of the obstacle through video analysis, so that the airport area is monitored in real time, and the moving obstacle is identified and tracked.
Further, wireless communication technology (such as Wi-Fi, 4G/5G) is used to transmit the acquired data to the central control system in real time. In a central control system, a high performance database system is used to store real-time data for subsequent processing and analysis. And then carrying out data cleaning, data calibration and standardization processing on each data, wherein the data cleaning is to remove noise and abnormal values, ensure the accuracy and the reliability of the data, and for example, carry out smoothing processing on the laser radar data by adopting Kalman filtering (KALMAN FILTER). The data calibration is to calibrate the data from different sensors to ensure the consistency and accuracy of the data. For example, by time synchronization and spatial calibration, the data of the different sensors are aligned. The normalization process is to normalize data from different sources to have a uniform format and unit.
Furthermore, the data of each aircraft state, airport environment data and obstacle data are integrated to generate comprehensive airport scene information, specifically, data from different sensors can be fused by adopting data fusion algorithms such as Kalman filtering, particle filtering and Bayesian network, so as to obtain more accurate and reliable aircraft state, environment condition and obstacle information. Or combining the aircraft state data, the airport environment data and the obstacle data, and generating comprehensive airport scene information through an advanced data fusion algorithm (such as a multi-sensor data fusion and deep learning algorithm). At this time, real-time airport scene information is generated based on the fused data, including the position and motion state of the aircraft, the environmental condition of the airport and the distribution condition of the obstacles. The information can be displayed in real time in a central control system through a three-dimensional visualization technology and used by airport management personnel and an automatic control system.
Step S20, constructing a real-time airport scene model based on airport scene information, and dynamically simulating the motion trail of the aircraft by using a prediction algorithm based on aircraft state data;
In one embodiment of the present invention, the step of constructing a real-time airport scene model based on airport scene information and dynamically simulating the motion trail of the aircraft according to the aircraft state data by using a prediction algorithm includes:
Based on geographic information and a structure diagram of an airport, constructing an airport scene model by using a three-dimensional modeling tool, and dynamically updating the environment state and the obstacle position in the airport scene model according to airport environment data and obstacle data collected in real time;
According to the motion characteristics of the aircraft, a kinematic model of the aircraft is constructed, and a preset prediction algorithm is selected to predict the motion trail of the aircraft;
taking the real-time collected aircraft state data as input, and predicting the motion trail of the aircraft by using the constructed kinematic model and the selected prediction algorithm, and dynamically simulating the motion trail of the aircraft in the airport scene model;
Visually displaying the predicted motion trail of the aircraft in an airport scene model, and displaying the current position of the aircraft and the predicted motion trail;
and comparing the motion trail obtained by simulation with the actually collected aircraft state data, evaluating the accuracy of a prediction algorithm, and adjusting and optimizing a kinematic model and the prediction algorithm according to the evaluation result.
Specifically, the aircraft state data, the airport environment data and the obstacle data are subjected to data integration through data acquisition and a data fusion algorithm, so that comprehensive airport scene information is generated. And the integrated data is managed and processed by utilizing a database or a real-time data stream processing platform (such as APACHE KAFKA and APACHE FLINK), so that the real-time property and consistency of the data are ensured. Three-dimensional modeling tools (e.g., unity, unreal Engine, blender) are then utilized to construct a three-dimensional model of the airport (i.e., an airport scene model) based on geographic information and structural drawings (e.g., runway layout, taxiways, buildings, etc.) of the airport. And dynamically updating the environmental status (such as weather conditions) and the obstacle position in the three-dimensional model according to the environmental data and the obstacle data collected in real time. And then, filtering and denoising the aircraft state data such as the position information, the speed, the acceleration and the like of the aircraft, ensuring the accuracy and the stability of the data, and then, performing time synchronization on the aircraft state data, and ensuring the consistency of the data. At this time, the three-dimensional model is calibrated accurately, so that the simulation result is consistent with the actual situation. And the three-dimensional model is adjusted according to the real-time data, so that the current airport environment can be accurately reflected.
Further, a kinematic model of the aircraft is constructed according to the motion characteristics (such as speed, acceleration, heading and the like) of the aircraft, and the motion rule of the aircraft in the three-dimensional space is described. The model parameters need to take into account the dynamics of the aircraft, such as aerodynamic properties, engine thrust, control inputs, etc. And then selecting a proper prediction algorithm (such as Kalman filtering, particle filtering, machine learning algorithm and the like) for predicting the motion trail of the aircraft, and configuring algorithm parameters at the same time to ensure that the dynamic change of the aircraft can be processed. Specifically, the Kalman filtering is to estimate and predict the state of the aircraft by using the Kalman filter, and obtain a more accurate motion trail by iteratively updating the error between the predicted value and the actual observed value. Particle filtering is the process of representing the state of an aircraft by generating a large number of samples (particles) and weighting and resampling the particles by a particle filter to obtain the motion profile of the aircraft. The machine learning algorithm is to train a machine learning model (such as a neural network, a support vector machine, etc.) by using historical data of the aircraft, and predict a future track of the aircraft according to the current state data.
Further, real-time collected aircraft state data (e.g., position, speed, heading) is used as input. And predicting the motion trail of the aircraft by using the constructed kinematic model and the selected prediction algorithm. Dynamically simulating the motion trail of the aircraft in the airport scene model to generate an expected flight path. The predicted motion trajectories of the aircraft are then visualized in an airport scene model, displaying the current position of the aircraft and the predicted motion trajectories. At this time, the visualized result is updated in real time, reflecting the change of the position of the aircraft and the track adjustment. Meanwhile, the dynamic simulation result is integrated into an airport navigation management system, and is provided for an air traffic controller to carry out decision support, and real-time data and prediction information are provided for airport management personnel and pilots to help the airport management personnel and pilots to optimize flight operation and scheduling. And then comparing the motion track obtained by simulation with the track actually observed, and evaluating the accuracy and reliability of the prediction algorithm. Finally, according to the evaluation result, the kinematic model and the prediction algorithm are continuously adjusted and optimized, and the prediction precision and the instantaneity are improved. At the moment, through the steps, real-time scene model construction based on airport scene information and dynamic simulation of the motion trail of the aircraft can be realized, accurate reference and decision support are provided for intelligent illumination control of the airport and other automatic systems, and the safety and efficiency of airport operation are improved.
S30, analyzing an airport scene model according to a machine learning algorithm, predicting the behaviors of the aircrafts, carrying out priority assessment on each aircraft, and formulating a target lighting scheme based on the behavior prediction result and the priority assessment result;
In one embodiment of the present invention, the steps of analyzing the airport scene model according to the machine learning algorithm, predicting the behavior of the aircraft and performing priority assessment on each aircraft, and formulating the target lighting scheme based on the behavior prediction result and the priority assessment result include:
Extracting key features from the airport scene model, and training the machine learning model by using a preset machine learning algorithm based on the extracted key features;
Inputting the collected aircraft state data into a trained machine learning model, predicting future behaviors of the aircraft, and evaluating the priority of each aircraft;
And according to the behavior prediction result and the priority evaluation result of the aircraft, a corresponding target lighting scheme is formulated.
Specifically, key features are extracted from the airport scene model, including aircraft state data, airport environment data, obstacle information, and the like. Wherein the aircraft status data includes real-time status of the aircraft (e.g., position, speed, heading), mission information of the aircraft (e.g., mission type, priority), etc. Further, the key features extracted by the method can be selected from features important for the prediction and priority assessment of the behavior of the aircraft, such as the current position, speed change rate, mission information, airport environment states (such as weather and obstacle positions) and the like of the aircraft. An appropriate machine learning model (e.g., regression model, classification model, deep learning model, etc.) is then selected for predicting future behavior of the aircraft and evaluating the priority of the aircraft. Further based on the extracted key features, a preset machine learning algorithm (such as regression analysis, decision tree, random forest, support vector machine, neural network and the like) is used for training the machine learning model to conduct behavior prediction and priority assessment. The extracted key features and marked historical data are used in the training process to ensure that the model can accurately capture the mode of the aircraft behavior and the standard of priority evaluation. And simultaneously, the accuracy and generalization capability of the model are verified by using independent data sets, and the model parameters are adjusted to improve the performance. And meanwhile, the cross verification and super parameter optimization are carried out on the model, so that the prediction precision of the model is improved, and the generalization capability and accuracy of the model are ensured. Further, real-time collected aircraft state data (including aircraft position, speed, mission information, etc.) is input into a trained machine learning model. Future behaviors of the aircraft (such as motion trajectories, mission states, etc.) are predicted using the model, and each aircraft is prioritized according to preset priority criteria (such as mission importance, urgency, proximity of position relative to other aircraft, and other relevant factors). The trained models are used to rank or rank the aircraft at this time, and the priority of each aircraft is determined. At this time, according to the predicted behavior result and the estimated priority result of the aircraft, a corresponding target lighting scheme is formulated, specifically, the predicted behavior result of the aircraft is analyzed, which areas need special lighting (such as landing areas, taxiways, the vicinity of obstacles, etc.) or the required lighting areas and the required lighting types (such as landing lights and navigation lights) are determined. And determining which aircraft lighting needs are more urgent or important according to the priority evaluation result of the aircraft so as to decide which aircraft are provided with higher priority lighting support. The higher priority aircraft will therefore be prioritized in the lighting scheme to ensure safe and efficient operation thereof. Meanwhile, under the condition that a plurality of aircrafts run simultaneously, the lighting scheme is dynamically adjusted, so that the requirements of all aircrafts can be met, and meanwhile, the energy consumption and the lamplight use efficiency are optimized. And subsequently dynamically adjusting the lighting scheme based on the real-time changing aircraft status and priority. At this time, specific light control parameters including brightness, color, flicker frequency and indication direction are set to ensure to meet the actual requirements and priority requirements of the aircraft. The brightness is used for setting the brightness of the lamplight so as to meet the lighting requirements of different areas. The colors are used to select light colors (e.g., red, green, white) to indicate different states (e.g., alert, navigate, guide). The flicker frequency is used to set the flicker frequency of the light to draw attention of the aircraft. The indication direction is used for adjusting the indication direction of the light to ensure that the path of the aircraft can be correctly guided.
Step S40, automatically adjusting the brightness, color, flicker frequency and indication direction of the lamplight according to the formulated target lighting scheme;
Wherein in one embodiment of the invention, the formulated target lighting scheme is transmitted to the airport lighting control system to automatically adjust the lighting settings. The lighting effect and the state of the aircraft are monitored in real time, and are adjusted and optimized according to actual conditions, so that a closed-loop control system is formed, and effective implementation and timely adjustment of a lighting scheme are ensured. Specifically, a central control system is provided in the method and is responsible for receiving and processing the target lighting schemes and sending instructions to the individual light control units. Meanwhile, light control units are arranged on runways, taxiways and other key areas, and each unit can independently adjust the brightness, color, flicker frequency and direction of light. The control center is connected with each light control unit through a wired (such as optical fiber) or wireless (such as Wi-Fi and 5G) network, so that the command can be transmitted in real time. The reliability and real-time performance of data transmission are ensured by using a proper communication protocol (such as Modbus, zigbee, MQTT). And a control algorithm is realized in the central control system, and a specific lamplight adjusting instruction is generated according to the target lighting scheme. The central control system generates specific instructions according to the target lighting scheme, including settings of brightness, color, flashing frequency and direction, and sends the specific instructions to the corresponding light control units through the communication network. After receiving the instruction, the light control unit analyzes the instruction content and extracts the setting values of brightness, color, flicker frequency and direction. And controlling the intelligent lamp to perform corresponding adjustment according to the analyzed instruction. For example: the brightness adjustment is to adjust the current of the LED lamp by PWM (pulse width modulation) technology, thereby controlling the brightness. The color adjustment is realized by adjusting the current proportion of the red, green and blue colors of the LED lamp. The flicker frequency adjustment is to set the flicker frequency of the light by controlling the switching frequency of the lamp. The indication direction adjustment is to change the illumination direction of the lamp by a motor or other mechanical means if the lamp has a direction adjustable function. Sensors (such as light sensors and environment sensors) are integrated in the light control unit, and the state of the light and the surrounding environment are monitored in real time. And feeding the monitored data back to a central control system, and carrying out real-time adjustment and optimization by the system according to actual conditions to ensure that the light adjusting effect meets the expectations. At the moment, the brightness, the color, the flicker frequency and the indication direction of the lamplight can be automatically adjusted according to the formulated target lighting scheme through the steps, so that the intelligent and automatic level of an airport lighting system is improved, and the safe operation and the high-efficiency management of an aircraft are ensured.
Specifically, in this embodiment, for example, at an airport, there is one passenger plane and multiple unmanned planes that take off and land simultaneously: first, the data acquisition environment, such as the sensor detecting an increase in wind speed, the aircraft identifies the location of the sensor locating the passenger aircraft and the drone. And then fusing the sensor data, constructing a 3D scene model, and displaying the runway, the obstacle and the aircraft. Further performing track simulation, and predicting the landing track of the passenger plane and the flight path of the unmanned plane. Further predicting the future behaviors of the airliner and the unmanned plane through an airport scene model and a prediction algorithm. Further priority assessment is performed, wherein the passenger aircraft is prioritized over the unmanned aircraft, and the risk assessment determines that it is prioritized for landing. Further lighting schemes are generated, wherein the brightness and the color of the approach lamps are automatically adjusted to guide the passenger plane to safely fall; an obstacle avoidance lamp is arranged for the unmanned aerial vehicle, ensuring that it is kept away from the passenger aircraft. And finally, executing control, and sending a control instruction in real time, adjusting the light configuration and ensuring the optimal lighting effect.
Further, in an embodiment of the present invention, step S40 further includes:
Optimizing a running path by adopting a collaborative operation strategy optimization algorithm based on the behavior prediction result of each aircraft;
Correcting the optimized running path according to the real-time collected aircraft state data;
And transmitting the corrected and optimized running path to the aircraft through a communication link so that the aircraft dynamically adjusts the route according to the received running path.
Specifically, real-time status data of the aircraft, including position, speed, direction, etc., is extracted from the airport scene model. Future behavior and motion trajectories of the aircraft are predicted using machine learning algorithms (e.g., neural networks, support vector machines). Then selecting a proper collaborative operation strategy optimization algorithm, such as an ant colony algorithm, a genetic algorithm or a particle swarm optimization algorithm. A plurality of optimization targets such as shortest path, avoidance of obstacles, reduction of power consumption, and the like are set. And calculating the optimal running path of the aircraft by using an optimization algorithm, and ensuring that the path is optimal on the premise of meeting the safety and efficiency. Data exchange is then carried out with the communication system of the aircraft, firstly a reliable data communication link is established, usually using ADS-B (broadcast automatic correlation monitoring), CPDLC (controller to pilot data link communication) etc. techniques. Standardized network protocols are used, such as ACARS (aircraft communication addressing and reporting system) and future ATN (aeronautical telecommunication network). The aircraft then periodically transmits real-time status and position data to the ground control center via the communication system. The ground control center receives and processes the state data sent by the aircraft, and updates the real-time running state. And then, according to the optimal running path calculated by the optimization algorithm, carrying out necessary correction on the running path based on the real-time acquired aircraft state data, and then generating a specific running instruction. And sending the corrected and optimized running path instruction to the aircraft through a data link, wherein the corrected and optimized running path instruction comprises information such as path points, speeds, altitudes and the like. The navigation computer on the aircraft receives and parses the path instructions sent from the ground. According to the received path instructions, the aircraft automatically adjusts the route, including speed, altitude, and direction. And then, real-time monitoring and feedback are carried out, and the ground control center monitors the state and the path execution condition of the aircraft in real time, so that the aircraft is ensured to run according to a preset path. Abnormal conditions in the operation of the aircraft, such as off-path, too fast, etc., are detected by the monitoring data. According to the real-time monitoring data, the ground control center can send a new path adjustment instruction to the aircraft. According to the real-time state fed back by the aircraft, the running path of the aircraft is dynamically adjusted and optimized, and the safety and the efficiency are ensured.
Specifically, the collaborative operation policy optimization algorithm comprises a multi-objective optimization algorithm and a genetic algorithm, wherein the multi-objective optimization algorithm comprises: the target definition, such as defining a plurality of optimization targets, such as safe distance of the aircraft, navigation path optimization, energy consumption minimization, and the like. And giving weights to the targets according to actual demands so as to balance the relation among different targets. The solving method is to solve the optimization problem by using a multi-objective optimization algorithm (such as a Pareto front algorithm) and find the optimal balance point among a plurality of optimization objectives.
The genetic algorithm includes initial population generation that generates an initial population comprising a plurality of aircraft paths. And evaluating fitness of each individual, wherein the fitness comprises indexes such as safety, path efficiency, energy consumption and the like. Selecting, crossing and mutating, and generating a new population by the operations of selecting, crossing and mutating so as to improve the overall fitness. Iterative optimization is carried out for a plurality of times until the optimization target is met or the maximum iteration times are reached.
The specific optimization process is real-time data input, and the state and environment data of the aircraft are obtained in real time through the sensor. An initial scenario generation, based on the current data, generates an initial light configuration and an aircraft path scenario. And (3) optimizing the algorithm, namely optimizing the initial scheme by using a multi-objective optimization algorithm and a genetic algorithm, and generating a plurality of candidate schemes. And selecting an optimal scheme, namely selecting an optimal aircraft path scheme according to the comprehensive evaluation index. And (3) implementing control, generating a specific control instruction, and adjusting and optimizing the navigation path of the aircraft. Therefore, through exchanging the position and state data of the aircraft in real time, the route and the lamplight configuration of the aircraft are calculated and optimized, and the safety distance between the unmanned aerial vehicle and the man-machine is ensured. While an optimization algorithm is used to determine the optimal lighting configuration to ensure that proper guidance and warning is provided at critical locations on the aircraft route.
In summary, according to the airport lighting control method in the above embodiment of the present invention, by collecting and fusing the state data, the airport environment data and the obstacle data of each aircraft in real time, comprehensive airport scene information can be generated, and real-time motion trail prediction and behavior prediction of the aircraft can be performed based on the information. The machine learning and intelligent decision algorithm are combined to make a target lighting scheme, so that the airport lighting system can automatically adjust the brightness, color, flicker frequency and indication direction of the lamplight according to actual requirements, and the safety of the aircraft in the take-off and landing and ground sliding processes is improved; by utilizing a data fusion algorithm, a prediction algorithm and a machine learning algorithm, the real-time monitoring and dynamic adjustment can be carried out on various types of aircrafts, the problems of response lag and difficult coordination when the traditional illumination system processes the concurrency of various aircrafts are solved, various complex flight environments can be flexibly dealt with by carrying out priority evaluation and behavior prediction on the aircrafts, and the flexibility and the response speed of the system are obviously improved. The optimization and dynamic adjustment of the running path of the aircraft can be realized by the optimized collaborative operation strategy and the data exchange with the communication system of the aircraft, so that the mutual interference and waiting time of the aircraft in an airport scene are reduced, and the overall running efficiency of the airport is improved; the problems of response lag, difficult coordination and potential safety hazard existing in the existing airport lighting system when various aircrafts are processed simultaneously are solved.
Example two
Referring to fig. 2, which is a schematic structural diagram of an airfield lighting control system according to a second embodiment of the present invention, for convenience of explanation, only a portion related to the embodiment of the present invention is shown, the airfield lighting control system comprises:
the data integration module 11 is used for collecting the state data, airport environment data and obstacle data of each aircraft in real time, and integrating the data through a data fusion algorithm to generate comprehensive airport scene information;
The model construction module 12 is used for constructing a real-time airport scene model based on airport scene information and dynamically simulating the motion trail of the aircraft by using a prediction algorithm based on aircraft state data;
The lighting scheme making module 13 is used for analyzing the airport scene model according to a machine learning algorithm, predicting the behaviors of the aircrafts, carrying out priority assessment on each aircraft, and making a target lighting scheme based on the behavior prediction result and the priority assessment result;
the illumination adjustment module 14 is used for automatically adjusting the brightness, color, flicker frequency and indication direction of the lamplight according to the formulated target illumination scheme.
Further, in one embodiment of the present invention, the data integration module 11 includes:
the data processing unit is used for carrying out data cleaning, data calibration and standardization processing on the state data of each aircraft, the airport environment data and the obstacle data;
And the data integration unit is used for integrating the data of each aircraft state data, the airport environment data and the obstacle data by adopting a Kalman filtering or multi-source information fusion method to generate comprehensive airport scene information.
Further, in one embodiment of the present invention, the model building module 12 includes:
the first model building unit is used for building an airport scene model by utilizing a three-dimensional modeling tool based on geographic information and a structural diagram of an airport, and dynamically updating the environment state and the obstacle position in the airport scene model according to airport environment data and obstacle data collected in real time;
The second model construction unit is used for constructing a kinematic model of the aircraft according to the motion characteristics of the aircraft, and selecting a preset prediction algorithm to predict the motion trail of the aircraft;
The track simulation unit is used for taking the real-time acquired aircraft state data as input, predicting the motion track of the aircraft by using the constructed kinematic model and the selected prediction algorithm, and dynamically simulating the motion track of the aircraft in the airport scene model;
The track display unit is used for visually displaying the predicted motion track of the aircraft in the airport scene model and displaying the current position of the aircraft and the predicted motion track;
The optimizing unit is used for comparing the motion trail obtained by simulation with the actually collected aircraft state data, evaluating the accuracy of the prediction algorithm, and adjusting and optimizing the kinematic model and the prediction algorithm according to the evaluation result.
Further, in one embodiment of the present invention, the lighting scheme formulation module 13 includes:
the model training unit is used for extracting key features from the airport scene model and training the machine learning model by using a preset machine learning algorithm based on the extracted key features;
The prediction evaluation unit is used for inputting the acquired aircraft state data into the trained machine learning model, predicting the future behaviors of the aircraft and evaluating the priority of each aircraft;
And the lighting scheme making unit is used for making a corresponding target lighting scheme according to the behavior prediction result and the priority evaluation result of the aircraft.
Further, in one embodiment of the present invention, the system further comprises:
The running path optimization module is used for optimizing the running path by adopting a collaborative operation strategy optimization algorithm based on the behavior prediction result of each aircraft;
The running path correction module is used for correcting the optimized running path according to the real-time acquired aircraft state data;
And the data transmission module is used for transmitting the corrected and optimized running path to the aircraft through the communication link so as to enable the aircraft to dynamically adjust the route according to the received running path.
The airport lighting control system provided by the embodiment of the invention has the same implementation principle and technical effects as those of the embodiment of the method, and for the sake of brevity, reference may be made to the corresponding contents of the embodiment of the method.
In the description of the present specification, a description referring to terms "one embodiment," "some embodiments," "examples," "specific examples," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present invention. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiments or examples. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
The above examples merely represent a few embodiments of the present invention, which are described in more detail and are not to be construed as limiting the scope of the present invention. It should be noted that it will be apparent to those skilled in the art that several variations and modifications can be made without departing from the spirit of the invention, which are all within the scope of the invention. Accordingly, the scope of the invention should be assessed as that of the appended claims.

Claims (10)

1. An airport lighting control method, the method comprising:
acquiring state data, airport environment data and obstacle data of each aircraft in real time, and integrating the data through a data fusion algorithm to generate comprehensive airport scene information;
Constructing a real-time airport scene model based on airport scene information, and dynamically simulating the motion trail of the aircraft by using a prediction algorithm based on aircraft state data;
analyzing an airport scene model according to a machine learning algorithm, predicting the behaviors of the aircrafts, carrying out priority assessment on each aircraft, and formulating a target lighting scheme based on the behavior prediction result and the priority assessment result;
and automatically adjusting the brightness, the color, the flicker frequency and the indication direction of the lamplight according to the formulated target lighting scheme.
2. The airfield lighting control method of claim 1, wherein the step of data integration by a data fusion algorithm to generate comprehensive airfield scene information comprises:
performing data cleaning, data calibration and standardization processing on the state data, airport environment data and obstacle data of each aircraft;
And (3) carrying out data integration on the state data of each aircraft, the airport environment data and the obstacle data by adopting a Kalman filtering or multi-source information fusion method to generate comprehensive airport scene information.
3. The method of airport lighting control of claim 1, wherein the step of constructing a real-time airport scene model based on airport scene information and dynamically simulating the motion profile of the aircraft using a predictive algorithm based on aircraft state data comprises:
Based on geographic information and a structure diagram of an airport, constructing an airport scene model by using a three-dimensional modeling tool, and dynamically updating the environment state and the obstacle position in the airport scene model according to airport environment data and obstacle data collected in real time;
According to the motion characteristics of the aircraft, a kinematic model of the aircraft is constructed, and a preset prediction algorithm is selected to predict the motion trail of the aircraft;
taking the real-time collected aircraft state data as input, and predicting the motion trail of the aircraft by using the constructed kinematic model and the selected prediction algorithm, and dynamically simulating the motion trail of the aircraft in the airport scene model;
Visually displaying the predicted motion trail of the aircraft in an airport scene model, and displaying the current position of the aircraft and the predicted motion trail;
and comparing the motion trail obtained by simulation with the actually collected aircraft state data, evaluating the accuracy of a prediction algorithm, and adjusting and optimizing a kinematic model and the prediction algorithm according to the evaluation result.
4. The method of airport lighting control of claim 1, wherein the steps of analyzing the airport scene model, predicting the behavior of the aircraft and prioritizing each aircraft according to a machine learning algorithm, and formulating a target lighting scheme based on the behavior prediction result and the prioritization result comprise:
Extracting key features from the airport scene model, and training the machine learning model by using a preset machine learning algorithm based on the extracted key features;
Inputting the collected aircraft state data into a trained machine learning model, predicting future behaviors of the aircraft, and evaluating the priority of each aircraft;
And according to the behavior prediction result and the priority evaluation result of the aircraft, a corresponding target lighting scheme is formulated.
5. The airfield lighting control method of claim 1, wherein the method further comprises:
Optimizing a running path by adopting a collaborative operation strategy optimization algorithm based on the behavior prediction result of each aircraft;
Correcting the optimized running path according to the real-time collected aircraft state data;
And transmitting the corrected and optimized running path to the aircraft through a communication link so that the aircraft dynamically adjusts the route according to the received running path.
6. An airfield lighting control system, the system comprising:
The data integration module is used for collecting the state data of each aircraft, the airport environment data and the obstacle data in real time, and integrating the data through a data fusion algorithm to generate comprehensive airport scene information;
the model construction module is used for constructing a real-time airport scene model based on airport scene information and dynamically simulating the motion trail of the aircraft by using a prediction algorithm based on aircraft state data;
The illumination scheme making module is used for analyzing the airport scene model according to the machine learning algorithm, predicting the behaviors of the aircrafts, carrying out priority assessment on each aircraft, and making a target illumination scheme based on the behavior prediction result and the priority assessment result;
And the illumination adjusting module is used for automatically adjusting the brightness, the color, the flicker frequency and the indication direction of the lamplight according to the formulated target illumination scheme.
7. The airfield lighting control system of claim 6, wherein the data integration module comprises:
the data processing unit is used for carrying out data cleaning, data calibration and standardization processing on the state data of each aircraft, the airport environment data and the obstacle data;
And the data integration unit is used for integrating the data of each aircraft state data, the airport environment data and the obstacle data by adopting a Kalman filtering or multi-source information fusion method to generate comprehensive airport scene information.
8. The airfield lighting control system of claim 6, wherein the model building module comprises:
the first model building unit is used for building an airport scene model by utilizing a three-dimensional modeling tool based on geographic information and a structural diagram of an airport, and dynamically updating the environment state and the obstacle position in the airport scene model according to airport environment data and obstacle data collected in real time;
The second model construction unit is used for constructing a kinematic model of the aircraft according to the motion characteristics of the aircraft, and selecting a preset prediction algorithm to predict the motion trail of the aircraft;
The track simulation unit is used for taking the real-time acquired aircraft state data as input, predicting the motion track of the aircraft by using the constructed kinematic model and the selected prediction algorithm, and dynamically simulating the motion track of the aircraft in the airport scene model;
The track display unit is used for visually displaying the predicted motion track of the aircraft in the airport scene model and displaying the current position of the aircraft and the predicted motion track;
The optimizing unit is used for comparing the motion trail obtained by simulation with the actually collected aircraft state data, evaluating the accuracy of the prediction algorithm, and adjusting and optimizing the kinematic model and the prediction algorithm according to the evaluation result.
9. The airfield lighting control system of claim 6, wherein the lighting scheme formulation module comprises:
the model training unit is used for extracting key features from the airport scene model and training the machine learning model by using a preset machine learning algorithm based on the extracted key features;
The prediction evaluation unit is used for inputting the acquired aircraft state data into the trained machine learning model, predicting the future behaviors of the aircraft and evaluating the priority of each aircraft;
And the lighting scheme making unit is used for making a corresponding target lighting scheme according to the behavior prediction result and the priority evaluation result of the aircraft.
10. The airfield lighting control system of claim 6, wherein the system further comprises:
The running path optimization module is used for optimizing the running path by adopting a collaborative operation strategy optimization algorithm based on the behavior prediction result of each aircraft;
The running path correction module is used for correcting the optimized running path according to the real-time acquired aircraft state data;
And the data transmission module is used for transmitting the corrected and optimized running path to the aircraft through the communication link so as to enable the aircraft to dynamically adjust the route according to the received running path.
CN202411038607.9A 2024-07-31 2024-07-31 Airport illumination control method and system Pending CN118765012A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202411038607.9A CN118765012A (en) 2024-07-31 2024-07-31 Airport illumination control method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202411038607.9A CN118765012A (en) 2024-07-31 2024-07-31 Airport illumination control method and system

Publications (1)

Publication Number Publication Date
CN118765012A true CN118765012A (en) 2024-10-11

Family

ID=92939971

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202411038607.9A Pending CN118765012A (en) 2024-07-31 2024-07-31 Airport illumination control method and system

Country Status (1)

Country Link
CN (1) CN118765012A (en)

Similar Documents

Publication Publication Date Title
US12066840B2 (en) Method and system for providing route of unmanned air vehicle
Shakhatreh et al. Unmanned aerial vehicles (UAVs): A survey on civil applications and key research challenges
CN111045444B (en) Adaptive sensing and avoidance system
RU2762151C2 (en) System and method for detecting obstacles in air traffic systems
KR102658343B1 (en) Unmanned aerial vehicle system to inspect railway assets
CN109923492B (en) Flight path determination
US20180233054A1 (en) Method and apparatus for controlling agent movement in an operating space
US20200118450A1 (en) Autonomous path planning
CN112382131B (en) Airport scene safety collision avoidance early warning system and method
KR20170111921A (en) Method and system for controlling unmanned air vehicle
KR20170101776A (en) Method and system for providing route of unmanned air vehicle
CN112365744B (en) Airport scene target operation management method, device and system
KR20170126637A (en) Method and system for providing route of unmanned air vehicle
CN110728857A (en) Low-altitude isolation airspace traffic management method based on vertically-taking-off and landing unmanned aerial vehicle
US8633835B1 (en) Display of climb capability for an aircraft based on potential states for the aircraft
CN111819610A (en) Air situation information and traffic management system for unmanned aerial vehicles and manned aircraft
CN112017482A (en) Method and system for avoiding collision between aircraft and other flying objects
Lester et al. Three quantitative means to remain well clear for small UAS in the terminal area
CN118765012A (en) Airport illumination control method and system
CN116597696A (en) Low-altitude aircraft collision avoidance early warning system and method based on various environmental factors
Ivanytskyi et al. UAS Flight Trajectory Optimization Algorithm Based on Operative Meteorological Information.
Le Tallec et al. Low level rpas traffic management (llrtm) concept of operation
CN118984512A (en) Airport illumination control method and system
US20230282121A1 (en) Displaying electromagnetic spectrum information for unmanned aerial vehicle (uav) navigation
US20230343230A1 (en) Method, apparatus and computer program to detect dangerous object for aerial vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination