CN113946641B - Forest fire area judging method - Google Patents
Forest fire area judging method Download PDFInfo
- Publication number
- CN113946641B CN113946641B CN202111101112.2A CN202111101112A CN113946641B CN 113946641 B CN113946641 B CN 113946641B CN 202111101112 A CN202111101112 A CN 202111101112A CN 113946641 B CN113946641 B CN 113946641B
- Authority
- CN
- China
- Prior art keywords
- fire
- fire point
- sensor
- latitude
- longitude
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 35
- 230000003993 interaction Effects 0.000 claims abstract description 13
- 238000000605 extraction Methods 0.000 claims description 28
- 238000012790 confirmation Methods 0.000 claims description 14
- 230000003287 optical effect Effects 0.000 claims description 9
- 239000000284 extract Substances 0.000 claims description 3
- 230000001360 synchronised effect Effects 0.000 claims description 2
- 238000001514 detection method Methods 0.000 abstract description 7
- 238000004364 calculation method Methods 0.000 abstract description 2
- 230000002265 prevention Effects 0.000 abstract description 2
- 238000012423 maintenance Methods 0.000 description 6
- 239000003638 chemical reducing agent Substances 0.000 description 4
- 238000010586 diagram Methods 0.000 description 4
- 235000012419 Thalia geniculata Nutrition 0.000 description 2
- 244000145580 Thalia geniculata Species 0.000 description 2
- 238000012937 correction Methods 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000005553 drilling Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/20—Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
- G06F16/29—Geographical information databases
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J5/00—Radiation pyrometry, e.g. infrared or optical thermometry
- G01J5/0014—Radiation pyrometry, e.g. infrared or optical thermometry for sensing the radiation from gases, flames
- G01J5/0018—Flames, plasma or welding
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F13/00—Interconnection of, or transfer of information or other signals between, memories, input/output devices or central processing units
- G06F13/38—Information transfer, e.g. on bus
- G06F13/42—Bus transfer protocol, e.g. handshake; Synchronisation
- G06F13/4282—Bus transfer protocol, e.g. handshake; Synchronisation on a serial bus, e.g. I2C bus, SPI bus
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02A—TECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
- Y02A40/00—Adaptation technologies in agriculture, forestry, livestock or agroalimentary production
- Y02A40/10—Adaptation technologies in agriculture, forestry, livestock or agroalimentary production in agriculture
- Y02A40/28—Adaptation technologies in agriculture, forestry, livestock or agroalimentary production in agriculture specially adapted for farming
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Databases & Information Systems (AREA)
- Plasma & Fusion (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Remote Sensing (AREA)
- Data Mining & Analysis (AREA)
- Fire Alarms (AREA)
Abstract
The invention discloses a forest fire area judging method, and belongs to the technical field of satellite attitude control. The method is used for a novel star-borne fire point detection sensor system, the longitude and latitude positions of the fire points are calculated while the fire point detection is completed, and meanwhile, whether the fire point is a forest fire point or not is determined, and the most direct fire information is provided for the national forest fire prevention units. Firstly, prescribing data interaction between a novel fire point sensor and an upper computer aiming at the novel fire point sensor; and secondly, matching the fire point longitude and latitude information obtained by calculation of the upper computer with a global forest area preloaded by the system, judging whether the fire point is in the forest area, and giving out whether the fire point is located in the environment or outside the environment.
Description
Technical Field
The invention relates to a forest fire area judging method, and belongs to the technical field of satellite attitude control.
Background
The fire point detection sensor is a novel satellite-borne sensor, the system calculates the longitude and latitude positions of the fire points according to the output information of the sensor, and a sensor user is more concerned about whether the fire points are forest fire points or not, so that the system is required to judge forest fire areas, no existing satellite-borne real-time fire point area judging method exists at present, and in order to judge in real time, the technical difficulties of realizing data interaction between the sensor and an upper computer, matching the fire point information with forest areas and the like are urgently needed to be solved.
Disclosure of Invention
The invention aims to overcome the defects and provide a forest fire area judging method, which is used for receiving the fire position information input by a fire sensor according to interaction between an upper computer system and the fire sensor, converting the fire position information into geographic longitude and latitude information of the fire, matching whether the fire is a forest fire in a plurality of forest areas preloaded by the system, and informing a user whether the fire is a domestic forest fire or an overseas forest fire.
In order to achieve the above purpose, the present invention provides the following technical solutions:
A forest fire area judging method comprises the following steps:
(1) The upper computer system acquires the upper geographical latitude limit and the lower geographical latitude limit of each of N forest areas according to the map information of the N forest areas, wherein the upper geographical latitude limit of the ith forest area is denoted as lambda maxi, and the lower geographical latitude limit is denoted as lambda mini, and i is more than or equal to 1 and less than or equal to N;
(2) The upper computer system divides an ith forest area into j max latitude strips according to an equal latitude interval delta lambda i, and obtains an upper geographical longitude limit L maxij and a lower geographical longitude limit L minij of the forest area on each latitude strip;
(3) The upper computer system interacts with the fire point sensor, receives the fire point azimuth information input by the fire point sensor, and converts the information into geographic longitude and latitude information (L m,λm) of the fire point;
(4) The upper computer system judges the occurrence area of the fire point according to the geographical latitude and longitude information (the latitude range determined by L m,λm),λmaxi and lambda mini and the longitude range determined by L maxij and L minij) of the fire point.
Further, in the step (4), the method for determining the occurrence area of the fire point according to the geographical latitude and longitude information (latitude range determined by L m,λm),λmaxi and λ mini, and longitude range determined by L maxij and L minij) of the fire point is as follows:
(411)i=1;
(412) Judging whether the fire point latitude lambda m is in the latitude range (lambda mini,λmaxi) of the ith forest area;
if not, performing step (417);
If the latitude range is within, performing a step (413);
(413)j=1;
(414) Determining whether the fire point longitude L m is in the longitude range of the jth latitude stripe in the ith forest area (L minij,Lmaxij);
if the longitude range is within, performing the step (415);
If not, proceeding to step (416);
(415) If i is E [1, N ] outputting a prompt that the fire point is in a forest area, outputting longitude and latitude information of a jth latitude strip in the ith forest area, and ending the cycle;
(416) If j > j max, then go to step (417); otherwise j=j+1, returning to step (414);
(417) If i is less than or equal to N, i=i+1, returning to the step (412); if i is more than N, ending the judging flow, outputting 'the fire point is not in the forest area', and ending the circulation.
Further, n=n1+n2, where N1 is the number of inner forest fire areas and N2 is the number of outer forest fire areas.
The method for judging the occurrence area of the fire point according to the geographical latitude and longitude information (latitude range determined by L m,λm),λmaxi and lambda mini and longitude range determined by L maxij and L minij) of the fire point is as follows:
(421)i=1;
(422) Judging whether the fire point latitude lambda m is in the latitude range (lambda mini,λmaxi) of the ith forest area;
if not, performing step (417);
If the latitude range is within, performing a step (413);
(423)j=1;
(424) Determining whether the fire point longitude L m is in the longitude range of the jth latitude stripe in the ith forest area (L minij,Lmaxij);
if the longitude range is within, performing the step (425);
if not, go to step (426);
(425) If i is E [1, N1], outputting that the fire point is an internal forest fire, ending the cycle, otherwise outputting that the fire point is an external forest fire, ending the cycle;
(426) If j > j max, then go to step (427); otherwise j=j+1, returning to step (424);
(427) If i is less than or equal to N, i=i+1, returning to the step (412); if i is more than N, ending the judging flow, outputting 'the fire point is not in the forest area', and ending the circulation.
Further, in the step (3), the geographical longitude and latitude information of the fire point is the geographical longitude and latitude information of the fire point after confirmation; the interaction method of the upper computer system and the fire point sensor comprises the following steps:
(31) The FIRE point sensor outputs the acquired FIRE point information to the upper computer in the form of a TM_FIRE data packet; the fire point information comprises a fire point extraction number, a real fire point mark and fire point azimuth information;
(32) The upper computer receives the TM_fire data packet input by the FIRE point sensor, judges according to the FIRE point extraction number and the real FIRE point mark, and performs the following operations according to the judgment result: no instruction is sent to the fire point sensor, or a TC_ FIREDATA instruction is sent to the fire point sensor, or a TC_PRE_ FIREDATA instruction is sent to the fire point sensor; the TC_ FIREDATA instruction and the TC_PRE_ FIREDATA instruction both comprise a fire point extraction number, a real fire point mark and fire point longitude and latitude information; the TC_ FIREDATA corresponds to the fire point after confirmation, and the TC_PRE_ FIREDATA command corresponds to the fire point before confirmation;
(33) The fire point sensor receives a TC_ FIREDATA instruction input by the upper computer and stores fire point information contained in the TC_ FIREDATA instruction into an independent area divided by the fire point sensor;
and the upper computer extracts the fire information stored by the fire sensor from the memory area of the upper computer through a 1553B bus to obtain the geographic longitude and latitude information of the fire after confirmation.
Further, in the step (31), the FIRE sensor sequentially outputs a first tm_fire and a second tm_fire each cycle, which correspond to the FIRE information before the first identification of the FIRE sensor and the FIRE information after the second identification of the FIRE sensor;
When the FIRE point sensor identifies the FIRE point for the first time, the FIRE point extraction number in the first time TM_fire is more than 0, and the real FIRE point mark is 0; if the FIRE point is identified and confirmed by the second time, the extraction number of the FIRE point in the second time TM_fire is more than 0, the real FIRE point mark is 1, and if the FIRE point is not identified and confirmed, the extraction number of the FIRE point in the second time TM_fire is 0, and the real FIRE point mark is 0;
when the first FIRE point sensor does not recognize a FIRE point, the FIRE point extraction number in the first TM_FIRE output and the second TM_FIRE output is 0, and the real FIRE point mark is 0.
Further, in the step (32), the method for receiving the fire information input by the fire sensor and judging according to the point extraction number and the real fire mark by the upper computer is as follows:
if the number of the extracted fire points is 0 and the longitude and latitude of the fire points are not calculated, the upper computer does not reply a TC_ FIREDATA or TC_PRE_ FIREDATA instruction to the fire point sensor;
if the number of the extracted fire points is more than 0 and the real fire point mark is 0, the upper computer calculates longitude and latitude information of all the extracted fire points and sends a TC_PRE_ FIREDATA instruction to the fire point sensor;
If the number of the extracted fires is more than 0 and the real fire mark is 1, the upper computer calculates the longitude and latitude of all the extracted fires and sends a TC_ FIREDATA instruction to the fire sensor.
Further, the fire point azimuth information is a unit vector representation (x, y, z) of a fire point vector in a reference mirror coordinate system of the sensor, and x 2+y2+z2 = 1 is satisfied; the reference mirror coordinate system of the sensor itself is defined as: the center of the reference mirror is a coordinate origin, the direction of the optical axis of the sensor is a +Z axis, the direction of the center of the reference mirror pointing to the connector fixedly connecting the sensor to the star is a +X axis, and the +Y axis, the +X axis and the +Z axis form a right-hand rectangular coordinate system.
Further, in the interaction method of the upper computer system and the fire point sensor, the time of the fire point sensor and the upper computer is synchronous;
the upper computer periodically sends a time calibration instruction to the fire point sensor, the fire point sensor immediately latches the self time after receiving the instruction, and the self time is compensated according to the difference value between the upper computer time and the self time contained in the time calibration instruction, so that the consistency of the fire point sensor and the upper computer time is realized;
When the fire point sensor does not receive the time calibration instruction sent by the upper computer within a certain period of time, the time of the fire point sensor is corrected according to the internal clock of the fire point sensor.
Further, the tm_fire packet, the tc_ FIREDATA instruction, and the tc_pre_ FIREDATA instruction further include FIRE area and feature information, and the upper computer system determines an occurrence area of the FIRE and outputs the FIRE area and feature information according to geographic latitude and longitude information (latitude ranges determined by L m,λm),λmaxi and λ mini, and longitude ranges determined by L maxij and L minij) of the FIRE.
Compared with the prior art, the invention has the following beneficial effects:
(1) The invention makes clear the data interaction between the upper computer and the fire point sensor. Receiving fire point azimuth information input by a fire point sensor, and converting the fire point azimuth information into geographic longitude and latitude information of a fire point; specific content of sensor fire information output is specified, and content of information interaction is designed respectively for two conditions of fire detection and fire non-detection of the sensor, so that practicality and intelligent level of the sensor are improved;
(2) The invention accurately defines the area of each forest map through the upper limit and the lower limit of the geographic latitude and the upper limit and the lower limit of the geographic longitude of each latitude strip, judges whether the longitude and latitude information of the fire point falls into each forest area, and can judge the occurrence area of the fire point, and the judging method is accurate;
(3) The invention can inform the user in real time whether the fire point is domestic forest fire point or overseas forest fire point, and provides the most direct fire information for the national forest fire prevention department.
Drawings
FIG. 1 is a schematic diagram of a prior art fire sensor;
FIG. 2 is a schematic diagram of a rotary encoder of a fire sensor in the prior art;
FIG. 3 is a schematic diagram showing data interaction between a fire sensor and an upper computer in the present invention;
FIG. 4 is a schematic diagram of cloud data output according to the present invention;
FIG. 5 is a schematic view of the longitude and latitude area division of the forest area according to the present invention;
FIG. 6 is a general flow chart of a forest fire area judging method of the present invention;
fig. 7 is a flowchart of determining a fire occurrence area in the forest fire area determination method of the present invention.
Detailed Description
The features and advantages of the present invention will become more apparent and clear from the following detailed description of the invention.
The word "exemplary" is used herein to mean "serving as an example, embodiment, or illustration. Any embodiment described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments. Although various aspects of the embodiments are illustrated in the accompanying drawings, the drawings are not necessarily drawn to scale unless specifically indicated.
As shown in fig. 1 and 2, the fire detection sensor comprises an optical lens, a motor, a code disc and various optical filters on the code disc, an opaque correction baffle and a full-pass device, wherein the motor is connected with a speed reducer, the code disc is fixed on an output shaft of the speed reducer, various optical filters (such as different long-band infrared optical filters and middle-band infrared optical filters) are arranged on a rotary disc around a rotary shaft of the speed reducer, the correction baffle and the full-pass device are driven by the speed reducer to rotate so that the required optical filters are positioned in an optical path to realize fire detection, and azimuth information of a fire point in a reference mirror coordinate system of the sensor is sent to an upper computer system.
As shown in fig. 3 and fig. 4, the data interaction method between the fire point sensor and the upper computer in the invention comprises the following steps:
First, the ignition information is output: as shown in fig. 2, the filter wheel code wheel inside the FIRE sensor moves for about 1.5-2 s for one turn, and outputs twice valid tm_fire data packets (frame count changes) in each turn:
The FIRE point sensor sequentially outputs a first TM_fire and a second TM_fire each cycle, and the FIRE point information before confirmation of the first identification and the FIRE point information after confirmation of the second identification of the FIRE point sensor are respectively corresponding to the FIRE point information;
When the FIRE point sensor identifies the FIRE point for the first time, the FIRE point extraction number in the first time TM_fire is more than 0, and the real FIRE point mark is 0; if the FIRE point is identified and confirmed by the second time, the extraction number of the FIRE point in the second time TM_fire is more than 0, the real FIRE point mark is 1, and if the FIRE point is not identified and confirmed, the extraction number of the FIRE point in the second time TM_fire is 0, and the real FIRE point mark is 0;
If no FIRE is identified, the FIRE extraction numbers in the first TM_FIRE output and the second TM_FIRE output are 0;
Then, fire information is input: after the upper computer receives the TM_FIRE data packet, judging the 'numFireExtracted FIRE point extraction number n' and the 'real FIRE point mark' in the packet;
if the fire point extraction number n of numFireExtracted is 0, the longitude and latitude are not required to be calculated, and a TC_ FIREDATA or TC_PRE_ FIREDATA instruction is not required to be returned to the sensor;
If 'numFireExtracted extracts FIRE points n' >0 and 'real FIRE point mark' is 0, indicating that the TM_fire data packet corresponds to the confirmed front FIRE point, the upper computer calculates the longitude and latitude of all the extracted FIRE points, and informs the sensor through TC_PRE_ FIREDATA instruction, and the sensor uses the data in the instruction to perform windowing;
If the 'numFireExtracted extracted FIRE points n' is greater than 0 and the 'real FIRE point mark' is 1, indicating that the TM_fire data packet corresponds to the confirmed FIRE points, the upper computer calculates the longitude and latitude of all the extracted FIRE points, eliminates the pseudo FIRE points according to the longitude and latitude and the preloaded pseudo FIRE points, rearranges the rest FIRE points, and returns the rearranged FIRE points to the sensor through the TC_ FIREDATA instruction.
In the interaction process, the time consistency between the upper computer and the sensor is ensured, and a preferable method is that the upper computer sends a time calibration instruction to the sensor, and the specific method is as follows:
The upper computer sends a time calibration instruction to the sensor every 1s, wherein the instruction comprises the time of the upper computer; the sensor latches the sensor time immediately after receiving the instruction, and compensates the sensor time by using the difference value between the received upper computer time and the received upper computer time to ensure the consistency of the sensor time and the upper computer time.
Further, in order to improve the reliability of the sensor and the accuracy of fire point prediction, the upper computer sends needed auxiliary data information (longitude and latitude of the center of the field of view of the sensor, attribute of the surface projection point of the center of the field of view, speed of the surface projection point of the center of the field of view, distance between the surface projection point of the line of sight of the center of the field of view and a satellite, angular speed of the sensor and the like) to the sensor.
Further, the sensor outputs cloud judgment data: the sensor detects cloud layer targets in the view field in real time and has cloud judgment result output capacity of 32 multiplied by 8 areas;
Further, the sensor autonomously stores fire information and outputs the stored fire information according to requirements: the sensor is divided into independent areas for storing fire information, and the upper computer moves the fire information stored by the sensor to the upper computer software memory area through a 1553B bus.
Further, the on-orbit maintenance of the sensor is also included: the upper computer performs on-orbit maintenance on parameters and software of the sensor through an on-orbit maintenance instruction.
Example 1
As shown in fig. 4, in this embodiment, the content of the data interaction method mainly includes:
(1) Time alignment instructions: the upper computer sends a time calibration command TC_SYNC_ TIMEABS to the sensor every 1s, and the command content is as shown in Table 1:
TABLE 1 time alignment instruction TC_SYNC_ TIMEABS content
Content | Number of bytes | Format of the form | Equivalent weight |
Time_s | 4 | Unsigned integer | 1s |
Time_us | 4 | Unsigned integer | 1us |
The sensor immediately latches the sensor time after receiving the instruction, and compensates the sensor time by using the difference value between the received upper computer time and the received own time to ensure the consistency of the sensor time and the upper computer time; if the upper computer does not periodically send the instruction, that is, the sensor does not receive the time of the upper computer within a certain period of time, the sensor is still required to correct the time according to the internal clock. The time unification between the sensor and the upper computer is realized.
(2) Sensor assistance data: in order to improve the reliability of the sensor and the accuracy of fire point prediction, the upper computer sends needed auxiliary data information (longitude and latitude of the center of a field of view of the sensor, attribute of the surface projection point of the center of the field of view, speed of the surface projection point of the center of the field of view, distance between the surface projection point of the line of sight of the center of the field of view and a satellite and the like) to the sensor. The longitude and latitude of the center of the field of view represent the longitude and latitude position of the visual axis of the sensor at the projection point of the ground surface; the earth projection point attribute represents an attribute of the visual axis of the sensor at the earth projection point, such as land or sea, earth interior or earth exterior, and the like. The sensor assists the fire judgment according to the auxiliary data, for example, if the fire is identified in the ocean area, the sensor can be an offshore drilling platform, and the sensor is not a forest fire; for example, if the sensor identifies a fire outside the earth, it may be solar light irradiation or the like.
(3) And (3) fire point information output: the internal code disk of the FIRE point sensor moves for about 1.5-2 s for one circle, and a valid TM_fire data packet is output twice in each circle, wherein the content of the data packet is shown in table 2, a real FIRE point mark is 1 or 0,1 represents a real FIRE point after confirmation, and 0 represents a FIRE point before confirmation or no FIRE point exists; the fire point azimuth is a unit vector representation (X, Y, Z) of the fire point vector in a reference mirror coordinate system of the sensor itself, satisfying X 2+Y2+Z2 =1:
Table 2TM_FIRE packet content
If the FIRE point sensor identifies a FIRE point, the Zhou Qidi one time tm_fire output is "FIRE point data before confirmation", and the second time tm_fire output is "FIRE point data after confirmation" (if there is a real FIRE point); if no real FIRE point is confirmed, the FIRE point extraction number in the second TM_fire output is 0;
If the FIRE point sensor does not identify the FIRE point, the FIRE point extraction number in the first TM_FIRE output and the second TM_FIRE output is 0;
(4) Fire information input: after the upper computer receives the TM_FIRE data packet, judging the FIRE point extraction number and the real FIRE point mark in the packet:
If the fire point extraction number is 0, the longitude and latitude of the fire point are not required to be calculated, and a TC_ FIREDATA or TC_PRE_ FIREDATA instruction is not required to be returned to the sensor;
If the FIRE point extraction number is '0' and the real FIRE point mark is 0, indicating that the TM_fire data packet corresponds to the confirmed front FIRE point, the upper computer calculates longitude and latitude information of all the extracted FIRE points and sends a TC_PRE_ FIREDATA instruction to the sensor; the method for calculating the longitude and latitude information comprises the steps of calculating the geographic longitude and latitude information of a fire point according to azimuth information output by a fire point sensor, installation information of the sensor in a whole star, satellite orbit information and attitude information of the moment corresponding to the fire point, wherein the specific calculation method can refer to a 'remote sensing image geometric positioning' method;
if the FIRE point extraction number is '0' and the real FIRE point mark is 1, indicating that the TM_fire data packet corresponds to the confirmed FIRE point, the upper computer calculates the longitude and latitude of all the extracted FIRE points and sends a TC_ FIREDATA instruction to the sensor;
The contents of the TC_ FIREDATA or TC_PRE_ FIREDATA instructions are shown in Table 3, where the "real fire flag" in TC_ FIREDATA is 1 and the "real fire flag" in the TC_PRE_FIREDATA instruction is 0:
table 3TC_FIREDATA or TC_PRE_ FIREDATA instructs content
Sequence number | Instruction content |
1 | Sensor time_s |
2 | Sensor time_us |
3 | Extracting the number of fire points |
4 | Real fire mark |
5 | Fire point 1 longitude |
6 | Fire point 1 latitude |
7 | Fire point 1 area and characteristic information |
。。。 | |
Fire point nX longitude | |
Fire point nY latitude | |
Fire n area and characteristic information |
(5) Cloud judgment data output: as shown in fig. 4, the sensor detects a cloud target in the field of view in real time, and has cloud judgment result output capability of 32×8 areas; intercepting symmetrical cloud judgment total areas by taking the center of a field of view of a sensor as a center point, equally dividing the total areas into 32 multiplied by 8 sub-areas, outputting cloud judgment results aiming at each sub-area, outputting specific requirements and contents of cloud layer information in the field of view to an upper computer, and performing data selection or other operations by the upper computer according to the cloud judgment results of each sub-area, wherein each sub-area cloud judgment is one of the following table 4:
TABLE 4 possible cloud determination results for each sub-region
Sequence number | Cloud judgment result |
1 | Determining clear sky |
2 | Suspected clear sky |
3 | Suspected cloud |
4 | Determining cloud |
(6) And outputting fire point information stored by the sensor: the sensor is divided into independent areas for storing fire point information, the sensor automatically stores the fire point data, and the fire point information can be acquired at any time when the ground is needed; specifically, the upper computer moves the fire information stored by the sensor to the upper computer software memory area through the 1553B bus. To increase data availability, the sensor is required to only store fire information in the TC_ FIREDATA instruction packet.
(7) And (3) on-orbit maintenance of the sensor: the upper computer performs on-orbit maintenance on parameters and software of the sensor through an on-orbit maintenance instruction.
As shown in fig. 5, 6 and 7, the forest fire area judging method of the present invention comprises:
s1, preassembling N1 pieces of forest area map information of China and N2 pieces of forest area map information of other global areas by a system:
(1) For each forest area, giving an upper latitude limit lambda maxi and a lower latitude limit lambda mini of the area;
(2) The area is divided into j latitude strips by an equal latitude interval Δλ i, each of which marks an upper geographic longitude limit L maxij and a lower geographic longitude limit L minij of the forest area, as shown in fig. 3.
S2, judging forest fire area
The system is matched with a global forest area preloaded by the system according to the geographical longitude and latitude information of the fire point, and whether the fire point is located in the forest area and the inside and outside of the environment is judged, specifically as follows:
(1)i=1;
(2) Judging whether the fire is in the latitude range (lambda mini,λmaxi) of the ith forest area according to the input fire latitude lambda m:
if the fire is not in the range, the fire is not in the ith forest area, and the step (7) is carried out;
if the range is within the range, carrying out the next step;
(3)j=1;
(4) Judging whether a j-th latitude strip (L minij,Lmaxij) in the fire area exists according to the input fire longitude L m:
If the latitude and longitude range is in the (L minij,Lmaxij) longitude range, performing the step (5);
If not, carrying out the step (6);
(5) If i is E [1, N1], outputting that the fire point is an internal forest fire, ending the cycle, otherwise outputting that the fire point is an external forest fire, ending the cycle;
(6) If j is more than j max, performing the step (7); otherwise j=j+1, returning to step (4);
(7) If i is less than or equal to N1+N2, i=i+1, and returning to the step (2); if i is more than N1+N2, ending the judging flow, outputting 'the fire point is not in the forest area', and ending the cycle.
The invention has been described in detail in connection with the specific embodiments and exemplary examples thereof, but such description is not to be construed as limiting the invention. It will be understood by those skilled in the art that various equivalent substitutions, modifications or improvements may be made to the technical solution of the present invention and its embodiments without departing from the spirit and scope of the present invention, and these fall within the scope of the present invention. The scope of the invention is defined by the appended claims.
What is not described in detail in the present specification is a well known technology to those skilled in the art.
Claims (7)
1. The forest fire area judging method is characterized by comprising the following steps of:
(1) The upper computer system acquires the upper geographical latitude limit and the lower geographical latitude limit of each of N forest areas according to the map information of the N forest areas, wherein the upper geographical latitude limit of the ith forest area is denoted as lambda maxi, and the lower geographical latitude limit is denoted as lambda mini, and i is more than or equal to 1 and less than or equal to N;
(2) The upper computer system divides an ith forest area into j max latitude strips according to an equal latitude interval delta lambda i, and obtains an upper geographical longitude limit L maxij and a lower geographical longitude limit L minij of the forest area on each latitude strip;
(3) The upper computer system interacts with the fire point sensor, receives the fire point azimuth information input by the fire point sensor, and converts the information into geographic longitude and latitude information (L m,λm) of the fire point;
(4) The upper computer system judges the occurrence area of the fire point according to the geographical latitude and longitude information (latitude range determined by L m,λm),λmaxi and lambda mini, and longitude range determined by L maxij and L minij;
the N=N1+N2, wherein N1 is the number of the inner forest fire areas, and N2 is the number of the outer forest fire areas;
In the step (4), the method for judging the occurrence area of the fire point according to the geographical latitude and longitude information (latitude range determined by L m,λm),λmaxi and lambda mini, and longitude range determined by L maxij and L minij) of the fire point is as follows:
(411)i=1;
(412) Judging whether the fire point latitude lambda m is in the latitude range (lambda mini,λmaxi) of the ith forest area;
if not, performing step (417);
If the latitude range is within, performing a step (413);
(413)j=1;
(414) Determining whether the fire point longitude L m is in the longitude range of the jth latitude stripe in the ith forest area (L minij,Lmaxij);
if the longitude range is within, performing the step (415);
If not, proceeding to step (416);
(415) If i is E [1, N ] outputting a prompt that the fire point is in a forest area, outputting longitude and latitude information of a jth latitude strip in the ith forest area, and ending the cycle;
(416) If j > j max, then go to step (417); otherwise j=j+1, returning to step (414);
(417) If i is less than or equal to N, i=i+1, returning to the step (412); if i is more than N, ending the judging flow, outputting 'the fire point is not in the forest area', and ending the cycle;
Or alternatively, the first and second heat exchangers may be,
In the step (4), the method for judging the occurrence area of the fire point according to the geographical latitude and longitude information (latitude range determined by L m,λm),λmaxi and lambda mini, and longitude range determined by L maxij and L minij) of the fire point is as follows:
(421)i=1;
(422) Judging whether the fire point latitude lambda m is in the latitude range (lambda mini,λmaxi) of the ith forest area;
if not, performing step (417);
If the latitude range is within, performing a step (413);
(423)j=1;
(424) Determining whether the fire point longitude L m is in the longitude range of the jth latitude stripe in the ith forest area (L minij,Lmaxij);
if the longitude range is within, performing the step (425);
if not, go to step (426);
(425) If i is E [1, N1], outputting that the fire point is an internal forest fire, ending the cycle, otherwise outputting that the fire point is an external forest fire, ending the cycle;
(426) If j > j max, then go to step (427); otherwise j=j+1, returning to step (424);
(427) If i is less than or equal to N, i=i+1, returning to the step (412); if i is more than N, ending the judging flow, outputting 'the fire point is not in the forest area', and ending the circulation.
2. The method according to claim 1, wherein in the step (3), the geographical latitude and longitude information of the fire point is the geographical latitude and longitude information of the fire point after confirmation; the interaction method of the upper computer system and the fire point sensor comprises the following steps:
(31) The FIRE point sensor outputs the acquired FIRE point information to the upper computer in the form of a TM_FIRE data packet; the fire point information comprises a fire point extraction number, a real fire point mark and fire point azimuth information;
(32) The upper computer receives the TM_fire data packet input by the FIRE point sensor, judges according to the FIRE point extraction number and the real FIRE point mark, and performs the following operations according to the judgment result: no instruction is sent to the fire point sensor, or a TC_ FIREDATA instruction is sent to the fire point sensor, or a TC_PRE_ FIREDATA instruction is sent to the fire point sensor; the TC_ FIREDATA instruction and the TC_PRE_ FIREDATA instruction both comprise a fire point extraction number, a real fire point mark and fire point longitude and latitude information; the TC_ FIREDATA corresponds to the fire point after confirmation, and the TC_PRE_ FIREDATA command corresponds to the fire point before confirmation;
(33) The fire point sensor receives a TC_ FIREDATA instruction input by the upper computer and stores fire point information contained in the TC_ FIREDATA instruction into an independent area divided by the fire point sensor;
and the upper computer extracts the fire information stored by the fire sensor from the memory area of the upper computer through a 1553B bus to obtain the geographic longitude and latitude information of the fire after confirmation.
3. The method according to claim 2, wherein in the step (31), the FIRE sensor sequentially outputs a first tm_fire and a second tm_fire each cycle, corresponding to FIRE information before the first identification of the FIRE sensor and FIRE information after the second identification of the FIRE sensor;
When the FIRE point sensor identifies the FIRE point for the first time, the FIRE point extraction number in the first time TM_fire is more than 0, and the real FIRE point mark is 0; if the FIRE point is identified and confirmed by the second time, the extraction number of the FIRE point in the second time TM_fire is more than 0, the real FIRE point mark is 1, and if the FIRE point is not identified and confirmed, the extraction number of the FIRE point in the second time TM_fire is 0, and the real FIRE point mark is 0;
when the first FIRE point sensor does not recognize a FIRE point, the FIRE point extraction number in the first TM_FIRE output and the second TM_FIRE output is 0, and the real FIRE point mark is 0.
4. The method for determining forest fire area according to claim 2, wherein in the step (32), the upper computer receives the fire information inputted by the fire sensor, and determines based on the point extraction number and the real fire mark as follows:
if the number of the extracted fire points is 0 and the longitude and latitude of the fire points are not calculated, the upper computer does not reply a TC_ FIREDATA or TC_PRE_ FIREDATA instruction to the fire point sensor;
if the number of the extracted fire points is more than 0 and the real fire point mark is 0, the upper computer calculates longitude and latitude information of all the extracted fire points and sends a TC_PRE_ FIREDATA instruction to the fire point sensor;
If the number of the extracted fires is more than 0 and the real fire mark is 1, the upper computer calculates the longitude and latitude of all the extracted fires and sends a TC_ FIREDATA instruction to the fire sensor.
5. The forest fire area judgment method according to claim 1 or 2, wherein the fire location information is a unit vector representation (x, y, z) of a fire vector in a reference mirror coordinate system of the sensor itself, satisfying x 2+y2+z2 =1; the reference mirror coordinate system of the sensor itself is defined as: the center of the reference mirror is a coordinate origin, the direction of the optical axis of the sensor is a +Z axis, the direction of the center of the reference mirror pointing to the connector fixedly connecting the sensor to the star is a +X axis, and the +Y axis, the +X axis and the +Z axis form a right-hand rectangular coordinate system.
6. The forest fire area judging method according to claim 2, wherein in the interaction method of the upper computer system and the fire point sensor, the time of the fire point sensor and the upper computer is synchronous;
the upper computer periodically sends a time calibration instruction to the fire point sensor, the fire point sensor immediately latches the self time after receiving the instruction, and the self time is compensated according to the difference value between the upper computer time and the self time contained in the time calibration instruction, so that the consistency of the fire point sensor and the upper computer time is realized;
When the fire point sensor does not receive the time calibration instruction sent by the upper computer within a certain period of time, the time of the fire point sensor is corrected according to the internal clock of the fire point sensor.
7. The method of claim 2, wherein the tm_fire packet, the tc_ FIREDATA instruction, and the tc_pre_ FIREDATA instruction further include FIRE area and feature information, and the upper computer system determines the FIRE occurrence area and outputs the FIRE area and feature information according to geographic latitude and longitude information (latitude ranges determined by L m,λm),λmaxi and λ mini, and longitude ranges determined by L maxij and L minij) of the FIRE.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111101112.2A CN113946641B (en) | 2021-09-18 | 2021-09-18 | Forest fire area judging method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111101112.2A CN113946641B (en) | 2021-09-18 | 2021-09-18 | Forest fire area judging method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113946641A CN113946641A (en) | 2022-01-18 |
CN113946641B true CN113946641B (en) | 2024-07-09 |
Family
ID=79328385
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111101112.2A Active CN113946641B (en) | 2021-09-18 | 2021-09-18 | Forest fire area judging method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113946641B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115493550A (en) * | 2022-09-01 | 2022-12-20 | 北京控制工程研究所 | Satellite-borne fake fire point elimination on-orbit implementation method |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106991681A (en) * | 2017-04-11 | 2017-07-28 | 福州大学 | A kind of fire boundary vector information extract real-time and method for visualizing and system |
CN107633637A (en) * | 2017-10-25 | 2018-01-26 | 国网湖南省电力公司 | A kind of power network mountain fire satellite monitoring alarm localization method based on bivariate table interpolation |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20200076133A (en) * | 2018-12-19 | 2020-06-29 | 삼성전자주식회사 | Electronic device and method for providing vehicle to everything service thereof |
-
2021
- 2021-09-18 CN CN202111101112.2A patent/CN113946641B/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106991681A (en) * | 2017-04-11 | 2017-07-28 | 福州大学 | A kind of fire boundary vector information extract real-time and method for visualizing and system |
CN107633637A (en) * | 2017-10-25 | 2018-01-26 | 国网湖南省电力公司 | A kind of power network mountain fire satellite monitoring alarm localization method based on bivariate table interpolation |
Also Published As
Publication number | Publication date |
---|---|
CN113946641A (en) | 2022-01-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US5204818A (en) | Surveying satellite apparatus | |
JP2798557B2 (en) | Track display device for navigation system | |
CN1788188B (en) | Picked-up image display method and device | |
CN111045068A (en) | Low-orbit satellite autonomous orbit and attitude determination method based on non-navigation satellite signals | |
US20100176992A1 (en) | Method and device for determining a position | |
CN110501736B (en) | System and method for tightly coupling positioning by utilizing visual images and GNSS ranging signals | |
CN100437143C (en) | Region satellite navigation system and method thereof | |
US20080063270A1 (en) | Method and Apparatus for Determining a Location Associated With an Image | |
EP2483633A1 (en) | Method and system for spectral image celestial navigation | |
CN104360362B (en) | Method and system for positioning observed object via aircraft | |
CN113946641B (en) | Forest fire area judging method | |
CN113934805B (en) | Data interaction method for satellite-borne intelligent fire sensor | |
CN110989670B (en) | Unmanned aerial vehicle system for environmental water conservation monitoring of power transmission and transformation project and aerial photography method thereof | |
CN104458653B (en) | Method and system for measuring atmospheric refraction value at large zenith distance | |
CN111639662A (en) | Remote sensing image bidirectional matching method and device, electronic equipment and storage medium | |
CN108955647A (en) | Scene of a fire localization method and system based on unmanned plane | |
CN115588267A (en) | Natural fire early warning and response system based on remote sensing | |
CN113110602A (en) | Neural network's automatic system of patrolling and examining of high robustness unmanned aerial vehicle power equipment | |
CN111474567A (en) | Precise positioning system of unmanned aerial vehicle and take-off and landing method | |
CN118089707B (en) | Single-star positioning method based on astronomical navigation | |
CN117889831B (en) | Terminal positioning method based on low-orbit satellite image matching | |
Meguro et al. | Development of positioning technique using omni-directional IR camera and aerial survey data | |
CN205593524U (en) | Integrated non - scanning laser radar's aerial photogrametry system | |
RU2215995C1 (en) | Navigation and electronic display complex | |
EP1899889A2 (en) | Method and apparatus for determining a location associated with an image |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |