[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN112969977A - Catching auxiliary method, ground command platform, unmanned aerial vehicle, system and storage medium - Google Patents

Catching auxiliary method, ground command platform, unmanned aerial vehicle, system and storage medium Download PDF

Info

Publication number
CN112969977A
CN112969977A CN202080005974.2A CN202080005974A CN112969977A CN 112969977 A CN112969977 A CN 112969977A CN 202080005974 A CN202080005974 A CN 202080005974A CN 112969977 A CN112969977 A CN 112969977A
Authority
CN
China
Prior art keywords
target object
capture
unmanned aerial
aerial vehicle
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202080005974.2A
Other languages
Chinese (zh)
Inventor
翁松伟
陈秀秀
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SZ DJI Technology Co Ltd
Original Assignee
SZ DJI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SZ DJI Technology Co Ltd filed Critical SZ DJI Technology Co Ltd
Publication of CN112969977A publication Critical patent/CN112969977A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The application discloses a capture auxiliary method, a ground command platform, an unmanned aerial vehicle, a system and a storage medium, wherein the method comprises the following steps: controlling a plurality of unmanned aerial vehicles to monitor a capture area (S101); acquiring a plurality of position information of a target object acquired by an unmanned aerial vehicle which monitors the target object in a capture area (S102); the movement trajectory of the target object is drawn according to the plurality of position information, and the movement trajectory is displayed (S103). The method and the device can assist commanders in capturing the target.

Description

Catching auxiliary method, ground command platform, unmanned aerial vehicle, system and storage medium
Technical Field
The application relates to the technical field of target tracking, in particular to a capture assisting method, a ground command platform, an unmanned aerial vehicle, a system and a storage medium.
Background
Along with the high-speed development of unmanned aerial vehicle manufacturing industry in China, the application scenes of unmanned aerial vehicles are more and more, for example, the unmanned aerial vehicle is used for mapping, cable inspection, petroleum pipeline inspection, deep forest fire prevention, emergency rescue and disaster relief, target searching at night and the like. However, the caught target moves continuously, only the position of the searched target is displayed, the searching personnel cannot accurately know the next action of the target, the target cannot be caught accurately, and the target escapes.
Disclosure of Invention
Based on this, the application provides a capture auxiliary method, a ground command platform, an unmanned aerial vehicle, a system and a storage medium, and aims to assist a commander or a search and capture person to capture a target and reduce the search difficulty of the searched target and the probability of the target escaping from an enclosure.
In a first aspect, the present application provides a capture assisting method, including:
controlling a plurality of unmanned aerial vehicles to monitor a capture area through respective carried shooting devices;
acquiring a plurality of position information of the target object, which is acquired when the unmanned aerial vehicle monitoring the target object in the capture area carries out flight tracking on the target object;
and drawing the moving track of the target object according to the position information, and displaying the moving track for a commander to view.
In a second aspect, the present application further provides a capture assisting method, which is applied to an unmanned aerial vehicle, where the unmanned aerial vehicle is equipped with a shooting device, and the method includes:
controlling the shooting device to monitor a capture area;
when a target object in the capture area is monitored, controlling the unmanned aerial vehicle to carry out flight tracking on the target object;
drawing a moving track of the target object in the process of carrying out flight tracking on the target object;
and sending the moving track to a ground command platform so that the ground command platform can display the moving track, and a commander can check the moving track.
In a third aspect, the present application further provides a ground command platform, which includes a memory and a processor; the memory is used for storing a computer program;
the processor is configured to execute the computer program and, when executing the computer program, implement the steps of the capture assist method as described above.
In a fourth aspect, the present application further provides a drone, the drone comprising a memory and a processor; the memory is used for storing a computer program;
the processor is configured to execute the computer program and, when executing the computer program, implement the steps of the capture assist method as described above.
In a fifth aspect, the present application further provides a capture assist system, including: ground command platform, an at least unmanned aerial vehicle and at least one control terminal, ground command platform pass through the network with at least one unmanned aerial vehicle is connected, control terminal is used for controlling unmanned aerial vehicle's flight, unmanned aerial vehicle carries the camera, wherein:
the ground command platform is used for acquiring capture task data, wherein the capture task data comprises capture task information of a capture sub-region corresponding to each unmanned aerial vehicle participating in a capture task;
the ground command platform is further used for sending the capture task information of the capture sub-area corresponding to each unmanned aerial vehicle to the corresponding control terminal;
the control terminal is used for sending the capture task information to the corresponding unmanned aerial vehicle;
the unmanned aerial vehicle is used for controlling the shooting device to monitor the corresponding capture subarea according to the capture task information;
when the unmanned aerial vehicle monitors a target object in the capture sub-area, the unmanned aerial vehicle carries out flight tracking on the target object;
the unmanned aerial vehicle is further used for acquiring a plurality of position information of the target object in the process of carrying out flight tracking on the target object and sending the position information to the ground command platform;
and the ground command platform is also used for drawing the moving track of the target object according to the position information and displaying the moving track for the command staff to check.
In a sixth aspect, the present application further provides a computer readable storage medium having stored thereon a computer program which, when executed by a processor, causes the processor to carry out the steps of the capture assist method as described above.
The embodiment of the application provides a capture auxiliary method, a ground command platform, an unmanned aerial vehicle, a system and a storage medium, wherein a plurality of unmanned aerial vehicles are controlled to monitor a capture area through respective carried shooting devices, in the monitoring process, a plurality of position information of a target object collected when the unmanned aerial vehicle monitoring the target object in the capture area carries out flight tracking on the target object is obtained, then a moving track of the target object is drawn according to the plurality of position information of the target object, and the moving track is displayed for a commander to check, so that the commander can know the next action of the target object according to the moving track, the commander can command the target object to be captured by the target object, the command of the commander or the target object can be effectively assisted in capturing the target, the target capturing difficulty of the target object to be searched and the probability of the target object escaping from a containment ring are reduced, user experience is greatly improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic diagram of a scenario for implementing the capture assisting method provided in the present application;
fig. 2 is a flowchart illustrating steps of an assist method for capture according to an embodiment of the present disclosure;
FIG. 3 is a schematic flow diagram of a sub-step of the capture assist method of FIG. 2;
FIG. 4 is a schematic flow diagram of another sub-step of the capture assist method of FIG. 2;
fig. 5 is a flowchart illustrating steps of another capture assist method provided in an embodiment of the present application;
fig. 6 is a flowchart illustrating steps of another capture assist method provided in an embodiment of the present application;
FIG. 7 is a schematic flow diagram of a sub-step of the capture assist method of FIG. 6;
fig. 8 is a schematic block diagram of a structure of a ground command platform according to an embodiment of the present disclosure;
fig. 9 is a schematic block diagram of a structure of an unmanned aerial vehicle provided in an embodiment of the present application;
fig. 10 is a schematic block diagram of a structure of a capture assist system provided in an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The flow diagrams depicted in the figures are merely illustrative and do not necessarily include all of the elements and operations/steps, nor do they necessarily have to be performed in the order depicted. For example, some operations/steps may be decomposed, combined or partially combined, so that the actual execution sequence may be changed according to the actual situation.
Some embodiments of the present application will be described in detail below with reference to the accompanying drawings. The embodiments described below and the features of the embodiments can be combined with each other without conflict.
The application provides a capture auxiliary method, a ground command platform, an unmanned aerial vehicle, a system and a storage medium, please refer to fig. 1, fig. 1 is a scene schematic diagram for implementing the capture auxiliary method provided by the application, as shown in fig. 1, the scene includes the ground command platform 100, at least one control terminal 200 and at least one unmanned aerial vehicle 300, the ground command platform 100 is in communication connection with each control terminal 200, the control terminal 200 is in communication connection with the unmanned aerial vehicle 300, the ground command platform 100 controls the unmanned aerial vehicle 300 to fly through the control terminal 200, in an embodiment, the scene may also only include the ground command platform 100 and at least one unmanned aerial vehicle 300, the ground command platform 100 is in communication connection with each unmanned aerial vehicle, and the ground command platform 100 can directly control the unmanned aerial vehicle 300 to fly.
In an embodiment, the unmanned aerial vehicle 300 includes a camera 310, the camera 310 may be an infrared camera or an infrared camera, the ground command platform 100 controls the multiple unmanned aerial vehicles 300 through the control terminal 200 to monitor the capture area through the respective camera 310, and obtains multiple pieces of position information of the target object collected by the unmanned aerial vehicle 300 monitoring the target object in the capture area when the target object is subjected to flight tracking, and then draws a movement track of the target object according to the multiple pieces of position information of the target object, and displays the movement track for a commander to view.
In an embodiment, as shown in fig. 1, the ground command platform includes a server 110 and a display device 120, the server 110 is in communication connection with the control terminal 200 or the unmanned aerial vehicles 300 through a network, the servers 110 can control the multiple unmanned aerial vehicles 300 to monitor a capture area through respective carried cameras 310, and acquire multiple pieces of position information of a target object acquired when the unmanned aerial vehicles 300 monitoring the target object in the capture area perform flight tracking on the target object, then draw a movement track of the target object according to the multiple pieces of position information of the target object, send the movement track to the display device 120, and display the movement track by the display device 120.
Wherein, control terminal 200 includes remote controller, smart mobile phone, panel computer, notebook computer and PC computer etc. and unmanned aerial vehicle 300 can have one or more propulsion unit to allow unmanned aerial vehicle 300 can fly in the air. The one or more propulsion units may cause the drone 300 to move at one or more, two or more, three or more, four or more, five or more, six or more free angles. In some cases, the drone 300 may rotate about one, two, three, or more axes of rotation. The axes of rotation may be perpendicular to each other. The axes of rotation may be maintained perpendicular to each other throughout the flight of the drone 300. The axis of rotation may include a pitch axis, a roll axis, and/or a yaw axis. The drone 300 may move in one or more dimensions. For example, the drone 300 can move upward due to the lift generated by one or more rotors. In some cases, the drone 300 may be movable along a Z-axis (which may be upward with respect to the drone 300 direction), an X-axis, and/or a Y-axis (which may be lateral). The drone 300 may move along one, two, or three axes that are perpendicular to each other.
The drone 300 may be a rotorcraft. In some cases, the drone 300 may be a multi-rotor aircraft that may include multiple rotors. The plurality of rotors may rotate to generate lift for the drone 300. The rotor may be a propulsion unit that allows the drone 300 to move freely in the air. The rotors may rotate at the same rate and/or may produce the same amount of lift or thrust. The rotors may rotate at will at different rates, generate different amounts of lift or thrust, and/or allow the drone 300 to rotate. In some cases, one, two, three, four, five, six, seven, eight, nine, ten, or more rotors may be provided on the drone 300. The rotors may be arranged with their axes of rotation parallel to each other. In some cases, the axes of rotation of the rotors may be at any angle relative to each other, which may affect the motion of the drone 300.
The drone 300 may have multiple rotors. The rotor may be connected to the body of the drone 300, which may include a control unit, an Inertial Measurement Unit (IMU), a processor, a battery, a power source, and/or other sensors. The rotor may be connected to the body by one or more arms or extensions that branch off from a central portion of the body. For example, one or more arms may extend radially from the central body of the drone 300 and may have rotors at or near the ends of the arms.
Referring to fig. 2, fig. 2 is a schematic flowchart illustrating steps of an assist method for capture according to an embodiment of the present application. The auxiliary capture method is applied to a ground command platform, and as shown in fig. 2, the auxiliary capture method includes steps S101 to S103.
S101, controlling a plurality of unmanned aerial vehicles to monitor a capture area through respective carried shooting devices.
The capture area is an area planned by a commander and/or a collector before capturing a target object, so that the unmanned aerial vehicle and the collector can be deployed in the capture area in advance, a shooting device of the unmanned aerial vehicle can be an infrared camera and can also be other cameras, for example, a camera comprising a wide-angle lens and a zoom lens.
In one embodiment, the ground command platform is in communication connection with each unmanned aerial vehicle participating in the capture task, so that the ground command platform can control each unmanned aerial vehicle participating in the capture task to monitor the capture area through the shooting device carried by the unmanned aerial vehicle. Or, ground command platform and the one or more control terminal communication connection of each unmanned aerial vehicle of participating in the task of arresting, control terminal and unmanned aerial vehicle communication connection for ground command platform can control unmanned aerial vehicle through the shooting device that carries on with unmanned aerial vehicle communication connection and monitor the area of arresting.
In one embodiment, as shown in fig. 3, step S101 includes sub-steps S1011 to S1013.
S1011, acquiring capture task data, wherein the capture task data comprises capture task information of capture sub-areas corresponding to each unmanned aerial vehicle participating in the capture task.
The capture area comprises a plurality of capture sub-areas, one unmanned aerial vehicle is responsible for monitoring one capture sub-area, the capture task data can be acquired from an external storage device of a ground command platform, a local memory or a cloud end, the method is not specifically limited to this point, the capture task data comprises the number of the unmanned aerial vehicles participating in the capture task, the identity IDs of the unmanned aerial vehicles participating in the capture task, the capture task information of the capture sub-areas corresponding to the unmanned aerial vehicles respectively, the capture task information comprises the hovering height and the shooting attitude parameters of the unmanned aerial vehicles in the capture sub-areas, the hovering height is used for controlling the unmanned aerial vehicles to hover in the capture sub-areas, the shooting attitude parameters are used for controlling the unmanned aerial vehicles to adjust the attitude of the shooting device, and the hovering height and the shooting attitude parameters can enable the monitoring range of the unmanned aerial vehicles to cover the corresponding sub-capture.
In one embodiment, a capture area map of a capture area is obtained, and the capture area is split into a plurality of capture sub-areas according to the capture area map and the area which can be monitored by an unmanned aerial vehicle, wherein the capture area map is obtained by mapping the capture area by the unmanned aerial vehicle; acquiring altitude information, area and geographical position coordinate range of each capturing sub-region, and determining capturing task information such as hovering height, shooting attitude parameters and the like of each unmanned aerial vehicle in the corresponding capturing sub-region according to the altitude information, the area and the geographical position coordinate range of each capturing sub-region; and distributing an unmanned aerial vehicle to each capture subarea, establishing a corresponding relation between the identity ID of the distributed unmanned aerial vehicle and the capture subarea, and storing capture task information such as the identity ID of the distributed unmanned aerial vehicle, the corresponding relation between the identity ID of the unmanned aerial vehicle and the capture subarea, the hovering height and shooting attitude parameters of the unmanned aerial vehicle in the corresponding capture subarea, so as to obtain capture task data. The capture area map of the capture area obtained through surveying and mapping can automatically generate capture task data, so that a plurality of unmanned aerial vehicles can be controlled to monitor the capture area based on the capture task data.
In an embodiment, the manner of determining the hovering height and shooting attitude parameters of the unmanned aerial vehicle in the capture sub-area may be: determining the geographical position coordinates of the hovering position point of the unmanned aerial vehicle in the capturing sub-area according to the geographical position coordinate range of the capturing sub-area; determining shooting attitude parameters of the unmanned aerial vehicle in the capture sub-area according to the geographical position coordinates of the hovering position point of the unmanned aerial vehicle and the geographical position coordinates of the boundary position point of the capture sub-area; determining the initial hovering height of the unmanned aerial vehicle in the capturing sub-area according to the altitude information of the capturing sub-area; determining the current monitoring range of the unmanned aerial vehicle according to the shooting attitude parameters and the initial hovering height of the unmanned aerial vehicle in the capture sub-area; when the area of the current monitoring range is smaller than that of the capture sub-region, determining an adjustment value of the hovering height based on the area of the current monitoring range and the area of the capture sub-region, and taking the sum of the adjustment value and the initial hovering height as the hovering height of the unmanned aerial vehicle in the capture sub-region; and when the area of the current monitoring range is larger than or equal to the area of the capture sub-region, taking the initial hovering height as the hovering height of the unmanned aerial vehicle in the capture sub-region. Through the altitude information, the area and the geographical position coordinate range of the capture sub-region, the hovering height and the shooting attitude parameter of the unmanned aerial vehicle in the capture sub-region can be automatically and accurately determined.
And S1012, controlling each unmanned aerial vehicle to fly to the corresponding capturing sub-region according to the capturing task information of the corresponding capturing sub-region of each unmanned aerial vehicle.
Wherein the capture task information comprises geographical position coordinates of a hover position point of the unmanned aerial vehicle within the capture sub-area, and therefore, can control each unmanned aerial vehicle to fly to the corresponding capture sub-area according to the geographical position coordinates of the hovering position point of each unmanned aerial vehicle in the corresponding capture sub-area, namely, generating a flight control instruction of each unmanned aerial vehicle according to the geographic position coordinates of the hovering position point of each unmanned aerial vehicle in the corresponding capturing sub-area, and the flight control instruction of each unmanned aerial vehicle is sent to the corresponding unmanned aerial vehicle, so that each unmanned aerial vehicle flies to the corresponding capture sub-area according to the respective corresponding flight control instruction, or the flight control instruction of the unmanned aerial vehicle is sent to the corresponding unmanned aerial vehicle through the control terminal, so that each unmanned aerial vehicle flies to the corresponding capture sub-area according to the corresponding flight control instruction.
In one embodiment, the geographical position coordinates of the hovering position point of each unmanned aerial vehicle in the corresponding capturing sub-area and the current geographical position coordinates of each unmanned aerial vehicle are obtained; planning a flight route of each unmanned aerial vehicle flying to the corresponding capture sub-area according to the geographical position coordinates of the hovering position point of each unmanned aerial vehicle in the corresponding capture sub-area and the current geographical position coordinates of each unmanned aerial vehicle, and obtaining the flight route of each unmanned aerial vehicle; and controlling each unmanned aerial vehicle to fly towards the corresponding capture sub-area according to the flight route of each unmanned aerial vehicle. By planning the air route of each unmanned aerial vehicle flying to the corresponding capture sub-area, the unmanned aerial vehicle can be automatically controlled to fly to the corresponding capture sub-area according to the air route.
And S1013, after each unmanned aerial vehicle flies to the corresponding capture sub-area, controlling each unmanned aerial vehicle to monitor the corresponding capture sub-area through the carried shooting device.
In the process that the unmanned aerial vehicle flies to the corresponding capture subarea, when the unmanned aerial vehicle reaches the corresponding capture subarea, the unmanned aerial vehicle sends arrival confirmation information to the ground command platform through the associated control terminal, or the unmanned aerial vehicle directly sends arrival confirmation information to the ground command platform to inform the ground command platform that the unmanned aerial vehicle has arrived at the corresponding capture sub-area, and the ground command platform sends a starting instruction of the auxiliary capture mode to the unmanned aerial vehicle which reaches the corresponding capture sub-area when determining that the unmanned aerial vehicle reaches the corresponding capture sub-area, so that the unmanned aerial vehicle which reaches the corresponding capture sub-area enters an auxiliary capture mode based on the starting instruction, monitors the corresponding capture sub-area through the carried shooting device, or when the unmanned aerial vehicle determines that the unmanned aerial vehicle reaches the corresponding capture sub-area, the unmanned aerial vehicle automatically enters an auxiliary capture mode, and monitors the corresponding capture sub-area through the carried shooting device. Wherein, when unmanned aerial vehicle is in supplementary mode of arresting, unmanned aerial vehicle's pilot lamp is in the state of extinguishing always, and unmanned aerial vehicle's bee calling organ is in the silence state always simultaneously.
In an embodiment, the capture task information includes a hovering height and a shooting attitude parameter of an unmanned aerial vehicle in a corresponding capture sub-area, the hovering height is used for controlling the unmanned aerial vehicle to hover in the capture sub-area, the shooting attitude parameter is used for controlling the unmanned aerial vehicle to adjust the attitude of the shooting device, and the monitoring range of the unmanned aerial vehicle can cover the corresponding sub-capture area through the hovering height and the shooting attitude parameter. Therefore, after each unmanned aerial vehicle flies to the corresponding capture sub-area, each unmanned aerial vehicle is controlled to hover according to the hovering height of each unmanned aerial vehicle in the corresponding capture sub-area; controlling each unmanned aerial vehicle to adjust the attitude of each shooting device according to the shooting attitude parameter corresponding to each unmanned aerial vehicle, and controlling each unmanned aerial vehicle to monitor the corresponding capture subarea through the shooting devices carried by each unmanned aerial vehicle after hovering and adjusting the attitude of the shooting device. Through the height of hovering of every unmanned aerial vehicle in the subregion of arresting that corresponds separately, control every unmanned aerial vehicle and hover and according to every unmanned aerial vehicle shooting gesture parameter that corresponds separately, control every unmanned aerial vehicle and adjust the gesture of respective shooting device for every unmanned aerial vehicle's monitoring range can cover the subregion of arresting that corresponds separately completely, thereby realizes the monitoring to whole area of arresting.
In an embodiment, according to the shooting attitude parameter corresponding to each unmanned aerial vehicle, the manner of controlling each unmanned aerial vehicle to adjust the attitude of each shooting device may be: controlling each unmanned aerial vehicle to adjust the attitude of the respective cradle head according to the shooting attitude parameter corresponding to each unmanned aerial vehicle, wherein the attitude of a shooting device of the unmanned aerial vehicle changes along with the attitude change of the cradle head of the unmanned aerial vehicle; and/or adjusting the hovering gesture of each unmanned aerial vehicle according to the shooting gesture parameter corresponding to each unmanned aerial vehicle, wherein the gesture of the shooting device of the unmanned aerial vehicle changes along with the hovering gesture of the unmanned aerial vehicle. Because the shooting device of the unmanned aerial vehicle is carried on the tripod head, the posture of the shooting device of the unmanned aerial vehicle can be adjusted by adjusting the posture of the tripod head of the unmanned aerial vehicle and/or the hovering posture of the unmanned aerial vehicle.
S102, acquiring a plurality of position information of the target object, which is acquired when the unmanned aerial vehicle monitoring the target object in the capture area carries out flight tracking on the target object.
The target object may be a fleeting person, an animal, etc.
When each unmanned aerial vehicle monitors the corresponding capture subarea, whether an object with the temperature close to the set temperature exists in the capture subarea or not is monitored mainly through an infrared imaging technology, and if the object with the temperature close to the set temperature exists in the capture subarea, the object can be determined to be a target object, so that the unmanned aerial vehicle monitoring the target object in the sub capture subarea automatically carries out flight tracking on the target object, collects a plurality of position information of the target object in the process of carrying out flight tracking on the target object, and sends the collected position information to a ground command platform, so that the ground command platform can obtain the position information of the target object, and the moving track can be conveniently drawn based on the position information of the target object. The set temperature may be set based on actual conditions, which is not specifically limited in this application, for example, the set temperature is a body temperature, that is, 37.5 ℃.
In one embodiment, as shown in fig. 4, step S102 includes sub-steps S1021 to S1022.
And S1021, controlling the target unmanned aerial vehicle to carry out flight tracking on the target object after monitoring the target object in the capture area.
Acquiring infrared thermal imaging of each unmanned aerial vehicle in a corresponding sub-capture area acquired by an infrared camera, determining whether an object with the temperature close to the set temperature exists in the capture area or not according to the infrared thermal imaging of each sub-capture area, if the object with the temperature close to the set temperature exists in the capture area, determining that the object is a target object, taking the unmanned aerial vehicle corresponding to the sub-capture area where the target object is located as the target unmanned aerial vehicle, and sending a flight tracking instruction to the target unmanned aerial vehicle so that the target unmanned aerial vehicle carries out flight tracking on the target object according to the flight tracking instruction. When the absolute value of the difference between the temperature of the object in the capture area and the set temperature is smaller than or equal to a preset threshold, it is determined that the temperature of the object is close to the set temperature, and the preset threshold may be set based on an actual situation, which is not specifically limited in this application, for example, the preset threshold is 3 ℃.
In an embodiment, the method for controlling the target drone to perform flight tracking on the target object may be: when a target object in the capture area is monitored, adjusting the direction of a lens of a shooting device of the target unmanned aerial vehicle so that the direction of the lens of the shooting device is approximately the same as the gravity direction; after the direction of a lens of the shooting device is approximately the same as the gravity direction, the target unmanned aerial vehicle is controlled to carry out flight tracking on the target object. Wherein, when the contained angle between camera lens direction and the gravity direction is less than or equal to and predetermines the contained angle, it is roughly the same with the gravity direction to confirm the camera lens direction of shooting device, predetermines the contained angle and can set up based on actual conditions, and this application does not specifically restrict to this, for example, predetermines the contained angle and is 10. Through the camera lens orientation of adjustment shooting device for the camera lens direction of shooting device is roughly the same with the direction of gravity, and the unmanned aerial vehicle of being convenient for flies to track target object through shooting device.
In one embodiment, after the lens direction of the shooting device is approximately the same as the gravity direction, the target unmanned aerial vehicle is controlled to fly right above the target object; and after the target unmanned aerial vehicle is determined to fly right above the target object, controlling the target unmanned aerial vehicle to carry out flight tracking on the target object. The height is preset at intervals between the target unmanned aerial vehicle and the target object, and it can be understood that the target unmanned aerial vehicle can also be controlled to fly above the target object, so that the target unmanned aerial vehicle is located above the rear portion of the target object, the horizontal distance between the target unmanned aerial vehicle and the target object is a preset distance, and the height is preset at intervals between the target unmanned aerial vehicle and the target object. The preset distance and the preset height may be set based on actual conditions, which is not specifically limited in the present application, for example, the preset distance is 1 meter, and the preset height is 5 meters. Through the camera lens orientation of adjustment shooting device for the camera lens direction of shooting device is roughly the same with the direction of gravity, adjusts unmanned aerial vehicle's position simultaneously, makes unmanned aerial vehicle be located the target object directly over, and the unmanned aerial vehicle of being convenient for carries out flight tracking through shooting device to the target object, and the unmanned aerial vehicle of also being convenient for gathers the positional information of target object.
S1022, acquiring a plurality of position information of the target object, which is acquired when the target unmanned aerial vehicle carries out flight tracking on the target object.
The target unmanned aerial vehicle is at the in-process that carries out flight tracking to the target object, with the position information that the target object was gathered to interval preset time through target unmanned aerial vehicle's positioner, and send the position information who gathers to ground command platform, thereby make ground command platform store have the position information of target object under different moments, ground command platform when the removal orbit of target object is drawn in needs, a plurality of position information of the target object that acquire when carrying out flight tracking to the target object of target unmanned aerial vehicle. The Positioning device includes any one of a GPS (Global Positioning System) device and an RTK (Real-time kinematic) device, and the preset time may be set based on an actual situation, which is not specifically limited in this application, for example, the preset time is 0.5 seconds.
In one embodiment, when the target drone is located directly above the target object, the longitude and latitude coordinates acquired by the positioning device of the target drone are used as the position information of the target object, and when the target drone is not located directly above the target object, the longitude and latitude coordinates acquired by the positioning device of the target drone and the position information of the target object relative to the target drone are acquired, and the position information of the target object is determined according to the acquired longitude and latitude coordinates and the position information.
In an embodiment, before executing step S102, current position information of a target object sent by an unmanned aerial vehicle that monitors a target object in a capture area is acquired; displaying an identification icon of the target object according to the current position information, and outputting locking prompt information, wherein the locking prompt information is used for prompting a commander or a search taker whether to lock the target object; when a locking instruction of a target object is obtained, locking the target object according to the locking instruction; after the target object is locked, the unmanned aerial vehicle monitoring the target object in the capture area is controlled to perform flight tracking on the target object, and step S102 is executed, that is, a plurality of position information of the target object collected when the unmanned aerial vehicle monitoring the target object in the capture area performs flight tracking on the target object is obtained. The mode of outputting the locking prompt message may be as follows: displaying a locking prompt text; and/or broadcasting a locking prompt tone; and/or controlling the locking prompting lamp to flash. By outputting the locking prompt information, the command personnel or the search personnel can be conveniently prompted whether to lock the target object, and when the command personnel or the search personnel determine to lock the target object, the target object is locked, so that the unmanned aerial vehicle monitoring the target object in the capture area can carry out flight tracking on the target object, and the real target object is prevented from being lost due to the fact that the unmanned aerial vehicle carries out flight tracking on the false target.
In an embodiment, the manner of displaying the identification icon of the target object according to the current position information may be: displaying a preset capture area map, wherein the preset capture area map is obtained by drawing and measuring a capture area in advance; and displaying the identification icon of the target object in a preset capture area map according to the current position information. The identification icon of the target object may be set based on actual conditions, which is not specifically limited in the present application, for example, the identification icon of the target object is a red solid round ball. Through the display of the identification icon of the target object in the capture area map, commanders can know the position of the target object in the capture area conveniently, and the commanders can command the capture personnel to capture the target object conveniently.
In an embodiment, when the ground command platform acquires the current position information of the target object, the current position information is sent to the associated control terminals, so that the control terminals can share the current position information of the target object, and meanwhile, each control terminal displays an identification icon of the target object in a preset capture area map based on the current position information of the target object. Because the control terminal is held by the search personnel, the search personnel participating in the capture task can know the current position of the target object conveniently and capture the target object conveniently by displaying the identification icon of the target object in the capture area map of the control terminal.
In one embodiment, the identification icon of each search taker is displayed simultaneously within a preset capture area map in which the identification icon of the target object is displayed. The identification icon of the search and capture person is different from the identification icon of the target object, and is used for distinguishing the search and capture person from the target object, for example, the identification icon of the search and capture person is different from the identification icon of the target object in color, shape and/or size, and for example, the identification icon of the search and capture person is different from the identification icon of the target object in color, shape and/or size. By simultaneously displaying the identification icon of the target object and the identification icon of each searching and catching person in the preset catching area map, the command personnel can distinguish the searching and catching persons from the target object conveniently, so that the command personnel can know the azimuth information of the searching and catching persons and the target object, and the command personnel can command the searching and catching persons to catch the target object conveniently.
S103, drawing the moving track of the target object according to the position information, and displaying the moving track for a commander to check.
After the plurality of position information of the target object is acquired, the moving track of the target object is drawn based on the plurality of position information, and the moving track of the target object is displayed for a commander to check, so that the commander can command a search catcher to catch the target object through the moving track.
In an embodiment, the moving trajectory of the target object may be displayed by: displaying a preset capture area map, wherein the preset capture area map is obtained by drawing and measuring a capture area in advance; and displaying the moving track in a preset capture area map. The displayed movement track of the target object comprises an identification icon of the target object, the identification icon is used for representing the position of the target object in a preset capture area map, the shade of the color of the identification icon is determined according to the temperature of the target object sensed by a shooting device of the unmanned aerial vehicle, the higher the temperature of the target object is, the darker the color of the identification icon is, and the lower the temperature of the target object is, the lighter the color of the identification icon is. The moving track of the target object is displayed on the capture area map, so that the commander can know the moving condition of the target object in the capture area conveniently, and the commander can be better assisted to command the capture personnel to capture the target object.
In an embodiment, the ground command platform sends the moving track of the target object to the associated control terminals, so that the control terminals can synchronously display the moving track, and the search personnel can check the moving track. Because the control terminal is held by the search personnel, the search personnel can know the moving condition of the target object conveniently by displaying the moving track of the target object at the control terminal, and the search personnel can be better assisted to capture the target object.
In an embodiment, when there are a plurality of target objects, the movement tracks of different target objects are displayed in different display manners, for example, the movement tracks of different target objects are different in color, and for example, the identification icons on the movement tracks of different target objects are different. When a plurality of target objects exist, the moving tracks of different target objects are displayed in different display modes, so that different target objects can be distinguished conveniently, commanders can know the moving conditions of different target objects, and commanders can be better assisted to command searchers to capture different target objects.
In one embodiment, when the target object is monitored to be lost, an identification icon of the target object is displayed on a moving track, wherein a plurality of identification icons of the target object are displayed on the moving track, the position of the identification icon on the moving track is determined according to the position information of the target object, and the shade of the color of the identification icon is determined according to the temperature of the target object sensed by a shooting device of the unmanned aerial vehicle. When the target object is lost, the plurality of identification icons of the target object are displayed on the moving track, so that a commander can conveniently determine the lost place of the target person according to the color depth of the identification icons, and the commander can be assisted to command a search-and-capture person to search the lost place of the target person.
The auxiliary method for capturing provided by the embodiment monitors the capturing area by controlling a plurality of unmanned planes through the respective carried shooting devices, and in the monitoring process, acquiring a plurality of position information of the target object acquired when the unmanned aerial vehicle monitoring the target object in the capturing area carries out flight tracking on the target object, then drawing a movement locus of the target object according to the plurality of position information of the target object, and displaying the movement locus, the target object capturing device is used for being checked by commanders, so that the commanders can know next action of the target object according to the moving track, the commanders can command the searching and capturing personnel to capture the target object conveniently, the commanders can be effectively assisted or the searching and capturing personnel can capture the target, the searching difficulty of the searched and captured target object and the probability of the target object escaping from the enclosure are reduced, and the user experience is greatly improved.
Referring to fig. 5, fig. 5 is a schematic flowchart illustrating steps of another capture assisting method according to an embodiment of the present application. As shown in fig. 5, the capture assist method includes steps S201 to S205.
S201, controlling a plurality of unmanned aerial vehicles to monitor a capture area through respective carried shooting devices.
S202, acquiring a plurality of position information of the target object, which is acquired when the unmanned aerial vehicle monitoring the target object in the capture area carries out flight tracking on the target object.
S203, drawing the moving track of the target object according to the position information, and displaying the moving track for a commander to check.
For the detailed description of the steps S201 to S203, reference may be made to the foregoing embodiments, which are not described in detail in this embodiment.
S204, determining whether the target object moves to a preset alert area or not according to the moving track of the target object.
The preset warning area is an area calibrated in the capture area in advance, the preset warning area comprises at least one of a first warning area with the temperature close to the temperature of a target object, a second warning area without monitoring the temperature of the target object and a third warning area with a mobile vehicle placed, the first warning area comprises at least one of a people dense area and a black area with the temperature close to the temperature of the target object, the second warning area comprises at least one of a water area, a shelter area and a strong light reflection area, and the mobile vehicle comprises at least one of an automobile, a ship, a yacht, a motorcycle and a helicopter.
In an embodiment, according to the moving track of the target object, the manner of determining whether the target object is moving to the preset alert area may be: determining the moving direction of the target object according to the moving track of the target object, and acquiring the position information of a preset warning area; determining whether the target object moves to the preset alert region according to the moving direction of the target object and the position information of the preset alert region, namely determining whether the preset alert region is located in the moving direction of the target object based on the position information of the preset alert region, determining that the target object moves to the preset alert region when determining that the preset alert region is located in the moving direction of the target object, and determining that the target object does not move to the preset alert region when determining that the preset alert region is not located in the moving direction of the target object. The position information of the preset warning area is calibrated in advance.
S205, when the target object is determined to move to the preset alert area, outputting alert information to remind a commander and/or a catcher that the target object moves to the preset alert area.
And when the target object is determined to move to the preset warning area, outputting warning information so as to remind the commander and/or the catcher that the target object moves to the preset warning area. Wherein, the output mode of the alarm information can be as follows: displaying an alarm prompt text; and/or broadcast the warning prompt tone; and/or controlling an alarm prompting lamp to flash.
In one embodiment, when the target object is determined to move to the preset alert area, acquiring the current position information of each searching and capturing person and the position information of the preset alert area, and determining the distance between each searching and capturing person and the preset alert area according to the current position information of each searching and capturing person and the position information of the preset alert area; and determining target searching personnel according to the distance between each searching personnel and a preset warning area, and displaying the identification icons of the target searching personnel on a map of the preset capturing area. And taking the searching and capturing personnel closest to the preset warning area as target searching and capturing personnel. By displaying the identification icons of the target searching personnel on the map of the preset capturing area, the command personnel can quickly command the target searching personnel to intercept and capture the target object moving towards the preset warning area, and the target object is prevented from escaping.
In one embodiment, when it is determined that the target object is moving towards the preset alert zone, determining a distance between the target object and the preset alert zone; and when the distance between the target object and the preset warning area is determined to be smaller than or equal to the preset threshold value, outputting warning information, and when the distance between the target object and the preset warning area is determined to be smaller than or equal to the preset threshold value, not processing. The method for determining the distance between the target object and the preset alert area may be as follows: acquiring current position information of a target object and position information of a preset warning area; and determining the distance between the target object and the preset warning region according to the current position information of the target object and the position information of the preset warning region. The preset threshold may be set based on actual conditions, which is not specifically limited in this application, for example, the preset threshold is 1 km. When the target object is determined to move to the preset warning region and the distance between the target object and the preset warning region is smaller than or equal to the preset threshold value, warning information is output, so that a commander can command a search worker to intercept and capture the target object in time and the target object is prevented from escaping.
In the capture assisting method provided in the above embodiment, it is determined whether the target object moves to the preset alert region according to the moving track of the target object, and when it is determined that the target object moves to the preset alert region, the warning information is output to remind a commander and/or a searcher that the target object moves to the preset alert region, so that the commander can command the searcher to intercept and capture the target object in time, and the target object is prevented from escaping.
Referring to fig. 6, fig. 6 is a schematic flowchart illustrating steps of another capture assisting method according to an embodiment of the present application. The auxiliary capturing method is applied to the unmanned aerial vehicle, and the unmanned aerial vehicle is provided with the shooting device. As shown in fig. 6, the capture assist method includes steps S301 to S304.
And S301, controlling the shooting device to monitor the capture area.
The capture area is an area planned by a commander and/or a collector before capturing a target object, so that the unmanned aerial vehicle and the collector can be deployed in the capture area in advance, a shooting device of the unmanned aerial vehicle can be an infrared camera and can also be other cameras, for example, a camera comprising a wide-angle lens and a zoom lens.
In an embodiment, as shown in fig. 7, step S301 includes sub-steps S3011 to S3013.
S3011, acquiring capture task information of the unmanned aerial vehicle, wherein the capture task information comprises position information of the capture sub-area corresponding to the unmanned aerial vehicle.
The ground command platform is in communication connection with each unmanned aerial vehicle participating in the capture task, or the ground command platform is in communication connection with one or more control terminals, the control terminals are in communication connection with the unmanned aerial vehicles participating in the capture task, and the ground command platform acquires capture task data, wherein the capture task data comprises capture task information of capture sub-areas corresponding to each unmanned aerial vehicle participating in the capture task; the capture task information of the capture sub-region corresponding to each unmanned aerial vehicle is sent to the corresponding unmanned aerial vehicle, or the capture task information of the capture sub-region corresponding to each unmanned aerial vehicle is sent to the corresponding unmanned aerial vehicle through the control terminal, and the unmanned aerial vehicle acquires the capture task information sent by the ground command platform or the control terminal.
S3012, controlling the unmanned aerial vehicle to fly to the corresponding capture sub-area according to the position information.
And the capture task information comprises position information of a capture sub-area corresponding to the unmanned aerial vehicle. Acquiring current position information of the unmanned aerial vehicle, and planning a flight path of the unmanned aerial vehicle flying to the corresponding capture subarea according to the current position information of the unmanned aerial vehicle and the position information of the capture subarea corresponding to the unmanned aerial vehicle to obtain a flight path of the unmanned aerial vehicle; and controlling the unmanned aerial vehicle to fly to the corresponding capture sub-region based on the flight route. By planning the air route of the unmanned aerial vehicle flying to the corresponding capture sub-area, the unmanned aerial vehicle can fly to the corresponding capture sub-area automatically according to the air route.
S3013, after the unmanned aerial vehicle flies to the corresponding capture sub-area, controlling the shooting device to monitor the corresponding capture sub-area.
And when the unmanned aerial vehicle is determined to fly to the corresponding capture subarea, controlling a shooting device of the unmanned aerial vehicle to monitor the corresponding capture subarea. In an embodiment, the capture task information further includes a hovering height and a shooting attitude parameter of the unmanned aerial vehicle in the corresponding capture sub-region, before the shooting device is controlled to monitor the corresponding capture sub-region, the unmanned aerial vehicle is controlled to hover according to the hovering height, the attitude of the shooting device is adjusted according to the shooting attitude parameter, and after the unmanned aerial vehicle hovers and the attitude of the shooting device is adjusted, the shooting device of the unmanned aerial vehicle is controlled to monitor the corresponding capture sub-region. Through the hovering height of the unmanned aerial vehicle in the corresponding capturing subarea, the unmanned aerial vehicle is controlled to hover and the gesture of the shooting device is controlled to be adjusted according to the shooting gesture parameters corresponding to the unmanned aerial vehicle, so that the monitoring range of the unmanned aerial vehicle can completely cover the corresponding capturing subarea.
In an embodiment, the mode that the unmanned aerial vehicle adjusts the attitude of the shooting device according to the shooting attitude parameter may be: adjusting the attitude of the tripod head of the unmanned aerial vehicle according to the shooting attitude parameters, wherein the attitude of the shooting device changes along with the change of the attitude of the tripod head of the unmanned aerial vehicle; and/or adjusting the hovering posture of the unmanned aerial vehicle according to the shooting posture parameter, wherein the posture of the shooting device changes along with the change of the hovering posture of the unmanned aerial vehicle. Because the shooting device of the unmanned aerial vehicle is carried on the tripod head, the posture of the shooting device of the unmanned aerial vehicle can be adjusted by adjusting the posture of the tripod head of the unmanned aerial vehicle and/or the hovering posture of the unmanned aerial vehicle.
S302, after the target object in the capture area is monitored, controlling the unmanned aerial vehicle to carry out flight tracking on the target object.
The unmanned aerial vehicle acquires infrared thermal imaging of the corresponding sub-capture area through the infrared camera, determines whether an object with the temperature close to the set temperature exists in the sub-capture area or not according to the infrared thermal imaging, and if the object with the temperature close to the set temperature exists in the capture area, the object can be determined to be a target object, namely the target object in the capture area is monitored, and the unmanned aerial vehicle is controlled to carry out flight tracking on the target object. When the absolute value of the difference value between the temperature of the object in the capture sub-area and the set temperature is smaller than or equal to a preset threshold, it is determined that the temperature of the object is close to the set temperature, and the preset threshold can be set based on an actual situation, which is not specifically limited in this application, for example, the preset threshold is 3 ℃.
In one embodiment, after a target object in a capture area is monitored, the lens orientation of the shooting device is adjusted so that the lens direction of the shooting device is approximately the same as the gravity direction; after the direction of a lens of the shooting device is approximately the same as the gravity direction, the unmanned aerial vehicle is controlled to carry out flight tracking on the target object. Wherein, when the contained angle between the camera lens orientation of shooting device and the gravity direction is less than or equal to and predetermines the contained angle, it is roughly the same with the gravity direction to confirm the camera lens orientation of shooting device, predetermines the contained angle and can set up based on actual conditions, and this application does not specifically limit to this, for example, predetermines the contained angle and is 10. Through the camera lens orientation of adjustment shooting device for the camera lens direction of shooting device is roughly the same with the direction of gravity, and the unmanned aerial vehicle of being convenient for flies to track target object through shooting device.
In one embodiment, after the lens direction of the shooting device is approximately the same as the gravity direction, the unmanned aerial vehicle is controlled to fly right above the target object; and after the unmanned aerial vehicle flies right above the target object, controlling the unmanned aerial vehicle to carry out flying tracking on the target object. Wherein, the interval preset height between unmanned aerial vehicle and the target object, can understand, also can control unmanned aerial vehicle to the top flight of target object for unmanned aerial vehicle is located the back upper place of target object, and the horizontal distance between unmanned aerial vehicle and the target object is preset distance, the interval preset height between unmanned aerial vehicle and the target object, preset distance and preset height can set up based on actual conditions, this application does not do specific limitation to this, for example, preset distance is 1 meter, preset height is 5 meters. Through the camera lens orientation of adjustment shooting device for the camera lens direction of shooting device is roughly the same with the direction of gravity, adjusts unmanned aerial vehicle's position simultaneously, makes unmanned aerial vehicle be located the target object directly over, and the unmanned aerial vehicle of being convenient for carries out flight tracking through shooting device to the target object, and the unmanned aerial vehicle of also being convenient for gathers the positional information of target object.
S303, drawing the moving track of the target object in the process of carrying out flight tracking on the target object.
In the process that the unmanned aerial vehicle carries out flight tracking on the target object, the unmanned aerial vehicle collects the position information of the target object through the positioning device at intervals of preset time, so that the position information of the target object at different moments is obtained, and then the moving track of the target object is drawn based on the position information of the target object at different moments. The Positioning device includes any one of a GPS (Global Positioning System) device and an RTK (Real-time kinematic) device, and the preset time may be set based on an actual situation, which is not specifically limited in this application, for example, the preset time is 0.5 seconds.
In one embodiment, when the unmanned aerial vehicle is located directly above the target object, the longitude and latitude coordinates acquired by the positioning device of the unmanned aerial vehicle are used as the position information of the target object, and when the unmanned aerial vehicle is not located directly above the target object, the longitude and latitude coordinates acquired by the positioning device of the unmanned aerial vehicle and the orientation information of the target object relative to the unmanned aerial vehicle are acquired, and the position information of the target object is determined according to the acquired longitude and latitude coordinates and the orientation information.
In an embodiment, before executing step S303, current position information of the target object is sent to a control terminal associated with the unmanned aerial vehicle, so that the control terminal displays an identifier icon of the target object in a preset capture area map according to the current position information of the target object, and outputs locking prompt information, where the locking prompt information is used for prompting a search taker whether to lock the target object; acquiring a locking instruction of a target object sent by a control terminal, and locking the target object according to the locking instruction; after the target object is locked, step S303 is executed, namely, during the flight tracking of the target object, the movement track of the target object is drawn. The identification icon of the target object may be set based on actual conditions, which is not specifically limited in the present application, for example, the identification icon of the target object is a red solid round ball.
The preset capture area map is obtained by drawing and measuring the capture area in advance, and the mode of the control terminal outputting the locking prompt information comprises at least one of displaying a locking prompt text, broadcasting a locking prompt tone and controlling the locking prompt lamp to flicker. By outputting the locking prompt information, whether the target object is locked by the searching and capturing personnel is convenient to prompt, and when the searching and capturing personnel determines to lock the target object, the target object is locked, so that the unmanned aerial vehicle carries out flight tracking on the target object, and the real target object is prevented from being lost due to the fact that the target object is subjected to flight tracking false objects.
In an embodiment, before step S303 is executed, current position information of the target object is sent to a control terminal associated with the unmanned aerial vehicle, the control terminal forwards the current position information to a ground command platform, the ground command platform displays an identification icon of the target object according to the current position information, and outputs locking prompt information, where the locking prompt information is used to prompt a commander whether to lock the target object; when a locking instruction of a target object is acquired, the locking instruction is sent to the unmanned aerial vehicle through the control terminal, the unmanned aerial vehicle locks the target object according to the locking instruction, and after the target object is locked, step S303 is executed, namely, in the process of performing flight tracking on the target object, the moving track of the target object is drawn.
In an embodiment, an identification icon of each search-and-capture person is further displayed in the preset capture area map, the identification icons of different search-and-capture persons are different, and the identification icon of a search-and-capture person is different from the identification icon of a target object, so as to distinguish the search-and-capture person from the target object. By simultaneously displaying the identification icon of the target object and the identification icon of each searching and capturing person in the preset capturing area map, the command personnel or the searching and capturing persons can conveniently distinguish the searching and capturing persons from the target object, so that the command personnel or the searching and capturing persons can know the azimuth information of the searching and capturing persons and the target object, and the target object can be conveniently captured.
S304, the moving track is sent to a ground command platform, so that the ground command platform can display the moving track, and a commander can check the moving track.
After the unmanned aerial vehicle draws the moving track of the target object, the moving track is sent to the ground command platform, and the ground command platform displays the moving track of the target object after receiving the moving track of the target object, so that a commander can check the moving track, and the commander can command a search and capture person to capture the target object through the moving track.
In an embodiment, the manner of displaying the moving track of the target object by the ground command platform may be: the ground command platform displays a preset capture area map, wherein the preset capture area map is obtained by drawing and measuring a capture area in advance; and displaying the moving track in a preset capture area map. The displayed movement track of the target object comprises an identification icon of the target object, the identification icon is used for representing the position of the target object in a preset capture area map, the shade of the color of the identification icon is determined according to the temperature of the target object sensed by a shooting device of the unmanned aerial vehicle, the higher the temperature of the target object is, the darker the color of the identification icon is, and the lower the temperature of the target object is, the lighter the color of the identification icon is. The moving track of the target object is displayed on the capture area map, so that the commander can know the moving condition of the target object in the capture area conveniently, and the commander can be better assisted to command the capture personnel to capture the target object.
In one embodiment, the movement track of the target object comprises an identification icon of the target object, the identification icon is used for representing the position of the target object in the preset capture area map, and the shade of the color of the identification icon is determined according to the temperature of the target object sensed by the shooting device. The mobile terminal comprises a mobile track, a plurality of identification icons and a display screen, wherein the identification icons of a target object are displayed on the mobile track, and the positions of the identification icons on the mobile track are determined according to the position information of the target object. By displaying the plurality of identification icons of the target object on the moving track, a commander can conveniently determine the disappearing place of the target person according to the color depth of the identification icons, and thus the commander can be assisted to command a hunter to search the disappearing place of the target person.
In an embodiment, when there are a plurality of target objects, the ground command platform displays the movement tracks of different target objects in different display manners, for example, the movement tracks of different target objects have different colors, and for example, the identification icons on the movement tracks of different target objects are different. When a plurality of target objects exist, the moving tracks of different target objects are displayed in different display modes, so that different target objects can be distinguished conveniently, commanders can know the moving conditions of different target objects, and commanders can be better assisted to command searchers to capture different target objects.
In one embodiment, whether the target object moves to a preset alert area is determined according to the moving track of the target object; and when the target object is determined to move to the preset warning area, sending a warning instruction to a control terminal associated with the unmanned aerial vehicle, so that the control terminal can output warning information based on the warning instruction to remind the target object of the search and catch personnel to move to the preset warning area. When the target object is determined to move to the preset warning region, the control terminal outputs warning information to remind a search catcher that the target object moves to the preset warning region, so that the search catcher can intercept and catch the target object in time and the target object is prevented from escaping.
In an embodiment, when receiving the alarm instruction, the control terminal sends the alarm instruction to the ground command platform, so that the ground command platform outputs alarm information based on the alarm instruction to remind a commander that the target object moves to a preset alert area. When the target object is determined to move to the preset warning area, the ground commands to output warning information to remind a commander that the target object moves to the preset warning area, so that the commander can command a search and catch person to intercept and catch the target object in time, and the target object is prevented from escaping.
The preset warning area is an area calibrated in the capture area in advance, the preset warning area comprises at least one of a first warning area with the temperature close to the temperature of a target object, a second warning area without monitoring the temperature of the target object and a third warning area with a mobile vehicle placed, the first warning area comprises at least one of a people dense area and a black area with the temperature close to the temperature of the target object, the second warning area comprises at least one of a water area, a shelter area and a strong light reflection area, and the mobile vehicle comprises at least one of an automobile, a ship, a yacht, a motorcycle and a helicopter.
In an embodiment, according to the moving track of the target object, the manner of determining whether the target object is moving to the preset alert area may be: determining the moving direction of the target object according to the moving track of the target object, and acquiring the position information of a preset warning area; determining whether the target object moves to the preset alert region according to the moving direction of the target object and the position information of the preset alert region, namely determining whether the preset alert region is located in the moving direction of the target object based on the position information of the preset alert region, determining that the target object moves to the preset alert region when determining that the preset alert region is located in the moving direction of the target object, and determining that the target object does not move to the preset alert region when determining that the preset alert region is not located in the moving direction of the target object. The position information of the preset warning area is calibrated in advance.
In one embodiment, when it is determined that the target object is moving towards the preset alert zone, determining a distance between the target object and the preset alert zone; and when the distance between the target object and the preset warning area is determined to be smaller than or equal to the preset threshold value, outputting warning information, and when the distance between the target object and the preset warning area is determined to be smaller than or equal to the preset threshold value, not processing. When the target object is determined to move to the preset warning region and the distance between the target object and the preset warning region is smaller than or equal to the preset threshold value, warning information is output, so that a search worker can intercept and catch the target object in time and the target object is prevented from escaping.
The method for determining the distance between the target object and the preset alert area may be as follows: acquiring current position information of a target object and position information of a preset warning area; and determining the distance between the target object and the preset warning region according to the current position information of the target object and the position information of the preset warning region. The preset threshold may be set based on actual conditions, which is not specifically limited in this application, for example, the preset threshold is 1 km.
According to the capture assisting method provided by the embodiment, the unmanned aerial vehicle monitors the capture area by controlling the shooting device, and after the target object in the capture area is monitored, the unmanned aerial vehicle carries out flight tracking on the target object, then draws the moving track of the target object in the process of carrying out flight tracking on the target object, and sends the moving track to the ground command platform, so that the ground command platform displays the moving track, a commander can check the moving track, the commander can know the next action of the target object according to the moving track, the commander can command a search staff to capture the target object conveniently, the commander or the search staff can be effectively assisted to capture the target, the search difficulty of searching the target object and the probability of escaping from the enclosure of the target object are reduced, and the user experience is greatly improved.
Referring to fig. 8, fig. 8 is a schematic block diagram of a structure of a ground command platform according to an embodiment of the present disclosure. As shown in fig. 8, the ground command platform 400 includes a processor 401 and a memory 402, and the processor 401 and the memory 402 are connected by a bus 403, such as an I2C (Inter-integrated Circuit) bus 403.
Specifically, the Processor 401 may be a Micro-controller Unit (MCU), a Central Processing Unit (CPU), a Digital Signal Processor (DSP), or the like.
Specifically, the Memory 402 may be a Flash chip, a Read-Only Memory (ROM) magnetic disk, an optical disk, a usb disk, or a removable hard disk. Optionally, the ground command platform may further include a transceiver for executing the content of the communication or data transmission between the ground command platform and other devices in the above embodiments. The ground command platform may further include a display device for displaying the movement track and the prompt information mentioned in the above embodiments. In one embodiment of the present invention, the substrate is,
wherein the processor 401 is configured to run a computer program stored in the memory 402, and when executing the computer program, implement the following steps:
controlling a plurality of unmanned aerial vehicles to monitor a capture area through respective carried shooting devices;
acquiring a plurality of position information of the target object, which is acquired when the unmanned aerial vehicle monitoring the target object in the capture area carries out flight tracking on the target object;
and drawing the moving track of the target object according to the position information, and displaying the moving track for a commander to view.
In an embodiment, the capture area comprises a plurality of capture sub-areas; control many unmanned aerial vehicles monitor the area of arresting through the shooting device that carries on separately, include:
acquiring capture task data, wherein the capture task data comprises capture task information of a capture sub-region corresponding to each unmanned aerial vehicle participating in a capture task;
controlling each unmanned aerial vehicle to fly to the corresponding capturing sub-region according to the capturing task information of the corresponding capturing sub-region of each unmanned aerial vehicle;
and after each unmanned aerial vehicle flies to the corresponding capture sub-area, controlling each unmanned aerial vehicle to monitor the corresponding capture sub-area through the carried shooting device.
In one embodiment, the capture task information includes a hovering height and shooting attitude parameters of an unmanned aerial vehicle in a corresponding capture area; before controlling each unmanned aerial vehicle to monitor the corresponding capture subarea through the respective carried shooting devices, the method further comprises the following steps:
controlling each unmanned aerial vehicle to hover according to the hovering height of each unmanned aerial vehicle in the corresponding capturing sub-area;
and controlling each unmanned aerial vehicle to adjust the attitude of each shooting device according to the shooting attitude parameter corresponding to each unmanned aerial vehicle.
In an embodiment, the controlling each of the drones to adjust the attitude of the respective shooting device according to the shooting attitude parameter corresponding to each of the drones includes:
controlling each unmanned aerial vehicle to adjust the attitude of the respective cradle head according to the shooting attitude parameter corresponding to each unmanned aerial vehicle, wherein the attitude of a shooting device of the unmanned aerial vehicle changes along with the attitude change of the cradle head of the unmanned aerial vehicle; and/or
And adjusting the hovering gesture of each unmanned aerial vehicle according to the shooting gesture parameter corresponding to each unmanned aerial vehicle, wherein the gesture of the shooting device of the unmanned aerial vehicle changes along with the change of the hovering gesture of the unmanned aerial vehicle.
In an embodiment, the acquiring a plurality of position information of the target object, which is acquired when the unmanned aerial vehicle monitoring the target object in the capture area performs flight tracking on the target object, includes:
when a target object in the capture area is monitored, controlling a target unmanned aerial vehicle to carry out flight tracking on the target object, wherein the target unmanned aerial vehicle is an unmanned aerial vehicle monitoring the target object;
acquiring a plurality of position information of the target object acquired by the target unmanned aerial vehicle when the target object is subjected to flight tracking.
In an embodiment, after the target object in the capture area is monitored, controlling the target drone to perform flight tracking on the target object includes:
when the target object in the capture area is monitored, adjusting the direction of a lens of a shooting device of the target unmanned aerial vehicle so that the direction of the lens is approximately the same as the gravity direction;
and after the lens direction is approximately the same as the gravity direction, controlling the target unmanned aerial vehicle to carry out flight tracking on the target object.
In an embodiment, said controlling the target drone to perform flight tracking on the target object after the lens direction is substantially the same as the gravity direction includes:
after the lens direction is approximately the same as the gravity direction, controlling the target unmanned aerial vehicle to fly right above the target object;
and after the target unmanned aerial vehicle flies right above the target object, controlling the target unmanned aerial vehicle to carry out flight tracking on the target object.
In an embodiment, the target drone is spaced from the target object by a preset height.
In an embodiment, before the acquiring a plurality of position information of the target object, which is acquired when the drone monitoring the target object in the capture area performs flight tracking on the target object, the method further includes:
acquiring current position information of the target object sent by the unmanned aerial vehicle monitoring the target object in the capture area;
displaying an identification icon of the target object according to the current position information, and outputting locking prompt information, wherein the locking prompt information is used for prompting a commander or a search catcher whether to lock the target object;
and when the locking instruction of the target object is acquired, locking the target object according to the locking instruction.
In an embodiment, the displaying the identification icon of the target object according to the current position information includes:
displaying a preset capture area map, wherein the preset capture area map is obtained by drawing and measuring the capture area in advance;
and displaying the identification icon of the target object in the preset capture area map according to the current position information.
In one embodiment, the outputting the lock prompt information includes:
displaying a locking prompt text; and/or
Broadcasting a locking prompt tone; and/or
And controlling the locking prompt lamp to flash.
In one embodiment, the processor is further configured to implement the steps of:
and displaying an identification icon of each search-capture person in a preset capture area map, wherein the identification icon of each search-capture person is different from the identification icon of the target object.
In an embodiment, the identification icon of the search catcher is different from the identification icon of the target object.
In an embodiment, the marker icon of the search catcher and the marker icon of the target object are different in color.
In an embodiment, the displaying the movement trajectory includes:
displaying a preset capture area map, wherein the preset capture area map is obtained by drawing and measuring the capture area in advance;
and displaying the moving track in the preset capture area map.
In an embodiment, the movement trajectory includes an identification icon of the target object, and the identification icon is used for representing the position of the target object in the preset capture area map.
In an embodiment, the shade of the color of the identification icon is determined according to the temperature of the target object.
In an embodiment, when a plurality of target objects are provided, the movement trajectories of different target objects are displayed in different display modes.
In one embodiment, the moving tracks of different target objects are different in color.
In one embodiment, the identification icons on the movement trajectories of different target objects are different.
In an embodiment, the camera comprises an infrared camera.
In one embodiment, the processor is further configured to implement the steps of:
determining whether the target object moves to a preset alert area or not according to the moving track of the target object;
and when the target object is determined to move to the preset warning area, outputting warning information to remind the commander and/or the catcher that the target object moves to the preset warning area.
In an embodiment, the preset warning region is a region previously defined in the capture region.
In one embodiment, the preset alert zone includes at least one of a first alert zone having a temperature close to the temperature of the target object, a second alert zone in which the temperature of the target object is not monitored, and a third alert zone in which the mobile vehicle is placed.
In an embodiment, the first warning area includes at least one of a crowded area and a dark area having a temperature close to a temperature of the target object, the second warning area includes at least one of a water area, a shelter area, and a highly reflective area, and the mobile vehicle includes at least one of an automobile, a ship, a yacht, a motorcycle, and a helicopter.
In an embodiment, the determining whether the target object moves to a preset alert area according to the moving track of the target object includes:
determining the moving direction of the target object according to the moving track of the target object, and acquiring the position information of the preset warning area;
and determining whether the target object moves to a preset alert area or not according to the moving direction and the position information of the preset alert area.
In an embodiment, before outputting the alarm information, the method further includes:
when the target object is determined to move to a preset warning area, determining the distance between the target object and the preset warning area;
and when the distance between the target object and the preset warning area is smaller than or equal to a preset threshold value, outputting warning information.
In one embodiment, the determining the distance between the target object and the preset alert zone includes:
acquiring current position information of the target object and position information of the preset warning area;
and determining the distance between the target object and the preset warning region according to the current position information and the position information of the preset warning region.
In one embodiment, the processor is further configured to implement the steps of:
and sending the moving track to an associated control terminal so that the control terminal can synchronously display the moving track, and searching personnel can check the moving track.
In an embodiment, after the displaying the movement trajectory, the method further includes:
when the target object is monitored to be lost, displaying an identification icon of the target object on the moving track, wherein the shade of the color of the identification icon is determined according to the temperature of the target object.
It should be noted that, as will be clearly understood by those skilled in the art, for convenience and brevity of description, the specific working process of the ground control platform described above may refer to the corresponding process in the foregoing capture auxiliary method embodiment, and details are not described herein again.
Please refer to fig. 9, fig. 9 is a schematic block diagram of a structure of an unmanned aerial vehicle according to an embodiment of the present application.
As shown in fig. 9, the drone 500 includes a processor 501, a memory 502, and a camera 503, and the processor 501, the memory 502, and the camera 503 are connected by a bus 504, such as an I2C (Inter-integrated Circuit) bus 504. Wherein, unmanned aerial vehicle can be for rotor type unmanned aerial vehicle, for example four rotor type unmanned aerial vehicle, six rotor type unmanned aerial vehicle, eight rotor type unmanned aerial vehicle, also can be fixed wing unmanned aerial vehicle, can also be the combination of rotor type and fixed wing unmanned aerial vehicle, do not do the injecing here.
Specifically, the Processor 501 may be a Micro-controller Unit (MCU), a Central Processing Unit (CPU), a Digital Signal Processor (DSP), or the like.
Specifically, the Memory 502 may be a Flash chip, a Read-Only Memory (ROM) magnetic disk, an optical disk, a usb disk, or a removable hard disk. Optionally, the drone may further include a transceiver for communication or data transmission of the drone with other devices, such as a ground command platform or a control terminal.
The processor 501 is configured to run a computer program stored in the memory 502, and when executing the computer program, implement the following steps:
controlling the shooting device to monitor a capture area;
when a target object in the capture area is monitored, controlling the unmanned aerial vehicle to carry out flight tracking on the target object;
drawing a moving track of the target object in the process of carrying out flight tracking on the target object;
and sending the moving track to a ground command platform so that the ground command platform can display the moving track, and a commander can check the moving track.
In an embodiment, the capture area comprises a plurality of capture sub-areas, and the drone corresponds to one of the capture sub-areas; the control the shooting device monitors a capture area, and comprises the following steps:
acquiring capture task information of the unmanned aerial vehicle, wherein the capture task information comprises position information of the capture subarea corresponding to the unmanned aerial vehicle;
controlling the unmanned aerial vehicle to fly to the corresponding capture subarea according to the position information;
and after the unmanned aerial vehicle flies to the corresponding capture subarea, controlling the shooting device to monitor the corresponding capture subarea.
In an embodiment, the capture task information includes a hovering height and shooting attitude parameters of the unmanned aerial vehicle in the corresponding capture sub-area; before the controlling the shooting device to monitor the corresponding capture sub-region, the method further includes:
controlling the unmanned aerial vehicle to hover according to the hovering height;
and adjusting the posture of the shooting device according to the shooting posture parameters.
In one embodiment, the adjusting the posture of the shooting device according to the shooting posture parameter includes:
adjusting the attitude of the tripod head of the unmanned aerial vehicle according to the shooting attitude parameters, wherein the attitude of the shooting device changes along with the change of the attitude of the tripod head of the unmanned aerial vehicle; and/or
Adjusting the hovering gesture of the unmanned aerial vehicle according to the shooting gesture parameters, wherein the gesture of the shooting device changes along with the change of the hovering gesture of the unmanned aerial vehicle.
In an embodiment, the controlling the unmanned aerial vehicle to perform flight tracking on the target object after the target object in the capture area is monitored includes:
when a target object in the capture area is monitored, adjusting the direction of a lens of the shooting device so that the direction of the lens is approximately the same as the gravity direction;
and after the lens direction is approximately the same as the gravity direction, controlling the unmanned aerial vehicle to carry out flight tracking on the target object.
In an embodiment, when an included angle between a lens orientation of the photographing device and a gravity direction is smaller than or equal to a preset included angle, it is determined that the lens orientation of the photographing device is substantially the same as the gravity direction.
In an embodiment, said controlling said drone to perform flight tracking on said target object after said lens direction is substantially the same as the direction of gravity includes:
after the lens direction is approximately the same as the gravity direction, controlling the unmanned aerial vehicle to fly right above the target object;
and after the unmanned aerial vehicle flies right above the target object, controlling the unmanned aerial vehicle to carry out flight tracking on the target object.
In an embodiment, the drone is spaced from the target object by a preset height.
In an embodiment, before the drawing the moving trajectory of the target object in the process of performing flight tracking on the target object, the method further includes:
sending current position information of the target object to a control terminal associated with the unmanned aerial vehicle, so that the control terminal can display an identification icon of the target object in a preset capture area map according to the current position information and output locking prompt information, wherein the locking prompt information is used for prompting search personnel whether to lock the target object;
and acquiring a locking instruction of the target object sent by the control terminal, and locking the target object according to the locking instruction.
In an embodiment, the preset capture area map is obtained by mapping the capture area in advance.
In an embodiment, the manner of outputting the locking prompt information by the control terminal includes at least one of displaying a locking prompt text, broadcasting a locking prompt tone, and controlling a locking prompt lamp to flash.
In an embodiment, an identification icon of each search-capture person is further displayed in the preset capture area map, the identification icons of different search-capture persons are different, and the identification icon of each search-capture person is different from the identification icon of the target object.
In an embodiment, the marker icon of the search catcher and the marker icon of the target object are different in color.
In an embodiment, the movement trajectory includes an identification icon of the target object, and the identification icon is used for representing the position of the target object in a preset capture area map.
In one embodiment, the shade of the color of the identification icon is determined according to the temperature of the target object sensed by the photographing device.
In an embodiment, when a plurality of target objects are provided, the ground command platform displays the movement tracks of different target objects in different display modes.
In one embodiment, the moving tracks of different target objects are different in color.
In one embodiment, the identification icons on the movement trajectories of different target objects are different.
In an embodiment, the camera comprises an infrared camera.
In one embodiment, the processor is further configured to implement the steps of:
determining whether the target object moves to a preset alert area or not according to the moving track of the target object;
when the target object is determined to move to a preset warning area, sending a warning instruction to a control terminal associated with the unmanned aerial vehicle, so that the control terminal can output warning information based on the warning instruction to remind a person who searches for the target object to move to the preset warning area.
In an embodiment, the preset warning region is a region previously defined in the capture region.
In one embodiment, the preset alert zone includes at least one of a first alert zone having a temperature close to the temperature of the target object, a second alert zone in which the temperature of the target object is not monitored, and a third alert zone in which the mobile vehicle is placed.
In an embodiment, the first warning area includes at least one of a crowded area and a dark area having a temperature close to a temperature of the target object, the second warning area includes at least one of a water area, a shelter area, and a highly reflective area, and the mobile vehicle includes at least one of an automobile, a ship, a yacht, a motorcycle, and a helicopter.
In an embodiment, the determining whether the target object moves to a preset alert area according to the moving track of the target object includes:
determining the moving direction of the target object according to the moving track of the target object, and acquiring the position information of the preset warning area;
and determining whether the target object moves to a preset alert area or not according to the moving direction and the position information of the preset alert area.
In an embodiment, before sending the warning instruction to the control terminal associated with the drone, the method further includes:
when the target object is determined to move to a preset warning area, determining the distance between the target object and the preset warning area;
and when the distance between the target object and the preset warning area is smaller than or equal to a preset threshold value, sending an alarm instruction to a control terminal associated with the unmanned aerial vehicle.
It should be noted that, as can be clearly understood by those skilled in the art, for convenience and brevity of description, the specific working process of the above-described unmanned aerial vehicle may refer to the corresponding process in the aforementioned capture assisting method embodiment, and is not described herein again.
Referring to fig. 10, fig. 10 is a schematic block diagram illustrating a structure of a capture assist system according to an embodiment of the present application. As shown in fig. 10, the auxiliary capture system 600 includes a ground command platform 610, at least one unmanned aerial vehicle 620, and at least one control terminal 630, the ground command platform 610 is connected with the at least one unmanned aerial vehicle 620 and the at least one control terminal 630 through a network, the control terminal 630 is used for controlling the flight of the unmanned aerial vehicle 620, the unmanned aerial vehicle 620 is equipped with a camera, wherein:
the ground command platform 610 is configured to acquire capture task data, where the capture task data includes capture task information of a capture sub-area corresponding to each unmanned aerial vehicle participating in a capture task;
the ground command platform 610 is further configured to send capture task information of a capture sub-area corresponding to each unmanned aerial vehicle to the corresponding control terminal 630;
the control terminal 630 is configured to send the capture task information to the corresponding unmanned aerial vehicle 620;
the unmanned aerial vehicle 620 is configured to control the shooting device to monitor a corresponding capture sub-area according to the capture task information;
when the unmanned aerial vehicle 620 monitors a target object in the capture sub-area, the unmanned aerial vehicle 620 performs flight tracking on the target object;
the unmanned aerial vehicle 620 is further configured to acquire a plurality of pieces of position information of the target object during a flight tracking process of the target object, and send the plurality of pieces of position information to the ground command platform 610;
the ground command platform 610 is further configured to draw a movement trajectory of the target object according to the plurality of position information, and display the movement trajectory for a commander to view.
In an embodiment, the ground command platform 610 is further configured to send the capture task information of the capture sub-area corresponding to each drone to the corresponding drone 620.
It should be noted that, as will be clearly understood by those skilled in the art, for convenience and brevity of description, the specific working process of the capture assisting system described above may refer to the corresponding process in the foregoing capture assisting method embodiment, and is not described herein again.
In an embodiment of the present application, a computer-readable storage medium is further provided, where a computer program is stored in the computer-readable storage medium, where the computer program includes program instructions, and the processor executes the program instructions to implement the steps of the capture assist method provided in the foregoing embodiment.
The computer readable storage medium may be an internal storage unit of the ground command platform or the drone described in any of the foregoing embodiments, for example, a hard disk or a memory of the ground command platform or the drone. The computer readable storage medium may also be an external storage device of the ground command platform or the drone, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, which are equipped on the ground command platform or the drone.
It is to be understood that the terminology used in the description of the present application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of the present application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should also be understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
While the invention has been described with reference to specific embodiments, the scope of the invention is not limited thereto, and those skilled in the art can easily conceive various equivalent modifications or substitutions within the technical scope of the invention. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (110)

1. A capture assist method, comprising:
controlling a plurality of unmanned aerial vehicles to monitor a capture area through respective carried shooting devices;
acquiring a plurality of position information of the target object, which is acquired when the unmanned aerial vehicle monitoring the target object in the capture area carries out flight tracking on the target object;
and drawing the moving track of the target object according to the position information, and displaying the moving track for a commander to view.
2. The capture assist method according to claim 1, wherein the capture area includes a plurality of capture sub-areas; control many unmanned aerial vehicles monitor the area of arresting through the shooting device that carries on separately, include:
acquiring capture task data, wherein the capture task data comprises capture task information of a capture sub-region corresponding to each unmanned aerial vehicle participating in a capture task;
controlling each unmanned aerial vehicle to fly to the corresponding capturing sub-region according to the capturing task information of the corresponding capturing sub-region of each unmanned aerial vehicle;
and after each unmanned aerial vehicle flies to the corresponding capture sub-area, controlling each unmanned aerial vehicle to monitor the corresponding capture sub-area through the carried shooting device.
3. The capture assisting method according to claim 2, wherein the capture task information comprises a hovering height and shooting attitude parameters of an unmanned aerial vehicle in a corresponding capture area; before controlling each unmanned aerial vehicle to monitor the corresponding capture subarea through the respective carried shooting devices, the method further comprises the following steps:
controlling each unmanned aerial vehicle to hover according to the hovering height of each unmanned aerial vehicle in the corresponding capturing sub-area;
and controlling each unmanned aerial vehicle to adjust the attitude of each shooting device according to the shooting attitude parameter corresponding to each unmanned aerial vehicle.
4. The capture assisting method according to claim 3, wherein the controlling each unmanned aerial vehicle to adjust the attitude of the respective camera according to the shooting attitude parameter corresponding to each unmanned aerial vehicle comprises:
controlling each unmanned aerial vehicle to adjust the attitude of the respective cradle head according to the shooting attitude parameter corresponding to each unmanned aerial vehicle, wherein the attitude of a shooting device of the unmanned aerial vehicle changes along with the attitude change of the cradle head of the unmanned aerial vehicle; and/or
And adjusting the hovering gesture of each unmanned aerial vehicle according to the shooting gesture parameter corresponding to each unmanned aerial vehicle, wherein the gesture of the shooting device of the unmanned aerial vehicle changes along with the change of the hovering gesture of the unmanned aerial vehicle.
5. The capture assisting method according to claim 1, wherein the acquiring of the plurality of position information of the target object acquired by the unmanned aerial vehicle monitoring the target object in the capture area during flight tracking of the target object comprises:
when a target object in the capture area is monitored, controlling a target unmanned aerial vehicle to carry out flight tracking on the target object, wherein the target unmanned aerial vehicle is an unmanned aerial vehicle monitoring the target object;
acquiring a plurality of position information of the target object acquired by the target unmanned aerial vehicle when the target object is subjected to flight tracking.
6. The capture assisting method according to claim 5, wherein the controlling a target unmanned aerial vehicle to perform flight tracking on a target object after the target object in the capture area is monitored comprises:
when the target object in the capture area is monitored, adjusting the direction of a lens of a shooting device of the target unmanned aerial vehicle so that the direction of the lens is approximately the same as the gravity direction;
and after the lens direction is approximately the same as the gravity direction, controlling the target unmanned aerial vehicle to carry out flight tracking on the target object.
7. The capture assisting method according to claim 6, wherein the controlling the target unmanned aerial vehicle to perform flight tracking on the target object after the lens direction is substantially the same as the gravity direction comprises:
after the lens direction is approximately the same as the gravity direction, controlling the target unmanned aerial vehicle to fly right above the target object;
and after the target unmanned aerial vehicle flies right above the target object, controlling the target unmanned aerial vehicle to carry out flight tracking on the target object.
8. The capture assist method according to claim 7, wherein the target drone is spaced from the target object by a preset height.
9. The capture assisting method according to claim 1, wherein before acquiring a plurality of position information of the target object acquired by the unmanned aerial vehicle monitoring the target object in the capture area during flight tracking of the target object, the method further comprises:
acquiring current position information of the target object sent by the unmanned aerial vehicle monitoring the target object in the capture area;
displaying an identification icon of the target object according to the current position information, and outputting locking prompt information, wherein the locking prompt information is used for prompting a commander or a search catcher whether to lock the target object;
and when the locking instruction of the target object is acquired, locking the target object according to the locking instruction.
10. The capture assist method according to claim 9, wherein the displaying the identification icon of the target object according to the current position information includes:
displaying a preset capture area map, wherein the preset capture area map is obtained by drawing and measuring the capture area in advance;
and displaying the identification icon of the target object in the preset capture area map according to the current position information.
11. The capture assist method according to claim 9, wherein the outputting of the lock prompt information includes:
displaying a locking prompt text; and/or
Broadcasting a locking prompt tone; and/or
And controlling the locking prompt lamp to flash.
12. The capture assist method according to claim 10, characterized in that the method further comprises:
and displaying an identification icon of each search-capture person in a preset capture area map, wherein the identification icon of each search-capture person is different from the identification icon of the target object.
13. The capture assist method according to claim 12, wherein the search person's identification icon is different in color from the target object's identification icon.
14. The capture assist method according to any one of claims 1 to 13, wherein the displaying the movement trajectory includes:
displaying a preset capture area map, wherein the preset capture area map is obtained by drawing and measuring the capture area in advance;
and displaying the moving track in the preset capture area map.
15. The capture assist method according to claim 14, wherein the movement trajectory includes an identification icon of the target object, the identification icon being used to represent a position of the target object within the preset capture area map.
16. The capture assist method according to claim 15, wherein the shade of the color of the identification icon is determined according to the temperature of the target object.
17. The capture assist method according to any one of claims 1 to 13, wherein when a plurality of target objects are provided, movement trajectories of different target objects are displayed in different display manners.
18. The capture assist method according to claim 17, wherein the moving trajectories of different target objects are different in color.
19. The capture assist method according to claim 17, wherein the identification icon on the movement trajectory of different target objects is different.
20. The capture assist method according to any one of claims 1 to 13, wherein the photographing device includes an infrared camera.
21. The capture assist method according to any one of claims 1 to 13, characterized in that the method further comprises:
determining whether the target object moves to a preset alert area or not according to the moving track of the target object;
and when the target object is determined to move to the preset warning area, outputting warning information to remind the commander and/or the catcher that the target object moves to the preset warning area.
22. The capture assist method according to claim 21, wherein the preset warning region is a region previously set within the capture region.
23. The capture assist method according to claim 21, wherein the preset alert zone includes at least one of a first alert zone having a temperature close to a temperature of the target object, a second alert zone in which the temperature of the target object is not monitored, and a third alert zone in which a mobile vehicle is placed.
24. The capture assist method according to claim 23, wherein the first warning area includes at least one of a crowd dense area and a black body area having a temperature close to a temperature of the target object, the second warning area includes at least one of a water area, a shelter area, and a highly reflective area, and the mobile vehicle includes at least one of an automobile, a ship, a yacht, a motorcycle, and a helicopter.
25. The capture assist method according to claim 21, wherein the determining whether the target object is moving to a preset alert zone according to the movement trajectory of the target object includes:
determining the moving direction of the target object according to the moving track of the target object, and acquiring the position information of the preset warning area;
and determining whether the target object moves to a preset alert area or not according to the moving direction and the position information of the preset alert area.
26. The capture assist method according to claim 21, wherein before outputting the warning information, the method further comprises:
when the target object is determined to move to a preset warning area, determining the distance between the target object and the preset warning area;
and when the distance between the target object and the preset warning area is smaller than or equal to a preset threshold value, outputting warning information.
27. The capture assist method of claim 26, wherein the determining the distance between the target object and the preset surveillance zone comprises:
acquiring current position information of the target object and position information of the preset warning area;
and determining the distance between the target object and the preset warning region according to the current position information and the position information of the preset warning region.
28. The capture assist method according to any one of claims 1 to 13, characterized in that the method further comprises:
and sending the moving track to an associated control terminal so that the control terminal can synchronously display the moving track, and searching personnel can check the moving track.
29. The capture assist method according to any one of claims 1 to 13, wherein after the displaying the movement trajectory, further comprising:
when the target object is monitored to be lost, displaying an identification icon of the target object on the moving track, wherein the shade of the color of the identification icon is determined according to the temperature of the target object.
30. The utility model provides a capture auxiliary method, its characterized in that is applied to unmanned aerial vehicle, unmanned aerial vehicle carries with camera, the method includes:
controlling the shooting device to monitor a capture area;
when a target object in the capture area is monitored, controlling the unmanned aerial vehicle to carry out flight tracking on the target object;
drawing a moving track of the target object in the process of carrying out flight tracking on the target object;
and sending the moving track to a ground command platform so that the ground command platform can display the moving track, and a commander can check the moving track.
31. The capture assist method of claim 30, wherein the capture area comprises a plurality of capture sub-areas, the drone corresponding to one of the capture sub-areas; the control the shooting device monitors a capture area, and comprises the following steps:
acquiring capture task information of the unmanned aerial vehicle, wherein the capture task information comprises position information of the capture subarea corresponding to the unmanned aerial vehicle;
controlling the unmanned aerial vehicle to fly to the corresponding capture subarea according to the position information;
and after the unmanned aerial vehicle flies to the corresponding capture subarea, controlling the shooting device to monitor the corresponding capture subarea.
32. The capture assist method according to claim 31, wherein the capture task information includes a hover height and a shooting attitude parameter of the drone within the corresponding capture region; before the controlling the shooting device to monitor the corresponding capture sub-region, the method further includes:
controlling the unmanned aerial vehicle to hover according to the hovering height;
and adjusting the posture of the shooting device according to the shooting posture parameters.
33. The capture assist method according to claim 32, wherein the adjusting the attitude of the imaging device according to the imaging attitude parameter includes:
adjusting the attitude of the tripod head of the unmanned aerial vehicle according to the shooting attitude parameters, wherein the attitude of the shooting device changes along with the change of the attitude of the tripod head of the unmanned aerial vehicle; and/or
Adjusting the hovering gesture of the unmanned aerial vehicle according to the shooting gesture parameters, wherein the gesture of the shooting device changes along with the change of the hovering gesture of the unmanned aerial vehicle.
34. The capture assisting method according to claim 30, wherein the controlling the unmanned aerial vehicle to perform flight tracking on the target object after the target object in the capture area is monitored comprises:
when a target object in the capture area is monitored, adjusting the direction of a lens of the shooting device so that the direction of the lens is approximately the same as the gravity direction;
and after the lens direction is approximately the same as the gravity direction, controlling the unmanned aerial vehicle to carry out flight tracking on the target object.
35. The capture assist method according to claim 34, wherein when an angle between a lens orientation of the camera and a direction of gravity is smaller than or equal to a preset angle, it is determined that the lens orientation of the camera is substantially the same as the direction of gravity.
36. The capture assist method according to claim 34, wherein the controlling the drone to perform flight tracking on the target object after the lens direction is substantially the same as the gravity direction comprises:
after the lens direction is approximately the same as the gravity direction, controlling the unmanned aerial vehicle to fly right above the target object;
and after the unmanned aerial vehicle flies right above the target object, controlling the unmanned aerial vehicle to carry out flight tracking on the target object.
37. The capture assist method of claim 36, wherein the drone is spaced from the target object by a preset height.
38. The capture assisting method according to claim 30, wherein before the drawing the moving track of the target object in the process of flight tracking of the target object, the method further comprises:
sending current position information of the target object to a control terminal associated with the unmanned aerial vehicle, so that the control terminal can display an identification icon of the target object in a preset capture area map according to the current position information and output locking prompt information, wherein the locking prompt information is used for prompting search personnel whether to lock the target object;
and acquiring a locking instruction of the target object sent by the control terminal, and locking the target object according to the locking instruction.
39. The capture assist method according to claim 38, wherein the preset capture area map is obtained by plotting the capture area in advance.
40. The capture assisting method according to claim 38, wherein the manner in which the control terminal outputs the lock prompting message includes at least one of displaying a lock prompting text, broadcasting a lock prompting tone, and controlling a lock prompting lamp to blink.
41. The capture assisting method according to claim 38, wherein an identification icon of each search taker is further displayed in the preset capture area map, wherein the identification icons of different search takers are different, and the identification icon of the search taker is different from the identification icon of the target object.
42. The capture assist method according to claim 41, wherein the search person's identification icon is a different color than the target object's identification icon.
43. The capture assist method according to any one of claims 30 to 42, wherein the movement trajectory includes an identification icon of the target object, the identification icon being used to indicate a position of the target object within a preset capture area map.
44. The capture assist method according to claim 43, wherein the shade of the color of the identification icon is determined in accordance with the temperature of the target object sensed by the camera.
45. The capture assisting method according to any one of claims 30 to 42, wherein when the target objects are multiple, the ground command platform displays the moving tracks of different target objects in different display modes.
46. The capture assist method according to claim 45, wherein the moving trajectories of different target objects are different in color.
47. The capture assist method according to claim 45, wherein the identification icons on the movement trajectories of different target objects are different.
48. The capture assist method according to any one of claims 30 to 42, wherein the photographing device includes an infrared camera.
49. The capture assist method according to any one of claims 30 to 42, further comprising:
determining whether the target object moves to a preset alert area or not according to the moving track of the target object;
when the target object is determined to move to a preset warning area, sending a warning instruction to a control terminal associated with the unmanned aerial vehicle, so that the control terminal can output warning information based on the warning instruction to remind a person who searches for the target object to move to the preset warning area.
50. The capture assist method according to claim 49, wherein the preset warning region is a region previously set within the capture region.
51. The capture assist method according to claim 49, wherein the preset alert zone includes at least one of a first alert zone having a temperature close to that of the target object, a second alert zone in which the temperature of the target object is not monitored, and a third alert zone in which a mobile vehicle is placed.
52. The capture assist method according to claim 51, wherein the first warning area includes at least one of a crowded area and a dark area having a temperature close to a temperature of the target object, the second warning area includes at least one of a water area, a shelter area, and a highly reflective area, and the mobile vehicle includes at least one of an automobile, a ship, a yacht, a motorcycle, and a helicopter.
53. The capture assist method according to claim 49, wherein the determining whether the target object is moving to a preset alert zone according to the moving track of the target object comprises:
determining the moving direction of the target object according to the moving track of the target object, and acquiring the position information of the preset warning area;
and determining whether the target object moves to a preset alert area or not according to the moving direction and the position information of the preset alert area.
54. The capture assist method according to claim 49, wherein before sending the warning instruction to the control terminal associated with the drone, the method further comprises:
when the target object is determined to move to a preset warning area, determining the distance between the target object and the preset warning area;
and when the distance between the target object and the preset warning area is smaller than or equal to a preset threshold value, sending an alarm instruction to a control terminal associated with the unmanned aerial vehicle.
55. A ground command platform, comprising a memory and a processor;
the memory is used for storing a computer program;
the processor is configured to execute the computer program and, when executing the computer program, implement the following steps:
controlling a plurality of unmanned aerial vehicles to monitor a capture area through respective carried shooting devices;
acquiring a plurality of position information of the target object, which is acquired when the unmanned aerial vehicle monitoring the target object in the capture area carries out flight tracking on the target object;
and drawing the moving track of the target object according to the position information, and displaying the moving track for a commander to view.
56. The ground command platform of claim 55, wherein the capture area comprises a plurality of capture sub-areas; control many unmanned aerial vehicles monitor the area of arresting through the shooting device that carries on separately, include:
acquiring capture task data, wherein the capture task data comprises capture task information of a capture sub-region corresponding to each unmanned aerial vehicle participating in a capture task;
controlling each unmanned aerial vehicle to fly to the corresponding capturing sub-region according to the capturing task information of the corresponding capturing sub-region of each unmanned aerial vehicle;
and after each unmanned aerial vehicle flies to the corresponding capture sub-area, controlling each unmanned aerial vehicle to monitor the corresponding capture sub-area through the carried shooting device.
57. The ground command platform of claim 56, wherein the capture task information includes a hover height and a capture attitude parameter for an UAV in a corresponding capture region; before controlling each unmanned aerial vehicle to monitor the corresponding capture subarea through the respective carried shooting devices, the method further comprises the following steps:
controlling each unmanned aerial vehicle to hover according to the hovering height of each unmanned aerial vehicle in the corresponding capturing sub-area;
and controlling each unmanned aerial vehicle to adjust the attitude of each shooting device according to the shooting attitude parameter corresponding to each unmanned aerial vehicle.
58. The ground command platform of claim 57, wherein the controlling each drone to adjust the attitude of its respective camera according to its respective shooting attitude parameter comprises:
controlling each unmanned aerial vehicle to adjust the attitude of the respective cradle head according to the shooting attitude parameter corresponding to each unmanned aerial vehicle, wherein the attitude of a shooting device of the unmanned aerial vehicle changes along with the attitude change of the cradle head of the unmanned aerial vehicle; and/or
And adjusting the hovering gesture of each unmanned aerial vehicle according to the shooting gesture parameter corresponding to each unmanned aerial vehicle, wherein the gesture of the shooting device of the unmanned aerial vehicle changes along with the change of the hovering gesture of the unmanned aerial vehicle.
59. The ground command platform of claim 55, wherein the obtaining of the plurality of position information of the target object acquired by the drone monitoring the target object within the capture area while performing flight tracking on the target object comprises:
when a target object in the capture area is monitored, controlling a target unmanned aerial vehicle to carry out flight tracking on the target object, wherein the target unmanned aerial vehicle is an unmanned aerial vehicle monitoring the target object;
acquiring a plurality of position information of the target object acquired by the target unmanned aerial vehicle when the target object is subjected to flight tracking.
60. The ground command platform of claim 59, wherein the controlling the target drone to perform flight tracking on the target object after the target object in the capture area is monitored comprises:
when the target object in the capture area is monitored, adjusting the direction of a lens of a shooting device of the target unmanned aerial vehicle so that the direction of the lens is approximately the same as the gravity direction;
and after the lens direction is approximately the same as the gravity direction, controlling the target unmanned aerial vehicle to carry out flight tracking on the target object.
61. The ground command platform of claim 60, wherein the controlling the target drone to perform flight tracking on the target object after the lens direction is substantially the same as the direction of gravity comprises:
after the lens direction is approximately the same as the gravity direction, controlling the target unmanned aerial vehicle to fly right above the target object;
and after the target unmanned aerial vehicle flies right above the target object, controlling the target unmanned aerial vehicle to carry out flight tracking on the target object.
62. The ground command platform of claim 61, wherein the target drone is spaced from the target object by a preset height.
63. The ground command platform of claim 55, wherein the acquiring of the plurality of position information of the target object acquired by the drone monitoring the target object within the capture area during flight tracking of the target object further comprises:
acquiring current position information of the target object sent by the unmanned aerial vehicle monitoring the target object in the capture area;
displaying an identification icon of the target object according to the current position information, and outputting locking prompt information, wherein the locking prompt information is used for prompting a commander or a search catcher whether to lock the target object;
and when the locking instruction of the target object is acquired, locking the target object according to the locking instruction.
64. The ground command platform of claim 63, wherein the displaying the identification icon of the target object according to the current location information comprises:
displaying a preset capture area map, wherein the preset capture area map is obtained by drawing and measuring the capture area in advance;
and displaying the identification icon of the target object in the preset capture area map according to the current position information.
65. The ground command platform of claim 63, wherein the outputting of the locking prompt comprises:
displaying a locking prompt text; and/or
Broadcasting a locking prompt tone; and/or
And controlling the locking prompt lamp to flash.
66. The ground command platform of claim 63, wherein the processor is further configured to perform the steps of:
and displaying an identification icon of each search-capture person in a preset capture area map, wherein the identification icon of each search-capture person is different from the identification icon of the target object.
67. The ground command platform of claim 66, wherein the search taker's logo icon is a different color than the target object's logo icon.
68. The ground command platform of any one of claims 55 to 67, wherein the displaying the movement trajectory comprises:
displaying a preset capture area map, wherein the preset capture area map is obtained by drawing and measuring the capture area in advance;
and displaying the moving track in the preset capture area map.
69. The ground command platform of claim 68, wherein the movement trajectory comprises an identification icon of the target object, the identification icon being indicative of a position of the target object within the preset capture area map.
70. The ground command platform of claim 69, wherein the color of the identification icon is determined according to the temperature of the target object.
71. The ground command platform of any one of claims 55 to 67, wherein when there are a plurality of target objects, the movement tracks of different target objects are displayed in different display manners.
72. The ground command platform of claim 71, wherein the movement tracks of different target objects are different colors.
73. The ground command platform of claim 71, wherein the identification icons on the movement tracks of different target objects are different.
74. A ground command platform as claimed in any one of claims 55 to 67 wherein the camera means comprises an infrared camera.
75. The ground command platform of any one of claims 55 to 67, wherein the processor is further configured to perform the steps of:
determining whether the target object moves to a preset alert area or not according to the moving track of the target object;
and when the target object is determined to move to the preset warning area, outputting warning information to remind the commander and/or the catcher that the target object moves to the preset warning area.
76. The ground command platform of claim 75, wherein the predetermined alert zone is a zone previously defined within the capture zone.
77. The ground command platform of claim 75, wherein the predetermined alert zone comprises at least one of a first alert zone having a temperature close to the temperature of the target object, a second alert zone having no monitored temperature of the target object, and a third alert zone having a mobile vehicle disposed thereon.
78. The ground command platform of claim 77, wherein the first alert zone comprises at least one of a crowded area and a blackbody area having a temperature close to the temperature of the target object, the second alert zone comprises at least one of a water area, a shelter area, and a highly reflective area, and the mobile vehicle comprises at least one of an automobile, a boat, a yacht, a motorcycle, and a helicopter.
79. The ground command platform of claim 75, wherein the determining whether the target object is moving to a preset alert zone according to the moving track of the target object comprises:
determining the moving direction of the target object according to the moving track of the target object, and acquiring the position information of the preset warning area;
and determining whether the target object moves to a preset alert area or not according to the moving direction and the position information of the preset alert area.
80. The ground command platform of claim 75, wherein before outputting the alarm information, the ground command platform further comprises:
when the target object is determined to move to a preset warning area, determining the distance between the target object and the preset warning area;
and when the distance between the target object and the preset warning area is smaller than or equal to a preset threshold value, outputting warning information.
81. The ground command platform of claim 80, wherein the determining the distance between the target object and the preset alert zone comprises:
acquiring current position information of the target object and position information of the preset warning area;
and determining the distance between the target object and the preset warning region according to the current position information and the position information of the preset warning region.
82. The ground command platform of any one of claims 55 to 67, wherein the processor is further configured to perform the steps of:
and sending the moving track to an associated control terminal so that the control terminal can synchronously display the moving track, and searching personnel can check the moving track.
83. The ground command platform of any one of claims 55 to 67, wherein after the displaying the movement trajectory, further comprising:
when the target object is monitored to be lost, displaying an identification icon of the target object on the moving track, wherein the shade of the color of the identification icon is determined according to the temperature of the target object.
84. An unmanned aerial vehicle, comprising a camera, a memory, and a processor;
the memory is used for storing a computer program;
the processor is configured to execute the computer program and, when executing the computer program, implement the following steps:
controlling the shooting device to monitor a capture area;
when a target object in the capture area is monitored, controlling the unmanned aerial vehicle to carry out flight tracking on the target object;
drawing a moving track of the target object in the process of carrying out flight tracking on the target object;
and sending the moving track to a ground command platform so that the ground command platform can display the moving track, and a commander can check the moving track.
85. A drone as claimed in claim 84, wherein the capture area includes a plurality of capture sub-areas, the drone corresponding to one of the capture sub-areas; the control the shooting device monitors a capture area, and comprises the following steps:
acquiring capture task information of the unmanned aerial vehicle, wherein the capture task information comprises position information of the capture subarea corresponding to the unmanned aerial vehicle;
controlling the unmanned aerial vehicle to fly to the corresponding capture subarea according to the position information;
and after the unmanned aerial vehicle flies to the corresponding capture subarea, controlling the shooting device to monitor the corresponding capture subarea.
86. A drone as claimed in claim 85, wherein the capture task information includes hover height and camera pose parameters of the drone within the corresponding capture region; before the controlling the shooting device to monitor the corresponding capture sub-region, the method further includes:
controlling the unmanned aerial vehicle to hover according to the hovering height;
and adjusting the posture of the shooting device according to the shooting posture parameters.
87. The drone of claim 86, wherein the adjusting the pose of the camera according to the camera pose parameters comprises:
adjusting the attitude of the tripod head of the unmanned aerial vehicle according to the shooting attitude parameters, wherein the attitude of the shooting device changes along with the change of the attitude of the tripod head of the unmanned aerial vehicle; and/or
Adjusting the hovering gesture of the unmanned aerial vehicle according to the shooting gesture parameters, wherein the gesture of the shooting device changes along with the change of the hovering gesture of the unmanned aerial vehicle.
88. A drone as claimed in claim 84, wherein the control of the drone to perform flight tracking of the target object after monitoring the target object within the capture area includes:
when a target object in the capture area is monitored, adjusting the direction of a lens of the shooting device so that the direction of the lens is approximately the same as the gravity direction;
and after the lens direction is approximately the same as the gravity direction, controlling the unmanned aerial vehicle to carry out flight tracking on the target object.
89. An unmanned aerial vehicle according to claim 88, wherein the orientation of the lens of the camera device is determined to be substantially the same as the direction of gravity when the angle between the orientation of the lens of the camera device and the direction of gravity is less than or equal to a predetermined angle.
90. A drone as claimed in claim 88, wherein the controlling the drone to perform flight tracking on the target object after the lens direction is substantially the same as the direction of gravity includes:
after the lens direction is approximately the same as the gravity direction, controlling the unmanned aerial vehicle to fly right above the target object;
and after the unmanned aerial vehicle flies right above the target object, controlling the unmanned aerial vehicle to carry out flight tracking on the target object.
91. A drone according to claim 90, wherein the drone is spaced from the target object by a preset height.
92. A drone of claim 84, wherein before mapping the target object's movement trajectory during flight tracking of the target object, further comprising:
sending current position information of the target object to a control terminal associated with the unmanned aerial vehicle, so that the control terminal can display an identification icon of the target object in a preset capture area map according to the current position information and output locking prompt information, wherein the locking prompt information is used for prompting search personnel whether to lock the target object;
and acquiring a locking instruction of the target object sent by the control terminal, and locking the target object according to the locking instruction.
93. A drone as claimed in claim 92, wherein the preset capture area map is mapped ahead of time on the capture area.
94. The unmanned aerial vehicle of claim 92, wherein the manner in which the control terminal outputs the locking prompt message comprises at least one of displaying a locking prompt text, broadcasting a locking prompt tone, and controlling a locking prompt lamp to blink.
95. An unmanned aerial vehicle as claimed in claim 92, wherein an identification icon for each search taker is also displayed within the preset capture area map, the identification icons for different search takers being different, the identification icon for a search taker being different from the identification icon for the target object.
96. A drone as claimed in claim 95, wherein the search taker's identifying icon is a different colour to the target object's identifying icon.
97. A drone as claimed in any one of claims 84 to 96, wherein the movement trajectory includes an identifying icon of the target object, the identifying icon being for representing the position of the target object within a preset capture area map.
98. A drone as claimed in claim 97, wherein the shade of the colour of the identifying icon is determined from the temperature of the target object sensed by the camera.
99. A drone as claimed in any one of claims 84 to 96, wherein when there are a plurality of target objects, the ground command platform displays the movement trajectories of the different target objects in different display manners.
100. A drone as claimed in claim 99, wherein the movement trajectories of different target objects are different in colour.
101. A drone as claimed in claim 99, wherein the identification icons on the movement trajectories of different target objects are different.
102. A drone as claimed in any of claims 84 to 96, wherein the camera includes an infrared camera.
103. A drone as claimed in any one of claims 84 to 96, wherein the processor is further configured to implement the steps of:
determining whether the target object moves to a preset alert area or not according to the moving track of the target object;
when the target object is determined to move to a preset warning area, sending a warning instruction to a control terminal associated with the unmanned aerial vehicle, so that the control terminal can output warning information based on the warning instruction to remind a person who searches for the target object to move to the preset warning area.
104. A drone according to claim 103, wherein the preset alerting region is a region that is well defined in advance within the capture region.
105. A drone as claimed in claim 103, wherein the preset alert zone includes at least one of a first alert zone with a temperature close to the temperature of the target object, a second alert zone where the temperature of the target object is not monitored, and a third alert zone where a mobile vehicle is placed.
106. The drone of claim 105, wherein the first alert area comprises at least one of a crowded area and a blackbody area having a temperature close to a temperature of the target object, the second alert area comprises at least one of a water area, a shelter area, and a highly reflective area, and the mobile vehicle comprises at least one of an automobile, a boat, a yacht, a motorcycle, and a helicopter.
107. A drone according to claim 103, wherein said determining whether the target object is moving towards a preset alert zone according to the target object's movement trajectory comprises:
determining the moving direction of the target object according to the moving track of the target object, and acquiring the position information of the preset warning area;
and determining whether the target object moves to a preset alert area or not according to the moving direction and the position information of the preset alert area.
108. The drone of claim 103, further comprising, prior to sending the alert instruction to the control terminal associated with the drone:
when the target object is determined to move to a preset warning area, determining the distance between the target object and the preset warning area;
and when the distance between the target object and the preset warning area is smaller than or equal to a preset threshold value, sending an alarm instruction to a control terminal associated with the unmanned aerial vehicle.
109. The utility model provides a capture auxiliary system, its characterized in that includes ground command platform, an at least unmanned aerial vehicle and at least one control terminal, ground command platform pass through the network with at least one unmanned aerial vehicle is connected, control terminal is used for control unmanned aerial vehicle's flight, unmanned aerial vehicle carries the camera device, wherein:
the ground command platform is used for acquiring capture task data, wherein the capture task data comprises capture task information of a capture sub-region corresponding to each unmanned aerial vehicle participating in a capture task;
the ground command platform is further used for sending the capture task information of the capture sub-area corresponding to each unmanned aerial vehicle to the corresponding control terminal;
the control terminal is used for sending the capture task information to the corresponding unmanned aerial vehicle;
the unmanned aerial vehicle is used for controlling the shooting device to monitor the corresponding capture subarea according to the capture task information;
when the unmanned aerial vehicle monitors a target object in the capture sub-area, the unmanned aerial vehicle carries out flight tracking on the target object;
the unmanned aerial vehicle is further used for acquiring a plurality of position information of the target object in the process of carrying out flight tracking on the target object and sending the position information to the ground command platform;
and the ground command platform is also used for drawing the moving track of the target object according to the position information and displaying the moving track for the command staff to check.
110. A computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program which, when executed by a processor, causes the processor to carry out the steps of the capture assist method according to any one of claims 1 to 54.
CN202080005974.2A 2020-05-28 2020-05-28 Catching auxiliary method, ground command platform, unmanned aerial vehicle, system and storage medium Pending CN112969977A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/093037 WO2021237618A1 (en) 2020-05-28 2020-05-28 Capture assistance method, ground command platform, unmanned aerial vehicle, system, and storage medium

Publications (1)

Publication Number Publication Date
CN112969977A true CN112969977A (en) 2021-06-15

Family

ID=76271525

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202080005974.2A Pending CN112969977A (en) 2020-05-28 2020-05-28 Catching auxiliary method, ground command platform, unmanned aerial vehicle, system and storage medium

Country Status (2)

Country Link
CN (1) CN112969977A (en)
WO (1) WO2021237618A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023060405A1 (en) * 2021-10-11 2023-04-20 深圳市大疆创新科技有限公司 Unmanned aerial vehicle monitoring method and apparatus, and unmanned aerial vehicle and monitoring device
CN118654664A (en) * 2024-06-24 2024-09-17 中国地质大学(北京) Emergency navigation method and system based on remote sensing and unmanned aerial vehicle

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114355985B (en) * 2022-03-18 2022-07-08 北京卓翼智能科技有限公司 Path planning method and device for unmanned aerial vehicle cluster, controller and storage medium
CN115883969B (en) * 2023-02-16 2023-05-05 北京万和汇通通信科技有限公司 Unmanned aerial vehicle shooting method, unmanned aerial vehicle shooting device, unmanned aerial vehicle shooting equipment and unmanned aerial vehicle shooting medium
CN116449714B (en) * 2023-04-20 2024-01-23 四川大学 Multi-spacecraft pursuit game track control method
CN119148542B (en) * 2024-11-19 2025-02-14 福建迈威信息工程有限公司 Multi-terminal self-adaptive data processing method, system, equipment and storage medium
CN119397928B (en) * 2025-01-03 2025-04-25 广东翼景信息科技有限公司 A method, device and readable storage medium for deploying unmanned aerial vehicle (UAV) intelligent airport

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107293125A (en) * 2017-07-03 2017-10-24 武汉理工大学 A kind of escape vehicle recognition and tracking system based on unmanned plane
CN109787679A (en) * 2019-03-15 2019-05-21 郭欣 Police infrared arrest system and method based on multi-rotor unmanned aerial vehicle
US10642272B1 (en) * 2016-09-28 2020-05-05 Amazon Technologies, Inc. Vehicle navigation with image-aided global positioning system

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102419598B (en) * 2011-12-08 2013-11-06 南京航空航天大学 Method for cooperatively detecting moving target by using multiple unmanned aerial vehicles
CN108958297A (en) * 2018-08-03 2018-12-07 南京航空航天大学 A kind of multiple no-manned plane collaboration target following earth station

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10642272B1 (en) * 2016-09-28 2020-05-05 Amazon Technologies, Inc. Vehicle navigation with image-aided global positioning system
CN107293125A (en) * 2017-07-03 2017-10-24 武汉理工大学 A kind of escape vehicle recognition and tracking system based on unmanned plane
CN109787679A (en) * 2019-03-15 2019-05-21 郭欣 Police infrared arrest system and method based on multi-rotor unmanned aerial vehicle

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023060405A1 (en) * 2021-10-11 2023-04-20 深圳市大疆创新科技有限公司 Unmanned aerial vehicle monitoring method and apparatus, and unmanned aerial vehicle and monitoring device
CN118654664A (en) * 2024-06-24 2024-09-17 中国地质大学(北京) Emergency navigation method and system based on remote sensing and unmanned aerial vehicle
CN118654664B (en) * 2024-06-24 2025-03-28 中国地质大学(北京) Emergency navigation method and system based on remote sensing and unmanned aerial vehicle

Also Published As

Publication number Publication date
WO2021237618A1 (en) 2021-12-02

Similar Documents

Publication Publication Date Title
CN112969977A (en) Catching auxiliary method, ground command platform, unmanned aerial vehicle, system and storage medium
US11794890B2 (en) Unmanned aerial vehicle inspection system
US11361665B2 (en) Unmanned aerial vehicle privacy controls
US11814173B2 (en) Systems and methods for unmanned aerial vehicles
JP6609833B2 (en) Method and system for controlling the flight of an unmanned aerial vehicle
CN107531322B (en) Aerial capture platform
US11531340B2 (en) Flying body, living body detection system, living body detection method, program and recording medium
US20150321758A1 (en) UAV deployment and control system
CN104808680A (en) Multi-rotor flight shooting device
CN107089319A (en) Storage tank fire detection system
US12254779B2 (en) Unmanned aerial vehicle privacy controls
CN105511495A (en) Power line UAV intelligent inspection control method and system
CN115220475A (en) System and method for UAV flight control
US20180144644A1 (en) Method and system for managing flight plan for unmanned aerial vehicle
CN104118561B (en) A method for monitoring large-scale endangered wild animals based on drone technology
US20230419843A1 (en) Unmanned aerial vehicle dispatching method, server, base station, system, and readable storage medium
WO2017139282A1 (en) Unmanned aerial vehicle privacy controls
WO2020062178A1 (en) Map-based method for identifying target object, and control terminal
CN110316376A (en) It is a kind of for detecting the unmanned plane of mine fire
CN113268075A (en) Unmanned aerial vehicle control method and system
CN115580708A (en) A method for unmanned aerial vehicle inspection of optical cable lines
WO2022234574A1 (en) Multi-drone beyond visual line of sight (bvlos) operation
WO2022188151A1 (en) Image photographing method, control apparatus, movable platform, and computer storage medium
CN114020000A (en) A UAV control system for concealed fire inspection on the surface of coal mines
JP6730764B1 (en) Flight route display method and information processing apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20210615