WO2018020656A1 - Moving body, method for controlling moving body, storage medium having control program stored thereon, and control program - Google Patents
Moving body, method for controlling moving body, storage medium having control program stored thereon, and control program Download PDFInfo
- Publication number
- WO2018020656A1 WO2018020656A1 PCT/JP2016/072307 JP2016072307W WO2018020656A1 WO 2018020656 A1 WO2018020656 A1 WO 2018020656A1 JP 2016072307 W JP2016072307 W JP 2016072307W WO 2018020656 A1 WO2018020656 A1 WO 2018020656A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- imaging device
- image
- unit
- moving body
- uav
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 74
- 238000003384 imaging method Methods 0.000 claims abstract description 166
- 238000012545 processing Methods 0.000 claims abstract description 144
- 238000004891 communication Methods 0.000 claims description 16
- 239000000284 extract Substances 0.000 claims description 9
- 230000036544 posture Effects 0.000 description 60
- 238000000605 extraction Methods 0.000 description 17
- 230000003287 optical effect Effects 0.000 description 17
- 238000010586 diagram Methods 0.000 description 15
- 230000007246 mechanism Effects 0.000 description 9
- 230000001133 acceleration Effects 0.000 description 5
- 238000013461 design Methods 0.000 description 5
- 230000006870 function Effects 0.000 description 3
- 238000004364 calculation method Methods 0.000 description 2
- 230000002950 deficient Effects 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000000052 comparative effect Effects 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000010422 painting Methods 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 239000013589 supplement Substances 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/66—Remote control of cameras or camera parts, e.g. by remote control devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U10/00—Type of UAV
- B64U10/10—Rotorcrafts
- B64U10/13—Flying platforms
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U20/00—Constructional aspects of UAVs
- B64U20/80—Arrangement of on-board electronics, e.g. avionics systems or wiring
- B64U20/87—Mounting of imaging devices, e.g. mounting of gimbals
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
- H04N23/84—Camera processing pipelines; Components thereof for processing colour signals
- H04N23/843—Demosaicing, e.g. interpolating colour pixel values
Definitions
- the present invention is a technique relating to a moving object. More specifically, the present invention relates to a technique related to image processing performed by a moving body.
- Patent Document 1 discloses a technique for removing reflection of an object in an image.
- Patent Document 1 discloses a technique using a tablet computer with a camera function. The tablet computer determines whether there is an unnecessary reflection in the captured image by analyzing the spectrum of the image. When there is an unnecessary reflection, the tablet computer interpolates the area with the unnecessary reflection using the pixels in the area near the area with the unnecessary reflection.
- a moving object determines whether an object is present in an image captured by an imaging device, based on a relationship between the aircraft, the imaging device, and the attitude of the imaging device and the attitude of the aircraft. And a processing unit that performs a process of removing the object when it is determined that the object exists.
- the moving body determines whether or not the object is present in the image based on the relationship between the posture of the imaging device and the posture of the airframe.
- the moving body performs processing for removing the object from the image captured by the imaging device. According to such processing, it is possible to determine whether or not the image includes an object before performing processing for actually removing the object. Therefore, the processing load caused by executing the image processing can be reduced.
- the determination unit may perform the determination based on an angle formed by the inclination of the imaging device and the inclination of the airframe.
- the determination unit may determine that an object exists.
- the predetermined angle may be set based on information including the shape of the airframe, the size of the airframe, the relationship between the imaging device and the airframe and the position, and the angle of view of the imaging device. . According to such a configuration, it is possible to specify the posture where the object may exist in the image from the known information of the aircraft.
- the moving body may further include a rotor blade.
- the predetermined angle is based on information including the shape of the rotor blade, the size of the rotor blade, the relationship between the position of the rotor blade and the aircraft, the relationship between the position of the imaging device and the aircraft, and the angle of view of the imaging device. May be set. According to this configuration, it is possible to specify the posture in which the object may exist in the image from the known information of the airframe and the rotor blade.
- the moving body may further include an acquisition unit that acquires the attitude information of the aircraft and the attitude information of the imaging device.
- the determination unit may perform determination using the attitude information of the aircraft and the attitude information of the imaging device.
- the posture acquisition unit may acquire posture information measured by a gyro sensor.
- a moving body includes: an imaging device; a determination unit that determines whether an object is present in an image captured by the imaging device based on a first value indicating the inclination of the aircraft; And a processing unit that performs a process of removing the object when it is determined that the object exists.
- the moving body determines whether or not the object exists based on the first value indicating the inclination of the aircraft.
- the moving body performs processing for removing the object from the image captured by the imaging device. According to such processing, it is possible to determine whether or not the image includes an object before performing processing for actually removing the object. Therefore, the processing load caused by executing the image processing can be reduced.
- the first value includes the wind speed, the second value operated at the operation terminal that operates the moving body, and the current value of the drive motor that drives the rotor blades of the moving body. At least one may be included.
- the moving body may further include a first acquisition unit that acquires the wind speed.
- the determination unit may determine that the object exists when the acquired wind speed exceeds a predetermined threshold.
- the wind speed may be obtained using an anemometer.
- the mobile object may further include a second acquisition unit that acquires a second value operated on the operation terminal.
- the determination unit may determine that the object exists when the acquired second value exceeds a predetermined threshold.
- the moving body may further include a plurality of rotor blades and a third acquisition unit that acquires current values of the rotor blades.
- the determination unit may determine that the object exists when the current value of the rotor blade on the opposite side to the traveling direction of the aircraft is larger than the current value of the rotor blade on the side of the aircraft traveling direction.
- a gimbal may be further provided between the airframe and the imaging device.
- the gimbal may control to keep the posture of the imaging device constant.
- the processing unit may extract a region where an object can be included in an image.
- the processing unit may detect an object from the extracted area.
- the processing unit may detect the object by converting the extracted region into feature amount information and comparing it with prestored object feature amount information. .
- the information indicating the feature amount may include a histogram or a spatial frequency.
- the processing unit may interpolate the object detected by the processing unit.
- the processing unit may interpolate pixels corresponding to the object detected by the processing unit using pixels in a region near the object.
- the processing unit may interpolate the pixels corresponding to the object detected by the processing unit using images that are temporally mixed in the image.
- the moving body may further include a storage unit.
- the processing unit may output at least one of an image before being processed by the processing unit and an image after being processed by the processing unit to the storage unit.
- the mobile body may further include a communication unit.
- the processing unit may output the image processed by the processing unit to the outside via the communication unit.
- the object may be a part of the aircraft.
- a control method is a control method for a moving body having a body and an imaging device.
- this moving body control method based on the relationship between the attitude of the imaging device and the attitude of the aircraft, a step of determining whether an object is present in an image captured by the imaging device, and an object is determined to exist. And removing the object.
- processing for determining whether or not an object exists in the image is performed based on the relationship between the posture of the imaging device and the posture of the aircraft.
- processing for removing the object from the image captured by the imaging device is performed. According to such processing, it is possible to determine whether or not the image includes an object before performing processing for actually removing the object. Therefore, the processing load caused by executing the image processing can be reduced.
- a control method is a control method for a moving body having an imaging device.
- this moving body control method based on a first value indicating the inclination of the aircraft, a step of determining whether or not an object is present in an image captured by the imaging device, and when it is determined that an object is present, Removing the object.
- processing for determining whether or not an object exists is performed based on the first value indicating the inclination of the aircraft.
- processing for removing the object from the image captured by the imaging device is performed. According to such processing, it is possible to determine whether or not the image includes an object before performing processing for actually removing the object. Therefore, the processing load caused by executing the image processing can be reduced.
- a storage medium is a computer-readable storage medium that stores a control program.
- This control program is a control program for a moving body having an airframe and an imaging device.
- the control program determines, on the computer, whether or not an object exists in an image captured by the imaging device based on the relationship between the orientation of the imaging device and the attitude of the aircraft, and the presence of the object. If so, the step of removing the object is executed.
- processing for determining whether or not an object exists in the image captured by the imaging device is executed.
- processing for removing the object is executed. According to such processing, it is possible to determine whether or not the image includes an object before performing processing for actually removing the object. Therefore, the processing load caused by executing the image processing can be reduced.
- a storage medium is a computer-readable storage medium that stores a control program.
- This control program is a control program for a moving object having an imaging device in a computer.
- the control program includes a step of determining whether or not an object exists in an image captured by the imaging device based on a first value indicating the inclination of the aircraft, and removing an object when it is determined that the object exists. And executing a step.
- processing for determining whether or not an object exists is executed based on the first value indicating the inclination of the aircraft.
- processing for removing the object from the image captured by the imaging device is executed. According to such processing, it is possible to determine whether or not the image includes an object before performing processing for actually removing the object. Therefore, the processing load caused by executing the image processing can be reduced.
- a control program is a control program for a moving body having an airframe and an imaging device.
- the control program determines, on the computer, whether or not an object exists in an image captured by the imaging device based on the relationship between the orientation of the imaging device and the attitude of the aircraft, and the presence of the object. If so, the step of removing the object is executed.
- processing for determining whether or not an object exists in the image captured by the imaging device is executed.
- processing for removing the object is executed. According to such processing, it is possible to determine whether or not the image includes an object before performing processing for actually removing the object. Therefore, the processing load caused by executing the image processing can be reduced.
- a control program is a control program for a moving object having an imaging device.
- This control program is a step of determining whether or not an object exists in an image captured by an imaging device based on a first value indicating the inclination of the aircraft, and when determining that an object exists, And removing the object.
- processing for determining whether or not an object exists is executed based on the first value indicating the inclination of the aircraft.
- processing for removing the object from the image captured by the imaging device is executed. According to such processing, it is possible to determine whether or not the image includes an object before performing processing for actually removing the object. Therefore, the processing load caused by executing the image processing can be reduced.
- the processing load when an object is removed by image processing can be reduced.
- FIG. 1 is a diagram illustrating an example of an image captured using an imaging device mounted on a multicopter that is an example of a moving object.
- the image 11 does not show the multicopter aircraft.
- the image 12 includes multi-copter propellers 13 and 14 as objects. Some areas in the image 12 are obscured by the propellers 13 and 14 of the multicopter.
- the moving body performs processing for removing the object by image processing on the image determined that the object exists.
- the moving body does not perform processing for removing the object by image processing on the image determined that the object does not exist. Since processing for analyzing an image and removing an object is not performed on all images, the processing load can be reduced.
- Whether or not the object exists in the image (whether or not the object may be reflected in the image) can be determined based on the relative posture between the moving body and the imaging device.
- the determination process based on the relative posture between the moving body and the imaging device has a lighter processing load than the process of determining whether there is a reflection by analyzing the contents of the image. Therefore, it is possible to reduce the processing load when the object is removed by image processing.
- the moving body may be a manned aircraft.
- the moving body is a concept including other aircraft that moves in the air, a vehicle that moves on the ground, a ship that moves on the water, a robot, and the like.
- FIG. 2 is a diagram showing an example of the appearance of the UAV 101 according to the present embodiment.
- the UAV 101 includes a UAV body 210, a plurality of rotor blades 220, a gimbal 230, an imaging device 240, and a camera 250.
- the flight of the UAV body 210 is controlled by controlling the rotation of the plurality of rotor blades 220.
- the UAV 101 can be configured to have four rotor blades.
- the number of rotor blades is not limited to four.
- the number of rotor blades may be any number.
- the UAV 101 may be a type of UAV having fixed wings that do not have rotating wings.
- the UAV 101 may be a type of UAV having both a rotary wing and a fixed wing.
- the gimbal 230 is provided in the UAV body 210.
- the gimbal 230 supports the imaging device 240 in a rotatable manner.
- the gimbal 230 can control the rotation of the imaging device 240 around the yaw axis, the pitch axis, and the roll axis.
- the imaging device 240 captures a subject around the UAV body 210 and obtains image data.
- the imaging device 240 is controlled to be rotatable by the gimbal 230.
- the imaging device 240 can be configured to include at least a lens and an imaging sensor.
- the plurality of cameras 250 can be sensing cameras for controlling the flight of the UAV 101.
- two cameras 250 may be provided on the front which is the nose of the UAV aircraft 210.
- Two cameras 250 may be provided on the bottom surface of the UAV body 210.
- the pair of cameras 250 may be provided on at least one of the nose, the tail, the side surface, the bottom surface, and the ceiling surface.
- the camera 250 can be configured to include at least a lens and an imaging sensor.
- FIG. 3 is a diagram showing an example of a block diagram of the configuration of the UAV 101 according to the present embodiment.
- the UAV 101 includes a UAV control unit 310 that controls the entire UAV, a memory 320, and a communication interface 330.
- the UAV control unit 310 can control the rotary blade mechanism 340, the gimbal 230, the imaging device 240, and the camera 250.
- the UAV control unit 310 controls the entire UAV in accordance with a software program stored in the memory (storage unit) 320, for example.
- the UAV control unit 310 controls the entire UAV in accordance with an instruction received from a remote controller terminal (operation terminal) through the communication interface (communication unit) 330.
- the UAV control unit 310 controls the flight of the UAV and performs the imaging control of the imaging device 240.
- the UAV control unit 310 can be configured by, for example, a microprocessor such as a CPU or MPU, a microcontroller such as an MCU, or the like.
- the memory 320 may store a software program that controls the entire UAV.
- the memory 320 may store various data such as image data captured by the imaging device 240 and the camera 250, various information of the UAV 101, and the like.
- a computer-readable storage medium can be used as the memory.
- flash memory such as SRAM, DRAM, EEPROM, USB memory can be used.
- the memory 320 may be provided in the housing of the UAV 101.
- the memory 320 may be removable from the UAV 101.
- the communication interface 330 can receive an instruction from the remote controller terminal by wireless communication, and can transmit various data and information stored in the memory of the UAV 101.
- the communication interface 330 can also receive a signal from a GNSS (Global Navigation Satellite System) positioning system.
- GNSS Global Navigation Satellite System
- the rotating blade mechanism 340 can include a plurality of rotating blades 220 and a plurality of drive motors that rotate the plurality of rotating blades 220.
- the UAV 101 may have various sensors such as a barometer, a laser, an acceleration, and a gyro.
- the UAV 101 can include other devices and mechanisms.
- FIG. 4 is a diagram illustrating an example of a remote controller terminal 400 that controls the UAV 101.
- the remote controller terminal 400 includes an operation unit 410, a display unit 420, and a support unit 430.
- the operation unit 410 receives an operation instruction input from the user.
- An operation instruction from the user is transmitted from the remote controller terminal 400 to the UAV 101 through a communication interface (not shown).
- the display unit 420 displays various information related to the UAV 101.
- the display unit 420 may display an image captured by the UAV 101 with the imaging device 240.
- the display unit 420 may display an image captured by the UAV 101 with the camera 250.
- the display unit 420 may be removable from the remote controller terminal 400.
- the display unit 420 may be various portable information terminal devices such as a tablet terminal and a smartphone terminal.
- the support unit 430 supports the display unit 420.
- the support part 430 may be adjustable to an arbitrary angle.
- the display unit 420 may be attached to or removed from the support unit 430.
- the display unit 420 may be integrated with the support unit 430.
- FIG. 5 is a diagram schematically showing the UAV 101 shown in FIG.
- FIG. 5 is a diagram for explaining the angle of view of the imaging device 240 provided in the UAV 101.
- a gimbal 230 is provided between the UAV body 210 and the imaging device 240.
- the gimbal 230 can keep the posture of the imaging device 240 constant.
- the gimbal 230 controls the imaging device 240 so that it does not vibrate even if vibration occurs in the UAV body 210.
- the gimbal 230 can keep the angle (posture) of the imaging device 240 at an arbitrary angle (posture). Accordingly, the UAV body 210 and the imaging device 240 can maintain different angles (attitudes).
- FIG. 6 is a diagram showing an example in which the UAV body 210 is tilted forward in the traveling direction from the posture shown in FIG.
- the imaging device 240 maintains the same posture as shown in FIG.
- a part 610 of the rotary blade 220 is included in the angle of view 600 of the imaging device 240. That is, a part 610 of the rotor blade 220 is reflected in an image captured by the imaging device 240.
- the relationship as shown in FIG. 6 is the posture relationship of the UAV body 210, the positional relationship between the rotor blades 220 and the UAV body 210, the posture of the imaging device 240, and the positional relationship between the imaging device 240 and the UAV body 210.
- a part of the UAV body 210 (for example, a frame) may be included in an image captured by the imaging device 240.
- the object in this embodiment is a part of the UAV body 210.
- a part of the UAV body 210 may include a frame of the UAV body 210.
- a part of the UAV body 210 may include a rotor wing 220.
- other mechanisms (not shown) provided in the UAV 101 may be included in a part of the UAV body 210.
- the UAV body 210 tilts forward in the traveling direction as shown in FIG. This inclination is caused by relatively increasing the rotational force of the rotor blade 220 on the rear side in the traveling direction relative to the rotational force of the rotor blade 220 on the front side in the traveling direction. With this operation, the UAV 101 is given a force that moves in the traveling direction.
- the relative attitude between the UAV body 210 and the imaging device 240 can change dynamically.
- the fact that the relative attitude between the UAV body 210 and the imaging device 240 can dynamically change can be said to indicate whether or not an object exists in the image.
- the field of view obtained from the image is substantially narrowed compared to a normal case where the UAV 101 is not moved at a high speed.
- the field of view obtained from an image becomes narrow when moving at high speed, there is a possibility that an operator who operates the UAV 101 while viewing the image on the display unit 420 of the remote controller terminal 400 may interfere with the operation of the UAV 101.
- the object may exist in the image.
- the UAV aircraft 210 may be greatly inclined.
- the gimbal 230 keeps the imaging device 240 in a certain posture, the object may exist in the image. That is, there is a possibility that a part of the rotor wing 220, a part of the frame of the UAV body 210, or a part of other mechanism is reflected in the image.
- an operator who operates the UAV 101 while viewing the image on the display unit 420 of the remote controller terminal 400 confirms the image and quickly performs various operations. At this time, if the object exists in the image, there may be a delay in checking the current status of the UAV 101.
- the UAV 101 performs processing for analyzing an image and determining whether an object exists. If the object exists, the UAV 101 performs processing for removing the object from the image by image processing. However, if these processes are performed on all the images obtained by the imaging device 240, the processing load on the UAV 101 increases.
- processing for determining whether an object exists is performed on an image captured when a predetermined condition is satisfied.
- the object is removed from the image determined that the object exists.
- image processing is not performed on all images obtained by imaging by the imaging device 240. Therefore, the processing load can be reduced as compared with a case where image processing is performed on all images obtained by the imaging device 240.
- FIG. 7 is a block diagram showing an example of the configuration of the UAV control unit 310 according to the present embodiment.
- the UAV control unit 310 includes an attitude acquisition unit 710, a wind speed acquisition unit 720, an operation instruction acquisition unit 730, a current value acquisition unit 740, an image acquisition unit 750, a determination unit 760, an image processing unit 770, and an output unit 780.
- the image processing unit 770 includes a candidate area extraction unit 771 and an interpolation unit 772.
- FIG. 7 is merely an example, and other configurations may be included.
- the attitude acquisition unit 710 acquires the attitude information of the UAV body 210.
- the posture information is information indicating the rotation angle of the yaw axis, pitch axis, and roll axis, for example.
- the UAV 101 may include an inertial measurement device (IMU: Inertial Measurement Unit).
- the IMU may include three gyro sensors and three acceleration sensors with respect to three orthogonal axes.
- the UAV 101 may include a triaxial geomagnetic sensor.
- the attitude acquisition unit 710 may acquire the attitude information of the UAV body 210 using an acceleration sensor, a gyro sensor, a triaxial geomagnetic sensor, and the like.
- the posture information acquired by the posture acquisition unit 710 is sent to the determination unit 760 and the candidate area extraction unit 771.
- the posture acquisition unit 710 may acquire posture information of the imaging device 240. That is, the posture acquisition unit 710 may acquire posture information of the imaging device 240 controlled by the gimbal 230. The posture acquisition unit 710 may acquire the posture information of the imaging device 240 based on the control information of the gimbal 230.
- the imaging device 240 may include an acceleration sensor, a gyro sensor, and a triaxial geomagnetic sensor, and the posture acquisition unit 710 acquires posture information of the imaging device 240 using information from the sensor provided in the imaging device 240. You can do it.
- the attitude of the imaging device 240 may be changed to an arbitrary angle by control from the remote controller terminal 400.
- the wind speed acquisition unit (first acquisition unit) 720 acquires the wind speed.
- the wind speed acquisition unit 720 sends the acquired wind speed to the determination unit 760.
- the UAV 101 may include an anemometer.
- the wind speed acquisition unit 720 may acquire a value obtained by an anemometer.
- the wind speed acquisition unit 720 may calculate the wind speed based on the acceleration and angular velocity obtained by the IMU. For example, it is assumed that the UAV 101 should be moved to the position A ′ as a result of calculation based on the value obtained by the IMU, where the UAV 101 should be moved to the position A in the calculation under the control of the UAV control unit 310 of the UAV 101. The wind speed acquisition unit 720 may calculate the wind speed based on such a relationship between the position A ′ and the position A.
- the wind speed acquisition unit 720 may acquire information indicating the wind speed of the route from the outside through the communication interface 330.
- the wind speed acquisition unit 720 may acquire the wind speed based on the difference between the ground speed and the air speed of the UAV 101.
- the ground speed may be acquired using GNSS or the like, and the air speed may be acquired using an air speed measuring device.
- the wind speed acquisition unit 720 may acquire the wind speed based on these differences.
- the operation instruction acquisition unit (second acquisition unit) 730 acquires the operation instruction received by the operation unit 410 of the remote controller terminal 400 via the communication interface 330.
- the operation instruction may be a value (second value) indicating the inclination of the controller stick of the operation unit 410. It may be an instruction to increase the speed of the UAV 101 as the inclination of the controller stick of the operation unit 410 increases.
- the UAV control unit 310 may control the flight of the UAV 101 based on the acquired operation instruction.
- the current value acquisition unit (third acquisition unit) 740 acquires a current value that flows through the drive motor of the rotary blade mechanism 340.
- the current value acquisition unit 740 acquires a current value that flows through the drive motor of the rotary blade mechanism 340 corresponding to each of the multiple rotary blades 220.
- the image acquisition unit 750 acquires an image captured using the imaging device 240.
- the image acquisition unit 750 outputs the acquired image to the image processing unit 770.
- the determination unit 760 determines whether an object exists in the image input to the image processing unit 770.
- the determination unit 760 includes the posture information acquired by the posture acquisition unit 710, the wind speed acquired by the wind speed acquisition unit 720, the operation instruction acquired by the operation instruction acquisition unit 730, and the current value acquired by the current value acquisition unit 740. A determination based on at least one of the above may be made.
- the result determined by the determination unit 760 is output to the image processing unit 770. Details will be described later.
- the image processing unit 770 performs various image processes on the input image.
- the image processing unit 770 includes a candidate area extraction unit 771 and an interpolation unit 772.
- the image processing unit 770 may execute another process in addition to the candidate area extraction process by the candidate area extraction unit 771 and the interpolation process by the interpolation unit 772.
- the image processing unit 770 outputs an image subjected to various types of processing to the output unit 780.
- the candidate area extraction unit 771 extracts candidate areas of the input image.
- a candidate area is an area where an object may exist.
- the candidate area extraction unit 771 may extract candidate areas using known information.
- the known information may include, for example, the shape and size of the UAV airframe 210 of the UAV 101.
- the known information may include the shape and size of the rotor blade 220.
- the known information may include a positional relationship between the rotor wing 220 and the UAV airframe 210.
- the known information may include a positional relationship between the imaging device 240 and the UAV body 210.
- the known information may include the angle of view of the imaging device 240 (horizontal angle of view and vertical angle of view with respect to the optical axis).
- the candidate area extraction unit 771 may extract candidate areas in the image by using such known information.
- the candidate area extraction unit 771 may further extract the candidate area using the posture information acquired by the posture acquisition unit 710. Details will be described later.
- the interpolation unit 772 performs an interpolation process on the candidate region extracted by the candidate region extraction unit 771.
- the interpolation unit 772 performs processing for detecting an object from the candidate area.
- the interpolation unit 772 performs a process of removing the detected object.
- a process for removing the object a known process called in-painting may be applied.
- the output unit 780 outputs the image output by the image processing unit 770.
- the output unit 780 may output an image to the memory 320.
- the output unit 780 may transmit an image to the remote controller terminal 400 through the communication interface 330.
- the output unit 780 may output the image to the memory 320 and transmit it to the remote controller terminal 400.
- the output unit 780 may transmit an image to another device (for example, a server on the cloud) through the communication interface 330.
- FIG. 8 is a flowchart showing an example of processing according to the present embodiment.
- the process shown in the figure is executed by the UAV control unit 310. This process is executed at the timing when the image acquisition unit 750 acquires an image. In the case of a moving image, it may be executed at the acquisition timing of each frame (still image) constituting the moving image.
- the process shown in FIG. 8 is an example of a control method executed by the UAV control unit 310.
- the determination unit 760 can determine whether or not the first parameter indicated by the relationship between the attitude of the imaging device 240 and the attitude of the UAV body 210 exceeds a predetermined threshold.
- the determination unit 760 may determine whether or not the angle formed by the tilt of the UAV body 210 and the tilt of the imaging device 240 indicated by the posture information acquired by the posture acquisition unit 710 is greater than a predetermined angle.
- the determination unit 760 determines whether or not the angle formed by the posture information acquired by the posture acquisition unit 710 is greater than a predetermined angle. If the angle formed by the tilt of the imaging device 240 and the tilt of the UAV body 210 is larger than a predetermined angle, there is a possibility that an object exists in the image.
- the UAV control unit 310 advances the process to step S820. It can be said that this inclination indicates a relative relationship between the posture of the imaging device 240 and the posture of the UAV body 210.
- the determination unit 760 determines whether or not the difference between the inclination in the optical axis direction of the imaging device 240 with respect to the reference line and the inclination of the UAV body 210 with respect to the reference line is greater than a predetermined angle. May be determined. For example, if the reference line is a horizontal line, it may be determined whether the difference between the inclination (elevation angle) of the imaging device 240 with respect to the horizontal line and the inclination of the UAV body 210 with respect to the horizontal line (elevation angle) is greater than a predetermined angle. Good. For the tilt, information indicating the rotation angle of the pitch axis obtained from the UAV body 210 and the gyro sensor of the imaging device 240 may be used.
- FIG. 9 is a diagram showing the traveling direction 901 of the UAV body 210 and the optical axis direction 902 of the imaging device 240.
- the known information may include the shape and size of the UAV airframe 210.
- the known information may include the shape and size of the rotor blade 220a.
- the known information may include a positional relationship between the rotor 220a and the UAV body 210.
- FIG. 9 an example in which information related to the moving blade 220a on the traveling direction side is included as known information is shown, but information related to the rotating blade 220b on the opposite side to the moving direction may be included.
- the known information may include a positional relationship between the imaging device 240 and the UAV body 210.
- the known information may include the angle of view of the imaging device 240 (horizontal angle of view and vertical angle of view with respect to the optical axis).
- the angle formed by the tilt of the UAV body 210 and the tilt of the imaging device 240 when an object exists in the image can be specified.
- the angle ⁇ formed by the orientation 951 of the UAV body 210 with respect to the optical axis direction 952 of the imaging device 240 is larger than a predetermined angle, it may be determined that an object exists in the image.
- the predetermined angle used as the threshold value may be set in advance from the known information described above.
- the inclination of the UAV body 210 relative to the optical axis direction of the imaging device 240 is smaller than in the example shown in FIG. Even so, it may be determined that an object exists in the image.
- the inclination of the UAV body 210 with respect to the optical axis direction of the imaging device 240 is as shown in FIG. Even if it is larger than the example shown in FIG. 4, it may be determined that no object exists in the image.
- the imaging device 240 when the imaging device 240 is attached to the lower side in the vertical direction of the UAV body 210, the inclination of the UAV body 210 with respect to the optical axis direction of the imaging device 240 is larger than the example shown in FIG. 9. Even in this case, it may be determined that no object exists in the image.
- the threshold value corresponding to the posture information is set based on known information. If the angle formed by the tilt of the UAV body 210 and the tilt of the imaging device 240 exceeds a threshold value (predetermined angle), there is a possibility that the object exists in the image. Therefore, the UAV control unit 310 advances the process to step S820.
- the determination unit 760 may determine whether or not the second parameter (first value) indicating the inclination of the aircraft exceeds a predetermined threshold value.
- the parameter may be at least one of the wind speed acquired by the wind speed acquisition unit 720, the operation instruction acquired by the operation instruction acquisition unit 730, and the current value acquired by the current value acquisition unit 740.
- a corresponding threshold value may be set in advance for each parameter.
- the determination unit 760 may determine whether or not the wind speed acquired by the wind speed acquisition unit 720 exceeds a predetermined wind speed.
- the determination unit 760 may determine whether or not the value indicated by the operation instruction acquired by the operation instruction acquisition unit 730 exceeds a predetermined input value.
- the determination unit 760 may determine whether or not the current value acquired by the current value acquisition unit 740 exceeds a predetermined current value.
- the determination unit 760 may advance the process to step S820.
- the determination unit 760 may advance the process to step S820.
- the determination unit 760 determines whether or not the wind speed acquired by the wind speed acquisition unit 720 exceeds a predetermined wind speed. For example, it is assumed that 15 m / s is set as a predetermined wind speed threshold. The determination unit 760 determines that a value exceeding a predetermined threshold has been detected when the wind speed acquired by the wind speed acquisition unit 720 exceeds 15 m / s. It is highly possible that the UAV body 210 is tilted when the wind speed exceeding the predetermined threshold is measured. Therefore, since there may be an object in the image, the UAV control unit 310 advances the process to step S820.
- the predetermined input value may be a value indicated by an operation instruction from the remote controller terminal 400.
- the determination unit 760 determines that a value exceeding the predetermined threshold value has been detected. For example, when the operation unit 410 of the remote controller terminal 400 is tilted more strongly than a predetermined position, it is determined that a predetermined input value exceeds a predetermined threshold.
- the UAV control unit 310 advances the process to step S820.
- the determination unit 760 determines whether or not the current value acquired by the current value acquisition unit 740 exceeds a predetermined current value.
- the predetermined current value may be a current value flowing through the motor of the rotary blade mechanism 340.
- the value of current flowing through the two motors on the opposite sides of the UAV 101 travel direction exceeds a predetermined threshold value.
- the UAV 101 is flying at high speed.
- the predetermined current value may be the current value of the motor on the traveling direction side or the current value of the motor on the opposite side to the traveling direction. A combination of these may also be used. For example, when the current value flowing through the motor on the opposite side of the traveling direction of the UAV 101 is relatively larger than the current value flowing through the traveling direction side, the predetermined current value may exceed the threshold value. In any case, there is a high possibility that the UAV 101 is flying at high speed. As the UAV 101 flies at high speed, there is a high possibility that the UAV body 210 is tilted. Therefore, since there may be an object in the image, the UAV control unit 310 advances the process to step S820.
- step S810 it is not necessary to search the image to confirm whether the object actually exists in the image.
- a process is performed to determine whether there is a possibility that the object exists in the image. If there is a possibility that the object actually exists in the image, the process proceeds to a step described later, and the image processing unit 770 performs processing for searching the image and detecting the object.
- Step S810 It is a heavy load for the image processing unit 770 to perform object detection processing and removal processing on all images acquired by the image acquisition unit 750.
- the determination process in step S810 is performed. With this determination process, it is possible to determine whether or not the image is likely to exist without performing image processing. Step S810 can be said to be processing for determining whether or not the UAV body 210 is tilted or is likely to tilt.
- the determination unit 760 preferably performs the determination process using the posture information, the wind speed, the operation instruction, and the current value when the image acquisition unit 750 acquires the image, but is not limited thereto.
- determination may be made using common posture information, wind speed, operation instruction, and current value. For example, in the case of a movie of 30 fps (frames per second), assuming that the predetermined time is 1 second, for 30 frames, the posture information, wind speed, operation instruction, and current value at the time of the first frame therein May be applied to the remaining 29 frames.
- step S810 when the determination unit 760 detects a parameter that exceeds the threshold, the process proceeds to step S820. If no parameter exceeding the threshold is detected in step S810, it is considered that the object does not exist in the image. Therefore, the image processing described below is not performed. Therefore, the processing load can be reduced.
- the predetermined threshold value used in the determination in step S810 may be set to an arbitrary value according to an instruction from the remote controller terminal 400.
- the predetermined threshold value may be changed by an instruction from the remote controller terminal 400 during the flight.
- the candidate area extraction unit 771 extracts candidate areas.
- the structure of the UAV 101 is known. Using these pieces of known information, it is possible to specify in which region in the image captured by the imaging device 240 there is a possibility that the object will appear. In other words, the candidate area extraction unit 771 can extract candidate areas using known information.
- the candidate area extraction unit 771 may extract a candidate area using the posture information acquired by the posture acquisition unit 710 in addition to the known information.
- Candidate regions may be extracted from the tilt of the UAV body 210 and the tilt of the imaging device 240. In FIG. 9 described above, if the tilt (posture) of the UAV body 210, the tilt (posture) of the imaging device 240, and known information are used, a candidate area is displayed above the image captured by the imaging device 240. Is extracted.
- FIG. 10 is a diagram illustrating an image 1000 captured by the imaging device 240.
- the image 1000 includes candidate areas 1010 and 1020.
- Candidate areas 1010 and 1020 are areas where objects can exist.
- the candidate areas 1010 and 1020 are only areas where objects can exist.
- the rotary blade 220 rotates around the rotation axis. Even if the UAV fuselage 210 is tilted to such an extent that a part of the rotor blade 220 can be present in the image, the image including the rotor blade 220 and the rotor blade 220 are included depending on the imaging timing. In some cases, an unacceptable image is captured.
- Candidate regions 1010 and 1020 may be regions including a circular range that can be taken in the image around the rotation axis by the rotary blade 220.
- step S830 the candidate area extraction unit 771 determines whether a candidate area has been extracted. If the candidate area is not extracted, the subsequent process is unnecessary, and the process ends. Therefore, there is no need to perform image processing such as detecting and removing an object. When the candidate area is extracted, the process proceeds to step S840.
- step S840 the interpolation unit 772 analyzes the candidate area extracted in step S820 to detect an object, and determines whether an object exists in the candidate area.
- FIG. 11 is a diagram illustrating an example in which the interpolation unit 772 detects the objects 1110 and 1120. Examples of the object include a part of the UAV body 210 and a part of the rotor wing 220 as described above. Appearance textures such as the shape and color of the UAV fuselage 210 and rotor 220 are known. Therefore, the interpolation unit 772 may determine whether or not these known objects are detected from the candidate area. The interpolation unit 772 may compare the object information stored in the memory 320 in advance with the object information of the image captured by the imaging device 240.
- feature amount information such as an object color histogram and spatial frequency is stored in the memory 320 in advance.
- the interpolation unit 772 converts the candidate area of the image captured by the imaging device 240 into information on feature amounts such as a color histogram and spatial frequency.
- the interpolation unit 772 may detect the objects 1110 and 1120 from the candidate areas 1010 and 1020 by comparing the feature amount information obtained by the conversion with the feature amount information stored in advance.
- step S850 If an object is detected in the candidate area, the process proceeds to step S850. If no object is detected in the candidate area, the process ends.
- the interpolation unit 772 interpolates the detected object.
- the interpolation unit 772 treats the detected pixels of the objects 1110 and 1120 as a missing area.
- the interpolation unit 772 performs an interpolation process for interpolating the missing area.
- an interpolation process there is a process called inpainting in which pixels in a defective area are interpolated using pixels around the defective area in one image.
- FIG. 12 is a diagram for explaining an interpolation process for interpolating a missing area.
- the interpolation unit 772 interpolates the pixel values of the objects 1110 and 1120 using the pixels in the areas 1210 and 1220 around (near) the objects 1110 and 1120 detected in the image 1000.
- the surrounding areas 1210 and 1220 may be defined as a rectangle that is at least n pixels away from the defect area.
- inpainting for example, a known method described in Non-Patent Document 1 and Non-Patent Document 2 may be applied.
- the objects 1110 and 1120 are removed from the image 1000.
- the objects 1110 and 1120 are converted into a region with less discomfort by an interpolation process using pixels in a region near the objects.
- the above-described neighborhood may be determined from the size and position of the object to be interpolated.
- the interpolation range can be determined by LX ⁇ ⁇ + ⁇ .
- ⁇ may be a parameter determined from the acquired whole image and the size and position of the object to be interpolated.
- ⁇ is a pixel.
- ⁇ may be a parameter used to supplement when LX is small.
- the process of interpolating the missing area only needs to be removed from the image 1000 to the extent that the objects 1110 and 1120 do not feel strange.
- the outline of the objects 1110 and 1120 may remain slightly after the interpolation processing is executed.
- the uncomfortable feeling is reduced in the image after the interpolation process compared to before the interpolation process. Therefore, the objects 1110 and 1120 may not be completely removed from the image 1000. Some of the objects 1110 and 1120 may remain in the image after the interpolation processing.
- the interpolation unit 772 has described an example in which an object included in one image is interpolated using pixels around the object in the one image.
- the interpolating unit 772 may interpolate the object using temporally forward, backward, or both front and rear frames.
- the object is a rotary blade 220
- a part of the rotary blade 220 may or may not exist in the image depending on the frame.
- the temporally preceding and succeeding frames there may be a frame in which no object exists, and therefore interpolation processing may be performed using the frame.
- the interpolation unit 772 uses a process for interpolating an object included in one image using pixels around the object in the one image, and uses both frames before, after, and before and after in time.
- the process of interpolating the object may be combined.
- the UAV mainly including the rotor blade has been described as an example.
- the present invention is not limited to this.
- the processing described in the above-described embodiment can be applied to any moving body that can have a mode in which the attitude of the machine body and the attitude of the imaging device are different.
- the imaging device is attached to the lower part of the airframe
- the location where the imaging device is attached may be on the side of the aircraft or on the top. It may be attached at any position.
- the example in which the image processing unit 770 switches whether to perform the processing of the candidate region extraction unit 771 and the interpolation unit 772 according to the determination result of the determination unit 760 has been described. Absent. Other configurations may be used.
- the determination result of the determination unit 760 may be sent to the image acquisition unit 750 instead of the image processing unit 770.
- the image acquisition unit 750 may switch between outputting an image to the image processing unit 770 or outputting an image to the output unit 780 without going through the image processing unit 770 based on the determination result of the determination unit 760.
- the image processing unit 770 may output both the image before the interpolation by the interpolation unit 772 (referred to as an image before interpolation) and the image after the interpolation (referred to as an image after interpolation) to the output unit 780.
- the output unit 780 may output both the pre-interpolation image and the post-interpolation image to the memory 320. With this configuration, the user can compare both the original image and the image from which the object has been removed.
- the example in which the optical axis direction of the imaging device 240 is fixed in a fixed direction and is maintained in a fixed posture by the gimbal 230 has been described, but is not limited thereto.
- the optical axis direction of the imaging device 240 may be changed by an instruction from the remote controller terminal 400.
- the gimbal 230 continues to control the posture of the imaging device 240 so as to maintain the changed posture in the optical axis direction.
- the determination unit 760 may perform determination using the posture information of the imaging device 240 after the change.
- the mode of determining whether there is a possibility that an object exists in the image by using the inclination of the UAV body 210 has been described.
- the fact that the UAV body 210 is tilted means that the rotor wing 220 is also tilted in accordance with the tilt of the UAV body 210.
- the gyro sensor is provided in the rotary blade 220, it may be determined whether an object exists in the image using the attitude of the rotary blade 220.
- step S820 when the value of the predetermined parameter exceeds the threshold corresponding to the parameter in step S810, the process proceeds to step S820 to perform the process of extracting the candidate area.
- the present invention is not limited to this. Absent.
- the process of step S810 may be omitted and the process may be started from step S820. For example, it may be determined whether a candidate area is extracted from an image using known information, the posture of the UAV body 210, and the posture of the imaging device 240.
- steps S820 and S830 may be omitted. If the value of the predetermined parameter exceeds the threshold value corresponding to the parameter in step S810, it is considered that there is a possibility that the object exists somewhere in the image.
- the area to be removed can be specified, so the processing load can be reduced.
- the processing load can be reduced even if steps S820 and S830 are not performed. This is because it is not necessary to perform image processing on all images.
- Image processing may be performed in the remote controller terminal 400.
- the remote controller terminal 400 may include an image processing unit. Data acquired by the posture acquisition unit 710, the wind speed acquisition unit 720, the operation instruction acquisition unit 730, the current value acquisition unit 740, and the image acquisition unit 750 in FIG. 7 may be transmitted to the remote controller terminal 400 via the output unit 780. .
- the remote controller terminal 400 may perform the process shown in FIG. 8 using the transmitted data.
- Each unit for realizing the functions of the above-described embodiments can be implemented by, for example, hardware or software.
- program code for controlling hardware may be executed by various processors of a computer such as a CPU and MPU.
- Hardware such as a circuit for realizing the function of the program code may be provided.
- a part of the program code may be realized by hardware, and the remaining part may be executed by various processors.
Landscapes
- Engineering & Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Mechanical Engineering (AREA)
- Remote Sensing (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Microelectronics & Electronic Packaging (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Studio Devices (AREA)
- Closed-Circuit Television Systems (AREA)
- Image Analysis (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
Description
210 UAV機体
220 回転翼
230 ジンバル
240 撮像装置
250 カメラ
310 UAV制御部
320 メモリ
330 通信インタフェース
340 回転翼機構
400 リモートコントローラ端末
410 操作部
420 表示部
430 支持部
710 姿勢取得部
720 風速取得部
730 操作指示取得部
740 電流値取得部
750 画像取得部
760 判定部
770 画像処理部
771 候補領域抽出部
772 補間部
780 出力部 101 UAV
210
Claims (31)
- 機体と、
撮像装置と、
前記撮像装置の姿勢と前記機体の姿勢との関係に基づいて、前記撮像装置が撮像した画像にオブジェクトが存在するか否かを判定する判定部と、
前記オブジェクトが存在すると判定された場合、前記オブジェクトを除去する処理を行う処理部と
を有する、移動体。 The aircraft,
An imaging device;
A determination unit that determines whether or not an object exists in an image captured by the imaging device based on a relationship between a posture of the imaging device and a posture of the body;
And a processing unit that performs a process of removing the object when it is determined that the object exists. - 前記判定部は、前記撮像装置の傾きと前記機体の傾きとが成す角度に基づいて前記判定を行う、請求項1に記載の移動体。 The moving body according to claim 1, wherein the determination unit performs the determination based on an angle formed by an inclination of the imaging device and an inclination of the airframe.
- 前記角度が所定の角度より大きい場合、前記判定部は、前記オブジェクトが存在すると判定する、請求項2に記載の移動体。 The moving body according to claim 2, wherein when the angle is larger than a predetermined angle, the determination unit determines that the object exists.
- 前記所定の角度は、前記機体の形状、前記機体の大きさ、前記撮像装置と前記機体との位置の関係、および前記撮像装置の画角を含む情報に基づいて設定される、請求項3に記載の移動体。 The predetermined angle is set based on information including a shape of the airframe, a size of the airframe, a positional relationship between the imaging device and the airframe, and an angle of view of the imaging device. The moving body described.
- 回転翼をさらに有し、
前記所定の角度は、前記回転翼の形状、前記回転翼の大きさ、前記回転翼と前記機体との位置の関係、前記撮像装置と前記機体との位置の関係、および前記撮像装置の画角を含む情報に基づいて設定される、請求項3に記載の移動体。 A rotating blade,
The predetermined angle includes the shape of the rotor blade, the size of the rotor blade, the relationship between the position of the rotor blade and the body, the relationship between the position of the imaging device and the body, and the angle of view of the imaging device. The mobile body according to claim 3, which is set based on information including - 前記機体の姿勢情報および前記撮像装置の姿勢情報を取得する取得部をさらに有し、
前記判定部は、前記機体の姿勢情報および前記撮像装置の姿勢情報を用いて前記判定を行う、請求項1から5のいずれか一項に記載の移動体。 It further includes an acquisition unit that acquires posture information of the aircraft and posture information of the imaging device,
The mobile unit according to claim 1, wherein the determination unit performs the determination using posture information of the body and posture information of the imaging device. - 前記取得部は、ジャイロセンサによって測定された姿勢情報を取得する、請求項6に記載の移動体。 The moving body according to claim 6, wherein the acquisition unit acquires posture information measured by a gyro sensor.
- 撮像装置と、
機体の傾きを示す第一の値に基づいて、前記撮像装置が撮像した画像にオブジェクトが存在するか否かを判定する判定部と、
前記オブジェクトが存在すると判定された場合、前記オブジェクトを除去する処理を行う処理部と
を有する、移動体。 An imaging device;
A determination unit that determines whether or not an object exists in an image captured by the imaging device, based on a first value indicating a tilt of the airframe;
And a processing unit that performs a process of removing the object when it is determined that the object exists. - 前記第一の値は、風速、前記移動体を操作する操作端末において操作された第二の値、および前記移動体の回転翼を駆動する駆動モータの電流値の少なくとも1つを含む、請求項8に記載の移動体。 The first value includes at least one of a wind speed, a second value operated at an operation terminal that operates the moving body, and a current value of a drive motor that drives a rotor blade of the moving body. The moving body according to 8.
- 前記風速を取得する第一取得部をさらに有し、
前記判定部は、取得された前記風速が所定の閾値を超える場合、前記オブジェクトが存在すると判定する、請求項9に記載の移動体。 A first acquisition unit that acquires the wind speed;
The moving body according to claim 9, wherein the determining unit determines that the object exists when the acquired wind speed exceeds a predetermined threshold. - 前記風速は、風速計を用いて取得される、請求項10に記載の移動体。 The moving body according to claim 10, wherein the wind speed is obtained using an anemometer.
- 前記操作端末において操作された第二の値を取得する第二取得部をさらに有し、
前記判定部は、取得された前記第二の値が所定の閾値を超える場合、前記オブジェクトが存在すると判定する、請求項9から11のいずれか一項に記載の移動体。 A second acquisition unit for acquiring a second value operated on the operation terminal;
The moving body according to claim 9, wherein the determination unit determines that the object exists when the acquired second value exceeds a predetermined threshold. - 複数の回転翼と、
前記複数の回転翼のそれぞれの電流値を取得する第三取得部と
をさらに有し、
前記判定部は、前記機体の進行方向とは反対側の前記回転翼の電流値が、前記機体の進行方向の側の前記回転翼の電流値より大きい場合、前記オブジェクトが存在すると判定する、請求項9から12のいずれか一項に記載の移動体。 A plurality of rotor blades,
A third acquisition unit for acquiring the current value of each of the plurality of rotor blades;
The determination unit determines that the object exists when a current value of the rotor blade on a side opposite to a traveling direction of the airframe is larger than a current value of the rotor blade on a traveling direction side of the airframe. Item 13. The moving body according to any one of Items 9 to 12. - 前記機体と前記撮像装置との間に更にジンバルを有する、請求項1から13のいずれか一項に記載の移動体。 The moving body according to any one of claims 1 to 13, further comprising a gimbal between the airframe and the imaging device.
- 前記ジンバルは、前記撮像装置の姿勢を一定に保つ制御をする、請求項14に記載の移動体。 15. The moving body according to claim 14, wherein the gimbal controls to keep the posture of the imaging device constant.
- 前記処理部は、前記画像において前記オブジェクトが含まれ得る領域を抽出する、請求項1から15のいずれか一項に記載の移動体。 The mobile unit according to any one of claims 1 to 15, wherein the processing unit extracts a region in which the object can be included in the image.
- 前記処理部は、前記領域から前記オブジェクトを検出する、請求項16に記載の移動体。 The moving body according to claim 16, wherein the processing unit detects the object from the area.
- 前記処理部は、前記領域を特徴量の情報に変換し、予め記憶されている前記オブジェクトの特徴量の情報と比較することで前記オブジェクトを検出する、請求項17に記載の移動体。 The mobile unit according to claim 17, wherein the processing unit detects the object by converting the area into feature amount information and comparing the region with feature value information of the object stored in advance.
- 前記特徴量の情報は、ヒストグラム又は空間周波数を含む、請求項18に記載の移動体。 The mobile object according to claim 18, wherein the feature amount information includes a histogram or a spatial frequency.
- 前記処理部は、前記処理部で検出された前記オブジェクトを補間する、請求項17から19のいずれか一項に記載の移動体。 The moving body according to any one of claims 17 to 19, wherein the processing unit interpolates the object detected by the processing unit.
- 前記処理部は、前記処理部で検出された前記オブジェクトに対応する画素を、前記オブジェクトの近傍の領域の画素を用いて補間する、請求項20に記載の移動体。 21. The moving object according to claim 20, wherein the processing unit interpolates pixels corresponding to the object detected by the processing unit using pixels in a region near the object.
- 前記処理部は、前記処理部で検出された前記オブジェクトに対応する画素を、前記画像の時間的に前後する画像を用いて補間する、請求項20または21に記載の移動体。 The moving body according to claim 20 or 21, wherein the processing unit interpolates pixels corresponding to the object detected by the processing unit using an image that is temporally mixed in the image.
- 記憶部をさらに有し、
前記処理部は、前記処理部で処理される前の画像及び前記処理部で処理された後の画像の少なくとも1つを前記記憶部に出力する、請求項1から21のいずれか一項に記載の移動体。 A storage unit;
The processing unit outputs at least one of an image before being processed by the processing unit and an image after being processed by the processing unit to the storage unit. Moving body. - 通信部をさらに有し、
前記処理部は、前記処理部で処理された画像を前記通信部を介して外部に出力する、請求項1から23のいずれか一項に記載の移動体。 A communication unit;
The mobile unit according to any one of claims 1 to 23, wherein the processing unit outputs an image processed by the processing unit to the outside via the communication unit. - 前記オブジェクトは、前記機体の一部である、請求項1から24のいずれか一項に記載の移動体。 25. The moving object according to any one of claims 1 to 24, wherein the object is a part of the aircraft.
- 機体と、撮像装置とを有する移動体の制御方法であって、
前記撮像装置の姿勢と前記機体の姿勢との関係に基づいて、前記撮像装置が撮像した画像にオブジェクトが存在するか否かを判定するステップと、
前記オブジェクトが存在すると判定された場合、前記オブジェクトを除去するステップと
を有する、移動体の制御方法。 A method of controlling a moving body having an airframe and an imaging device,
Determining whether an object is present in an image captured by the imaging device based on a relationship between a posture of the imaging device and a posture of the airframe;
And a step of removing the object when it is determined that the object exists. - 撮像装置を有する移動体の制御方法であって、
機体の傾きを示す第一の値に基づいて、前記撮像装置が撮像した画像にオブジェクトが存在するか否かを判定するステップと、
前記オブジェクトが存在すると判定された場合、前記オブジェクトを除去するステップと
を有する、移動体の制御方法。 A method of controlling a moving body having an imaging device,
Determining whether an object exists in an image captured by the imaging device based on a first value indicating a tilt of the airframe;
And a step of removing the object when it is determined that the object exists. - 機体と、撮像装置とを有する移動体の制御プログラムを格納する、コンピュータ読み取り可能な記憶媒体であって、
前記制御プログラムは、コンピュータに、
前記撮像装置の姿勢と前記機体の姿勢との関係に基づいて、前記撮像装置が撮像した画像にオブジェクトが存在するか否かを判定するステップと、
前記オブジェクトが存在すると判定された場合、前記オブジェクトを除去するステップと
を実行させる、コンピュータ読み取り可能な記憶媒体。 A computer-readable storage medium for storing a control program for a moving body having an airframe and an imaging device,
The control program is stored in a computer.
Determining whether an object is present in an image captured by the imaging device based on a relationship between a posture of the imaging device and a posture of the airframe;
A computer-readable storage medium that executes a step of removing the object when it is determined that the object exists. - 撮像装置を有する移動体の制御プログラムを格納する、コンピュータ読み取り可能な記憶媒体であって、
前記制御プログラムは、コンピュータに、
機体の傾きを示す第一の値に基づいて、前記撮像装置が撮像した画像にオブジェクトが存在するか否かを判定するステップと、
前記オブジェクトが存在すると判定された場合、前記オブジェクトを除去するステップと
を実行させる、コンピュータ読み取り可能な記憶媒体。 A computer-readable storage medium for storing a control program for a moving object having an imaging device,
The control program is stored in a computer.
Determining whether an object exists in an image captured by the imaging device based on a first value indicating a tilt of the airframe;
A computer-readable storage medium that executes a step of removing the object when it is determined that the object exists. - 機体と、撮像装置とを有する移動体の制御プログラムであって、コンピュータに、
前記撮像装置の姿勢と前記機体の姿勢との関係に基づいて、前記撮像装置が撮像した画像にオブジェクトが存在するか否かを判定するステップと、
前記オブジェクトが存在すると判定された場合、前記オブジェクトを除去するステップと
を実行させる、制御プログラム。 A moving body control program having an airframe and an imaging apparatus,
Determining whether an object is present in an image captured by the imaging device based on a relationship between a posture of the imaging device and a posture of the airframe;
A control program that executes a step of removing the object when it is determined that the object exists. - 撮像装置を有する移動体の制御プログラムであって、コンピュータに、
機体の傾きを示す第一の値に基づいて、前記撮像装置が撮像した画像にオブジェクトが存在するか否かを判定するステップと、
前記オブジェクトが存在すると判定された場合、前記オブジェクトを除去するステップと
を実行させる、制御プログラム。 A control program for a moving object having an imaging device,
Determining whether an object exists in an image captured by the imaging device based on a first value indicating a tilt of the airframe;
A control program that executes a step of removing the object when it is determined that the object exists.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2016/072307 WO2018020656A1 (en) | 2016-07-29 | 2016-07-29 | Moving body, method for controlling moving body, storage medium having control program stored thereon, and control program |
JP2017519596A JP6436601B2 (en) | 2016-07-29 | 2016-07-29 | MOBILE BODY, MOBILE BODY CONTROL METHOD, STORAGE MEDIUM CONTAINING CONTROL PROGRAM, AND CONTROL PROGRAM |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2016/072307 WO2018020656A1 (en) | 2016-07-29 | 2016-07-29 | Moving body, method for controlling moving body, storage medium having control program stored thereon, and control program |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018020656A1 true WO2018020656A1 (en) | 2018-02-01 |
Family
ID=61016641
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/072307 WO2018020656A1 (en) | 2016-07-29 | 2016-07-29 | Moving body, method for controlling moving body, storage medium having control program stored thereon, and control program |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP6436601B2 (en) |
WO (1) | WO2018020656A1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20220274717A1 (en) * | 2019-08-27 | 2022-09-01 | Sony Group Corporation | Mobile object, information processing apparatus, information processing method, and program |
JP2022137105A (en) * | 2018-02-07 | 2022-09-21 | 株式会社アクセル | Imaging apparatus and program |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006281830A (en) * | 2005-03-31 | 2006-10-19 | Yamaha Motor Co Ltd | View point display system for camera |
JP2015090591A (en) * | 2013-11-06 | 2015-05-11 | 株式会社パスコ | Generation device and generation method for road surface ortho-image |
-
2016
- 2016-07-29 WO PCT/JP2016/072307 patent/WO2018020656A1/en active Application Filing
- 2016-07-29 JP JP2017519596A patent/JP6436601B2/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006281830A (en) * | 2005-03-31 | 2006-10-19 | Yamaha Motor Co Ltd | View point display system for camera |
JP2015090591A (en) * | 2013-11-06 | 2015-05-11 | 株式会社パスコ | Generation device and generation method for road surface ortho-image |
Non-Patent Citations (1)
Title |
---|
FUMIO OKURA ET AL.: "Fly-through MR Heijo-kyo : Augmented Telepresence Using Recorded Aerial Omnidirectional Videos Captured from Unmanned Airship", HUMAN INTERFACE SOCIETY KENKYU HOKOKUSHU 2010, vol. 12, 14 October 2010 (2010-10-14), pages 31 - 36 * |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2022137105A (en) * | 2018-02-07 | 2022-09-21 | 株式会社アクセル | Imaging apparatus and program |
US20220274717A1 (en) * | 2019-08-27 | 2022-09-01 | Sony Group Corporation | Mobile object, information processing apparatus, information processing method, and program |
US11964775B2 (en) * | 2019-08-27 | 2024-04-23 | Sony Group Corporation | Mobile object, information processing apparatus, information processing method, and program |
Also Published As
Publication number | Publication date |
---|---|
JP6436601B2 (en) | 2018-12-12 |
JPWO2018020656A1 (en) | 2018-07-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109417596B (en) | Multi-sensor image stabilization techniques | |
EP3086195B1 (en) | System for piloting a drone in first-person view mode | |
WO2020024185A1 (en) | Techniques for motion-based automatic image capture | |
JP2016119655A (en) | Video system for piloting drone in immersive mode | |
US11798172B2 (en) | Maximum temperature point tracking method, device and unmanned aerial vehicle | |
CN114476105A (en) | Automated landing surface topography assessment and related systems and methods | |
WO2019155335A1 (en) | Unmanned aerial vehicle including an omnidirectional depth sensing and obstacle avoidance aerial system and method of operating same | |
US20210289133A1 (en) | Method and system of controlling video play speed, control terminal and mobile platform | |
CN108780324A (en) | Unmanned plane, unmanned aerial vehicle (UAV) control method and apparatus | |
JP2017072986A (en) | Autonomous flying device, control method and program of autonomous flying device | |
JP6436601B2 (en) | MOBILE BODY, MOBILE BODY CONTROL METHOD, STORAGE MEDIUM CONTAINING CONTROL PROGRAM, AND CONTROL PROGRAM | |
WO2019183789A1 (en) | Method and apparatus for controlling unmanned aerial vehicle, and unmanned aerial vehicle | |
CN109949381A (en) | Image processing method, device, picture processing chip, camera assembly and aircraft | |
JP6630939B2 (en) | Control device, imaging device, moving object, control method, and program | |
US20200027238A1 (en) | Method for merging images and unmanned aerial vehicle | |
JP6949930B2 (en) | Control device, moving body and control method | |
US11363195B2 (en) | Control device, imaging device, imaging system, movable object, control method, and program | |
JP6681101B2 (en) | Inspection system | |
WO2019064457A1 (en) | Computer system, position estimation method, and program | |
US20210256732A1 (en) | Image processing method and unmanned aerial vehicle | |
WO2021035746A1 (en) | Image processing method and device, and movable platform | |
JP2021154857A (en) | Operation support device, operation support method, and program | |
JP2021103410A (en) | Mobile body and imaging system | |
JP7317684B2 (en) | Mobile object, information processing device, and imaging system | |
JP2020095519A (en) | Shape estimation device, shape estimation method, program, and recording medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2017519596 Country of ref document: JP Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16910559 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205N DATED 16/05/2019) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16910559 Country of ref document: EP Kind code of ref document: A1 |