[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

WO2018020656A1 - Moving body, method for controlling moving body, storage medium having control program stored thereon, and control program - Google Patents

Moving body, method for controlling moving body, storage medium having control program stored thereon, and control program Download PDF

Info

Publication number
WO2018020656A1
WO2018020656A1 PCT/JP2016/072307 JP2016072307W WO2018020656A1 WO 2018020656 A1 WO2018020656 A1 WO 2018020656A1 JP 2016072307 W JP2016072307 W JP 2016072307W WO 2018020656 A1 WO2018020656 A1 WO 2018020656A1
Authority
WO
WIPO (PCT)
Prior art keywords
imaging device
image
unit
moving body
uav
Prior art date
Application number
PCT/JP2016/072307
Other languages
French (fr)
Japanese (ja)
Inventor
幸良 笹尾
Original Assignee
エスゼット ディージェイアイ テクノロジー カンパニー リミテッド
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by エスゼット ディージェイアイ テクノロジー カンパニー リミテッド filed Critical エスゼット ディージェイアイ テクノロジー カンパニー リミテッド
Priority to PCT/JP2016/072307 priority Critical patent/WO2018020656A1/en
Priority to JP2017519596A priority patent/JP6436601B2/en
Publication of WO2018020656A1 publication Critical patent/WO2018020656A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U10/00Type of UAV
    • B64U10/10Rotorcrafts
    • B64U10/13Flying platforms
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U20/00Constructional aspects of UAVs
    • B64U20/80Arrangement of on-board electronics, e.g. avionics systems or wiring
    • B64U20/87Mounting of imaging devices, e.g. mounting of gimbals
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals
    • H04N23/843Demosaicing, e.g. interpolating colour pixel values

Definitions

  • the present invention is a technique relating to a moving object. More specifically, the present invention relates to a technique related to image processing performed by a moving body.
  • Patent Document 1 discloses a technique for removing reflection of an object in an image.
  • Patent Document 1 discloses a technique using a tablet computer with a camera function. The tablet computer determines whether there is an unnecessary reflection in the captured image by analyzing the spectrum of the image. When there is an unnecessary reflection, the tablet computer interpolates the area with the unnecessary reflection using the pixels in the area near the area with the unnecessary reflection.
  • a moving object determines whether an object is present in an image captured by an imaging device, based on a relationship between the aircraft, the imaging device, and the attitude of the imaging device and the attitude of the aircraft. And a processing unit that performs a process of removing the object when it is determined that the object exists.
  • the moving body determines whether or not the object is present in the image based on the relationship between the posture of the imaging device and the posture of the airframe.
  • the moving body performs processing for removing the object from the image captured by the imaging device. According to such processing, it is possible to determine whether or not the image includes an object before performing processing for actually removing the object. Therefore, the processing load caused by executing the image processing can be reduced.
  • the determination unit may perform the determination based on an angle formed by the inclination of the imaging device and the inclination of the airframe.
  • the determination unit may determine that an object exists.
  • the predetermined angle may be set based on information including the shape of the airframe, the size of the airframe, the relationship between the imaging device and the airframe and the position, and the angle of view of the imaging device. . According to such a configuration, it is possible to specify the posture where the object may exist in the image from the known information of the aircraft.
  • the moving body may further include a rotor blade.
  • the predetermined angle is based on information including the shape of the rotor blade, the size of the rotor blade, the relationship between the position of the rotor blade and the aircraft, the relationship between the position of the imaging device and the aircraft, and the angle of view of the imaging device. May be set. According to this configuration, it is possible to specify the posture in which the object may exist in the image from the known information of the airframe and the rotor blade.
  • the moving body may further include an acquisition unit that acquires the attitude information of the aircraft and the attitude information of the imaging device.
  • the determination unit may perform determination using the attitude information of the aircraft and the attitude information of the imaging device.
  • the posture acquisition unit may acquire posture information measured by a gyro sensor.
  • a moving body includes: an imaging device; a determination unit that determines whether an object is present in an image captured by the imaging device based on a first value indicating the inclination of the aircraft; And a processing unit that performs a process of removing the object when it is determined that the object exists.
  • the moving body determines whether or not the object exists based on the first value indicating the inclination of the aircraft.
  • the moving body performs processing for removing the object from the image captured by the imaging device. According to such processing, it is possible to determine whether or not the image includes an object before performing processing for actually removing the object. Therefore, the processing load caused by executing the image processing can be reduced.
  • the first value includes the wind speed, the second value operated at the operation terminal that operates the moving body, and the current value of the drive motor that drives the rotor blades of the moving body. At least one may be included.
  • the moving body may further include a first acquisition unit that acquires the wind speed.
  • the determination unit may determine that the object exists when the acquired wind speed exceeds a predetermined threshold.
  • the wind speed may be obtained using an anemometer.
  • the mobile object may further include a second acquisition unit that acquires a second value operated on the operation terminal.
  • the determination unit may determine that the object exists when the acquired second value exceeds a predetermined threshold.
  • the moving body may further include a plurality of rotor blades and a third acquisition unit that acquires current values of the rotor blades.
  • the determination unit may determine that the object exists when the current value of the rotor blade on the opposite side to the traveling direction of the aircraft is larger than the current value of the rotor blade on the side of the aircraft traveling direction.
  • a gimbal may be further provided between the airframe and the imaging device.
  • the gimbal may control to keep the posture of the imaging device constant.
  • the processing unit may extract a region where an object can be included in an image.
  • the processing unit may detect an object from the extracted area.
  • the processing unit may detect the object by converting the extracted region into feature amount information and comparing it with prestored object feature amount information. .
  • the information indicating the feature amount may include a histogram or a spatial frequency.
  • the processing unit may interpolate the object detected by the processing unit.
  • the processing unit may interpolate pixels corresponding to the object detected by the processing unit using pixels in a region near the object.
  • the processing unit may interpolate the pixels corresponding to the object detected by the processing unit using images that are temporally mixed in the image.
  • the moving body may further include a storage unit.
  • the processing unit may output at least one of an image before being processed by the processing unit and an image after being processed by the processing unit to the storage unit.
  • the mobile body may further include a communication unit.
  • the processing unit may output the image processed by the processing unit to the outside via the communication unit.
  • the object may be a part of the aircraft.
  • a control method is a control method for a moving body having a body and an imaging device.
  • this moving body control method based on the relationship between the attitude of the imaging device and the attitude of the aircraft, a step of determining whether an object is present in an image captured by the imaging device, and an object is determined to exist. And removing the object.
  • processing for determining whether or not an object exists in the image is performed based on the relationship between the posture of the imaging device and the posture of the aircraft.
  • processing for removing the object from the image captured by the imaging device is performed. According to such processing, it is possible to determine whether or not the image includes an object before performing processing for actually removing the object. Therefore, the processing load caused by executing the image processing can be reduced.
  • a control method is a control method for a moving body having an imaging device.
  • this moving body control method based on a first value indicating the inclination of the aircraft, a step of determining whether or not an object is present in an image captured by the imaging device, and when it is determined that an object is present, Removing the object.
  • processing for determining whether or not an object exists is performed based on the first value indicating the inclination of the aircraft.
  • processing for removing the object from the image captured by the imaging device is performed. According to such processing, it is possible to determine whether or not the image includes an object before performing processing for actually removing the object. Therefore, the processing load caused by executing the image processing can be reduced.
  • a storage medium is a computer-readable storage medium that stores a control program.
  • This control program is a control program for a moving body having an airframe and an imaging device.
  • the control program determines, on the computer, whether or not an object exists in an image captured by the imaging device based on the relationship between the orientation of the imaging device and the attitude of the aircraft, and the presence of the object. If so, the step of removing the object is executed.
  • processing for determining whether or not an object exists in the image captured by the imaging device is executed.
  • processing for removing the object is executed. According to such processing, it is possible to determine whether or not the image includes an object before performing processing for actually removing the object. Therefore, the processing load caused by executing the image processing can be reduced.
  • a storage medium is a computer-readable storage medium that stores a control program.
  • This control program is a control program for a moving object having an imaging device in a computer.
  • the control program includes a step of determining whether or not an object exists in an image captured by the imaging device based on a first value indicating the inclination of the aircraft, and removing an object when it is determined that the object exists. And executing a step.
  • processing for determining whether or not an object exists is executed based on the first value indicating the inclination of the aircraft.
  • processing for removing the object from the image captured by the imaging device is executed. According to such processing, it is possible to determine whether or not the image includes an object before performing processing for actually removing the object. Therefore, the processing load caused by executing the image processing can be reduced.
  • a control program is a control program for a moving body having an airframe and an imaging device.
  • the control program determines, on the computer, whether or not an object exists in an image captured by the imaging device based on the relationship between the orientation of the imaging device and the attitude of the aircraft, and the presence of the object. If so, the step of removing the object is executed.
  • processing for determining whether or not an object exists in the image captured by the imaging device is executed.
  • processing for removing the object is executed. According to such processing, it is possible to determine whether or not the image includes an object before performing processing for actually removing the object. Therefore, the processing load caused by executing the image processing can be reduced.
  • a control program is a control program for a moving object having an imaging device.
  • This control program is a step of determining whether or not an object exists in an image captured by an imaging device based on a first value indicating the inclination of the aircraft, and when determining that an object exists, And removing the object.
  • processing for determining whether or not an object exists is executed based on the first value indicating the inclination of the aircraft.
  • processing for removing the object from the image captured by the imaging device is executed. According to such processing, it is possible to determine whether or not the image includes an object before performing processing for actually removing the object. Therefore, the processing load caused by executing the image processing can be reduced.
  • the processing load when an object is removed by image processing can be reduced.
  • FIG. 1 is a diagram illustrating an example of an image captured using an imaging device mounted on a multicopter that is an example of a moving object.
  • the image 11 does not show the multicopter aircraft.
  • the image 12 includes multi-copter propellers 13 and 14 as objects. Some areas in the image 12 are obscured by the propellers 13 and 14 of the multicopter.
  • the moving body performs processing for removing the object by image processing on the image determined that the object exists.
  • the moving body does not perform processing for removing the object by image processing on the image determined that the object does not exist. Since processing for analyzing an image and removing an object is not performed on all images, the processing load can be reduced.
  • Whether or not the object exists in the image (whether or not the object may be reflected in the image) can be determined based on the relative posture between the moving body and the imaging device.
  • the determination process based on the relative posture between the moving body and the imaging device has a lighter processing load than the process of determining whether there is a reflection by analyzing the contents of the image. Therefore, it is possible to reduce the processing load when the object is removed by image processing.
  • the moving body may be a manned aircraft.
  • the moving body is a concept including other aircraft that moves in the air, a vehicle that moves on the ground, a ship that moves on the water, a robot, and the like.
  • FIG. 2 is a diagram showing an example of the appearance of the UAV 101 according to the present embodiment.
  • the UAV 101 includes a UAV body 210, a plurality of rotor blades 220, a gimbal 230, an imaging device 240, and a camera 250.
  • the flight of the UAV body 210 is controlled by controlling the rotation of the plurality of rotor blades 220.
  • the UAV 101 can be configured to have four rotor blades.
  • the number of rotor blades is not limited to four.
  • the number of rotor blades may be any number.
  • the UAV 101 may be a type of UAV having fixed wings that do not have rotating wings.
  • the UAV 101 may be a type of UAV having both a rotary wing and a fixed wing.
  • the gimbal 230 is provided in the UAV body 210.
  • the gimbal 230 supports the imaging device 240 in a rotatable manner.
  • the gimbal 230 can control the rotation of the imaging device 240 around the yaw axis, the pitch axis, and the roll axis.
  • the imaging device 240 captures a subject around the UAV body 210 and obtains image data.
  • the imaging device 240 is controlled to be rotatable by the gimbal 230.
  • the imaging device 240 can be configured to include at least a lens and an imaging sensor.
  • the plurality of cameras 250 can be sensing cameras for controlling the flight of the UAV 101.
  • two cameras 250 may be provided on the front which is the nose of the UAV aircraft 210.
  • Two cameras 250 may be provided on the bottom surface of the UAV body 210.
  • the pair of cameras 250 may be provided on at least one of the nose, the tail, the side surface, the bottom surface, and the ceiling surface.
  • the camera 250 can be configured to include at least a lens and an imaging sensor.
  • FIG. 3 is a diagram showing an example of a block diagram of the configuration of the UAV 101 according to the present embodiment.
  • the UAV 101 includes a UAV control unit 310 that controls the entire UAV, a memory 320, and a communication interface 330.
  • the UAV control unit 310 can control the rotary blade mechanism 340, the gimbal 230, the imaging device 240, and the camera 250.
  • the UAV control unit 310 controls the entire UAV in accordance with a software program stored in the memory (storage unit) 320, for example.
  • the UAV control unit 310 controls the entire UAV in accordance with an instruction received from a remote controller terminal (operation terminal) through the communication interface (communication unit) 330.
  • the UAV control unit 310 controls the flight of the UAV and performs the imaging control of the imaging device 240.
  • the UAV control unit 310 can be configured by, for example, a microprocessor such as a CPU or MPU, a microcontroller such as an MCU, or the like.
  • the memory 320 may store a software program that controls the entire UAV.
  • the memory 320 may store various data such as image data captured by the imaging device 240 and the camera 250, various information of the UAV 101, and the like.
  • a computer-readable storage medium can be used as the memory.
  • flash memory such as SRAM, DRAM, EEPROM, USB memory can be used.
  • the memory 320 may be provided in the housing of the UAV 101.
  • the memory 320 may be removable from the UAV 101.
  • the communication interface 330 can receive an instruction from the remote controller terminal by wireless communication, and can transmit various data and information stored in the memory of the UAV 101.
  • the communication interface 330 can also receive a signal from a GNSS (Global Navigation Satellite System) positioning system.
  • GNSS Global Navigation Satellite System
  • the rotating blade mechanism 340 can include a plurality of rotating blades 220 and a plurality of drive motors that rotate the plurality of rotating blades 220.
  • the UAV 101 may have various sensors such as a barometer, a laser, an acceleration, and a gyro.
  • the UAV 101 can include other devices and mechanisms.
  • FIG. 4 is a diagram illustrating an example of a remote controller terminal 400 that controls the UAV 101.
  • the remote controller terminal 400 includes an operation unit 410, a display unit 420, and a support unit 430.
  • the operation unit 410 receives an operation instruction input from the user.
  • An operation instruction from the user is transmitted from the remote controller terminal 400 to the UAV 101 through a communication interface (not shown).
  • the display unit 420 displays various information related to the UAV 101.
  • the display unit 420 may display an image captured by the UAV 101 with the imaging device 240.
  • the display unit 420 may display an image captured by the UAV 101 with the camera 250.
  • the display unit 420 may be removable from the remote controller terminal 400.
  • the display unit 420 may be various portable information terminal devices such as a tablet terminal and a smartphone terminal.
  • the support unit 430 supports the display unit 420.
  • the support part 430 may be adjustable to an arbitrary angle.
  • the display unit 420 may be attached to or removed from the support unit 430.
  • the display unit 420 may be integrated with the support unit 430.
  • FIG. 5 is a diagram schematically showing the UAV 101 shown in FIG.
  • FIG. 5 is a diagram for explaining the angle of view of the imaging device 240 provided in the UAV 101.
  • a gimbal 230 is provided between the UAV body 210 and the imaging device 240.
  • the gimbal 230 can keep the posture of the imaging device 240 constant.
  • the gimbal 230 controls the imaging device 240 so that it does not vibrate even if vibration occurs in the UAV body 210.
  • the gimbal 230 can keep the angle (posture) of the imaging device 240 at an arbitrary angle (posture). Accordingly, the UAV body 210 and the imaging device 240 can maintain different angles (attitudes).
  • FIG. 6 is a diagram showing an example in which the UAV body 210 is tilted forward in the traveling direction from the posture shown in FIG.
  • the imaging device 240 maintains the same posture as shown in FIG.
  • a part 610 of the rotary blade 220 is included in the angle of view 600 of the imaging device 240. That is, a part 610 of the rotor blade 220 is reflected in an image captured by the imaging device 240.
  • the relationship as shown in FIG. 6 is the posture relationship of the UAV body 210, the positional relationship between the rotor blades 220 and the UAV body 210, the posture of the imaging device 240, and the positional relationship between the imaging device 240 and the UAV body 210.
  • a part of the UAV body 210 (for example, a frame) may be included in an image captured by the imaging device 240.
  • the object in this embodiment is a part of the UAV body 210.
  • a part of the UAV body 210 may include a frame of the UAV body 210.
  • a part of the UAV body 210 may include a rotor wing 220.
  • other mechanisms (not shown) provided in the UAV 101 may be included in a part of the UAV body 210.
  • the UAV body 210 tilts forward in the traveling direction as shown in FIG. This inclination is caused by relatively increasing the rotational force of the rotor blade 220 on the rear side in the traveling direction relative to the rotational force of the rotor blade 220 on the front side in the traveling direction. With this operation, the UAV 101 is given a force that moves in the traveling direction.
  • the relative attitude between the UAV body 210 and the imaging device 240 can change dynamically.
  • the fact that the relative attitude between the UAV body 210 and the imaging device 240 can dynamically change can be said to indicate whether or not an object exists in the image.
  • the field of view obtained from the image is substantially narrowed compared to a normal case where the UAV 101 is not moved at a high speed.
  • the field of view obtained from an image becomes narrow when moving at high speed, there is a possibility that an operator who operates the UAV 101 while viewing the image on the display unit 420 of the remote controller terminal 400 may interfere with the operation of the UAV 101.
  • the object may exist in the image.
  • the UAV aircraft 210 may be greatly inclined.
  • the gimbal 230 keeps the imaging device 240 in a certain posture, the object may exist in the image. That is, there is a possibility that a part of the rotor wing 220, a part of the frame of the UAV body 210, or a part of other mechanism is reflected in the image.
  • an operator who operates the UAV 101 while viewing the image on the display unit 420 of the remote controller terminal 400 confirms the image and quickly performs various operations. At this time, if the object exists in the image, there may be a delay in checking the current status of the UAV 101.
  • the UAV 101 performs processing for analyzing an image and determining whether an object exists. If the object exists, the UAV 101 performs processing for removing the object from the image by image processing. However, if these processes are performed on all the images obtained by the imaging device 240, the processing load on the UAV 101 increases.
  • processing for determining whether an object exists is performed on an image captured when a predetermined condition is satisfied.
  • the object is removed from the image determined that the object exists.
  • image processing is not performed on all images obtained by imaging by the imaging device 240. Therefore, the processing load can be reduced as compared with a case where image processing is performed on all images obtained by the imaging device 240.
  • FIG. 7 is a block diagram showing an example of the configuration of the UAV control unit 310 according to the present embodiment.
  • the UAV control unit 310 includes an attitude acquisition unit 710, a wind speed acquisition unit 720, an operation instruction acquisition unit 730, a current value acquisition unit 740, an image acquisition unit 750, a determination unit 760, an image processing unit 770, and an output unit 780.
  • the image processing unit 770 includes a candidate area extraction unit 771 and an interpolation unit 772.
  • FIG. 7 is merely an example, and other configurations may be included.
  • the attitude acquisition unit 710 acquires the attitude information of the UAV body 210.
  • the posture information is information indicating the rotation angle of the yaw axis, pitch axis, and roll axis, for example.
  • the UAV 101 may include an inertial measurement device (IMU: Inertial Measurement Unit).
  • the IMU may include three gyro sensors and three acceleration sensors with respect to three orthogonal axes.
  • the UAV 101 may include a triaxial geomagnetic sensor.
  • the attitude acquisition unit 710 may acquire the attitude information of the UAV body 210 using an acceleration sensor, a gyro sensor, a triaxial geomagnetic sensor, and the like.
  • the posture information acquired by the posture acquisition unit 710 is sent to the determination unit 760 and the candidate area extraction unit 771.
  • the posture acquisition unit 710 may acquire posture information of the imaging device 240. That is, the posture acquisition unit 710 may acquire posture information of the imaging device 240 controlled by the gimbal 230. The posture acquisition unit 710 may acquire the posture information of the imaging device 240 based on the control information of the gimbal 230.
  • the imaging device 240 may include an acceleration sensor, a gyro sensor, and a triaxial geomagnetic sensor, and the posture acquisition unit 710 acquires posture information of the imaging device 240 using information from the sensor provided in the imaging device 240. You can do it.
  • the attitude of the imaging device 240 may be changed to an arbitrary angle by control from the remote controller terminal 400.
  • the wind speed acquisition unit (first acquisition unit) 720 acquires the wind speed.
  • the wind speed acquisition unit 720 sends the acquired wind speed to the determination unit 760.
  • the UAV 101 may include an anemometer.
  • the wind speed acquisition unit 720 may acquire a value obtained by an anemometer.
  • the wind speed acquisition unit 720 may calculate the wind speed based on the acceleration and angular velocity obtained by the IMU. For example, it is assumed that the UAV 101 should be moved to the position A ′ as a result of calculation based on the value obtained by the IMU, where the UAV 101 should be moved to the position A in the calculation under the control of the UAV control unit 310 of the UAV 101. The wind speed acquisition unit 720 may calculate the wind speed based on such a relationship between the position A ′ and the position A.
  • the wind speed acquisition unit 720 may acquire information indicating the wind speed of the route from the outside through the communication interface 330.
  • the wind speed acquisition unit 720 may acquire the wind speed based on the difference between the ground speed and the air speed of the UAV 101.
  • the ground speed may be acquired using GNSS or the like, and the air speed may be acquired using an air speed measuring device.
  • the wind speed acquisition unit 720 may acquire the wind speed based on these differences.
  • the operation instruction acquisition unit (second acquisition unit) 730 acquires the operation instruction received by the operation unit 410 of the remote controller terminal 400 via the communication interface 330.
  • the operation instruction may be a value (second value) indicating the inclination of the controller stick of the operation unit 410. It may be an instruction to increase the speed of the UAV 101 as the inclination of the controller stick of the operation unit 410 increases.
  • the UAV control unit 310 may control the flight of the UAV 101 based on the acquired operation instruction.
  • the current value acquisition unit (third acquisition unit) 740 acquires a current value that flows through the drive motor of the rotary blade mechanism 340.
  • the current value acquisition unit 740 acquires a current value that flows through the drive motor of the rotary blade mechanism 340 corresponding to each of the multiple rotary blades 220.
  • the image acquisition unit 750 acquires an image captured using the imaging device 240.
  • the image acquisition unit 750 outputs the acquired image to the image processing unit 770.
  • the determination unit 760 determines whether an object exists in the image input to the image processing unit 770.
  • the determination unit 760 includes the posture information acquired by the posture acquisition unit 710, the wind speed acquired by the wind speed acquisition unit 720, the operation instruction acquired by the operation instruction acquisition unit 730, and the current value acquired by the current value acquisition unit 740. A determination based on at least one of the above may be made.
  • the result determined by the determination unit 760 is output to the image processing unit 770. Details will be described later.
  • the image processing unit 770 performs various image processes on the input image.
  • the image processing unit 770 includes a candidate area extraction unit 771 and an interpolation unit 772.
  • the image processing unit 770 may execute another process in addition to the candidate area extraction process by the candidate area extraction unit 771 and the interpolation process by the interpolation unit 772.
  • the image processing unit 770 outputs an image subjected to various types of processing to the output unit 780.
  • the candidate area extraction unit 771 extracts candidate areas of the input image.
  • a candidate area is an area where an object may exist.
  • the candidate area extraction unit 771 may extract candidate areas using known information.
  • the known information may include, for example, the shape and size of the UAV airframe 210 of the UAV 101.
  • the known information may include the shape and size of the rotor blade 220.
  • the known information may include a positional relationship between the rotor wing 220 and the UAV airframe 210.
  • the known information may include a positional relationship between the imaging device 240 and the UAV body 210.
  • the known information may include the angle of view of the imaging device 240 (horizontal angle of view and vertical angle of view with respect to the optical axis).
  • the candidate area extraction unit 771 may extract candidate areas in the image by using such known information.
  • the candidate area extraction unit 771 may further extract the candidate area using the posture information acquired by the posture acquisition unit 710. Details will be described later.
  • the interpolation unit 772 performs an interpolation process on the candidate region extracted by the candidate region extraction unit 771.
  • the interpolation unit 772 performs processing for detecting an object from the candidate area.
  • the interpolation unit 772 performs a process of removing the detected object.
  • a process for removing the object a known process called in-painting may be applied.
  • the output unit 780 outputs the image output by the image processing unit 770.
  • the output unit 780 may output an image to the memory 320.
  • the output unit 780 may transmit an image to the remote controller terminal 400 through the communication interface 330.
  • the output unit 780 may output the image to the memory 320 and transmit it to the remote controller terminal 400.
  • the output unit 780 may transmit an image to another device (for example, a server on the cloud) through the communication interface 330.
  • FIG. 8 is a flowchart showing an example of processing according to the present embodiment.
  • the process shown in the figure is executed by the UAV control unit 310. This process is executed at the timing when the image acquisition unit 750 acquires an image. In the case of a moving image, it may be executed at the acquisition timing of each frame (still image) constituting the moving image.
  • the process shown in FIG. 8 is an example of a control method executed by the UAV control unit 310.
  • the determination unit 760 can determine whether or not the first parameter indicated by the relationship between the attitude of the imaging device 240 and the attitude of the UAV body 210 exceeds a predetermined threshold.
  • the determination unit 760 may determine whether or not the angle formed by the tilt of the UAV body 210 and the tilt of the imaging device 240 indicated by the posture information acquired by the posture acquisition unit 710 is greater than a predetermined angle.
  • the determination unit 760 determines whether or not the angle formed by the posture information acquired by the posture acquisition unit 710 is greater than a predetermined angle. If the angle formed by the tilt of the imaging device 240 and the tilt of the UAV body 210 is larger than a predetermined angle, there is a possibility that an object exists in the image.
  • the UAV control unit 310 advances the process to step S820. It can be said that this inclination indicates a relative relationship between the posture of the imaging device 240 and the posture of the UAV body 210.
  • the determination unit 760 determines whether or not the difference between the inclination in the optical axis direction of the imaging device 240 with respect to the reference line and the inclination of the UAV body 210 with respect to the reference line is greater than a predetermined angle. May be determined. For example, if the reference line is a horizontal line, it may be determined whether the difference between the inclination (elevation angle) of the imaging device 240 with respect to the horizontal line and the inclination of the UAV body 210 with respect to the horizontal line (elevation angle) is greater than a predetermined angle. Good. For the tilt, information indicating the rotation angle of the pitch axis obtained from the UAV body 210 and the gyro sensor of the imaging device 240 may be used.
  • FIG. 9 is a diagram showing the traveling direction 901 of the UAV body 210 and the optical axis direction 902 of the imaging device 240.
  • the known information may include the shape and size of the UAV airframe 210.
  • the known information may include the shape and size of the rotor blade 220a.
  • the known information may include a positional relationship between the rotor 220a and the UAV body 210.
  • FIG. 9 an example in which information related to the moving blade 220a on the traveling direction side is included as known information is shown, but information related to the rotating blade 220b on the opposite side to the moving direction may be included.
  • the known information may include a positional relationship between the imaging device 240 and the UAV body 210.
  • the known information may include the angle of view of the imaging device 240 (horizontal angle of view and vertical angle of view with respect to the optical axis).
  • the angle formed by the tilt of the UAV body 210 and the tilt of the imaging device 240 when an object exists in the image can be specified.
  • the angle ⁇ formed by the orientation 951 of the UAV body 210 with respect to the optical axis direction 952 of the imaging device 240 is larger than a predetermined angle, it may be determined that an object exists in the image.
  • the predetermined angle used as the threshold value may be set in advance from the known information described above.
  • the inclination of the UAV body 210 relative to the optical axis direction of the imaging device 240 is smaller than in the example shown in FIG. Even so, it may be determined that an object exists in the image.
  • the inclination of the UAV body 210 with respect to the optical axis direction of the imaging device 240 is as shown in FIG. Even if it is larger than the example shown in FIG. 4, it may be determined that no object exists in the image.
  • the imaging device 240 when the imaging device 240 is attached to the lower side in the vertical direction of the UAV body 210, the inclination of the UAV body 210 with respect to the optical axis direction of the imaging device 240 is larger than the example shown in FIG. 9. Even in this case, it may be determined that no object exists in the image.
  • the threshold value corresponding to the posture information is set based on known information. If the angle formed by the tilt of the UAV body 210 and the tilt of the imaging device 240 exceeds a threshold value (predetermined angle), there is a possibility that the object exists in the image. Therefore, the UAV control unit 310 advances the process to step S820.
  • the determination unit 760 may determine whether or not the second parameter (first value) indicating the inclination of the aircraft exceeds a predetermined threshold value.
  • the parameter may be at least one of the wind speed acquired by the wind speed acquisition unit 720, the operation instruction acquired by the operation instruction acquisition unit 730, and the current value acquired by the current value acquisition unit 740.
  • a corresponding threshold value may be set in advance for each parameter.
  • the determination unit 760 may determine whether or not the wind speed acquired by the wind speed acquisition unit 720 exceeds a predetermined wind speed.
  • the determination unit 760 may determine whether or not the value indicated by the operation instruction acquired by the operation instruction acquisition unit 730 exceeds a predetermined input value.
  • the determination unit 760 may determine whether or not the current value acquired by the current value acquisition unit 740 exceeds a predetermined current value.
  • the determination unit 760 may advance the process to step S820.
  • the determination unit 760 may advance the process to step S820.
  • the determination unit 760 determines whether or not the wind speed acquired by the wind speed acquisition unit 720 exceeds a predetermined wind speed. For example, it is assumed that 15 m / s is set as a predetermined wind speed threshold. The determination unit 760 determines that a value exceeding a predetermined threshold has been detected when the wind speed acquired by the wind speed acquisition unit 720 exceeds 15 m / s. It is highly possible that the UAV body 210 is tilted when the wind speed exceeding the predetermined threshold is measured. Therefore, since there may be an object in the image, the UAV control unit 310 advances the process to step S820.
  • the predetermined input value may be a value indicated by an operation instruction from the remote controller terminal 400.
  • the determination unit 760 determines that a value exceeding the predetermined threshold value has been detected. For example, when the operation unit 410 of the remote controller terminal 400 is tilted more strongly than a predetermined position, it is determined that a predetermined input value exceeds a predetermined threshold.
  • the UAV control unit 310 advances the process to step S820.
  • the determination unit 760 determines whether or not the current value acquired by the current value acquisition unit 740 exceeds a predetermined current value.
  • the predetermined current value may be a current value flowing through the motor of the rotary blade mechanism 340.
  • the value of current flowing through the two motors on the opposite sides of the UAV 101 travel direction exceeds a predetermined threshold value.
  • the UAV 101 is flying at high speed.
  • the predetermined current value may be the current value of the motor on the traveling direction side or the current value of the motor on the opposite side to the traveling direction. A combination of these may also be used. For example, when the current value flowing through the motor on the opposite side of the traveling direction of the UAV 101 is relatively larger than the current value flowing through the traveling direction side, the predetermined current value may exceed the threshold value. In any case, there is a high possibility that the UAV 101 is flying at high speed. As the UAV 101 flies at high speed, there is a high possibility that the UAV body 210 is tilted. Therefore, since there may be an object in the image, the UAV control unit 310 advances the process to step S820.
  • step S810 it is not necessary to search the image to confirm whether the object actually exists in the image.
  • a process is performed to determine whether there is a possibility that the object exists in the image. If there is a possibility that the object actually exists in the image, the process proceeds to a step described later, and the image processing unit 770 performs processing for searching the image and detecting the object.
  • Step S810 It is a heavy load for the image processing unit 770 to perform object detection processing and removal processing on all images acquired by the image acquisition unit 750.
  • the determination process in step S810 is performed. With this determination process, it is possible to determine whether or not the image is likely to exist without performing image processing. Step S810 can be said to be processing for determining whether or not the UAV body 210 is tilted or is likely to tilt.
  • the determination unit 760 preferably performs the determination process using the posture information, the wind speed, the operation instruction, and the current value when the image acquisition unit 750 acquires the image, but is not limited thereto.
  • determination may be made using common posture information, wind speed, operation instruction, and current value. For example, in the case of a movie of 30 fps (frames per second), assuming that the predetermined time is 1 second, for 30 frames, the posture information, wind speed, operation instruction, and current value at the time of the first frame therein May be applied to the remaining 29 frames.
  • step S810 when the determination unit 760 detects a parameter that exceeds the threshold, the process proceeds to step S820. If no parameter exceeding the threshold is detected in step S810, it is considered that the object does not exist in the image. Therefore, the image processing described below is not performed. Therefore, the processing load can be reduced.
  • the predetermined threshold value used in the determination in step S810 may be set to an arbitrary value according to an instruction from the remote controller terminal 400.
  • the predetermined threshold value may be changed by an instruction from the remote controller terminal 400 during the flight.
  • the candidate area extraction unit 771 extracts candidate areas.
  • the structure of the UAV 101 is known. Using these pieces of known information, it is possible to specify in which region in the image captured by the imaging device 240 there is a possibility that the object will appear. In other words, the candidate area extraction unit 771 can extract candidate areas using known information.
  • the candidate area extraction unit 771 may extract a candidate area using the posture information acquired by the posture acquisition unit 710 in addition to the known information.
  • Candidate regions may be extracted from the tilt of the UAV body 210 and the tilt of the imaging device 240. In FIG. 9 described above, if the tilt (posture) of the UAV body 210, the tilt (posture) of the imaging device 240, and known information are used, a candidate area is displayed above the image captured by the imaging device 240. Is extracted.
  • FIG. 10 is a diagram illustrating an image 1000 captured by the imaging device 240.
  • the image 1000 includes candidate areas 1010 and 1020.
  • Candidate areas 1010 and 1020 are areas where objects can exist.
  • the candidate areas 1010 and 1020 are only areas where objects can exist.
  • the rotary blade 220 rotates around the rotation axis. Even if the UAV fuselage 210 is tilted to such an extent that a part of the rotor blade 220 can be present in the image, the image including the rotor blade 220 and the rotor blade 220 are included depending on the imaging timing. In some cases, an unacceptable image is captured.
  • Candidate regions 1010 and 1020 may be regions including a circular range that can be taken in the image around the rotation axis by the rotary blade 220.
  • step S830 the candidate area extraction unit 771 determines whether a candidate area has been extracted. If the candidate area is not extracted, the subsequent process is unnecessary, and the process ends. Therefore, there is no need to perform image processing such as detecting and removing an object. When the candidate area is extracted, the process proceeds to step S840.
  • step S840 the interpolation unit 772 analyzes the candidate area extracted in step S820 to detect an object, and determines whether an object exists in the candidate area.
  • FIG. 11 is a diagram illustrating an example in which the interpolation unit 772 detects the objects 1110 and 1120. Examples of the object include a part of the UAV body 210 and a part of the rotor wing 220 as described above. Appearance textures such as the shape and color of the UAV fuselage 210 and rotor 220 are known. Therefore, the interpolation unit 772 may determine whether or not these known objects are detected from the candidate area. The interpolation unit 772 may compare the object information stored in the memory 320 in advance with the object information of the image captured by the imaging device 240.
  • feature amount information such as an object color histogram and spatial frequency is stored in the memory 320 in advance.
  • the interpolation unit 772 converts the candidate area of the image captured by the imaging device 240 into information on feature amounts such as a color histogram and spatial frequency.
  • the interpolation unit 772 may detect the objects 1110 and 1120 from the candidate areas 1010 and 1020 by comparing the feature amount information obtained by the conversion with the feature amount information stored in advance.
  • step S850 If an object is detected in the candidate area, the process proceeds to step S850. If no object is detected in the candidate area, the process ends.
  • the interpolation unit 772 interpolates the detected object.
  • the interpolation unit 772 treats the detected pixels of the objects 1110 and 1120 as a missing area.
  • the interpolation unit 772 performs an interpolation process for interpolating the missing area.
  • an interpolation process there is a process called inpainting in which pixels in a defective area are interpolated using pixels around the defective area in one image.
  • FIG. 12 is a diagram for explaining an interpolation process for interpolating a missing area.
  • the interpolation unit 772 interpolates the pixel values of the objects 1110 and 1120 using the pixels in the areas 1210 and 1220 around (near) the objects 1110 and 1120 detected in the image 1000.
  • the surrounding areas 1210 and 1220 may be defined as a rectangle that is at least n pixels away from the defect area.
  • inpainting for example, a known method described in Non-Patent Document 1 and Non-Patent Document 2 may be applied.
  • the objects 1110 and 1120 are removed from the image 1000.
  • the objects 1110 and 1120 are converted into a region with less discomfort by an interpolation process using pixels in a region near the objects.
  • the above-described neighborhood may be determined from the size and position of the object to be interpolated.
  • the interpolation range can be determined by LX ⁇ ⁇ + ⁇ .
  • may be a parameter determined from the acquired whole image and the size and position of the object to be interpolated.
  • is a pixel.
  • may be a parameter used to supplement when LX is small.
  • the process of interpolating the missing area only needs to be removed from the image 1000 to the extent that the objects 1110 and 1120 do not feel strange.
  • the outline of the objects 1110 and 1120 may remain slightly after the interpolation processing is executed.
  • the uncomfortable feeling is reduced in the image after the interpolation process compared to before the interpolation process. Therefore, the objects 1110 and 1120 may not be completely removed from the image 1000. Some of the objects 1110 and 1120 may remain in the image after the interpolation processing.
  • the interpolation unit 772 has described an example in which an object included in one image is interpolated using pixels around the object in the one image.
  • the interpolating unit 772 may interpolate the object using temporally forward, backward, or both front and rear frames.
  • the object is a rotary blade 220
  • a part of the rotary blade 220 may or may not exist in the image depending on the frame.
  • the temporally preceding and succeeding frames there may be a frame in which no object exists, and therefore interpolation processing may be performed using the frame.
  • the interpolation unit 772 uses a process for interpolating an object included in one image using pixels around the object in the one image, and uses both frames before, after, and before and after in time.
  • the process of interpolating the object may be combined.
  • the UAV mainly including the rotor blade has been described as an example.
  • the present invention is not limited to this.
  • the processing described in the above-described embodiment can be applied to any moving body that can have a mode in which the attitude of the machine body and the attitude of the imaging device are different.
  • the imaging device is attached to the lower part of the airframe
  • the location where the imaging device is attached may be on the side of the aircraft or on the top. It may be attached at any position.
  • the example in which the image processing unit 770 switches whether to perform the processing of the candidate region extraction unit 771 and the interpolation unit 772 according to the determination result of the determination unit 760 has been described. Absent. Other configurations may be used.
  • the determination result of the determination unit 760 may be sent to the image acquisition unit 750 instead of the image processing unit 770.
  • the image acquisition unit 750 may switch between outputting an image to the image processing unit 770 or outputting an image to the output unit 780 without going through the image processing unit 770 based on the determination result of the determination unit 760.
  • the image processing unit 770 may output both the image before the interpolation by the interpolation unit 772 (referred to as an image before interpolation) and the image after the interpolation (referred to as an image after interpolation) to the output unit 780.
  • the output unit 780 may output both the pre-interpolation image and the post-interpolation image to the memory 320. With this configuration, the user can compare both the original image and the image from which the object has been removed.
  • the example in which the optical axis direction of the imaging device 240 is fixed in a fixed direction and is maintained in a fixed posture by the gimbal 230 has been described, but is not limited thereto.
  • the optical axis direction of the imaging device 240 may be changed by an instruction from the remote controller terminal 400.
  • the gimbal 230 continues to control the posture of the imaging device 240 so as to maintain the changed posture in the optical axis direction.
  • the determination unit 760 may perform determination using the posture information of the imaging device 240 after the change.
  • the mode of determining whether there is a possibility that an object exists in the image by using the inclination of the UAV body 210 has been described.
  • the fact that the UAV body 210 is tilted means that the rotor wing 220 is also tilted in accordance with the tilt of the UAV body 210.
  • the gyro sensor is provided in the rotary blade 220, it may be determined whether an object exists in the image using the attitude of the rotary blade 220.
  • step S820 when the value of the predetermined parameter exceeds the threshold corresponding to the parameter in step S810, the process proceeds to step S820 to perform the process of extracting the candidate area.
  • the present invention is not limited to this. Absent.
  • the process of step S810 may be omitted and the process may be started from step S820. For example, it may be determined whether a candidate area is extracted from an image using known information, the posture of the UAV body 210, and the posture of the imaging device 240.
  • steps S820 and S830 may be omitted. If the value of the predetermined parameter exceeds the threshold value corresponding to the parameter in step S810, it is considered that there is a possibility that the object exists somewhere in the image.
  • the area to be removed can be specified, so the processing load can be reduced.
  • the processing load can be reduced even if steps S820 and S830 are not performed. This is because it is not necessary to perform image processing on all images.
  • Image processing may be performed in the remote controller terminal 400.
  • the remote controller terminal 400 may include an image processing unit. Data acquired by the posture acquisition unit 710, the wind speed acquisition unit 720, the operation instruction acquisition unit 730, the current value acquisition unit 740, and the image acquisition unit 750 in FIG. 7 may be transmitted to the remote controller terminal 400 via the output unit 780. .
  • the remote controller terminal 400 may perform the process shown in FIG. 8 using the transmitted data.
  • Each unit for realizing the functions of the above-described embodiments can be implemented by, for example, hardware or software.
  • program code for controlling hardware may be executed by various processors of a computer such as a CPU and MPU.
  • Hardware such as a circuit for realizing the function of the program code may be provided.
  • a part of the program code may be realized by hardware, and the remaining part may be executed by various processors.

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Mechanical Engineering (AREA)
  • Remote Sensing (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Image Analysis (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

Analyzing all captured images to determine whether a ghost image of an object is present causes an increase in processing load. This moving body determines whether an object is present in an image captured by an imaging device, on the basis of the relationship between the orientation of the imaging device and the orientation of an airframe. If it is determined that the object is present, processing for removing the object from the image captured by the imaging device is performed.

Description

移動体、移動体の制御方法、制御プログラムを格納した記憶媒体、および制御プログラムMOBILE BODY, MOBILE BODY CONTROL METHOD, STORAGE MEDIUM CONTAINING CONTROL PROGRAM, AND CONTROL PROGRAM
 本発明は、移動体に関する技術である。より詳細には、移動体で行われる画像処理に関する技術に関する。 The present invention is a technique relating to a moving object. More specifically, the present invention relates to a technique related to image processing performed by a moving body.
 特許文献1には画像内のオブジェクトの写り込みを除去する技術が開示されている。特許文献1には、カメラ機能付のタブレットコンピュータを用いた技術が開示されている。タブレットコンピュータは、撮像した画像に不要な写り込みがあるか否かを画像のスペクトルを解析して判定する。不要な写り込みがある場合、タブレットコンピュータは、不要な写り込みがある領域の近傍の領域の画素を用いて、その不要な写り込みがある領域を補間する。 Patent Document 1 discloses a technique for removing reflection of an object in an image. Patent Document 1 discloses a technique using a tablet computer with a camera function. The tablet computer determines whether there is an unnecessary reflection in the captured image by analyzing the spectrum of the image. When there is an unnecessary reflection, the tablet computer interpolates the area with the unnecessary reflection using the pixels in the area near the area with the unnecessary reflection.
特開2015-126326号公報JP2015-126326A
 連続して撮像が行われる際に、撮像した全ての画像を解析してオブジェクトの写り込みがあるか否かを判定することは、処理負荷の増加を引き起こす。 When continuously capturing images, analyzing all the captured images to determine whether there is an object reflection causes an increase in processing load.
 本発明の一実施形態に係る移動体は、機体と、撮像装置と、撮像装置の姿勢と機体の姿勢との関係に基づいて、撮像装置が撮像した画像にオブジェクトが存在するか否かを判定する判定部と、オブジェクトが存在すると判定された場合、オブジェクトを除去する処理を行う処理部とを有する。 A moving object according to an embodiment of the present invention determines whether an object is present in an image captured by an imaging device, based on a relationship between the aircraft, the imaging device, and the attitude of the imaging device and the attitude of the aircraft. And a processing unit that performs a process of removing the object when it is determined that the object exists.
 この形態によれば、移動体は、撮像装置の姿勢と機体の姿勢との関係に基づいて、オブジェクトが画像に存在するか否かを判定する。移動体は、オブジェクトが画像に存在すると判定された場合、撮像装置が撮像した画像からオブジェクトを除去する処理を行う。かかる処理によれば、実際にオブジェクトを除去する処理を行う前に、オブジェクトが存在する画像であるか否かを判定できる。したがって、画像処理を実行することによって生じる処理負荷を軽減できる。 According to this aspect, the moving body determines whether or not the object is present in the image based on the relationship between the posture of the imaging device and the posture of the airframe. When it is determined that the object is present in the image, the moving body performs processing for removing the object from the image captured by the imaging device. According to such processing, it is possible to determine whether or not the image includes an object before performing processing for actually removing the object. Therefore, the processing load caused by executing the image processing can be reduced.
 本発明の一実施形態においては、判定部は、撮像装置の傾きと機体の傾きとが成す角度に基づいて判定を行ってよい。 In one embodiment of the present invention, the determination unit may perform the determination based on an angle formed by the inclination of the imaging device and the inclination of the airframe.
 本発明の一実施形態においては、上記の角度が所定の角度より大きい場合、判定部は、オブジェクトが存在すると判定してよい。 In one embodiment of the present invention, when the angle is larger than a predetermined angle, the determination unit may determine that an object exists.
 本発明の一実施形態においては、上記の所定の角度は、機体の形状、機体の大きさ、撮像装置と機体と位置の関係、および撮像装置の画角を含む情報に基づいて設定されてよい。かかる構成によれば、機体の既知の情報からオブジェクトが画像に存在する可能性がある姿勢を特定できる。 In one embodiment of the present invention, the predetermined angle may be set based on information including the shape of the airframe, the size of the airframe, the relationship between the imaging device and the airframe and the position, and the angle of view of the imaging device. . According to such a configuration, it is possible to specify the posture where the object may exist in the image from the known information of the aircraft.
 本発明の一実施形態においては、移動体は、回転翼をさらに有してよい。上記の所定の角度は、回転翼の形状、回転翼の大きさ、回転翼と機体との位置の関係、撮像装置と機体との位置の関係、および撮像装置の画角を含む情報に基づいて設定されてよい。かかる構成によれば、機体と回転翼との既知の情報からオブジェクトが画像に存在する可能性がある姿勢を特定できる。 In one embodiment of the present invention, the moving body may further include a rotor blade. The predetermined angle is based on information including the shape of the rotor blade, the size of the rotor blade, the relationship between the position of the rotor blade and the aircraft, the relationship between the position of the imaging device and the aircraft, and the angle of view of the imaging device. May be set. According to this configuration, it is possible to specify the posture in which the object may exist in the image from the known information of the airframe and the rotor blade.
 本発明の一実施形態においては、移動体は、機体の姿勢情報および撮像装置の姿勢情報を取得する取得部をさらに有してよい。判定部は、機体の姿勢情報および撮像装置の姿勢情報を用いて判定を行ってよい。 In one embodiment of the present invention, the moving body may further include an acquisition unit that acquires the attitude information of the aircraft and the attitude information of the imaging device. The determination unit may perform determination using the attitude information of the aircraft and the attitude information of the imaging device.
 本発明の一実施形態においては、姿勢取得部は、ジャイロセンサによって測定された姿勢情報を取得してよい。 In one embodiment of the present invention, the posture acquisition unit may acquire posture information measured by a gyro sensor.
 本発明の一実施形態に係る移動体は、撮像装置と、機体の傾きを示す第一の値に基づいて、撮像装置が撮像した画像にオブジェクトが存在するか否かを判定する判定部と、オブジェクトが存在すると判定された場合、オブジェクトを除去する処理を行う処理部とを有する。 A moving body according to an embodiment of the present invention includes: an imaging device; a determination unit that determines whether an object is present in an image captured by the imaging device based on a first value indicating the inclination of the aircraft; And a processing unit that performs a process of removing the object when it is determined that the object exists.
 この形態によれば、移動体は、機体の傾きを示す第一の値に基づいて、オブジェクトが存在するか否かを判定する。移動体は、オブジェクトが存在すると判定された場合、撮像装置が撮像した画像からオブジェクトを除去する処理を行う。かかる処理によれば、実際にオブジェクトを除去する処理を行う前に、オブジェクトが存在する画像であるか否かを判定できる。したがって、画像処理を実行することによって生じる処理負荷を軽減できる。 According to this embodiment, the moving body determines whether or not the object exists based on the first value indicating the inclination of the aircraft. When it is determined that the object exists, the moving body performs processing for removing the object from the image captured by the imaging device. According to such processing, it is possible to determine whether or not the image includes an object before performing processing for actually removing the object. Therefore, the processing load caused by executing the image processing can be reduced.
 本発明の一実施形態においては、上記の第一の値は、風速、移動体を操作する操作端末において操作された第二の値、および移動体の回転翼を駆動する駆動モータの電流値の少なくとも1つを含んでよい。 In one embodiment of the present invention, the first value includes the wind speed, the second value operated at the operation terminal that operates the moving body, and the current value of the drive motor that drives the rotor blades of the moving body. At least one may be included.
 本発明の一実施形態においては、移動体は、風速を取得する第一取得部をさらに有してよい。判定部は、取得された風速が、所定の閾値を超える場合、オブジェクトが存在すると判定してよい。 In one embodiment of the present invention, the moving body may further include a first acquisition unit that acquires the wind speed. The determination unit may determine that the object exists when the acquired wind speed exceeds a predetermined threshold.
 本発明の一実施形態においては、風速は、風速計を用いて取得されてよい。 In one embodiment of the present invention, the wind speed may be obtained using an anemometer.
 本発明の一実施形態においては、移動体は、操作端末において操作された第二の値を取得する第二取得部をさらに有してよい。判定部は、取得された第二の値が所定の閾値を超える場合、オブジェクトが存在すると判定してよい。 In one embodiment of the present invention, the mobile object may further include a second acquisition unit that acquires a second value operated on the operation terminal. The determination unit may determine that the object exists when the acquired second value exceeds a predetermined threshold.
 本発明の一実施形態においては、移動体は、複数の回転翼と、複数の回転翼のそれぞれの電流値を取得する第三取得部とをさらに有してよい。判定部は、機体の進行方向とは反対側の回転翼の電流値が、機体の進行方向の側の回転翼の電流値より大きい場合、オブジェクトが存在すると判定してよい。 In one embodiment of the present invention, the moving body may further include a plurality of rotor blades and a third acquisition unit that acquires current values of the rotor blades. The determination unit may determine that the object exists when the current value of the rotor blade on the opposite side to the traveling direction of the aircraft is larger than the current value of the rotor blade on the side of the aircraft traveling direction.
 本発明の一実施形態においては、機体と撮像装置との間に更にジンバルを有してよい。 In one embodiment of the present invention, a gimbal may be further provided between the airframe and the imaging device.
 本発明の一実施形態においては、ジンバルは、撮像装置の姿勢を一定に保つ制御をしてよい。 In one embodiment of the present invention, the gimbal may control to keep the posture of the imaging device constant.
 本発明の一実施形態においては、処理部は、画像においてオブジェクトが含まれ得る領域を抽出してよい。 In one embodiment of the present invention, the processing unit may extract a region where an object can be included in an image.
 本発明の一実施形態においては、処理部は、抽出された上記の領域からオブジェクトを検出してよい。 In one embodiment of the present invention, the processing unit may detect an object from the extracted area.
 本発明の一実施形態においては、処理部は、抽出された上記の領域を特徴量の情報に変換し、予め記憶されているオブジェクトの特徴量の情報と比較することでオブジェクトを検出してよい。 In an embodiment of the present invention, the processing unit may detect the object by converting the extracted region into feature amount information and comparing it with prestored object feature amount information. .
 本発明の一実施形態においては、上記の特徴量を示す情報は、ヒストグラムまたは空間周波数を含んでよい。 In one embodiment of the present invention, the information indicating the feature amount may include a histogram or a spatial frequency.
 本発明の一実施形態においては、処理部は、処理部で検出されたオブジェクトを補間してよい。 In one embodiment of the present invention, the processing unit may interpolate the object detected by the processing unit.
 本発明の一実施形態においては、処理部は、処理部で検出されたオブジェクトに対応する画素を、オブジェクトの近傍の領域の画素を用いて補間してよい。 In one embodiment of the present invention, the processing unit may interpolate pixels corresponding to the object detected by the processing unit using pixels in a region near the object.
 本発明の一実施形態においては、処理部は、処理部で検出されたオブジェクトに対応する画素を、画像の時間的に前後する画像を用いて補間してよい。 In one embodiment of the present invention, the processing unit may interpolate the pixels corresponding to the object detected by the processing unit using images that are temporally mixed in the image.
 本発明の一実施形態においては、移動体は、記憶部をさらに有してよい。処理部は、処理部で処理される前の画像及び処理部で処理された後の画像の少なくとも1つを記憶部に出力してよい。 In one embodiment of the present invention, the moving body may further include a storage unit. The processing unit may output at least one of an image before being processed by the processing unit and an image after being processed by the processing unit to the storage unit.
 本発明の一実施形態においては、移動体は、通信部をさらに有してよい。処理部は、処理部で処理された画像を通信部を介して外部に出力してよい。 In one embodiment of the present invention, the mobile body may further include a communication unit. The processing unit may output the image processed by the processing unit to the outside via the communication unit.
 本発明の一実施形態においては、オブジェクトは、機体の一部であってよい。 In one embodiment of the present invention, the object may be a part of the aircraft.
 本発明の一実施形態に係る制御方法は、機体と撮像装置とを有する移動体の制御方法である。この移動体の制御方法は、撮像装置の姿勢と機体の姿勢との関係に基づいて、撮像装置が撮像した画像にオブジェクトが存在するか否かを判定するステップと、オブジェクトが存在すると判定された場合、オブジェクトを除去するステップとを有する。 A control method according to an embodiment of the present invention is a control method for a moving body having a body and an imaging device. In this moving body control method, based on the relationship between the attitude of the imaging device and the attitude of the aircraft, a step of determining whether an object is present in an image captured by the imaging device, and an object is determined to exist. And removing the object.
 この形態によれば、撮像装置の姿勢と機体の姿勢との関係に基づいて、オブジェクトが画像に存在するか否かを判定する処理が行われる。オブジェクトが存在すると判定された場合、撮像装置が撮像した画像からオブジェクトを除去する処理が行われる。かかる処理によれば、実際にオブジェクトを除去する処理を行う前に、オブジェクトが存在する画像であるか否かを判定できる。したがって、画像処理を実行することによって生じる処理負荷を軽減できる。 According to this embodiment, processing for determining whether or not an object exists in the image is performed based on the relationship between the posture of the imaging device and the posture of the aircraft. When it is determined that the object exists, processing for removing the object from the image captured by the imaging device is performed. According to such processing, it is possible to determine whether or not the image includes an object before performing processing for actually removing the object. Therefore, the processing load caused by executing the image processing can be reduced.
 本発明の一実施形態に係る制御方法は、撮像装置を有する移動体の制御方法である。この移動体の制御方法は、機体の傾きを示す第一の値に基づいて、撮像装置が撮像した画像にオブジェクトが存在するか否かを判定するステップと、オブジェクトが存在すると判定された場合、オブジェクトを除去するステップとを有する。 A control method according to an embodiment of the present invention is a control method for a moving body having an imaging device. In this moving body control method, based on a first value indicating the inclination of the aircraft, a step of determining whether or not an object is present in an image captured by the imaging device, and when it is determined that an object is present, Removing the object.
 この形態によれば、機体の傾きを示す第一の値に基づいて、オブジェクトが存在するか否かを判定する処理が行われる。オブジェクトが存在すると判定された場合、撮像装置が撮像した画像からオブジェクトを除去する処理が行われる。かかる処理によれば、実際にオブジェクトを除去する処理を行う前に、オブジェクトが存在する画像であるか否かを判定できる。したがって、画像処理を実行することによって生じる処理負荷を軽減できる。 According to this embodiment, processing for determining whether or not an object exists is performed based on the first value indicating the inclination of the aircraft. When it is determined that the object exists, processing for removing the object from the image captured by the imaging device is performed. According to such processing, it is possible to determine whether or not the image includes an object before performing processing for actually removing the object. Therefore, the processing load caused by executing the image processing can be reduced.
 本発明の一実施形態に係る記憶媒体は、制御プログラムを格納するコンピュータ読み取り可能な記憶媒体である。この制御プログラムは、機体と、撮像装置とを有する移動体の制御プログラムである。この制御プログラムは、コンピュータに、撮像装置の姿勢と機体の姿勢との関係に基づいて、撮像装置が撮像した画像にオブジェクトが存在するか否かを判定するステップと、オブジェクトが存在すると判定された場合、オブジェクトを除去するステップとを実行させる。 A storage medium according to an embodiment of the present invention is a computer-readable storage medium that stores a control program. This control program is a control program for a moving body having an airframe and an imaging device. The control program determines, on the computer, whether or not an object exists in an image captured by the imaging device based on the relationship between the orientation of the imaging device and the attitude of the aircraft, and the presence of the object. If so, the step of removing the object is executed.
 この形態によれば、撮像装置の姿勢と機体の姿勢との関係に基づいて、撮像装置が撮像した画像にオブジェクトが存在するか否かを判定する処理が実行される。オブジェクトが存在する画像に対しては、オブジェクトを除去する処理が実行される。かかる処理によれば、実際にオブジェクトを除去する処理を行う前に、オブジェクトが存在する画像であるか否かを判定できる。したがって、画像処理を実行することによって生じる処理負荷を軽減できる。 According to this aspect, based on the relationship between the attitude of the imaging device and the attitude of the airframe, processing for determining whether or not an object exists in the image captured by the imaging device is executed. For an image in which an object exists, processing for removing the object is executed. According to such processing, it is possible to determine whether or not the image includes an object before performing processing for actually removing the object. Therefore, the processing load caused by executing the image processing can be reduced.
 本発明の一実施形態に係る記憶媒体は、制御プログラムを格納するコンピュータ読み取り可能な記憶媒体である。この制御プログラムは、コンピュータに、撮像装置を有する移動体の制御プログラムである。この制御プログラムは、機体の傾きを示す第一の値に基づいて、撮像装置が撮像した画像にオブジェクトが存在するか否かを判定するステップと、オブジェクトが存在すると判定された場合、オブジェクトを除去するステップとを実行させる。 A storage medium according to an embodiment of the present invention is a computer-readable storage medium that stores a control program. This control program is a control program for a moving object having an imaging device in a computer. The control program includes a step of determining whether or not an object exists in an image captured by the imaging device based on a first value indicating the inclination of the aircraft, and removing an object when it is determined that the object exists. And executing a step.
 この形態によれば、機体の傾きを示す第一の値に基づいて、オブジェクトが存在するか否かを判定する処理が実行される。オブジェクトが存在すると判定された場合、撮像装置が撮像した画像からオブジェクトを除去する処理が実行される。かかる処理によれば、実際にオブジェクトを除去する処理を行う前に、オブジェクトが存在する画像であるか否かを判定できる。したがって、画像処理を実行することによって生じる処理負荷を軽減できる。 According to this embodiment, processing for determining whether or not an object exists is executed based on the first value indicating the inclination of the aircraft. When it is determined that the object exists, processing for removing the object from the image captured by the imaging device is executed. According to such processing, it is possible to determine whether or not the image includes an object before performing processing for actually removing the object. Therefore, the processing load caused by executing the image processing can be reduced.
 本発明の一実施形態に係る制御プログラムは、機体と、撮像装置とを有する移動体の制御プログラムである。この制御プログラムは、コンピュータに、撮像装置の姿勢と機体の姿勢との関係に基づいて、撮像装置が撮像した画像にオブジェクトが存在するか否かを判定するステップと、オブジェクトが存在すると判定された場合、オブジェクトを除去するステップとを実行させる。 A control program according to an embodiment of the present invention is a control program for a moving body having an airframe and an imaging device. The control program determines, on the computer, whether or not an object exists in an image captured by the imaging device based on the relationship between the orientation of the imaging device and the attitude of the aircraft, and the presence of the object. If so, the step of removing the object is executed.
 この形態によれば、撮像装置の姿勢と機体の姿勢との関係に基づいて、撮像装置が撮像した画像にオブジェクトが存在するか否かを判定する処理が実行される。オブジェクトが存在する画像に対しては、オブジェクトを除去する処理が実行される。かかる処理によれば、実際にオブジェクトを除去する処理を行う前に、オブジェクトが存在する画像であるか否かを判定できる。したがって、画像処理を実行することによって生じる処理負荷を軽減できる。 According to this aspect, based on the relationship between the attitude of the imaging device and the attitude of the airframe, processing for determining whether or not an object exists in the image captured by the imaging device is executed. For an image in which an object exists, processing for removing the object is executed. According to such processing, it is possible to determine whether or not the image includes an object before performing processing for actually removing the object. Therefore, the processing load caused by executing the image processing can be reduced.
 本発明の一実施形態に係る制御プログラムは、撮像装置を有する移動体の制御プログラムである。この制御プログラムは、コンピュータに、機体の傾きを示す第一の値に基づいて、撮像装置が撮像した画像にオブジェクトが存在するか否かを判定するステップと、オブジェクトが存在すると判定された場合、オブジェクトを除去するステップとを実行させる。 A control program according to an embodiment of the present invention is a control program for a moving object having an imaging device. This control program is a step of determining whether or not an object exists in an image captured by an imaging device based on a first value indicating the inclination of the aircraft, and when determining that an object exists, And removing the object.
 この形態によれば、機体の傾きを示す第一の値に基づいて、オブジェクトが存在するか否かを判定する処理が実行される。オブジェクトが存在すると判定された場合、撮像装置が撮像した画像からオブジェクトを除去する処理が実行される。かかる処理によれば、実際にオブジェクトを除去する処理を行う前に、オブジェクトが存在する画像であるか否かを判定できる。したがって、画像処理を実行することによって生じる処理負荷を軽減できる。 According to this embodiment, processing for determining whether or not an object exists is executed based on the first value indicating the inclination of the aircraft. When it is determined that the object exists, processing for removing the object from the image captured by the imaging device is executed. According to such processing, it is possible to determine whether or not the image includes an object before performing processing for actually removing the object. Therefore, the processing load caused by executing the image processing can be reduced.
 本発明によれば、オブジェクトを画像処理によって除去する場合の処理負荷を軽減できる。 According to the present invention, the processing load when an object is removed by image processing can be reduced.
プロペラの写り込みの例を示す図である。It is a figure which shows the example of the reflection of a propeller. UAVの外観の一例を示す図である。It is a figure which shows an example of the external appearance of UAV. UAVの構成の例を示すブロック図である。It is a block diagram which shows the example of a structure of UAV. リモートコントローラ端末の例を示す図である。It is a figure which shows the example of a remote controller terminal. UAVに備えられた撮像装置の画角を説明する図である。It is a figure explaining the angle of view of the imaging device with which UAV was equipped. オブジェクトが画像に写り込む場合を説明する図である。It is a figure explaining the case where an object is reflected in an image. UAV制御部の構成の例を示すブロック図である。It is a block diagram which shows the example of a structure of a UAV control part. フローチャートの一例を示す図である。It is a figure which shows an example of a flowchart. UAVの傾きと撮像装置の傾きとを説明する図である。It is a figure explaining the inclination of UAV and the inclination of an imaging device. 候補領域を説明する図である。It is a figure explaining a candidate field. 検出したオブジェクトの例を説明する図である。It is a figure explaining the example of the detected object. 補間処理を説明する図である。It is a figure explaining an interpolation process.
 以下、図面を参照しながら本発明の実施形態について詳細に説明する。なお、以下の実施形態において説明する構成は一例に過ぎず、本発明は図示された構成に限定されるものではない。 Hereinafter, embodiments of the present invention will be described in detail with reference to the drawings. In addition, the structure demonstrated in the following embodiment is only an example, and this invention is not limited to the structure shown in figure.
 特許請求の範囲、明細書、図面、及び要約書には、著作権による保護の対象となる事項が含まれる。著作権者は、これらの書類の何人による複製に対しても、特許庁のファイルまたはレコードに表示される通りであれば異議を唱えない。ただし、それ以外の場合、一切の著作権を留保する。 The claims, the description, the drawings, and the abstract include matters that are subject to copyright protection. The copyright owner will not object to any number of copies of these documents as they appear in the JPO file or record. However, in other cases, all copyrights are reserved.
 移動体に搭載された撮像装置によって撮像された画像にオブジェクトが存在する場合がある。図1は、移動体の一例であるマルチコプターに搭載された撮像装置を用いて撮像された画像の例を示す図である。画像11にはマルチコプターの機体は写り込んでいない。画像12には、マルチコプターのプロペラ13、14がオブジェクトとして写りこんでいる。マルチコプターのプロペラ13、14によって、画像12内の一部の領域が覆い隠されてしまっている。 There may be an object in an image captured by an imaging device mounted on a moving body. FIG. 1 is a diagram illustrating an example of an image captured using an imaging device mounted on a multicopter that is an example of a moving object. The image 11 does not show the multicopter aircraft. The image 12 includes multi-copter propellers 13 and 14 as objects. Some areas in the image 12 are obscured by the propellers 13 and 14 of the multicopter.
 本実施形態においては、移動体に搭載された撮像装置によって撮像された画像に、オブジェクトが存在するか否かの判定が行われる。オブジェクトが存在すると判定された画像に対して、移動体は、オブジェクトを画像処理で除去する処理を行う。移動体は、オブジェクトが存在しないと判定された画像に対しては、オブジェクトを画像処理で除去する処理を行わない。画像を解析してオブジェクトを除去する処理を全ての画像に対して行わないので処理負荷を低減できる。オブジェクトが画像に存在するか否か(オブジェクトが画像に写り込んでいる可能性があるか否か)は、移動体と撮像装置との相対的な姿勢に基づいて判定することができる。移動体と撮像装置との相対的な姿勢に基づく判定処理は、画像の中身を解析して写り込みがあるか否かを判定する処理と比べて処理負荷が軽い。したがって、オブジェクトを画像処理によって除去する場合の処理負荷を軽減できる。 In this embodiment, it is determined whether or not an object exists in an image captured by an imaging device mounted on a moving body. The moving body performs processing for removing the object by image processing on the image determined that the object exists. The moving body does not perform processing for removing the object by image processing on the image determined that the object does not exist. Since processing for analyzing an image and removing an object is not performed on all images, the processing load can be reduced. Whether or not the object exists in the image (whether or not the object may be reflected in the image) can be determined based on the relative posture between the moving body and the imaging device. The determination process based on the relative posture between the moving body and the imaging device has a lighter processing load than the process of determining whether there is a reflection by analyzing the contents of the image. Therefore, it is possible to reduce the processing load when the object is removed by image processing.
 本実施形態においては、移動体の一例として無人航空機(UAV:Unmanned aerial vehicle)を用いる形態を説明する。移動体は、有人航空機であってよい。移動体は、空中を移動する他の航空機、地上を移動する車両、水上を移動する船舶、ロボット等を含む概念である。 In the present embodiment, a mode in which an unmanned aircraft (UAV: Unmanned vehicle) is used as an example of a moving body will be described. The moving body may be a manned aircraft. The moving body is a concept including other aircraft that moves in the air, a vehicle that moves on the ground, a ship that moves on the water, a robot, and the like.
 図2は、本実施形態に係るUAV101の外観の一例を示す図である。UAV101は、UAV機体210、複数の回転翼220、ジンバル230、撮像装置240、およびカメラ250を備える。 FIG. 2 is a diagram showing an example of the appearance of the UAV 101 according to the present embodiment. The UAV 101 includes a UAV body 210, a plurality of rotor blades 220, a gimbal 230, an imaging device 240, and a camera 250.
 複数の回転翼220の回転が制御されることにより、UAV機体210の飛行が制御される。例えばUAV101は、4つの回転翼を有する構成とすることができる。回転翼の数は4つに限定されるものではない。回転翼の数は任意の数であってもよい。UAV101は、回転翼を有さない固定翼を有する種類のUAVであってもよい。UAV101は、回転翼および固定翼のどちらも有する種類のUAVであってもよい。 The flight of the UAV body 210 is controlled by controlling the rotation of the plurality of rotor blades 220. For example, the UAV 101 can be configured to have four rotor blades. The number of rotor blades is not limited to four. The number of rotor blades may be any number. The UAV 101 may be a type of UAV having fixed wings that do not have rotating wings. The UAV 101 may be a type of UAV having both a rotary wing and a fixed wing.
 ジンバル230は、UAV機体210に備えられる。ジンバル230は、撮像装置240を回転可能に支持する。ジンバル230は、例えばヨー軸、ピッチ軸、及びロール軸を中心に撮像装置240を回転制御することができる。 The gimbal 230 is provided in the UAV body 210. The gimbal 230 supports the imaging device 240 in a rotatable manner. For example, the gimbal 230 can control the rotation of the imaging device 240 around the yaw axis, the pitch axis, and the roll axis.
 撮像装置240は、UAV機体210の周囲の被写体を撮像して画像データを得る。撮像装置240は、ジンバル230によって回転可能に制御される。撮像装置240は、レンズおよび撮像センサを少なくとも含む構成とすることができる。 The imaging device 240 captures a subject around the UAV body 210 and obtains image data. The imaging device 240 is controlled to be rotatable by the gimbal 230. The imaging device 240 can be configured to include at least a lens and an imaging sensor.
 複数のカメラ250は、UAV101の飛行を制御するためのセンシングカメラとすることができる。例えば、UAV機体210の機首である正面に2つのカメラ250が備えられていてもよい。UAV機体210の底面に2つのカメラ250が備えられていてもよい。2つのカメラ250のペアによって撮像された画像の視差を用いることによって、UAV機体210の周囲の距離を求めることができる。カメラ250のペアは、機首、機尾、側面、底面、及び天井面のうちの少なくとも1つに備えられていてもよい。カメラ250は、レンズおよび撮像センサを少なくとも含む構成とすることができる。 The plurality of cameras 250 can be sensing cameras for controlling the flight of the UAV 101. For example, two cameras 250 may be provided on the front which is the nose of the UAV aircraft 210. Two cameras 250 may be provided on the bottom surface of the UAV body 210. By using the parallax of the images captured by the pair of two cameras 250, the distance around the UAV body 210 can be obtained. The pair of cameras 250 may be provided on at least one of the nose, the tail, the side surface, the bottom surface, and the ceiling surface. The camera 250 can be configured to include at least a lens and an imaging sensor.
 図3は、本実施形態に係るUAV101の構成のブロック図の一例を示す図である。UAV101は、UAV全体の制御を行うUAV制御部310と、メモリ320と、通信インタフェース330とを有する。UAV制御部310は、回転翼機構340、ジンバル230、撮像装置240、およびカメラ250を制御可能である。 FIG. 3 is a diagram showing an example of a block diagram of the configuration of the UAV 101 according to the present embodiment. The UAV 101 includes a UAV control unit 310 that controls the entire UAV, a memory 320, and a communication interface 330. The UAV control unit 310 can control the rotary blade mechanism 340, the gimbal 230, the imaging device 240, and the camera 250.
 UAV制御部310は、例えばメモリ(記憶部)320に格納されたソフトウェアプログラムに従ってUAV全体の制御を行う。UAV制御部310は、通信インタフェース(通信部)330を通じてリモートコントローラ端末(操作端末)などから受信した指示に従って、UAV全体の制御を行う。例えばUAV制御部310は、UAVの飛行の制御を行ったり、撮像装置240の撮像制御を行ったりする。UAV制御部310は、例えばCPU、MPU等のマイクロプロセッサ、MCU等のマイクロコントローラ等により構成することができる。 The UAV control unit 310 controls the entire UAV in accordance with a software program stored in the memory (storage unit) 320, for example. The UAV control unit 310 controls the entire UAV in accordance with an instruction received from a remote controller terminal (operation terminal) through the communication interface (communication unit) 330. For example, the UAV control unit 310 controls the flight of the UAV and performs the imaging control of the imaging device 240. The UAV control unit 310 can be configured by, for example, a microprocessor such as a CPU or MPU, a microcontroller such as an MCU, or the like.
 メモリ320は、UAV全体の制御を行うソフトウェアプログラムを格納してもよい。メモリ320は、撮像装置240およびカメラ250が撮像した画像データなどの各種のデータおよびUAV101の各種の情報などを格納してもよい。メモリとしては、コンピュータ読み取り可能な記憶媒体を用いることができる。例えば、SRAM、DRAM、EEPROM、USBメモリなどのフラッシュメモリを用いることができる。メモリ320は、UAV101の筐体に備えられてもよい。メモリ320はUAV101から取り外し可能であってもよい。 The memory 320 may store a software program that controls the entire UAV. The memory 320 may store various data such as image data captured by the imaging device 240 and the camera 250, various information of the UAV 101, and the like. A computer-readable storage medium can be used as the memory. For example, flash memory such as SRAM, DRAM, EEPROM, USB memory can be used. The memory 320 may be provided in the housing of the UAV 101. The memory 320 may be removable from the UAV 101.
 通信インタフェース330は、無線通信によってリモートコントローラ端末からの指示を受信したり、UAV101のメモリに格納されている各種のデータおよび情報を送信したりすることができる。通信インタフェース330は、GNSS(Global Navigation Satellite System)測位システムからの信号を受信することもできる。 The communication interface 330 can receive an instruction from the remote controller terminal by wireless communication, and can transmit various data and information stored in the memory of the UAV 101. The communication interface 330 can also receive a signal from a GNSS (Global Navigation Satellite System) positioning system.
 回転翼機構340は、複数の回転翼220と、複数の回転翼220を回転させる複数の駆動モータとを含む構成とすることができる。 The rotating blade mechanism 340 can include a plurality of rotating blades 220 and a plurality of drive motors that rotate the plurality of rotating blades 220.
 UAV101は、気圧計、レーザ、加速度、及びジャイロ等の各種のセンサを有してもよい。UAV101はその他の装置、機構などを備えることができる。 The UAV 101 may have various sensors such as a barometer, a laser, an acceleration, and a gyro. The UAV 101 can include other devices and mechanisms.
 図4は、UAV101を制御するリモートコントローラ端末400の例を示す図である。リモートコントローラ端末400は、操作部410、表示部420、支持部430を有する。 FIG. 4 is a diagram illustrating an example of a remote controller terminal 400 that controls the UAV 101. The remote controller terminal 400 includes an operation unit 410, a display unit 420, and a support unit 430.
 操作部410は、ユーザからの操作指示の入力を受け付ける。ユーザからの操作指示は、図示されていない通信インタフェースを通じてリモートコントローラ端末400からUAV101に送信される。 The operation unit 410 receives an operation instruction input from the user. An operation instruction from the user is transmitted from the remote controller terminal 400 to the UAV 101 through a communication interface (not shown).
 表示部420は、UAV101に関する各種の情報を表示する。表示部420は、UAV101が撮像装置240で撮像した画像を表示してよい。表示部420は、UAV101がカメラ250で撮像した画像を表示してよい。表示部420は、リモートコントローラ端末400から取り外し可能であってよい。表示部420は、タブレット端末、スマートフォン端末などの各種の携帯可能な情報端末装置であってよい。 The display unit 420 displays various information related to the UAV 101. The display unit 420 may display an image captured by the UAV 101 with the imaging device 240. The display unit 420 may display an image captured by the UAV 101 with the camera 250. The display unit 420 may be removable from the remote controller terminal 400. The display unit 420 may be various portable information terminal devices such as a tablet terminal and a smartphone terminal.
 支持部430は、表示部420を支持する。支持部430は、任意の角度に調整可能であってよい。表示部420は、支持部430に取り付けたり支持部430から取り外し可能であったりしてよい。表示部420は、支持部430と一体化されてよい。 The support unit 430 supports the display unit 420. The support part 430 may be adjustable to an arbitrary angle. The display unit 420 may be attached to or removed from the support unit 430. The display unit 420 may be integrated with the support unit 430.
 図5は、図1に示すUAV101を簡略的に示した図である。図5は、UAV101に備えられた撮像装置240の画角を説明する図である。UAV機体210と撮像装置240との間にはジンバル230が備えられている。ジンバル230は、撮像装置240の姿勢を一定に維持することができる。ジンバル230はUAV機体210に振動が生じても撮像装置240が振動しないように制御する。ジンバル230は、撮像装置240の角度(姿勢)を任意の角度(姿勢)に保つことができる。したがって、UAV機体210と撮像装置240とは、異なる角度(姿勢)を保つことができる。 FIG. 5 is a diagram schematically showing the UAV 101 shown in FIG. FIG. 5 is a diagram for explaining the angle of view of the imaging device 240 provided in the UAV 101. A gimbal 230 is provided between the UAV body 210 and the imaging device 240. The gimbal 230 can keep the posture of the imaging device 240 constant. The gimbal 230 controls the imaging device 240 so that it does not vibrate even if vibration occurs in the UAV body 210. The gimbal 230 can keep the angle (posture) of the imaging device 240 at an arbitrary angle (posture). Accordingly, the UAV body 210 and the imaging device 240 can maintain different angles (attitudes).
 図6は、UAV機体210が、図5に示す姿勢よりも進行方向の前方側に傾いている例を示す図である。一方、撮像装置240は図5に示す姿勢と同じ姿勢を保っている。図6に示すような関係においては、回転翼220の一部610が撮像装置240の画角600に含まれる。つまり、回転翼220の一部610は、撮像装置240によって撮像された画像に写り込む。図6に示すような関係とは、UAV機体210の姿勢、回転翼220とUAV機体210との位置の関係、撮像装置240の姿勢、及び撮像装置240とUAV機体210との位置の関係とすることができる。図6の例では、回転翼220の一部610が画像に含まれる例を説明しているが、これに限られない。UAV機体210の一部の部位(例えばフレーム)が撮像装置240において撮像された画像に含まれる場合がある。このように、本実施形態におけるオブジェクトは、UAV機体210の一部である。UAV機体210の一部には、UAV機体210のフレームが含まれてもよい。UAV機体210の一部には、回転翼220が含まれてもよい。その他、UAV101に備えられる図示されていない他の機構などが、UAV機体210の一部に含まれてもよい。 FIG. 6 is a diagram showing an example in which the UAV body 210 is tilted forward in the traveling direction from the posture shown in FIG. On the other hand, the imaging device 240 maintains the same posture as shown in FIG. In the relationship as shown in FIG. 6, a part 610 of the rotary blade 220 is included in the angle of view 600 of the imaging device 240. That is, a part 610 of the rotor blade 220 is reflected in an image captured by the imaging device 240. The relationship as shown in FIG. 6 is the posture relationship of the UAV body 210, the positional relationship between the rotor blades 220 and the UAV body 210, the posture of the imaging device 240, and the positional relationship between the imaging device 240 and the UAV body 210. be able to. In the example of FIG. 6, an example in which a part 610 of the rotor blade 220 is included in the image is described, but the present invention is not limited to this. A part of the UAV body 210 (for example, a frame) may be included in an image captured by the imaging device 240. Thus, the object in this embodiment is a part of the UAV body 210. A part of the UAV body 210 may include a frame of the UAV body 210. A part of the UAV body 210 may include a rotor wing 220. In addition, other mechanisms (not shown) provided in the UAV 101 may be included in a part of the UAV body 210.
 一般に、複数の回転翼220を有するUAV101が水平移動する際、図6に示すようにUAV機体210は進行方向の前方側に傾く。この傾きは、進行方向の後方側の回転翼220の回転力を、進行方向の前方側の回転翼220の回転力よりも相対的に上げることで生じる。かかる動作により、UAV101には、進行方向に向けて移動する力が与えられる。 Generally, when a UAV 101 having a plurality of rotor blades 220 moves horizontally, the UAV body 210 tilts forward in the traveling direction as shown in FIG. This inclination is caused by relatively increasing the rotational force of the rotor blade 220 on the rear side in the traveling direction relative to the rotational force of the rotor blade 220 on the front side in the traveling direction. With this operation, the UAV 101 is given a force that moves in the traveling direction.
 UAV機体210に傾きが生じた場合であっても、撮像装置240はジンバル230によって一定の姿勢を引き続き保つ。したがって、UAV機体210と撮像装置240との相対的な姿勢は動的に変わり得る。UAV機体210と撮像装置240との相対的な姿勢が動的に変わり得るということは、オブジェクトが画像に存在するか否かが動的に変わり得るともいえる。UAV101が移動する速度が高速であるほどUAV機体210の傾きは大きくなる。傾きが大きくなるとオブジェクトが画像に存在する可能性が高まる。 Even if the UAV body 210 is tilted, the imaging device 240 continues to maintain a certain posture by the gimbal 230. Accordingly, the relative attitude between the UAV body 210 and the imaging device 240 can change dynamically. The fact that the relative attitude between the UAV body 210 and the imaging device 240 can dynamically change can be said to indicate whether or not an object exists in the image. The higher the moving speed of the UAV 101, the greater the inclination of the UAV body 210. As the inclination increases, the possibility that the object exists in the image increases.
 UAV101を高速で移動させている場合にオブジェクトが画像に存在すると、高速で移動させていない通常の場合と比べて画像から得られる視野が実質的に狭くなる。高速で移動させている場合に画像から得られる視野が狭くなると、リモートコントローラ端末400の表示部420で画像を見ながらUAV101を操作する操作者にとって、UAV101の操作に支障が生じる可能性がある。 When the UAV 101 is moved at a high speed and an object is present in the image, the field of view obtained from the image is substantially narrowed compared to a normal case where the UAV 101 is not moved at a high speed. When the field of view obtained from an image becomes narrow when moving at high speed, there is a possibility that an operator who operates the UAV 101 while viewing the image on the display unit 420 of the remote controller terminal 400 may interfere with the operation of the UAV 101.
 UAV101が高速で移動する場合以外でも、オブジェクトが画像に存在する場合がある。例えば、強風が発生した場合である。強風に煽られた結果、UAV機体210が大きく傾く場合がある。この場合でも、ジンバル230は撮像装置240を一定の姿勢に保つので、オブジェクトが画像に存在することがある。つまり、回転翼220の一部、UAV機体210のフレームの一部、またはその他の機構の一部が画像に写り込む可能性が生じる。UAV101が風に煽られた場合、リモートコントローラ端末400の表示部420で画像を見ながらUAV101を操作する操作者は、画像を確認して速やかに各種の操作を行う。このとき、オブジェクトが画像に存在してしまうと、UAV101の現在の状況の確認に遅れが生じてしまうことがある。 Even if the UAV 101 moves at high speed, the object may exist in the image. For example, when a strong wind occurs. As a result of being beaten by a strong wind, the UAV aircraft 210 may be greatly inclined. Even in this case, since the gimbal 230 keeps the imaging device 240 in a certain posture, the object may exist in the image. That is, there is a possibility that a part of the rotor wing 220, a part of the frame of the UAV body 210, or a part of other mechanism is reflected in the image. When the UAV 101 is blown by the wind, an operator who operates the UAV 101 while viewing the image on the display unit 420 of the remote controller terminal 400 confirms the image and quickly performs various operations. At this time, if the object exists in the image, there may be a delay in checking the current status of the UAV 101.
 オブジェクトが画像に存在することを防止する典型的な手法としては、UAV機体210及び撮像装置240の互いの姿勢に関わらず、撮像した画像にオブジェクトが存在しないように設計することが考えられる。例えば、回転翼220及びUAV機体210から撮像装置240が物理的に離れるようにUAV101を設計することが考えられる。ただし、この方法はUAV101の機体設計に制限を加えることになる。機体の仕様によっては、このような設計をすることができない場合もある。 As a typical method for preventing an object from being present in an image, it is conceivable to design the object so that the object does not exist regardless of the postures of the UAV body 210 and the imaging device 240. For example, it is conceivable to design the UAV 101 so that the imaging device 240 is physically separated from the rotor blade 220 and the UAV body 210. However, this method places a limit on the aircraft design of the UAV 101. Depending on the aircraft specifications, this design may not be possible.
 機体設計に制限を加えない手法として、画像処理によってオブジェクトを画像から除去することが考えられる。UAV101は、画像を解析してオブジェクトが存在するかを判定する処理を行う。そして、オブジェクトが存在する場合、UAV101は、その画像の中からオブジェクトを画像処理によって除去する処理を行う。しかしながら、これらの処理を撮像装置240が撮像して得られた全ての画像に対して行うと、UAV101の処理負荷が高くなる。 It is conceivable to remove an object from an image by image processing as a technique that does not limit the aircraft design. The UAV 101 performs processing for analyzing an image and determining whether an object exists. If the object exists, the UAV 101 performs processing for removing the object from the image by image processing. However, if these processes are performed on all the images obtained by the imaging device 240, the processing load on the UAV 101 increases.
 本実施形態では、所定の条件を満たす場合に撮像された画像に対して、オブジェクトが存在するかを判定する処理を行う。オブジェクトが存在すると判定された画像からオブジェクトを除去する。このように、本実施形態では、撮像装置240が撮像して得られた全ての画像に対して画像処理を行わない。したがって、撮像装置240が撮像して得られた全ての画像に対して画像処理を行う場合と比べて処理負荷を軽減することができる。 In this embodiment, processing for determining whether an object exists is performed on an image captured when a predetermined condition is satisfied. The object is removed from the image determined that the object exists. As described above, in the present embodiment, image processing is not performed on all images obtained by imaging by the imaging device 240. Therefore, the processing load can be reduced as compared with a case where image processing is performed on all images obtained by the imaging device 240.
 図7は、本実施形態にかかるUAV制御部310の構成の一例を示すブロック図である。UAV制御部310は、姿勢取得部710、風速取得部720、操作指示取得部730、電流値取得部740、画像取得部750、判定部760、画像処理部770、及び出力部780を有する。画像処理部770は、候補領域抽出部771及び補間部772を含む。図7は、一例に過ぎず、他の構成が含まれてもよい。 FIG. 7 is a block diagram showing an example of the configuration of the UAV control unit 310 according to the present embodiment. The UAV control unit 310 includes an attitude acquisition unit 710, a wind speed acquisition unit 720, an operation instruction acquisition unit 730, a current value acquisition unit 740, an image acquisition unit 750, a determination unit 760, an image processing unit 770, and an output unit 780. The image processing unit 770 includes a candidate area extraction unit 771 and an interpolation unit 772. FIG. 7 is merely an example, and other configurations may be included.
 姿勢取得部710は、UAV機体210の姿勢情報を取得する。姿勢情報とは、例えばヨー軸、ピッチ軸、ロール軸の回転角度を示す情報である。UAV101は、慣性計測装置(IMU:Inertial Measurement Unit)を備えていてもよい。IMUは、直交する3軸に対して3つのジャイロセンサや3つの加速度センサを備えていてもよい。UAV101は、3軸地磁気センサを備えてもよい。姿勢取得部710は、加速度センサ、ジャイロセンサ、および3軸地磁気センサなどを用いてUAV機体210の姿勢情報を取得してよい。姿勢取得部710で取得した姿勢情報は、判定部760と候補領域抽出部771に送られる。 The attitude acquisition unit 710 acquires the attitude information of the UAV body 210. The posture information is information indicating the rotation angle of the yaw axis, pitch axis, and roll axis, for example. The UAV 101 may include an inertial measurement device (IMU: Inertial Measurement Unit). The IMU may include three gyro sensors and three acceleration sensors with respect to three orthogonal axes. The UAV 101 may include a triaxial geomagnetic sensor. The attitude acquisition unit 710 may acquire the attitude information of the UAV body 210 using an acceleration sensor, a gyro sensor, a triaxial geomagnetic sensor, and the like. The posture information acquired by the posture acquisition unit 710 is sent to the determination unit 760 and the candidate area extraction unit 771.
 姿勢取得部710は、撮像装置240の姿勢情報を取得してもよい。つまり、姿勢取得部710は、ジンバル230によって制御されている撮像装置240の姿勢情報を取得してよい。姿勢取得部710は、ジンバル230の制御情報に基づいて撮像装置240の姿勢情報を取得してよい。撮像装置240に加速度センサ、ジャイロセンサ、および3軸地磁気センサが備えられてもよく、姿勢取得部710は、撮像装置240に備えられたセンサからの情報を用いて撮像装置240の姿勢情報を取得してよい。撮像装置240の姿勢は、リモートコントローラ端末400からの制御によって任意の角度に変えることができてもよい。 The posture acquisition unit 710 may acquire posture information of the imaging device 240. That is, the posture acquisition unit 710 may acquire posture information of the imaging device 240 controlled by the gimbal 230. The posture acquisition unit 710 may acquire the posture information of the imaging device 240 based on the control information of the gimbal 230. The imaging device 240 may include an acceleration sensor, a gyro sensor, and a triaxial geomagnetic sensor, and the posture acquisition unit 710 acquires posture information of the imaging device 240 using information from the sensor provided in the imaging device 240. You can do it. The attitude of the imaging device 240 may be changed to an arbitrary angle by control from the remote controller terminal 400.
 風速取得部(第一取得部)720は、風速を取得する。風速取得部720は、取得した風速を判定部760に送る。UAV101は、風速計を備えていてもよい。風速取得部720は、風速計で得られた値を取得してもよい。 The wind speed acquisition unit (first acquisition unit) 720 acquires the wind speed. The wind speed acquisition unit 720 sends the acquired wind speed to the determination unit 760. The UAV 101 may include an anemometer. The wind speed acquisition unit 720 may acquire a value obtained by an anemometer.
 風速取得部720は、IMUによって得られた加速度と角速度とに基づいて風速を算出してもよい。例えば、UAV101のUAV制御部310による制御によって、計算上は位置Aに移動するべきところ、IMUで得られた値を元に計算した結果、位置A´にUAV101が移動していたとする。風速取得部720は、このような位置A´と位置Aとの関係に基づいて風速を算出してもよい。 The wind speed acquisition unit 720 may calculate the wind speed based on the acceleration and angular velocity obtained by the IMU. For example, it is assumed that the UAV 101 should be moved to the position A ′ as a result of calculation based on the value obtained by the IMU, where the UAV 101 should be moved to the position A in the calculation under the control of the UAV control unit 310 of the UAV 101. The wind speed acquisition unit 720 may calculate the wind speed based on such a relationship between the position A ′ and the position A.
 風速取得部720は、経路の風速を示す情報を通信インタフェース330を通じて外部から取得してもよい。 The wind speed acquisition unit 720 may acquire information indicating the wind speed of the route from the outside through the communication interface 330.
 風速取得部720は、UAV101の対地速度と対気速度との差に基づいて風速を取得してもよい。例えばGNSSなどを用いて対地速度を取得してよく、対気速度測定器を用いて対気速度を取得してよい。風速取得部720は、これらの差に基づいて風速を取得してよい。 The wind speed acquisition unit 720 may acquire the wind speed based on the difference between the ground speed and the air speed of the UAV 101. For example, the ground speed may be acquired using GNSS or the like, and the air speed may be acquired using an air speed measuring device. The wind speed acquisition unit 720 may acquire the wind speed based on these differences.
 操作指示取得部(第二取得部)730は、リモートコントローラ端末400の操作部410によって入力を受け付けた操作指示を、通信インタフェース330を介して取得する。操作指示は、操作部410のコントローラスティックの傾きを示す値(第二の値)であってよい。操作部410のコントローラスティックの傾きが大きいほど、UAV101の速度を上げる指示であってよい。UAV制御部310は、取得した操作指示に基づいてUAV101の飛行を制御してよい。 The operation instruction acquisition unit (second acquisition unit) 730 acquires the operation instruction received by the operation unit 410 of the remote controller terminal 400 via the communication interface 330. The operation instruction may be a value (second value) indicating the inclination of the controller stick of the operation unit 410. It may be an instruction to increase the speed of the UAV 101 as the inclination of the controller stick of the operation unit 410 increases. The UAV control unit 310 may control the flight of the UAV 101 based on the acquired operation instruction.
 電流値取得部(第三取得部)740は、回転翼機構340の駆動モータに流す電流値を取得する。電流値取得部740は、複数の回転翼220のそれぞれに対応する回転翼機構340の駆動モータに流す電流値を取得する。 The current value acquisition unit (third acquisition unit) 740 acquires a current value that flows through the drive motor of the rotary blade mechanism 340. The current value acquisition unit 740 acquires a current value that flows through the drive motor of the rotary blade mechanism 340 corresponding to each of the multiple rotary blades 220.
 画像取得部750は、撮像装置240を用いて撮像された画像を取得する。画像取得部750は、取得した画像を画像処理部770に出力する。 The image acquisition unit 750 acquires an image captured using the imaging device 240. The image acquisition unit 750 outputs the acquired image to the image processing unit 770.
 判定部760は、画像処理部770に入力される画像にオブジェクトが存在するか否かを判定する。判定部760は、姿勢取得部710において取得された姿勢情報、風速取得部720において取得された風速、操作指示取得部730において取得された操作指示、および電流値取得部740において取得された電流値のうちの少なくとも1つに基づく判定をしてよい。判定部760が判定した結果は、画像処理部770に出力される。詳細については後述する。 The determination unit 760 determines whether an object exists in the image input to the image processing unit 770. The determination unit 760 includes the posture information acquired by the posture acquisition unit 710, the wind speed acquired by the wind speed acquisition unit 720, the operation instruction acquired by the operation instruction acquisition unit 730, and the current value acquired by the current value acquisition unit 740. A determination based on at least one of the above may be made. The result determined by the determination unit 760 is output to the image processing unit 770. Details will be described later.
 画像処理部770は、入力された画像に各種の画像処理を行う。画像処理部770は、候補領域抽出部771と補間部772とを有する。画像処理部770は、候補領域抽出部771による候補領域抽出処理と補間部772による補間処理とのほかに、別の処理を実行してよい。画像処理部770は、各種の処理を行った画像を出力部780に出力する。 The image processing unit 770 performs various image processes on the input image. The image processing unit 770 includes a candidate area extraction unit 771 and an interpolation unit 772. The image processing unit 770 may execute another process in addition to the candidate area extraction process by the candidate area extraction unit 771 and the interpolation process by the interpolation unit 772. The image processing unit 770 outputs an image subjected to various types of processing to the output unit 780.
 候補領域抽出部771は、入力された画像の候補領域を抽出する。候補領域とは、オブジェクトが存在する可能性のある領域のことである。本実施形態においてUAV101の構造は既知であるとする。候補領域抽出部771は、既知の情報を用いて候補領域を抽出してよい。既知の情報は、例えばUAV101のUAV機体210の形状および大きさを含んでよい。既知の情報は、回転翼220の形状および大きさを含んでよい。既知の情報は、回転翼220とUAV機体210との位置の関係を含んでよい。既知の情報は、撮像装置240とUAV機体210との位置の関係を含んでよい。既知の情報は、撮像装置240の画角(光軸に対する水平画角および垂直画角)を含んでよい。候補領域抽出部771は、これらの既知の情報を用いることで、画像内における候補領域を抽出してよい。候補領域抽出部771は、姿勢取得部710において取得された姿勢情報をさらに用いて候補領域を抽出してもよい。詳細は後述する。 The candidate area extraction unit 771 extracts candidate areas of the input image. A candidate area is an area where an object may exist. In this embodiment, it is assumed that the structure of the UAV 101 is known. The candidate area extraction unit 771 may extract candidate areas using known information. The known information may include, for example, the shape and size of the UAV airframe 210 of the UAV 101. The known information may include the shape and size of the rotor blade 220. The known information may include a positional relationship between the rotor wing 220 and the UAV airframe 210. The known information may include a positional relationship between the imaging device 240 and the UAV body 210. The known information may include the angle of view of the imaging device 240 (horizontal angle of view and vertical angle of view with respect to the optical axis). The candidate area extraction unit 771 may extract candidate areas in the image by using such known information. The candidate area extraction unit 771 may further extract the candidate area using the posture information acquired by the posture acquisition unit 710. Details will be described later.
 補間部772は、候補領域抽出部771が抽出した候補領域に補間処理を行う。補間部772は、候補領域からオブジェクトを検出する処理を行う。補間部772は、検出したオブジェクトを除去する処理を行う。オブジェクトを除去する処理としては、インペインティングと呼ばれる公知の処理を適用してよい。 The interpolation unit 772 performs an interpolation process on the candidate region extracted by the candidate region extraction unit 771. The interpolation unit 772 performs processing for detecting an object from the candidate area. The interpolation unit 772 performs a process of removing the detected object. As a process for removing the object, a known process called in-painting may be applied.
 出力部780は、画像処理部770によって出力された画像を出力する。出力部780は、画像をメモリ320に出力してよい。出力部780は、通信インタフェース330を通じて画像をリモートコントローラ端末400に送信してよい。出力部780は、画像をメモリ320に出力し、かつ、リモートコントローラ端末400に送信してよい。出力部780は、通信インタフェース330を通じて画像を他の装置(例えばクラウド上のサーバ)に送信してよい。 The output unit 780 outputs the image output by the image processing unit 770. The output unit 780 may output an image to the memory 320. The output unit 780 may transmit an image to the remote controller terminal 400 through the communication interface 330. The output unit 780 may output the image to the memory 320 and transmit it to the remote controller terminal 400. The output unit 780 may transmit an image to another device (for example, a server on the cloud) through the communication interface 330.
 以下では、判定部760、候補領域抽出部771、及び補間部772を用いた処理の詳細を説明する。 Hereinafter, details of processing using the determination unit 760, the candidate region extraction unit 771, and the interpolation unit 772 will be described.
 図8は、本実施形態にかかる処理の一例を示すフローチャートである。同図に示す処理は、UAV制御部310によって実行される。この処理は、画像取得部750において画像が取得されるタイミングで実行される。動画の場合は、動画を構成する各フレーム(静止画)の取得タイミングで実行されてよい。図8に示す処理は、UAV制御部310によって実行される制御方法の一例である。 FIG. 8 is a flowchart showing an example of processing according to the present embodiment. The process shown in the figure is executed by the UAV control unit 310. This process is executed at the timing when the image acquisition unit 750 acquires an image. In the case of a moving image, it may be executed at the acquisition timing of each frame (still image) constituting the moving image. The process shown in FIG. 8 is an example of a control method executed by the UAV control unit 310.
 ステップS810において判定部760は、撮像装置240の姿勢とUAV機体210の姿勢との関係が示す第一パラメータが所定の閾値を超えたか否かを判定することができる。判定部760は、姿勢取得部710において取得された姿勢情報が示す、UAV機体210の傾きと撮像装置240の傾きとが成す角度が、所定の角度より大きいか否かを判定してよい。 In step S810, the determination unit 760 can determine whether or not the first parameter indicated by the relationship between the attitude of the imaging device 240 and the attitude of the UAV body 210 exceeds a predetermined threshold. The determination unit 760 may determine whether or not the angle formed by the tilt of the UAV body 210 and the tilt of the imaging device 240 indicated by the posture information acquired by the posture acquisition unit 710 is greater than a predetermined angle.
 姿勢取得部710において取得された姿勢情報が示す、UAV機体の傾きと撮像装置240の傾きとが成す角度が、所定の角度より大きいか否かを判定部760が判定する例を説明する。撮像装置240の傾きとUAV機体210の傾きとが成す角度が所定の角度より大きい場合、画像内にオブジェクトが存在する可能性がある。UAV制御部310はステップS820に処理を進める。この傾きは、撮像装置240の姿勢とUAV機体210の姿勢との相対的な関係を示しているともいえる。判定部760は、ある基準線を想定した場合に、その基準線に対する撮像装置240の光軸方向の傾きとその基準線に対するUAV機体210の傾きとの差が、所定の角度より大きいか否かを判定してもよい。例えば、基準線を水平線とすると、撮像装置240の水平線に対する傾き(仰俯角)とUAV機体210の水平線に対する傾き(仰俯角)との差が、所定の角度より大きいか否かを判定してもよい。傾きは、UAV機体210および撮像装置240のジャイロセンサなどから得られるピッチ軸の回転角度を示す情報を用いてよい。 An example will be described in which the determination unit 760 determines whether or not the angle formed by the posture information acquired by the posture acquisition unit 710 is greater than a predetermined angle. If the angle formed by the tilt of the imaging device 240 and the tilt of the UAV body 210 is larger than a predetermined angle, there is a possibility that an object exists in the image. The UAV control unit 310 advances the process to step S820. It can be said that this inclination indicates a relative relationship between the posture of the imaging device 240 and the posture of the UAV body 210. If a certain reference line is assumed, the determination unit 760 determines whether or not the difference between the inclination in the optical axis direction of the imaging device 240 with respect to the reference line and the inclination of the UAV body 210 with respect to the reference line is greater than a predetermined angle. May be determined. For example, if the reference line is a horizontal line, it may be determined whether the difference between the inclination (elevation angle) of the imaging device 240 with respect to the horizontal line and the inclination of the UAV body 210 with respect to the horizontal line (elevation angle) is greater than a predetermined angle. Good. For the tilt, information indicating the rotation angle of the pitch axis obtained from the UAV body 210 and the gyro sensor of the imaging device 240 may be used.
 図9は、UAV機体210の進行方向901と撮像装置240の光軸方向902とを表した図である。先に説明したように、UAV101の構造は既知である。既知の情報は、UAV機体210の形状および大きさを含んでよい。既知の情報は、回転翼220aの形状、および大きさを含んでよい。既知の情報は、回転翼220aとUAV機体210との位置の関係を含んでよい。図9の例では、進行方向側の回転翼220aに関する情報を既知の情報として含む例を示すが、進行方向とは反対側の回転翼220bに関する情報を含んでもよい。既知の情報は、撮像装置240とUAV機体210との位置の関係を含んでよい。既知の情報は、撮像装置240の画角(光軸に対する水平画角および垂直画角)を含んでよい。これらの既知の情報を用いることで、画像内にオブジェクトが存在する場合のUAV機体210の傾きと撮像装置240の傾きとが成す角度を特定できる。例えば、撮像装置240の光軸方向952に対するUAV機体210の向き951が成す角θが、所定の角度より大きい場合、画像内にオブジェクトが存在すると判定してよい。閾値として用いる所定の角度は、先に説明した既知の情報から予め設定されてよい。 FIG. 9 is a diagram showing the traveling direction 901 of the UAV body 210 and the optical axis direction 902 of the imaging device 240. As explained above, the structure of the UAV 101 is known. The known information may include the shape and size of the UAV airframe 210. The known information may include the shape and size of the rotor blade 220a. The known information may include a positional relationship between the rotor 220a and the UAV body 210. In the example of FIG. 9, an example in which information related to the moving blade 220a on the traveling direction side is included as known information is shown, but information related to the rotating blade 220b on the opposite side to the moving direction may be included. The known information may include a positional relationship between the imaging device 240 and the UAV body 210. The known information may include the angle of view of the imaging device 240 (horizontal angle of view and vertical angle of view with respect to the optical axis). By using such known information, the angle formed by the tilt of the UAV body 210 and the tilt of the imaging device 240 when an object exists in the image can be specified. For example, when the angle θ formed by the orientation 951 of the UAV body 210 with respect to the optical axis direction 952 of the imaging device 240 is larger than a predetermined angle, it may be determined that an object exists in the image. The predetermined angle used as the threshold value may be set in advance from the known information described above.
 既知の情報に応じて、判定結果が異なる例を説明する。対比例として、図9に示す例を用いる。 An example will be described in which the determination result varies depending on known information. As the comparative example, the example shown in FIG. 9 is used.
 図9に示す例よりも、UAV機体210の形状が撮像装置240の光軸方向に伸びているとき、撮像装置240の光軸方向に対するUAV機体210の傾きが図9に示す例よりも小さい場合であっても、画像内にオブジェクトが存在すると判定されることがある。 9, when the shape of the UAV body 210 extends in the optical axis direction of the imaging device 240, the inclination of the UAV body 210 with respect to the optical axis direction of the imaging device 240 is smaller than the example illustrated in FIG. 9. Even so, it may be determined that an object exists in the image.
 図9に示す例よりも、UAV機体210の大きさが大きいとき、撮像装置240の光軸方向に対するUAV機体210の傾きが図9に示す例よりも小さい場合であっても、画像内にオブジェクトが存在すると判定されることがある。 When the size of the UAV body 210 is larger than the example shown in FIG. 9, even if the inclination of the UAV body 210 with respect to the optical axis direction of the imaging device 240 is smaller than the example shown in FIG. May be determined to exist.
 図9に示す例よりも、回転翼220aの形状が撮像装置240の光軸方向に伸びているとき、撮像装置240の光軸方向に対するUAV機体210の傾きが図9に示す例よりも小さい場合であっても、画像内にオブジェクトが存在すると判定されることがある。 When the shape of the rotary blade 220a extends in the optical axis direction of the imaging device 240, the inclination of the UAV body 210 relative to the optical axis direction of the imaging device 240 is smaller than in the example shown in FIG. Even so, it may be determined that an object exists in the image.
 図9に示す例よりも、回転翼220aの大きさが大きいとき、撮像装置240の光軸方向に対するUAV機体210の傾きが図9に示す例よりも小さい場合であっても、画像内にオブジェクトが存在する可能性があると判定されることがある。 When the size of the rotary blade 220a is larger than the example shown in FIG. 9, even if the inclination of the UAV body 210 with respect to the optical axis direction of the imaging device 240 is smaller than the example shown in FIG. May be determined to exist.
 図9に示す例よりも、進行方向側の回転翼220aがUAV機体210の進行方向と反対寄りの位置に取り付けられているとき、撮像装置240の光軸方向に対するUAV機体210の傾きが図9に示す例よりも大きい場合であっても、画像内にオブジェクトは存在しないと判定されることがある。 When the rotor blade 220a on the traveling direction side is attached to a position opposite to the traveling direction of the UAV body 210 than the example shown in FIG. 9, the inclination of the UAV body 210 with respect to the optical axis direction of the imaging device 240 is as shown in FIG. Even if it is larger than the example shown in FIG. 4, it may be determined that no object exists in the image.
 図9に示す例よりも、撮像装置240の画角が小さいとき、撮像装置240の光軸方向に対するUAV機体210の傾きが図9に示す例よりも大きい場合であっても、画像内にオブジェクトは存在しないと判定されることがある。 When the angle of view of the imaging device 240 is smaller than the example shown in FIG. 9, even if the inclination of the UAV body 210 with respect to the optical axis direction of the imaging device 240 is larger than the example shown in FIG. May be determined not to exist.
 図9に示す例よりも、撮像装置240がUAV機体210の鉛直方向の下側に取り付けられているとき、撮像装置240の光軸方向に対するUAV機体210の傾きが図9に示す例よりも大きい場合であっても、画像内にオブジェクトは存在しないと判定されることがある。 Compared to the example shown in FIG. 9, when the imaging device 240 is attached to the lower side in the vertical direction of the UAV body 210, the inclination of the UAV body 210 with respect to the optical axis direction of the imaging device 240 is larger than the example shown in FIG. 9. Even in this case, it may be determined that no object exists in the image.
 このように、姿勢情報に対応する閾値は、既知の情報に基づいて設定される。UAV機体210の傾きと撮像装置240の傾きとが成す角度が、閾値(所定の角度)を超える場合には、オブジェクトが画像内の存在する可能性がある。したがって、UAV制御部310は、ステップS820に処理を進める。 Thus, the threshold value corresponding to the posture information is set based on known information. If the angle formed by the tilt of the UAV body 210 and the tilt of the imaging device 240 exceeds a threshold value (predetermined angle), there is a possibility that the object exists in the image. Therefore, the UAV control unit 310 advances the process to step S820.
 ステップS810において判定部760は、機体の傾きを示す第二パラメータ(第一の値)が所定の閾値を超えたか否かを判定してもよい。パラメータは、風速取得部720において取得された風速、操作指示取得部730において取得された操作指示、および電流値取得部740において取得された電流値の少なくとも1つであってよい。それぞれのパラメータには、対応する閾値が予め設定されてよい。判定部760は、風速取得部720において取得された風速が所定の風速を超えているか否かを判定してよい。判定部760は、操作指示取得部730において取得された操作指示が示す値が、所定の入力値を超えているか否かを判定してよい。判定部760は、電流値取得部740において取得された電流値が、所定の電流値を超えているか否かを判定してよい。 In step S810, the determination unit 760 may determine whether or not the second parameter (first value) indicating the inclination of the aircraft exceeds a predetermined threshold value. The parameter may be at least one of the wind speed acquired by the wind speed acquisition unit 720, the operation instruction acquired by the operation instruction acquisition unit 730, and the current value acquired by the current value acquisition unit 740. A corresponding threshold value may be set in advance for each parameter. The determination unit 760 may determine whether or not the wind speed acquired by the wind speed acquisition unit 720 exceeds a predetermined wind speed. The determination unit 760 may determine whether or not the value indicated by the operation instruction acquired by the operation instruction acquisition unit 730 exceeds a predetermined input value. The determination unit 760 may determine whether or not the current value acquired by the current value acquisition unit 740 exceeds a predetermined current value.
 機体の傾きを示すパラメータが、対応する閾値を超えた場合、UAV機体210の姿勢が傾いている可能性が高い。換言すれば、各取得部のいずれかで取得された情報が、対応する閾値を超えた場合、UAV機体210の姿勢が傾いている可能性が高い。したがって、撮像された画像に回転翼220やUAV機体210などのオブジェクトが存在する可能性がある。それぞれのパラメータのいずれか一つが、対応する閾値を超えた場合に、判定部760は、ステップS820に処理を進めてもよい。それぞれのパラメータのうちの複数の種類のパラメータが、対応する閾値を超えた場合に、判定部760は、ステップS820に処理を進めてもよい。 If the parameter indicating the tilt of the aircraft exceeds the corresponding threshold, there is a high possibility that the posture of the UAV aircraft 210 is tilted. In other words, if the information acquired by any of the acquisition units exceeds the corresponding threshold value, there is a high possibility that the posture of the UAV body 210 is tilted. Therefore, there is a possibility that objects such as the rotor wing 220 and the UAV body 210 exist in the captured image. When any one of the respective parameters exceeds the corresponding threshold value, the determination unit 760 may advance the process to step S820. When a plurality of types of parameters among the respective parameters exceed the corresponding threshold, the determination unit 760 may advance the process to step S820.
 風速取得部720において取得された風速が、所定の風速を超えているか否かを判定部760が判定する例を説明する。例えば、風速の所定の閾値として秒速15mが設定されていると想定する。判定部760は、風速取得部720において取得された風速が秒速15mを超えている場合、所定の閾値を超えた値を検出したと判定する。所定の閾値を超えた風速が計測されるということは、UAV機体210が傾いている可能性が高い。したがって、画像内にオブジェクトが存在する場合があるので、UAV制御部310はステップS820に処理を進める。 An example in which the determination unit 760 determines whether or not the wind speed acquired by the wind speed acquisition unit 720 exceeds a predetermined wind speed will be described. For example, it is assumed that 15 m / s is set as a predetermined wind speed threshold. The determination unit 760 determines that a value exceeding a predetermined threshold has been detected when the wind speed acquired by the wind speed acquisition unit 720 exceeds 15 m / s. It is highly possible that the UAV body 210 is tilted when the wind speed exceeding the predetermined threshold is measured. Therefore, since there may be an object in the image, the UAV control unit 310 advances the process to step S820.
 操作指示取得部730において取得された操作指示が示す値が、所定の入力値を超えているか否かを判定部760が判定する例を説明する。所定の入力値は、リモートコントローラ端末400からの操作指示が示す値としてよい。リモートコントローラ端末400からの操作指示が示す値が、所定の閾値を超えている場合、判定部760は所定の閾値を超えた値を検出したと判定する。例えば、リモートコントローラ端末400の操作部410がある規定の位置よりも強く傾けられた場合に、所定の入力値が所定の閾値を超えていると判定する。リモートコントローラ端末400の操作部410がある規定の位置よりも強く傾けられた場合は、UAV101が高速飛行をする可能性が高い。UAV101が高速飛行をすることにより、UAV機体210が傾く可能性が高い。したがって、画像内にオブジェクトが存在する場合があるので、UAV制御部310はステップS820に処理を進める。 An example in which the determination unit 760 determines whether or not the value indicated by the operation instruction acquired by the operation instruction acquisition unit 730 exceeds a predetermined input value will be described. The predetermined input value may be a value indicated by an operation instruction from the remote controller terminal 400. When the value indicated by the operation instruction from the remote controller terminal 400 exceeds a predetermined threshold value, the determination unit 760 determines that a value exceeding the predetermined threshold value has been detected. For example, when the operation unit 410 of the remote controller terminal 400 is tilted more strongly than a predetermined position, it is determined that a predetermined input value exceeds a predetermined threshold. When the operation unit 410 of the remote controller terminal 400 is tilted more strongly than a predetermined position, there is a high possibility that the UAV 101 will fly at high speed. As the UAV 101 flies at high speed, there is a high possibility that the UAV body 210 is tilted. Therefore, since there may be an object in the image, the UAV control unit 310 advances the process to step S820.
 電流値取得部740において取得された電流値が、所定の電流値を超えているか否かを判定部760が判定する例を説明する。所定の電流値は、回転翼機構340のモータに流れる電流値としてよい。例えばUAV101の進行方向とは反対側の2つのモータに流す電流値が所定の閾値を超えている場合がある。この場合、UAV101が高速飛行をしている可能性が高い。UAV101の進行方向側の2つのモータの電流値が、進行方向とは反対側の2つのモータに流される電流値よりも小さい場合も、UAV101が傾いて飛行をしている可能性が高い。所定の電流値は、進行方向側のモータの電流値でもよいし、進行方向とは反対側のモータの電流値でもよい。これらの組み合わせでもよい。例えば、UAV101の進行方向の反対側のモータに流す電流値が進行方向側に流す電流値よりも相対的に大きい場合、所定の電流値が閾値を超えているとしてもよい。いずれにおいても、UAV101が高速飛行をしている可能性が高い。UAV101が高速飛行をすることにより、UAV機体210が傾く可能性が高い。したがって、画像内にオブジェクトが存在する場合があるので、UAV制御部310はステップS820に処理を進める。 An example will be described in which the determination unit 760 determines whether or not the current value acquired by the current value acquisition unit 740 exceeds a predetermined current value. The predetermined current value may be a current value flowing through the motor of the rotary blade mechanism 340. For example, there are cases where the value of current flowing through the two motors on the opposite sides of the UAV 101 travel direction exceeds a predetermined threshold value. In this case, there is a high possibility that the UAV 101 is flying at high speed. Even when the current values of the two motors on the traveling direction side of the UAV 101 are smaller than the current values supplied to the two motors on the opposite side of the traveling direction, there is a high possibility that the UAV 101 is tilting and flying. The predetermined current value may be the current value of the motor on the traveling direction side or the current value of the motor on the opposite side to the traveling direction. A combination of these may also be used. For example, when the current value flowing through the motor on the opposite side of the traveling direction of the UAV 101 is relatively larger than the current value flowing through the traveling direction side, the predetermined current value may exceed the threshold value. In any case, there is a high possibility that the UAV 101 is flying at high speed. As the UAV 101 flies at high speed, there is a high possibility that the UAV body 210 is tilted. Therefore, since there may be an object in the image, the UAV control unit 310 advances the process to step S820.
 ステップS810の時点では、オブジェクトが実際に画像に存在しているか否かを、画像を探索して確認する必要はない。この時点では、オブジェクトが画像に存在する可能性があるか否かを判定する処理を行う。オブジェクトが実際に画像に存在する可能性がある場合、後述するステップに進み、画像処理部770が、画像内を探索してオブジェクトを検出する処理が行われることになる。 At the time of step S810, it is not necessary to search the image to confirm whether the object actually exists in the image. At this point, a process is performed to determine whether there is a possibility that the object exists in the image. If there is a possibility that the object actually exists in the image, the process proceeds to a step described later, and the image processing unit 770 performs processing for searching the image and detecting the object.
 画像処理部770が、画像取得部750で取得される全ての画像に対してオブジェクトの検出処理および除去処理を行うことは負荷が高い。本実施形態においては、ステップS810の判定処理を行う。かかる判定処理によって、オブジェクトが存在する可能性がある画像であるか否かを、画像処理を行うことなく判定することができる。ステップS810は、UAV機体210が傾いているか否か、あるいは、傾く可能性があるか否かを判定する処理ともいえる。 It is a heavy load for the image processing unit 770 to perform object detection processing and removal processing on all images acquired by the image acquisition unit 750. In the present embodiment, the determination process in step S810 is performed. With this determination process, it is possible to determine whether or not the image is likely to exist without performing image processing. Step S810 can be said to be processing for determining whether or not the UAV body 210 is tilted or is likely to tilt.
 ステップS810において判定部760は、画像取得部750が画像を取得した時点の姿勢情報、風速、操作指示、および電流値を用いて判定処理を行うことが好ましいが、これに限られない。所定の時間の間は共通の姿勢情報、風速、操作指示、および電流値を用いて判定してもよい。例えば、30fps(frames per second)の動画の場合、所定の時間を1秒と想定すると、30フレームに対しては、その中の1フレーム目の時点における姿勢情報、風速、操作指示、および電流値を、残りの29フレームに適用してもよい。 In step S810, the determination unit 760 preferably performs the determination process using the posture information, the wind speed, the operation instruction, and the current value when the image acquisition unit 750 acquires the image, but is not limited thereto. During a predetermined time, determination may be made using common posture information, wind speed, operation instruction, and current value. For example, in the case of a movie of 30 fps (frames per second), assuming that the predetermined time is 1 second, for 30 frames, the posture information, wind speed, operation instruction, and current value at the time of the first frame therein May be applied to the remaining 29 frames.
 ステップS810では、判定部760は、閾値を超えるパラメータを検出した場合、ステップS820に処理を進める。ステップS810において、閾値を超えるパラメータが検出されない場合には、オブジェクトが画像に存在しないと考えられる。したがって、以下で説明する画像処理を行わない。したがって、処理負荷を低減できる。 In step S810, when the determination unit 760 detects a parameter that exceeds the threshold, the process proceeds to step S820. If no parameter exceeding the threshold is detected in step S810, it is considered that the object does not exist in the image. Therefore, the image processing described below is not performed. Therefore, the processing load can be reduced.
 ステップS810の判定において用いられる所定の閾値は、リモートコントローラ端末400からの指示に従って任意の値を設定してよい。所定の閾値は、飛行中にリモートコントローラ端末400からの指示によって変更されてよい。 The predetermined threshold value used in the determination in step S810 may be set to an arbitrary value according to an instruction from the remote controller terminal 400. The predetermined threshold value may be changed by an instruction from the remote controller terminal 400 during the flight.
 ステップS820において候補領域抽出部771は、候補領域を抽出する。先に説明したように、UAV101の構造は既知である。これらの既知の情報を用いると、撮像装置240が撮像した画像の中のどの領域にオブジェクトが写り込む可能性があるかを特定することができる。換言すれば、候補領域抽出部771は、既知の情報を用いて候補領域を抽出することができる。 In step S820, the candidate area extraction unit 771 extracts candidate areas. As explained above, the structure of the UAV 101 is known. Using these pieces of known information, it is possible to specify in which region in the image captured by the imaging device 240 there is a possibility that the object will appear. In other words, the candidate area extraction unit 771 can extract candidate areas using known information.
 候補領域抽出部771は、既知の情報に加えて姿勢取得部710で取得された姿勢情報を用いて候補領域を抽出してよい。UAV機体210の傾きと撮像装置240の傾きとから、候補領域を抽出してよい。先に説明した図9においては、UAV機体210の傾き(姿勢)と、撮像装置240の傾き(姿勢)と、既知の情報とを用いると、撮像装置240で撮像された画像の上部に候補領域が抽出される。 The candidate area extraction unit 771 may extract a candidate area using the posture information acquired by the posture acquisition unit 710 in addition to the known information. Candidate regions may be extracted from the tilt of the UAV body 210 and the tilt of the imaging device 240. In FIG. 9 described above, if the tilt (posture) of the UAV body 210, the tilt (posture) of the imaging device 240, and known information are used, a candidate area is displayed above the image captured by the imaging device 240. Is extracted.
 図10は、撮像装置240によって撮像された画像1000を示す図である。画像1000内には、候補領域1010、1020が含まれる。候補領域1010、1020は、オブジェクトが存在し得る領域である。ステップS820の時点においては、実際に画像1000内にオブジェクトが存在しているかを確認する必要はない。候補領域1010、1020は、オブジェクトが存在し得る領域に過ぎない。例えば、回転翼220は、回転軸を中心に回転をする。回転翼220の一部が画像内に存在し得る程度にUAV機体210が傾いていたとしても、撮像のタイミングに応じて、回転翼220が含まれる画像が撮像される場合と回転翼220が含まれない画像が撮像される場合とがある。候補領域1010、1020は、回転翼220が回転軸を中心に画像内で取り得る円範囲を含む領域としてよい。 FIG. 10 is a diagram illustrating an image 1000 captured by the imaging device 240. The image 1000 includes candidate areas 1010 and 1020. Candidate areas 1010 and 1020 are areas where objects can exist. At step S820, it is not necessary to confirm whether an object actually exists in the image 1000. The candidate areas 1010 and 1020 are only areas where objects can exist. For example, the rotary blade 220 rotates around the rotation axis. Even if the UAV fuselage 210 is tilted to such an extent that a part of the rotor blade 220 can be present in the image, the image including the rotor blade 220 and the rotor blade 220 are included depending on the imaging timing. In some cases, an unacceptable image is captured. Candidate regions 1010 and 1020 may be regions including a circular range that can be taken in the image around the rotation axis by the rotary blade 220.
 ステップS830において候補領域抽出部771は、候補領域を抽出したか否かを判定する。候補領域を抽出しない場合には、以降の処理は不要であるので処理を終了する。したがって、オブジェクトを検出し、除去するといった画像処理を行う必要がない。候補領域を抽出した場合、ステップS840に処理が移行する。 In step S830, the candidate area extraction unit 771 determines whether a candidate area has been extracted. If the candidate area is not extracted, the subsequent process is unnecessary, and the process ends. Therefore, there is no need to perform image processing such as detecting and removing an object. When the candidate area is extracted, the process proceeds to step S840.
 ステップS840において補間部772は、ステップS820において抽出された候補領域を解析してオブジェクトを検出する処理を行い、候補領域にオブジェクトが存在するかを判定する。図11は、補間部772が、オブジェクト1110及び1120を検出した例を示す図である。オブジェクトの例としては、先に説明したようにUAV機体210の一部や回転翼220の一部が挙げられる。UAV機体210及び回転翼220の形状及び色などのような外観のテクスチャは既知である。したがって、補間部772は、候補領域内からこれらの既知のオブジェクトを検出したか否かを判定すればよい。補間部772は、予めメモリ320に格納されたオブジェクトの情報と撮像装置240で撮像した画像のオブジェクトの情報とを対比させてもよい。 In step S840, the interpolation unit 772 analyzes the candidate area extracted in step S820 to detect an object, and determines whether an object exists in the candidate area. FIG. 11 is a diagram illustrating an example in which the interpolation unit 772 detects the objects 1110 and 1120. Examples of the object include a part of the UAV body 210 and a part of the rotor wing 220 as described above. Appearance textures such as the shape and color of the UAV fuselage 210 and rotor 220 are known. Therefore, the interpolation unit 772 may determine whether or not these known objects are detected from the candidate area. The interpolation unit 772 may compare the object information stored in the memory 320 in advance with the object information of the image captured by the imaging device 240.
 例えば、オブジェクトの色のヒストグラムや空間周波数などの特徴量の情報が予めメモリ320に格納されている。補間部772は、撮像装置240で撮像した画像の候補領域を、色のヒストグラムや空間周波数のなどの特徴量の情報に変換する。補間部772は、変換して得られた特徴量の情報を、予め格納している特徴量の情報と比較することで候補領域1010、1020の中からオブジェクト1110、1120を検出してよい。 For example, feature amount information such as an object color histogram and spatial frequency is stored in the memory 320 in advance. The interpolation unit 772 converts the candidate area of the image captured by the imaging device 240 into information on feature amounts such as a color histogram and spatial frequency. The interpolation unit 772 may detect the objects 1110 and 1120 from the candidate areas 1010 and 1020 by comparing the feature amount information obtained by the conversion with the feature amount information stored in advance.
 候補領域にオブジェクトが検出された場合、ステップS850に進む。候補領域にオブジェクトが検出されない場合、処理を終了する。 If an object is detected in the candidate area, the process proceeds to step S850. If no object is detected in the candidate area, the process ends.
 ステップS850において補間部772は、検出したオブジェクトを補間する。例えば、補間部772は、検出したオブジェクト1110、1120の画素を欠損領域として扱う。補間部772は、欠損領域を補間する補間処理を行う。補間処理としては、1枚の画像内の欠損領域の周囲の画素を用いて欠損領域の画素を補間する、インペインティングと呼ばれる処理がある。 In step S850, the interpolation unit 772 interpolates the detected object. For example, the interpolation unit 772 treats the detected pixels of the objects 1110 and 1120 as a missing area. The interpolation unit 772 performs an interpolation process for interpolating the missing area. As an interpolation process, there is a process called inpainting in which pixels in a defective area are interpolated using pixels around the defective area in one image.
 図12は、欠損領域を補間する補間処理を説明する図である。補間部772は、画像1000内において検出されたオブジェクト1110、1120の周囲(近傍)の領域1210、1220の画素を用いてオブジェクト1110、1120の画素値を補間する。周囲の領域1210、1220は、例えば欠損領域から最低n画素離れた矩形として定められてよい。このようなインペインティングと呼ばれる処理は、例えば非特許文献1および非特許文献2に記載されている公知の手法を適用してよい。かかる処理により、オブジェクト1110、1120が画像1000から除去される。オブジェクト1110、1120は、その近傍の領域の画素を用いた補間処理によって、違和感の少ない領域に変換される。 FIG. 12 is a diagram for explaining an interpolation process for interpolating a missing area. The interpolation unit 772 interpolates the pixel values of the objects 1110 and 1120 using the pixels in the areas 1210 and 1220 around (near) the objects 1110 and 1120 detected in the image 1000. For example, the surrounding areas 1210 and 1220 may be defined as a rectangle that is at least n pixels away from the defect area. For such processing called inpainting, for example, a known method described in Non-Patent Document 1 and Non-Patent Document 2 may be applied. With this process, the objects 1110 and 1120 are removed from the image 1000. The objects 1110 and 1120 are converted into a region with less discomfort by an interpolation process using pixels in a region near the objects.
 具体的には、上記の近傍は、補間対象のオブジェクトの大きさ及び位置から決定されてよい。例えば、オブジェクトの一方向を補間する場合、オブジェクトの一方向における長さをLX(画素)とすると、LX×α+βによって補間範囲は決定できる。αは、取得された全体の画像と補間対象のオブジェクトの大きさ及び位置とから決定されるパラメータであってよい。βは、画素である。βは、例えば、LXが小さい場合に、補足するために用いられるパラメータであってよい。 Specifically, the above-described neighborhood may be determined from the size and position of the object to be interpolated. For example, in the case of interpolating one direction of the object, if the length in one direction of the object is LX (pixel), the interpolation range can be determined by LX × α + β. α may be a parameter determined from the acquired whole image and the size and position of the object to be interpolated. β is a pixel. For example, β may be a parameter used to supplement when LX is small.
 欠損領域を補間する処理は、オブジェクト1110、1120が違和感のない程度に画像1000から取り除かれていればよい。補間処理の手法によっては、あるいはオブジェクト1110、1120の周囲の画素によっては、補間処理を実行した後にオブジェクト1110、1120の輪郭が微かに残る場合もある。しかしながら、補間処理前と比べて補間処理後の画像の方が違和感は低減する。したがって、オブジェクト1110、1120は完全に画像1000から取り除かれていなくてもよい。オブジェクト1110、1120の一部が補間処理後の画像に残っていてもよい。 The process of interpolating the missing area only needs to be removed from the image 1000 to the extent that the objects 1110 and 1120 do not feel strange. Depending on the interpolation processing method or depending on the pixels around the objects 1110 and 1120, the outline of the objects 1110 and 1120 may remain slightly after the interpolation processing is executed. However, the uncomfortable feeling is reduced in the image after the interpolation process compared to before the interpolation process. Therefore, the objects 1110 and 1120 may not be completely removed from the image 1000. Some of the objects 1110 and 1120 may remain in the image after the interpolation processing.
 これまでの説明において補間部772は、1枚の画像中に含まれるオブジェクトを、その1枚の画像中のオブジェクトの周囲の画素を用いて補間する例を説明した。補間部772は、時間的に前、後、または前後両方のフレームを用いてオブジェクトを補間してもよい。例えばオブジェクトが回転翼220の場合、フレームによっては回転翼220の一部が画像に存在したり、存在しなかったりすることがある。時間的に前後のフレームでは、オブジェクトが存在しないフレームもあり得るので、そのフレームを用いて補間処理を行ってもよい。 In the above description, the interpolation unit 772 has described an example in which an object included in one image is interpolated using pixels around the object in the one image. The interpolating unit 772 may interpolate the object using temporally forward, backward, or both front and rear frames. For example, when the object is a rotary blade 220, a part of the rotary blade 220 may or may not exist in the image depending on the frame. In the temporally preceding and succeeding frames, there may be a frame in which no object exists, and therefore interpolation processing may be performed using the frame.
 補間部772は、1枚の画像中に含まれるオブジェクトを、その1枚の画像中のオブジェクトの周囲の画素を用いて補間する処理と、時間的に前、後、または前後両方のフレームを用いてオブジェクトを補間する処理とを組み合わせもよい。 The interpolation unit 772 uses a process for interpolating an object included in one image using pixels around the object in the one image, and uses both frames before, after, and before and after in time. The process of interpolating the object may be combined.
 上述した実施形態では主に回転翼を有するUAVを例に挙げて説明したが、これに限られることはない。機体の姿勢と撮像装置の姿勢とが異なる態様となり得る移動体であれば、上述した実施形態で説明した処理を適用可能である。 In the above-described embodiment, the UAV mainly including the rotor blade has been described as an example. However, the present invention is not limited to this. The processing described in the above-described embodiment can be applied to any moving body that can have a mode in which the attitude of the machine body and the attitude of the imaging device are different.
 上述した実施形態では、撮像装置は機体の下部に取り付けられた例を説明したが、これに限られない。撮像装置が取り付けられる箇所は機体の側面でもよいし、上部でもよい。いずれの位置に取り付けられていてもよい。 In the above-described embodiment, the example in which the imaging device is attached to the lower part of the airframe has been described, but the present invention is not limited thereto. The location where the imaging device is attached may be on the side of the aircraft or on the top. It may be attached at any position.
 上述した実施形態では、判定部760の判定結果に応じて画像処理部770が候補領域抽出部771および補間部772の処理を行うか否かを切り替える例を説明したが、これに限られることはない。他の構成でもよい。例えば、判定部760の判定結果が画像処理部770ではなく画像取得部750に送られてよい。画像取得部750は、判定部760の判定結果に基づいて、画像処理部770に画像を出力するか、画像処理部770を介さずに出力部780に画像を出力するかを切り替えてよい。 In the above-described embodiment, the example in which the image processing unit 770 switches whether to perform the processing of the candidate region extraction unit 771 and the interpolation unit 772 according to the determination result of the determination unit 760 has been described. Absent. Other configurations may be used. For example, the determination result of the determination unit 760 may be sent to the image acquisition unit 750 instead of the image processing unit 770. The image acquisition unit 750 may switch between outputting an image to the image processing unit 770 or outputting an image to the output unit 780 without going through the image processing unit 770 based on the determination result of the determination unit 760.
 画像処理部770は、補間部772による補間を行う前の画像(補間前画像と呼ぶ)と補間を行った画像(補間後画像と呼ぶ)との両方を出力部780に出力してよい。出力部780は、補間前画像および補間後画像の両方をメモリ320に出力してよい。かかる構成により、ユーザはオリジナルの画像とオブジェクトが除去された画像との両方を見比べることができる。 The image processing unit 770 may output both the image before the interpolation by the interpolation unit 772 (referred to as an image before interpolation) and the image after the interpolation (referred to as an image after interpolation) to the output unit 780. The output unit 780 may output both the pre-interpolation image and the post-interpolation image to the memory 320. With this configuration, the user can compare both the original image and the image from which the object has been removed.
 撮像装置240の光軸方向は一定方向に固定され、ジンバル230によって一定の姿勢に保たれる例を説明したが、これに限られない。撮像装置240の光軸方向は、リモートコントローラ端末400からの指示により変更されてよい。ジンバル230は、変更された光軸方向の姿勢を保つように引き続き撮像装置240の姿勢を制御する。判定部760は、変更後の撮像装置240の姿勢情報を用いて判定を行ってよい。 The example in which the optical axis direction of the imaging device 240 is fixed in a fixed direction and is maintained in a fixed posture by the gimbal 230 has been described, but is not limited thereto. The optical axis direction of the imaging device 240 may be changed by an instruction from the remote controller terminal 400. The gimbal 230 continues to control the posture of the imaging device 240 so as to maintain the changed posture in the optical axis direction. The determination unit 760 may perform determination using the posture information of the imaging device 240 after the change.
 上述した実施形態では、UAV機体210の傾きを用いることで、画像にオブジェクトが存在する可能性があるか否かを判定する形態を説明した。UAV機体210が傾くということは、回転翼220もUAV機体210の傾きに合わせて傾く。回転翼220にジャイロセンサが備えられている場合、回転翼220の姿勢を用いて画像にオブジェクトが存在するか否かを判定してもよい。 In the above-described embodiment, the mode of determining whether there is a possibility that an object exists in the image by using the inclination of the UAV body 210 has been described. The fact that the UAV body 210 is tilted means that the rotor wing 220 is also tilted in accordance with the tilt of the UAV body 210. When the gyro sensor is provided in the rotary blade 220, it may be determined whether an object exists in the image using the attitude of the rotary blade 220.
 図8の処理においては、ステップS810において所定のパラメータの値が、そのパラメータに対応する閾値を超える場合、ステップS820に進み、候補領域を抽出する処理を行う形態を説明したが、これに限られない。ステップS810の処理を省略して、ステップS820から処理を開始してもよい。例えば、既知の情報と、UAV機体210の姿勢と、撮像装置240の姿勢とを用いて、画像内に候補領域が抽出されるかを判定する形態でもよい。 In the process of FIG. 8, when the value of the predetermined parameter exceeds the threshold corresponding to the parameter in step S810, the process proceeds to step S820 to perform the process of extracting the candidate area. However, the present invention is not limited to this. Absent. The process of step S810 may be omitted and the process may be started from step S820. For example, it may be determined whether a candidate area is extracted from an image using known information, the posture of the UAV body 210, and the posture of the imaging device 240.
 図8の処理においては、ステップS820とS830との処理を省略してもよい。ステップS810において所定のパラメータの値が、そのパラメータに対応する閾値を超える場合、画像内のどこかにオブジェクトが存在する可能性があると考えられる。ステップS820及びS830の処理を行うことで、除去すべき領域を特定できるので処理負荷が軽減できる。しかしながら、ステップS810の判定を行うだけでも、画像処理の対象とする画像を絞り込むことが可能であるので、ステップS820及びS830を行わなかったとしても、処理負荷の軽減を実現できる。全ての画像に対して画像処理を行わないで済むからである。 In the process of FIG. 8, the processes of steps S820 and S830 may be omitted. If the value of the predetermined parameter exceeds the threshold value corresponding to the parameter in step S810, it is considered that there is a possibility that the object exists somewhere in the image. By performing the processes of steps S820 and S830, the area to be removed can be specified, so the processing load can be reduced. However, since it is possible to narrow down the images to be subjected to image processing only by performing the determination in step S810, the processing load can be reduced even if steps S820 and S830 are not performed. This is because it is not necessary to perform image processing on all images.
 上述した実施形態においては、UAV101が画像処理を行う形態を説明したが、これに限られない。リモートコントローラ端末400において画像処理が行われてもよい。リモートコントローラ端末400は、画像処理部を備えてよい。図7の姿勢取得部710、風速取得部720、操作指示取得部730、電流値取得部740、画像取得部750が取得したデータは、出力部780を介してリモートコントローラ端末400に送信されてよい。リモートコントローラ端末400は、送信されたデータを用いて図8に示す処理を行ってよい。 In the above-described embodiment, the mode in which the UAV 101 performs image processing has been described, but the present invention is not limited to this. Image processing may be performed in the remote controller terminal 400. The remote controller terminal 400 may include an image processing unit. Data acquired by the posture acquisition unit 710, the wind speed acquisition unit 720, the operation instruction acquisition unit 730, the current value acquisition unit 740, and the image acquisition unit 750 in FIG. 7 may be transmitted to the remote controller terminal 400 via the output unit 780. . The remote controller terminal 400 may perform the process shown in FIG. 8 using the transmitted data.
 上述した実施形態の機能を実現するための各部は、例えばハードウェアまたはソフトウェアによって実装することができる。ソフトウェアによって実装される場合、ハードウェアを制御するプログラムコード(制御プログラム)がCPU、MPUなどのコンピュータの各種のプロセッサによって実行されてもよい。プログラムコードの機能を実現するための回路等のハードウェアを設けてもよい。プログラムコードの一部をハードウェアで実現し、残りの部分を各種プロセッサが実行してもよい。 Each unit for realizing the functions of the above-described embodiments can be implemented by, for example, hardware or software. When implemented by software, program code (control program) for controlling hardware may be executed by various processors of a computer such as a CPU and MPU. Hardware such as a circuit for realizing the function of the program code may be provided. A part of the program code may be realized by hardware, and the remaining part may be executed by various processors.
 101 UAV
 210 UAV機体
 220 回転翼
 230 ジンバル
 240 撮像装置
 250 カメラ
 310 UAV制御部
 320 メモリ
 330 通信インタフェース
 340 回転翼機構
 400 リモートコントローラ端末
 410 操作部
 420 表示部
 430 支持部
 710 姿勢取得部
 720 風速取得部
 730 操作指示取得部
 740 電流値取得部
 750 画像取得部
 760 判定部
 770 画像処理部
 771 候補領域抽出部
 772 補間部
 780 出力部
101 UAV
210 UAV body 220 Rotor blade 230 Gimbal 240 Imaging device 250 Camera 310 UAV control unit 320 Memory 330 Communication interface 340 Rotor blade mechanism 400 Remote controller terminal 410 Operation unit 420 Display unit 430 Support unit 710 Posture acquisition unit 720 Wind speed acquisition unit 730 Operation instruction Acquisition unit 740 Current value acquisition unit 750 Image acquisition unit 760 Determination unit 770 Image processing unit 771 Candidate area extraction unit 772 Interpolation unit 780 Output unit

Claims (31)

  1.  機体と、
     撮像装置と、
     前記撮像装置の姿勢と前記機体の姿勢との関係に基づいて、前記撮像装置が撮像した画像にオブジェクトが存在するか否かを判定する判定部と、
     前記オブジェクトが存在すると判定された場合、前記オブジェクトを除去する処理を行う処理部と
    を有する、移動体。
    The aircraft,
    An imaging device;
    A determination unit that determines whether or not an object exists in an image captured by the imaging device based on a relationship between a posture of the imaging device and a posture of the body;
    And a processing unit that performs a process of removing the object when it is determined that the object exists.
  2.  前記判定部は、前記撮像装置の傾きと前記機体の傾きとが成す角度に基づいて前記判定を行う、請求項1に記載の移動体。 The moving body according to claim 1, wherein the determination unit performs the determination based on an angle formed by an inclination of the imaging device and an inclination of the airframe.
  3.  前記角度が所定の角度より大きい場合、前記判定部は、前記オブジェクトが存在すると判定する、請求項2に記載の移動体。 The moving body according to claim 2, wherein when the angle is larger than a predetermined angle, the determination unit determines that the object exists.
  4.  前記所定の角度は、前記機体の形状、前記機体の大きさ、前記撮像装置と前記機体との位置の関係、および前記撮像装置の画角を含む情報に基づいて設定される、請求項3に記載の移動体。 The predetermined angle is set based on information including a shape of the airframe, a size of the airframe, a positional relationship between the imaging device and the airframe, and an angle of view of the imaging device. The moving body described.
  5.  回転翼をさらに有し、
     前記所定の角度は、前記回転翼の形状、前記回転翼の大きさ、前記回転翼と前記機体との位置の関係、前記撮像装置と前記機体との位置の関係、および前記撮像装置の画角を含む情報に基づいて設定される、請求項3に記載の移動体。
    A rotating blade,
    The predetermined angle includes the shape of the rotor blade, the size of the rotor blade, the relationship between the position of the rotor blade and the body, the relationship between the position of the imaging device and the body, and the angle of view of the imaging device. The mobile body according to claim 3, which is set based on information including
  6.  前記機体の姿勢情報および前記撮像装置の姿勢情報を取得する取得部をさらに有し、
     前記判定部は、前記機体の姿勢情報および前記撮像装置の姿勢情報を用いて前記判定を行う、請求項1から5のいずれか一項に記載の移動体。
    It further includes an acquisition unit that acquires posture information of the aircraft and posture information of the imaging device,
    The mobile unit according to claim 1, wherein the determination unit performs the determination using posture information of the body and posture information of the imaging device.
  7.  前記取得部は、ジャイロセンサによって測定された姿勢情報を取得する、請求項6に記載の移動体。 The moving body according to claim 6, wherein the acquisition unit acquires posture information measured by a gyro sensor.
  8.  撮像装置と、
     機体の傾きを示す第一の値に基づいて、前記撮像装置が撮像した画像にオブジェクトが存在するか否かを判定する判定部と、
     前記オブジェクトが存在すると判定された場合、前記オブジェクトを除去する処理を行う処理部と
    を有する、移動体。
    An imaging device;
    A determination unit that determines whether or not an object exists in an image captured by the imaging device, based on a first value indicating a tilt of the airframe;
    And a processing unit that performs a process of removing the object when it is determined that the object exists.
  9.  前記第一の値は、風速、前記移動体を操作する操作端末において操作された第二の値、および前記移動体の回転翼を駆動する駆動モータの電流値の少なくとも1つを含む、請求項8に記載の移動体。 The first value includes at least one of a wind speed, a second value operated at an operation terminal that operates the moving body, and a current value of a drive motor that drives a rotor blade of the moving body. The moving body according to 8.
  10.  前記風速を取得する第一取得部をさらに有し、
     前記判定部は、取得された前記風速が所定の閾値を超える場合、前記オブジェクトが存在すると判定する、請求項9に記載の移動体。
    A first acquisition unit that acquires the wind speed;
    The moving body according to claim 9, wherein the determining unit determines that the object exists when the acquired wind speed exceeds a predetermined threshold.
  11.  前記風速は、風速計を用いて取得される、請求項10に記載の移動体。 The moving body according to claim 10, wherein the wind speed is obtained using an anemometer.
  12.  前記操作端末において操作された第二の値を取得する第二取得部をさらに有し、
     前記判定部は、取得された前記第二の値が所定の閾値を超える場合、前記オブジェクトが存在すると判定する、請求項9から11のいずれか一項に記載の移動体。
    A second acquisition unit for acquiring a second value operated on the operation terminal;
    The moving body according to claim 9, wherein the determination unit determines that the object exists when the acquired second value exceeds a predetermined threshold.
  13.  複数の回転翼と、
     前記複数の回転翼のそれぞれの電流値を取得する第三取得部と
    をさらに有し、
     前記判定部は、前記機体の進行方向とは反対側の前記回転翼の電流値が、前記機体の進行方向の側の前記回転翼の電流値より大きい場合、前記オブジェクトが存在すると判定する、請求項9から12のいずれか一項に記載の移動体。
    A plurality of rotor blades,
    A third acquisition unit for acquiring the current value of each of the plurality of rotor blades;
    The determination unit determines that the object exists when a current value of the rotor blade on a side opposite to a traveling direction of the airframe is larger than a current value of the rotor blade on a traveling direction side of the airframe. Item 13. The moving body according to any one of Items 9 to 12.
  14.  前記機体と前記撮像装置との間に更にジンバルを有する、請求項1から13のいずれか一項に記載の移動体。 The moving body according to any one of claims 1 to 13, further comprising a gimbal between the airframe and the imaging device.
  15.  前記ジンバルは、前記撮像装置の姿勢を一定に保つ制御をする、請求項14に記載の移動体。 15. The moving body according to claim 14, wherein the gimbal controls to keep the posture of the imaging device constant.
  16.  前記処理部は、前記画像において前記オブジェクトが含まれ得る領域を抽出する、請求項1から15のいずれか一項に記載の移動体。 The mobile unit according to any one of claims 1 to 15, wherein the processing unit extracts a region in which the object can be included in the image.
  17.  前記処理部は、前記領域から前記オブジェクトを検出する、請求項16に記載の移動体。 The moving body according to claim 16, wherein the processing unit detects the object from the area.
  18.  前記処理部は、前記領域を特徴量の情報に変換し、予め記憶されている前記オブジェクトの特徴量の情報と比較することで前記オブジェクトを検出する、請求項17に記載の移動体。 The mobile unit according to claim 17, wherein the processing unit detects the object by converting the area into feature amount information and comparing the region with feature value information of the object stored in advance.
  19.  前記特徴量の情報は、ヒストグラム又は空間周波数を含む、請求項18に記載の移動体。 The mobile object according to claim 18, wherein the feature amount information includes a histogram or a spatial frequency.
  20.  前記処理部は、前記処理部で検出された前記オブジェクトを補間する、請求項17から19のいずれか一項に記載の移動体。 The moving body according to any one of claims 17 to 19, wherein the processing unit interpolates the object detected by the processing unit.
  21.  前記処理部は、前記処理部で検出された前記オブジェクトに対応する画素を、前記オブジェクトの近傍の領域の画素を用いて補間する、請求項20に記載の移動体。 21. The moving object according to claim 20, wherein the processing unit interpolates pixels corresponding to the object detected by the processing unit using pixels in a region near the object.
  22.  前記処理部は、前記処理部で検出された前記オブジェクトに対応する画素を、前記画像の時間的に前後する画像を用いて補間する、請求項20または21に記載の移動体。 The moving body according to claim 20 or 21, wherein the processing unit interpolates pixels corresponding to the object detected by the processing unit using an image that is temporally mixed in the image.
  23.  記憶部をさらに有し、
     前記処理部は、前記処理部で処理される前の画像及び前記処理部で処理された後の画像の少なくとも1つを前記記憶部に出力する、請求項1から21のいずれか一項に記載の移動体。
    A storage unit;
    The processing unit outputs at least one of an image before being processed by the processing unit and an image after being processed by the processing unit to the storage unit. Moving body.
  24.  通信部をさらに有し、
     前記処理部は、前記処理部で処理された画像を前記通信部を介して外部に出力する、請求項1から23のいずれか一項に記載の移動体。
    A communication unit;
    The mobile unit according to any one of claims 1 to 23, wherein the processing unit outputs an image processed by the processing unit to the outside via the communication unit.
  25.  前記オブジェクトは、前記機体の一部である、請求項1から24のいずれか一項に記載の移動体。 25. The moving object according to any one of claims 1 to 24, wherein the object is a part of the aircraft.
  26.  機体と、撮像装置とを有する移動体の制御方法であって、
     前記撮像装置の姿勢と前記機体の姿勢との関係に基づいて、前記撮像装置が撮像した画像にオブジェクトが存在するか否かを判定するステップと、
     前記オブジェクトが存在すると判定された場合、前記オブジェクトを除去するステップと
    を有する、移動体の制御方法。
    A method of controlling a moving body having an airframe and an imaging device,
    Determining whether an object is present in an image captured by the imaging device based on a relationship between a posture of the imaging device and a posture of the airframe;
    And a step of removing the object when it is determined that the object exists.
  27.  撮像装置を有する移動体の制御方法であって、
     機体の傾きを示す第一の値に基づいて、前記撮像装置が撮像した画像にオブジェクトが存在するか否かを判定するステップと、
     前記オブジェクトが存在すると判定された場合、前記オブジェクトを除去するステップと
    を有する、移動体の制御方法。
    A method of controlling a moving body having an imaging device,
    Determining whether an object exists in an image captured by the imaging device based on a first value indicating a tilt of the airframe;
    And a step of removing the object when it is determined that the object exists.
  28.  機体と、撮像装置とを有する移動体の制御プログラムを格納する、コンピュータ読み取り可能な記憶媒体であって、
     前記制御プログラムは、コンピュータに、
     前記撮像装置の姿勢と前記機体の姿勢との関係に基づいて、前記撮像装置が撮像した画像にオブジェクトが存在するか否かを判定するステップと、
     前記オブジェクトが存在すると判定された場合、前記オブジェクトを除去するステップと
    を実行させる、コンピュータ読み取り可能な記憶媒体。
    A computer-readable storage medium for storing a control program for a moving body having an airframe and an imaging device,
    The control program is stored in a computer.
    Determining whether an object is present in an image captured by the imaging device based on a relationship between a posture of the imaging device and a posture of the airframe;
    A computer-readable storage medium that executes a step of removing the object when it is determined that the object exists.
  29.  撮像装置を有する移動体の制御プログラムを格納する、コンピュータ読み取り可能な記憶媒体であって、
     前記制御プログラムは、コンピュータに、
     機体の傾きを示す第一の値に基づいて、前記撮像装置が撮像した画像にオブジェクトが存在するか否かを判定するステップと、
     前記オブジェクトが存在すると判定された場合、前記オブジェクトを除去するステップと
    を実行させる、コンピュータ読み取り可能な記憶媒体。
    A computer-readable storage medium for storing a control program for a moving object having an imaging device,
    The control program is stored in a computer.
    Determining whether an object exists in an image captured by the imaging device based on a first value indicating a tilt of the airframe;
    A computer-readable storage medium that executes a step of removing the object when it is determined that the object exists.
  30.  機体と、撮像装置とを有する移動体の制御プログラムであって、コンピュータに、
     前記撮像装置の姿勢と前記機体の姿勢との関係に基づいて、前記撮像装置が撮像した画像にオブジェクトが存在するか否かを判定するステップと、
     前記オブジェクトが存在すると判定された場合、前記オブジェクトを除去するステップと
    を実行させる、制御プログラム。
    A moving body control program having an airframe and an imaging apparatus,
    Determining whether an object is present in an image captured by the imaging device based on a relationship between a posture of the imaging device and a posture of the airframe;
    A control program that executes a step of removing the object when it is determined that the object exists.
  31.  撮像装置を有する移動体の制御プログラムであって、コンピュータに、
     機体の傾きを示す第一の値に基づいて、前記撮像装置が撮像した画像にオブジェクトが存在するか否かを判定するステップと、
     前記オブジェクトが存在すると判定された場合、前記オブジェクトを除去するステップと
    を実行させる、制御プログラム。
    A control program for a moving object having an imaging device,
    Determining whether an object exists in an image captured by the imaging device based on a first value indicating a tilt of the airframe;
    A control program that executes a step of removing the object when it is determined that the object exists.
PCT/JP2016/072307 2016-07-29 2016-07-29 Moving body, method for controlling moving body, storage medium having control program stored thereon, and control program WO2018020656A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2016/072307 WO2018020656A1 (en) 2016-07-29 2016-07-29 Moving body, method for controlling moving body, storage medium having control program stored thereon, and control program
JP2017519596A JP6436601B2 (en) 2016-07-29 2016-07-29 MOBILE BODY, MOBILE BODY CONTROL METHOD, STORAGE MEDIUM CONTAINING CONTROL PROGRAM, AND CONTROL PROGRAM

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2016/072307 WO2018020656A1 (en) 2016-07-29 2016-07-29 Moving body, method for controlling moving body, storage medium having control program stored thereon, and control program

Publications (1)

Publication Number Publication Date
WO2018020656A1 true WO2018020656A1 (en) 2018-02-01

Family

ID=61016641

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/072307 WO2018020656A1 (en) 2016-07-29 2016-07-29 Moving body, method for controlling moving body, storage medium having control program stored thereon, and control program

Country Status (2)

Country Link
JP (1) JP6436601B2 (en)
WO (1) WO2018020656A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220274717A1 (en) * 2019-08-27 2022-09-01 Sony Group Corporation Mobile object, information processing apparatus, information processing method, and program
JP2022137105A (en) * 2018-02-07 2022-09-21 株式会社アクセル Imaging apparatus and program

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006281830A (en) * 2005-03-31 2006-10-19 Yamaha Motor Co Ltd View point display system for camera
JP2015090591A (en) * 2013-11-06 2015-05-11 株式会社パスコ Generation device and generation method for road surface ortho-image

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006281830A (en) * 2005-03-31 2006-10-19 Yamaha Motor Co Ltd View point display system for camera
JP2015090591A (en) * 2013-11-06 2015-05-11 株式会社パスコ Generation device and generation method for road surface ortho-image

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
FUMIO OKURA ET AL.: "Fly-through MR Heijo-kyo : Augmented Telepresence Using Recorded Aerial Omnidirectional Videos Captured from Unmanned Airship", HUMAN INTERFACE SOCIETY KENKYU HOKOKUSHU 2010, vol. 12, 14 October 2010 (2010-10-14), pages 31 - 36 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2022137105A (en) * 2018-02-07 2022-09-21 株式会社アクセル Imaging apparatus and program
US20220274717A1 (en) * 2019-08-27 2022-09-01 Sony Group Corporation Mobile object, information processing apparatus, information processing method, and program
US11964775B2 (en) * 2019-08-27 2024-04-23 Sony Group Corporation Mobile object, information processing apparatus, information processing method, and program

Also Published As

Publication number Publication date
JP6436601B2 (en) 2018-12-12
JPWO2018020656A1 (en) 2018-07-26

Similar Documents

Publication Publication Date Title
CN109417596B (en) Multi-sensor image stabilization techniques
EP3086195B1 (en) System for piloting a drone in first-person view mode
WO2020024185A1 (en) Techniques for motion-based automatic image capture
JP2016119655A (en) Video system for piloting drone in immersive mode
US11798172B2 (en) Maximum temperature point tracking method, device and unmanned aerial vehicle
CN114476105A (en) Automated landing surface topography assessment and related systems and methods
WO2019155335A1 (en) Unmanned aerial vehicle including an omnidirectional depth sensing and obstacle avoidance aerial system and method of operating same
US20210289133A1 (en) Method and system of controlling video play speed, control terminal and mobile platform
CN108780324A (en) Unmanned plane, unmanned aerial vehicle (UAV) control method and apparatus
JP2017072986A (en) Autonomous flying device, control method and program of autonomous flying device
JP6436601B2 (en) MOBILE BODY, MOBILE BODY CONTROL METHOD, STORAGE MEDIUM CONTAINING CONTROL PROGRAM, AND CONTROL PROGRAM
WO2019183789A1 (en) Method and apparatus for controlling unmanned aerial vehicle, and unmanned aerial vehicle
CN109949381A (en) Image processing method, device, picture processing chip, camera assembly and aircraft
JP6630939B2 (en) Control device, imaging device, moving object, control method, and program
US20200027238A1 (en) Method for merging images and unmanned aerial vehicle
JP6949930B2 (en) Control device, moving body and control method
US11363195B2 (en) Control device, imaging device, imaging system, movable object, control method, and program
JP6681101B2 (en) Inspection system
WO2019064457A1 (en) Computer system, position estimation method, and program
US20210256732A1 (en) Image processing method and unmanned aerial vehicle
WO2021035746A1 (en) Image processing method and device, and movable platform
JP2021154857A (en) Operation support device, operation support method, and program
JP2021103410A (en) Mobile body and imaging system
JP7317684B2 (en) Mobile object, information processing device, and imaging system
JP2020095519A (en) Shape estimation device, shape estimation method, program, and recording medium

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2017519596

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16910559

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205N DATED 16/05/2019)

122 Ep: pct application non-entry in european phase

Ref document number: 16910559

Country of ref document: EP

Kind code of ref document: A1