WO2022162922A1 - 距離測定装置及び距離測定方法 - Google Patents
距離測定装置及び距離測定方法 Download PDFInfo
- Publication number
- WO2022162922A1 WO2022162922A1 PCT/JP2021/003432 JP2021003432W WO2022162922A1 WO 2022162922 A1 WO2022162922 A1 WO 2022162922A1 JP 2021003432 W JP2021003432 W JP 2021003432W WO 2022162922 A1 WO2022162922 A1 WO 2022162922A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- light
- unit
- measuring device
- distance measuring
- image
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 12
- 238000001514 detection method Methods 0.000 claims abstract description 87
- 238000012545 processing Methods 0.000 claims abstract description 27
- 230000003287 optical effect Effects 0.000 claims description 98
- 238000003384 imaging method Methods 0.000 claims description 79
- 238000004458 analytical method Methods 0.000 claims description 59
- 238000012937 correction Methods 0.000 claims description 18
- 238000010801 machine learning Methods 0.000 claims description 16
- 238000004364 calculation method Methods 0.000 claims description 9
- 238000005259 measurement Methods 0.000 claims description 8
- 238000000691 measurement method Methods 0.000 claims description 7
- 238000003672 processing method Methods 0.000 claims description 6
- 230000005484 gravity Effects 0.000 claims description 5
- 238000003708 edge detection Methods 0.000 claims description 3
- 241000273930 Brevoortia tyrannus Species 0.000 description 5
- 238000006073 displacement reaction Methods 0.000 description 5
- 230000008569 process Effects 0.000 description 4
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 238000012935 Averaging Methods 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 230000023004 detection of visible light Effects 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 238000010191 image analysis Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 238000007493 shaping process Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/02—Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
- G01B11/026—Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness by measuring distance between sensor and object
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/28—Measuring arrangements characterised by the use of optical techniques for measuring areas
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C3/00—Measuring distances in line of sight; Optical rangefinders
- G01C3/02—Details
- G01C3/06—Use of electric means to obtain final indication
- G01C3/08—Use of electric radiation detectors
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
- G01S17/08—Systems determining position data of a target for measuring distance only
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0412—Digitisers structurally integrated in a display
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04847—Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/60—Editing figures and text; Combining figures or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/13—Edge detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/764—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
Definitions
- the present invention relates to a distance measuring device and a distance measuring method.
- Patent Document 1 Japanese Patent Application Laid-Open No. 2001-304855
- the distance measuring device may be a distance measuring device that measures the distance to an object by projecting light.
- the distance measuring device may include a control section that controls the light projection state based on the detection result of the target.
- the distance measuring device may include a light projecting section that projects light controlled by the control section onto the object.
- the distance measuring device may include a processing unit that determines the distance to the object based on the reflected light detection result.
- the distance measuring device may include an imaging section that captures an image of the object.
- the control unit may detect the object based on the imaging result obtained by the imaging unit.
- the light projecting section may project light onto the object via the optical system.
- the imaging unit may capture an image of the object via an optical system or an imaging optical system different from the optical system.
- the light projecting section may include a light source section that emits light.
- the control unit may control the light projection state by controlling either one of the light source unit and the optical system.
- the light projection state may include any one of the light irradiation direction and the light intensity.
- the imaging unit may further detect reflected light from the object.
- the distance measuring device may include a detector that detects reflected light from the object.
- the distance measuring device may include an analysis section that analyzes the imaging result obtained by the imaging section. The analysis unit may analyze the image of the object in the imaging result to identify the object.
- the analysis unit may analyze the image of the object based on a machine learning model.
- the machine learning model may be constructed in advance by performing machine learning using an image of an object whose distance is to be measured as teacher data.
- the analysis unit may analyze the image of the object by an image processing method. Image processing methods may include at least edge detection methods.
- the control unit may control any one of the optical system, the imaging optical system different from the optical system, and the imaging unit so that the object analyzed by the analysis unit is imaged in the center of the imaging area of the imaging unit.
- the analysis unit may identify the object at the center of the image.
- the imaging unit may image the object multiple times at different timings.
- the analysis unit may identify the object from image differences between the plurality of images.
- the analysis unit may detect the resolution of the image.
- the controller may control the optical system based on the resolution detection result to enlarge or reduce the image of the object.
- the analysis unit may analyze the image of the object to identify the center or center of gravity of the object.
- the control unit may control any one of the light projecting unit and the optical system to project the light onto the center of the specified object.
- the analysis unit may analyze the image of the object to identify the object.
- the controller may control the light to scan the identified object with the light.
- the processing unit may determine the distance to the object based on the relationship between the position of the object scanned by the light and the detection result of the reflected light.
- the distance measuring device may further include a display section for displaying the image of the object obtained by the imaging section on the display screen.
- the display may display an object showing the location on the object where the light is projected, or the object for which the distance was determined, superimposed on the image of the object.
- the display unit may display the image of the object obtained by the imaging unit so that the place on the object onto which the light is projected is positioned at the center of the screen.
- the display screen may include a touch detection sensor that detects a touch operation by the user.
- the control unit may control the light to project the light to the at least one place when at least one place included in the image is selected by a touch operation on the display screen.
- a distance measuring device calculates distances to multiple locations or distances and/or areas between multiple locations when multiple locations included in an image are selected by touch operation on the display screen. A part may be further provided.
- the optical system may include corrective optical elements of at least one of lens elements, prisms, and mirrors.
- the controller may drive at least one corrective optical element.
- the light projecting unit may project light onto the object after the imaging unit has captured an image of the object.
- the controller may control the optical system or the imaging optical system to correct the blurring of the distance measuring device.
- the distance measurement method may control the light projection state based on the detection result of the object.
- the distance measuring method may project controlled light onto the object in the controlling step.
- the distance measurement method may determine the distance to the object based on the reflected light detection result.
- FIG. 1 shows the configuration of a distance measuring device according to this embodiment.
- Fig. 4 shows the deflection of light by a mirror; 4 shows the deflection of light by a corrective lens; An example of an object that can be specified from a captured image is shown.
- 2 shows an object specified from a captured image. It shows the detected deviation from the reference axis for the identified object.
- Fig. 3 shows deblurring for identified objects;
- An example of display on the display screen is shown.
- 4 shows another example of a display on a display screen.
- An example of display operation when a touch operation on the display screen is detected is shown.
- An example of display operation when a touch operation on the display screen is detected is shown.
- An example of a display operation when multiple touch operations are detected on a display screen is shown.
- FIG. 10 shows another example of display operation when multiple touch operations are detected on the display screen.
- FIG. 4 shows a flow of a distance measurement method according to the embodiment; 4 shows the configuration of a distance measuring device according to a first modified example; FIG. 11 shows the configuration of a distance measuring device according to a second modified example; FIG. An example of control of the correction lens is shown.
- FIG. 1 shows the configuration of a distance measuring device 100 according to this embodiment.
- the distance measuring device 100 is a device that measures the distance to an object by projecting light B3 . Note that measuring the distance is simply referred to as distance measurement, and the operation by the distance measurement device 100 is also referred to as distance measurement operation.
- the direction in which the light projecting unit 10 emits the light B3 along the reference axis L10 (that is, the left direction in the drawing is the front), and the opposite direction (that is, the right direction in the drawing) is the rear.
- the orientation (also referred to as direction) of the reference axis L10 is uniquely determined by the orientation of the body of the distance measuring device 100 (that is, the housing that accommodates the components).
- the distance measuring device 100 includes a light projection section 10 , a detection section 20 , an imaging section 30 , an analysis section 51 , a control section 52 , a processing section 61 , a display section 70 and a calculation section 62 .
- the analysis unit 51, the control unit 52, the processing unit 61, and the calculation unit 62 are functional units that are realized when an arithmetic processing unit (not shown) executes a dedicated program.
- the light projecting unit 10 is a unit that projects light B3 controlled by a control unit 52 ( to be described later) onto an object via a light projecting observation optical system (an example of an optical system) 12 .
- the light projection unit 10 includes a light source 11 and a light projection observation optical system 12 .
- the light source 11 generates pulsed light B1 at a constant cycle and enters the projection observation optical system 12 .
- the light source 11 for example, a semiconductor laser that oscillates infrared rays can be employed.
- the light B 1 is emitted a predetermined number of times, eg, 320 times, at a constant cycle, eg, a cycle of 500 to 700 ⁇ s, in one ranging operation.
- the light source 11 may have a driving device (not shown), and the control unit 52 may control the driving device to tilt the light source 11 . In this case, the emission direction of the light B1 emitted from the light source 11 is changed, and the light B1 can be deflected toward the object.
- the projection observation optical system 12 is an optical system composed of a plurality of optical elements that shape and direct the light B1 , and includes a mirror 13, a correction lens 14, and an objective lens 15 as an example. These optical elements are arranged along the reference axis L 10 of the projection observation optical system 12 .
- the mirror 13 is a mirror device that reflects or transmits light according to its wavelength, and has a dichroic reflecting surface 13a and a driving device 13b.
- the dichroic reflecting surface 13a is a mirror element that reflects light in the infrared band and transmits light in the visible band.
- the dichroic reflecting surface 13a is arranged on the reference axis L10, reflects the light B1 emitted from the light source 11 , forwards the light B1 along the reference axis L10, and forwards the light B1 from the front of the distance measuring device 100 through the objective lens 15. It transmits the visible light A1 entering through it and sends it out toward the imaging unit 30 arranged behind .
- the driving device 13b has driving elements such as an actuator and an electric motor, and is controlled by the control unit 52 based on the detection result of the inclination of the dichroic reflecting surface 13a by a rotation sensor (not shown) or the like. Tilt 13a. As shown in FIG. 2A, the dichroic reflecting surface 13a is tilted with respect to the reference axis L10 by the driving device 13b , so that the emission direction of the light B3 can be changed from an object moving with respect to the fuselage or from the reference axis L10 . It can be deflected towards stray objects.
- the correction lens 14 is a lens device that deflects the light B2 and has a lens element 14a and a drive device 14b.
- Lens element 14 a is an internal focus lens as an example and is arranged on reference axis L 10 between mirror 13 and objective lens 15 .
- the driving device 14b has a driving element such as a voice coil motor or a piezoelectric motor, and is controlled by the control unit 52 based on the detection result of the displacement of the lens element 14a by a displacement sensor (not shown) or the like to drive the lens element 14a. is displaced in a direction intersecting the reference axis L10 (in the present embodiment, two axial directions perpendicular to each other in a plane perpendicular to the reference axis L10 ). As shown in FIG.
- the driving device 14b displaces the lens element 14a with respect to the reference axis L10 , thereby deflecting the light B3 (tilting it with respect to the reference axis L10 ).
- the correcting lens 14 may be a vari-angle prism that is controlled by the control unit 52 and deforms asymmetrically with respect to the central axis.
- the objective lens 15 collimates the light B2 that is output from the light source 11 and enters via the mirror 13 and the correcting lens 14 and sends it to the front of the distance measuring device 100 as light B3 , and also enters the distance measuring device 100 from the front. It is an optical element that collimates incoming visible light A1 and sends it backwards.
- the objective lens 15 may consist of a plurality of optical elements including at least one lens element. The focal position may be moved back and forth by displacing the objective lens 15 or an optical element constituting the objective lens 15 along the reference axis L10.
- the projection observation optical system 12 may include a prism (not shown) in place of or in combination with the mirror 13 .
- the prism is an optical element that forwards the light B1 emitted from the light source 11 and forwards the visible light A1 entering through the objective lens 15 from the front of the distance measuring device 100.
- a roof prism, a Porro prism, or the like is adopted. can do.
- the prism has a drive device for driving its holding frame, which displaces and/or rotates the prism with respect to the reference axis L10 , thereby changing the direction of emission of the light B3 to an object that moves relative to the fuselage. Can be deflected towards objects.
- the mirror 13 , correction lens 14 , and prism are examples of correction optical elements, and at least one of them may be included in the projection observation optical system 12 .
- the detection unit 20 is a unit that detects reflected light from an object generated by projecting the light B3 .
- the detection section 20 includes a light receiving lens 21 and a detection element 22 .
- the light receiving lens 21 is an optical element that collects the reflected light C1 from the object.
- the reflected light C1 condensed by the light receiving lens 21 is sent to the detection element 22 as the reflected light C2 .
- the light receiving lens 21 has a reference axis L 20 different from that of the objective lens 15 (light projecting observation optical system 12) of the light projecting section 10 .
- the detection element 22 is an element that receives the reflected light C2 and outputs a detection signal corresponding to its intensity.
- the detection element 22 can employ, for example, a photodiode, a phototransistor , or the like that is sensitive to the band of light B3.
- the detector element 22 may include a bandpass filter on or in front of its detection surface that transmits light in a narrow band including the reflected light C2 and blocks or attenuates light in other bands.
- the detection signal is converted into a digital signal and supplied to the processing section 61 .
- reflected light C 1 reflected (or scattered) from an object positioned in front of the distance measuring device 100 enters the light receiving lens 21 .
- the reflected light C1 is collected by the light receiving lens 21 and detected by the detection element 22 as the reflected light C2 .
- the detection signal is output to the processing section 61 .
- the imaging unit 30 is a unit that images an object through the projection observation optical system 12 .
- the imaging unit 30 has, for example, a CMOS image sensor , and performs surface detection of visible light A3 entering from the front of the aircraft via the projection observation optical system 12 .
- a filter element that transmits visible light and cuts light (infrared light) may be provided on the light receiving surface of the image sensor.
- the detection result that is, the captured image of the object is transmitted to the analysis section 51 and the display section 70 . Since the optical axis of the light B3 emitted forward of the aircraft and the optical axis of the visible light A1 entering from the front of the aircraft coincide on the reference axis L10, the image captured by the imaging unit 30 Light B3 can be projected onto the object on the center.
- the analysis unit 51 is a unit that analyzes the imaging result obtained by the imaging unit 30, that is, the captured image of the object.
- the analysis unit 51 analyzes the captured image of the target based on, for example, a machine learning model, and identifies the target from the captured image.
- the machine learning model may be, for example, a multi-layer neural network (DNN) constructed by deep learning. It is constructed by machine learning as
- the number of specified objects is not limited to one, and may be plural.
- a plurality of types of target objects may be machine-learned in advance so that the user can select which type of target object to specify during distance measurement.
- FIG. 3 shows an example of a target object that can be specified from the captured image 200 by the analysis unit 51.
- a user who is a golf player uses the distance measuring device 100 to measure the distance to the pin flag 202 as an object in order to know the distance to the cup 201 on the green 210 on the golf course.
- the analysis unit 51 uses a machine learning model to identify the pin flag 202 from the captured image 200 of the object obtained by the imaging unit 30, and the image center (that is, the reference axis L 10 of the projection observation optical system 12). Calculate the deviation.
- the target object is not limited to the pin flag 202, and may be combined with an object positioned near the cup 201, such as the green 210, a pin (supporting the flag of the pin flag 202), or the like. As a result, the accuracy of target object identification by the machine learning model is improved. Also, when using an object positioned near the cup 201, which is the target object, color information specific to the object (for example, green for green 210) may be used.
- FIG. 4A shows an object identified from the captured image 200.
- the analysis unit 51 identifies the object on the reference axis L10.
- the analysis unit 51 identifies the pin flag 202 based on the machine learning model.
- the analysis unit 51 identifies the center or the center of gravity from the shape of the pin flag 202 .
- the pin flag 202 is imaged a plurality of times at different timings by the imaging unit 30, and the center or the center of gravity of the pin flag 202 is specified from the image difference between the plurality of images (the offset between the images with the smallest pixel value difference). can also
- an image of an object may be registered in the machine in advance, and a suitable machine learning model may be automatically selected according to the situation when the machine is operated. For example, if the situation in which the aircraft is operated is a golf game, when the user selects the golf game, the flag may be automatically selected as the machine learning (completed) model. Alternatively, the user may pick up an image of an object and register it in the machine during the first ranging operation or the like, and set such that the registered machine learning (completed) model is automatically selected during subsequent operations.
- FIG. 4B shows the detected deviation from the reference axis L10 for the identified object.
- the object is off the image center (reference axis L 10 ).
- the analysis unit 51 recognizes the target object at an arbitrary position in the captured image 200 and identifies the target object relative to the image center corresponding to the position of the reference axis L10 .
- a deviation S of an object (center coordinates of an object with respect to the center coordinates at the center of the image in the display screen 71a) is detected.
- a detection result of the deviation S is transmitted to the control unit 52 .
- the center of the image in the display screen 71a is also the intersection of the reference axis L10 of the detection element 22 and the detection element 22.
- the captured image of the target object may be analyzed by an image processing method.
- Image processing methods include edge detection methods that detect the contours of objects in captured images. As a result, it is possible to detect various objects existing on the golf course in the captured image 200 shown in FIG. 3, such as bunkers 220, forests 230, and other hazards.
- the analysis unit 51 detects the resolution of the captured image of the object when the object cannot be detected, and the control unit 52 controls the projection observation optical system 12 based on the detection result of the resolution. may be used to enlarge or reduce the image of the object. Image analysis by the analysis unit 51 will be further described later.
- the control section 52 is a unit that controls the projection state of the light B 3 based on the imaging result obtained by the imaging section 30 .
- the control unit 52 controls the light projection unit 10 and/or the light projection observation optical system 12 to tilt the mirror 13 with respect to the reference axis L10, displace the correction lens 14 from the reference axis L10, and control the prism (indirect ) and/or tilting the light source 11 with respect to the reference axes B1 and B2 deflects the light B3 in a direction determined by the offset S (and the distance to the object).
- the light B3 continues to irradiate the pin flag 202, which is the object identified by the analysis unit 51, particularly its center or center of gravity.
- the deflection angle ⁇ and the angle of view of the display screen 71a determined by the magnification of the optical system are in a corresponding relationship.
- control unit 52 may control the light projecting unit 10 and/or the light projecting observation optical system 12 to deflect the light B3 so as to scan the light B3. Further, the control unit 52 may control the projection state of the light B3 by changing not only the projection direction of the light B3 but also the intensity thereof. For example, when the reflectance of the object is low (or high) and the intensity of the reflected light C1 detected by the detector 20 is weak (strong) , the intensity of the light B3 may be increased (decreased). Also, when scanning an object with the light B3 , the object may be scanned across the object, that is, the object and its surrounding area may be scanned, or only the area of the object may be scanned. .
- control unit 52 controls the light projecting unit 10 and/or the light projecting observation optical system 12 as described above, so that the object analyzed by the analysis unit 51 is captured at the center of the image capturing area of the image capturing unit 30. make it By imaging the object in the center of the imaging area in this manner, the detection accuracy is improved.
- the processor 61 is a unit that determines the distance to the object based on the detection result of the reflected light C1 by the detector 20 .
- the detection time T may be determined by averaging the results obtained for multiple irradiations of the measurement light.
- the processing unit 61 may determine the distance to the object by scanning the object with the light B3 .
- the analysis unit 51 analyzes the captured image of the object to identify the object
- the control unit 52 controls the light projection unit 10 and/or the light projection observation optical system 12 to emit the light B3 .
- the detection unit 20 detects the reflected light C1 at the same time as the object identified using the to determine the distance to the object.
- the distance is determined for the scan position of the object where the detected intensity of the reflected light C1 is maximum, or for all scan positions within the object A distance can be determined and the average or minimum distance taken as the distance to the object.
- the processing unit 61 may determine the distance to the object by scanning the entire deflection range or the entire field angle range of the captured image with the light B3 .
- the control unit 52 controls the projection observation optical system 12 to scan the deflection range or the field angle range of the captured image using the light B3 , and at the same time the detection unit 20 detects the reflected light C1.
- the processing unit 61 detects an object based on the relationship between the scanning position of the light B3 and the detection result of the reflected light C1, and determines the distance to the object.
- the processing unit 61 based on the relationship between the scanning position and the detection result of the reflected light C1, the processing unit 61 identifies the target at, for example, the scanning position where the detection intensity of the reflected light C1 is maximum, and determines the distance of the target from the target. may be determined. Further, based on the relationship between the scan position and the detection result of the detection time T, for example, the target may be specified at the position where the detection time T is the minimum, and the distance of the target may be determined.
- the processing unit 61 supplies the determined distance to the object to the display unit 70 .
- the processing unit 61 may store the determined distance to the object in a storage device (not shown).
- the display unit 70 is a unit that displays the captured image of the object obtained by the imaging unit 30 and the distance to the object determined by the processing unit 61 , and has a display device 71 and a touch detection sensor 72 .
- the display device 71 may be an electronic viewfinder or liquid crystal display having a display screen exposed on the aircraft.
- a liquid crystal display is adopted as the display device 71 because the touch detection sensor 72 is adopted together with the display device 71 .
- the touch detection sensor 72 is, for example, a capacitance sensor, is arranged on the display screen 71a of the display device 71, and detects a user's touch operation and a touched place on the display screen 71a.
- the calculation unit 62 calculates the distances of the one or more locations and the distances between the plurality of locations. and/or a unit for calculating area. These calculations will be described later.
- FIG. 5A shows an example of a display displayed on the display screen 71a by the display unit 70.
- the display unit 70 displays the captured image 200 on the display screen 71a and a mark 240 that indicates the location on the object where the light B3 is projected (or the distance is displayed).
- the display unit 70 displays the captured image 200 of the object obtained by the imaging unit 30 on the pin flag 202 on which the light B3 is projected , that is, the mark 240 .
- the determined distance "385y (yards)" to the object is displayed.
- the mark 240 may be displayed so as to be positioned at the center of the screen.
- the object showing the distance is highlighted or the object showing the distance
- An object (for example, a mark or an arrow) may be superimposed on the .
- FIG. 5B shows an example of display operation when the touch detection sensor 72 of the display unit 70 detects a user's touch operation on the display screen 71a.
- a pin flag 202 is displayed as an object on the display screen 71a of the display unit.
- the touch detection sensor 72 detects the location (coordinates) on the display screen 71a touched by the user.
- the deviation S from is calculated , and the deflection angle ⁇ of the light B3 directed toward the object is calculated.
- the control unit 52 directs and projects the light B3 toward the pin flag 202 at the deflection angle ⁇ calculated by controlling the light projecting unit 10 and/or the light projecting observation optical system 12, and calculates the distance to the pin flag 202. .
- FIG. 6A and 6B show an example of display operation when the touch detection sensor 72 of the display unit 70 detects a user's touch operation on the display screen 71a.
- the distance measuring device 100 identifies the pin flag 202 which is the object in the captured image 200, locks onto it, projects the light B3 , and continuously measures the distance to the pin flag 202.
- FIG. are measuring. In this state, assume that the user selects the banker 220 included in the captured image 200 by performing a touch operation with a finger or the like on the display screen 71a.
- the display unit 70 detects the location on the display screen 71a touched by the user with the touch detection sensor 72, and displays the mark 242 on the banker 220, which is the selected location, superimposed on the captured image 200.
- the analysis unit 51 detects the deviation S from the pin flag 202, which is the target object previously specified or selected, to the bunker 220, which is the selected place, and the control unit 52 detects the deviation S based on the detection result of the deviation S
- the light B3 is directed to the selected bunker 220 and projected by controlling the light projecting section 10 and/or the light projecting observation optical system 12 . As a result, as shown in FIG.
- the display unit 70 displays a mark 242 indicating the location on the bunker 220 where the captured image 200 and the light B3 are projected on the display screen 71a. to be displayed. Also, the determined distance "373y (yards)" to the bunker 220 is displayed.
- FIG. 7A shows an example of the display operation when the touch detection sensor 72 of the display unit 70 detects a plurality of touch operations by the user on the display screen 71a.
- the user captures an image of a cup 201 on the green 210 as an object using the distance measuring device 100, and the captured image 200 is displayed on the display screen 71a.
- the user touches the display screen 71a with a finger or the like to select the location of the cup 201 on the green 210 and the forest 230 behind the green 210 included in the captured image 200 .
- the display unit 70 detects the location on the display screen 71a touched by the user with the touch detection sensor 72, displays the marks 246 at the selected two locations superimposed on the captured image 200, and displays the mark 246 on the selected two locations.
- 52 controls the projection observation optical system 12 so that the light B3 is projected onto each location
- the processing unit 61 determines the distance to each location
- the calculation unit 62 calculates the distance to the determined two locations. and the angle of view between the two locations to calculate the distance between them.
- the distance "150y" to the cup 201 and the determined distance "170y” to the forest 230 are displayed on the display screen 71a, and the calculated distance "30y” between the cup 201 and the forest 230 is displayed. Displayed on the display screen 71a, the user can evaluate the risk of going out of bounds if the shot goes over the green 210 from the distance from the cup 201 to the forest 230.
- FIG. 7B shows another example of the display operation when the touch detection sensor 72 of the display unit 70 detects a plurality of touch operations by the user on the display screen 71a.
- the user takes an image of the green 210 as the object with the distance measuring device 100, and the captured image 200 is displayed on the display screen 71a.
- the user touches the display screen 71a with a finger or the like to select a plurality (four in this example) of locations along the outline of the green 210 included in the captured image 200 .
- the display unit 70 detects the location on the display screen 71a touched by the user with the touch detection sensor 72, displays the marks 242 at the selected four locations superimposed on the captured image 200, and displays the marks 242 on the selected four locations.
- 62 calculates the area of area 244 bounded by the four locations. The calculation result of the area of the area 244 is displayed on the display screen 71a, and the user can know the approximate size of the green 210. FIG.
- FIG. 8 shows the flow of the distance measurement method according to this embodiment.
- the ranging operation is started when the user presses an operation button (not shown) provided on the body of the distance measuring device 100 .
- an operation button not shown
- a user who is a golf player uses the distance measuring device 100 to find the distance to the cup 201 on the green 210 on the golf course as shown in FIG.
- step S102 the imaging unit 30 captures an image of the object through the projection observation optical system 12. As shown in FIG. 3, the pin flag 202, which is the object, and the surrounding green 210, etc., are imaged. The captured image is displayed on the display screen 71 a of the display device 71 by the display unit 70 .
- step S102 the projection state of the light B3 is controlled based on the imaging result obtained in step S102. Specifically, the following steps S104 to S110 are executed.
- step S104 the analysis unit 51 analyzes the captured image of the object obtained in step S102. It should be noted that at the time of the first ranging operation or the like, as shown in FIG. 4A, the user swings the aircraft up, down, left, and right to move the pin flag 202, which is the object, onto the reference axis L10 (that is, the image displayed on the display screen 71a). center), the analysis unit 51 identifies the pin flag 202 . After this, the analysis unit 51 can specify the pin flag 202 at any position within the captured image.
- step S106 the analysis unit 51 determines whether or not an object has been detected. If the pin flag 202 does not exist in the captured image, the analysis unit 51 cannot detect the pin flag 202, determines that the object has not been detected, and returns to step S102. Therefore, the user moves the aircraft to position the pin flag 202 within the captured image. By doing so, the analysis unit 51 detects the pin flag 202 in the captured image, determines that the object has been detected, and proceeds to step S108.
- step S108 the analysis unit 51 identifies the center of the pin flag 202, which is the object. As described above, the analysis unit 51 identifies the center of the pin flag 202 from the shape center of the object or the image difference between a plurality of images, and as shown in FIG. A deviation S of an object (center coordinates of an object with respect to the center coordinates at the center of the image in the display screen 71a) is detected.
- step S110 the controller 52 controls at least one correction optical element included in the projection observation optical system 12 based on the analysis result of the object image obtained in step S108.
- the control of the correction optical element is as described above. This allows the light B3 to illuminate the center of the pin flag 202 as indicated by the arrow in FIG. 4C.
- step S112 the distance to the object is measured.
- the light projecting unit 10 projects light B3 onto the center of the pin flag 202 via the light projecting observation optical system 12 .
- the detector 20 detects the reflected light C1 from the pin flag 202 caused by the projection of the light B3.
- the processing unit 61 determines the distance D to the pin flag 202 based on the detection result of the reflected light C1. The details of projecting the light B3, detecting the reflected light C1, and determining the distance D are as described above.
- the imaging unit 30 captures an image of the object in step S102
- the light B3 is projected onto the object by the light projecting unit 10 in step S112. Let's shift the timing. As a result, it is possible to prevent the reflected light C1 generated by projecting the light B3 onto the object from being detected by the imaging unit 30 when the object is imaged in step S102.
- step S114 the display unit 70 displays the pin flag 202 obtained in step S102 and the captured image 200 around it, as well as the distance "385y (yards)" to the object determined in step S112. It is displayed on the display screen 71a.
- step S116 it is determined whether or not to end the ranging operation. If the operation button (not shown) is pressed again by the user, it is determined that the ranging operation should be continued, and the process returns to step S102.
- the light projecting section 10 projects the light B3 onto the object through the projection observation optical system 12, and the object is projected through the projection observation optical system 12.
- An imaging unit 30 that takes an image
- an analysis unit 51 that analyzes the image of the object obtained by the imaging unit 30, and a control unit 52 that controls the projection observation optical system 12 based on the analysis result of the analysis unit 51.
- the object is imaged through the projection observation optical system 12 through which the light projected onto the object passes, the captured image of the object obtained thereby is analyzed, and the projection observation is performed based on the analysis result.
- the optical system 12 By controlling the optical system 12, it is possible to accurately project light onto an object and determine the distance to the object.
- the target since the user does not need to collimate the area including the target and visually recognize the target, the target can be easily recognized.
- the distance measuring device 100 has a configuration in which the imaging unit 30 captures an image of an object, and the detection unit 20 projects the light B3 , thereby detecting the reflected light C1 from the object generated.
- the imaging unit 30 captures an image of the object and also detects the reflected light C1 from the object generated by projecting the light B3.
- FIG. 9 shows the configuration of the distance measuring device 110 according to the first modified example.
- the distance measuring device 110 includes a light projection section 10 , an imaging section 30 d , an analysis section 51 , a control section 52 , a processing section 61 , a display section 70 and a calculation section 62 .
- Units other than the imaging unit 30d are configured in the same manner as those in the distance measuring device 100 described above.
- the imaging unit 30d captures an image of an object via the light projecting observation optical system 12d, and captures reflected light (infrared light) C1 from the object generated by projecting light B3 from the light projecting unit 10 . It is the unit that detects.
- the imaging unit 30 can employ an image sensor having sensitivity in the visible light band and the infrared band, such as a CMOS image sensor.
- a captured image of the object obtained by receiving the visible light A1 is supplied to the analysis unit 51 and the display unit 70 .
- a detection signal of the reflected light (infrared light) C 3 from the object derived from the light B 3 is converted into a digital signal and supplied to the processing unit 61 .
- the processing unit 61 determines the distance to the object based on the detection result of the reflected light (infrared light) C3 by the imaging unit 30d. The details are as described above.
- the imaging unit 30 captures an image of the object and also detects the reflected light C1 from the object generated by projecting the light B3. Since functions can be shared, costs can be reduced.
- the configuration in which the light projection unit 10 and the imaging unit 30 share one optical system is adopted.
- a configuration in which the optical units 10 each have an independent optical system may be employed.
- FIG. 10 shows the configuration of a distance measuring device 120 according to a second modified example.
- the distance measuring device 120 includes a light projecting section 10d, a detecting section 20d, an imaging section 30, an analyzing section 51, a control section 52, a processing section 61, a display section 70, and a calculating section 62.
- Units other than the light projecting unit 10d and the detecting unit 20d are configured in the same manner as those in the distance measuring device 100 described above.
- the light projecting unit 10d is a unit that projects light B3 onto an object via a light projecting optical system (an example of a first optical system) 12dd .
- the light projecting section 10d includes a light source 11 and a light projecting optical system 12dd.
- the light source 11 generates pulsed light B1 at a constant cycle and enters the light projecting optical system 12dd.
- the projection optical system 12dd is an optical system composed of a plurality of optical elements for shaping and directing the light B1, and includes a correction lens 14 and an objective lens 15 as an example. These optical elements are arranged along the reference axis L10 of the projection optical system 12dd .
- the correction lens 14 is a lens device that deflects the light B1 and has a lens element 14a and a drive device 14b. Their configurations are as described above.
- the objective lens 15 is an optical element that collimates the light B 1 that is output from the light source 11 and enters through the correction lens 14 and sends the collimated light B 3 to the front of the distance measuring device 100 .
- the configuration of the objective lens 15 is as described above.
- the detection unit 20d is a unit that detects reflected light C1 from an object generated by projecting light B3 through a detection observation optical system (an example of a second optical system) 22d.
- the detection section 20 includes a light receiving lens 21 , a correction lens 23 , a mirror 24 and a detection element 22 .
- the light receiving lens 21 is an optical element that collects the reflected light C1 from the object. Reflected light C 1 condensed by the light receiving lens 21 is sent to the correcting lens 23 .
- the correcting lens 23 is a lens device that changes the light receiving angle of the reflected light C2 , and has a lens element 23a and a driving device 23b.
- the lens element 23 a is, for example, an internal focus lens and is arranged on the reference axis L 20 between the light receiving lens 21 and the mirror 24 .
- the drive device 23b has a drive element such as a voice coil motor or a piezoelectric motor, and is controlled by the control unit 52 based on the detection result of displacement of the lens element 23a by a displacement sensor (not shown) or the like to drive the lens element 23a. is displaced in a direction intersecting the reference axis L20 ( in this embodiment, two axial directions perpendicular to each other in a plane perpendicular to the reference axis L20 ).
- the mirror 24 is a mirror device that reflects or transmits light according to its wavelength, and has a dichroic reflecting surface 24a.
- the dichroic reflecting surface 24a is a mirror element that reflects light in the infrared band and transmits light in the visible band.
- the dichroic reflecting surface 24a is arranged on the reference axis L20 , reflects the reflected light (infrared light) C2 , sends it to the detection element 22, and transmits it to the light receiving lens 21 and the distance measuring device 100 together with the reflected light C1.
- the visible light A2 entering via the correcting lens 14 is transmitted and sent out toward the imaging unit 30 arranged behind.
- the detection element 22 is an element that receives the reflected light C3 and outputs a detection signal corresponding to its intensity.
- the sensing element 22 is configured as previously described.
- the detection signal is converted into a digital signal and supplied to the processing section 61 .
- the imaging unit 30 captures an image of the object by receiving the visible light A3 via the detection/observation optical system 22d.
- the analysis unit 51 analyzes the image of the object obtained by the imaging unit 30 and identifies the object from the captured image. The details of the analysis of the captured image are as described above.
- the control unit 52 controls the correction lens 14 included in the projection optical system 12dd and the correction lens 23 included in the detection/observation optical system 22d based on the analysis result by the analysis unit 51 .
- FIG. 11 shows an example of control of the correction lenses 14 and 23.
- the analysis unit 51 detects the shift S of the object with respect to the image center (reference axis L 10 ) (the center coordinates of the object with respect to the center coordinates of the image center in the display screen 71a)
- the control unit 52 causes the driving device 14b to
- the lens element 14a is displaced with respect to the reference axis L10 .
- the light B 3 is deflected (tilted with respect to the reference axis L 10 ).
- the controller 52 displaces the lens element 23a with respect to the reference axis L20 by the drive device 23b.
- the light receiving angle of the reflected light C 1 (and visible light A 1 ) is tilted with respect to the reference axis L 20 .
- the deviation of the deflection angle of the light B3 and the displacement of the acceptance angle of the reflected light C1 (and the visible light A1 ) are equally controlled. Therefore , it is possible to irradiate the object with the light B3 and receive the reflected light C1 generated from the object.
- the deflection angle control of the light B3 and the light receiving angle control of the reflected light C1 (and visible light A1 ) can be performed simultaneously, so that the processing time can be shortened.
- control unit 52 controls the detection/observation optical system 22 d as described above so that the object analyzed by the analysis unit 51 is imaged in the center of the imaging area of the imaging unit 30 .
- the detection accuracy is improved.
- the correction lenses 14 and 23 may be variangle prisms that are controlled by the control unit 52 and deform asymmetrically with respect to the central axis.
- the imaging unit 30 captures an image of the object, and the detection unit 22 detects the reflected light C1 from the object generated by projecting the light B3 .
- the imaging unit 30 captures an image of the object and also detects the reflected light C1 from the object generated by projecting the light B3.
- the correcting lenses 14 and 23 may be used for controlling camera shake correction of the body.
- the imaging unit 30 may be controlled to be positioned at the center of the image by shifting the imaging unit 30 in a plane orthogonal to the reference axis L10 using a driving device (not shown).
- Display unit 71 Display device 71a Display screen 72 Touch detection sensor 100, 110, 120 Distance measuring device 200 Captured image 201 Cup 202 Pin flag 210 Green 220 Banker 230 Forest 240 , 242, 246... marks, 244... areas, A1, A2 , A3... visible light, B1 , B2 , B3 ... light, C1, C2 , C3 ... reflected light, L10 , L 20 . . . Reference axis.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Electromagnetism (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Computing Systems (AREA)
- Databases & Information Systems (AREA)
- Evolutionary Computation (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Software Systems (AREA)
- Artificial Intelligence (AREA)
- Multimedia (AREA)
- Health & Medical Sciences (AREA)
- Computer Networks & Wireless Communication (AREA)
- Measurement Of Optical Distance (AREA)
Abstract
Description
特許文献1 特開2001-304855号公報
光を投射して対象物までの距離を測定する距離測定装置であってよい。
距離測定装置は、対象物の検出結果に基づいて、光の投射状態を制御する制御部を備えてよい。
距離測定装置は、制御部により制御された光を対象物に投射する投光部を備えてよい。
距離測定装置は、反射光の検出結果に基づいて、対象物までの距離を決定する処理部を備えてよい。
(項目2)
距離測定装置は、対象物を撮像する撮像部を備えてよい。
制御部は、撮像部により得られた撮像結果に基づいて対象物を検出してよい。
(項目3)
投光部は、光学系を介して光を対象物に投射してよい。
(項目4)
撮像部は、光学系、又は、光学系と異なる撮像光学系を介して対象物を撮像してよい。
(項目5)
投光部は光を射出する光源部を含んでよい。
制御部は、光源部及び光学系のいずれか1つを制御して光の投射状態を制御してよい。
(項目6)
光の投射状態は、光の照射方向及び光の強度のいずれか1つを含んでよい。
(項目7)
撮像部は、更に対象物からの反射光を検出してよい。
(項目8)
距離測定装置は、対象物からの反射光を検出する検出部を備えてよい。
(項目9)
距離測定装置は、撮像部により得られた撮像結果を解析する解析部を備えてよい。
解析部は、撮像結果における対象物の画像を解析して対象物を特定してよい。
(項目10)
解析部は、機械学習モデルに基づいて対象物の画像を解析してよい。
機械学習モデルは、予め、距離を測定する対象となる対象物の画像を教師データとして機械学習することで構築されていてよい。
(項目11)
解析部は、画像処理法により対象物の画像を解析してよい。
画像処理法は、少なくともエッジ検出法を含んでよい。
(項目12)
制御部は、解析部が解析した対象物が撮像部の撮像領域の中央で撮像されるように光学系、光学系と異なる撮像光学系、及び撮像部のいずれか1つを制御してよい。
(項目13)
解析部は、対象物を画像中心にて特定してよい。
(項目14)
撮像部は、対象物を異なるタイミングで複数回、撮像してよい。
解析部は、複数の画像間の画像差から対象物を特定してよい。
(項目15)
解析部は、画像の解像度を検出してよい。
制御部は、解像度の検出結果に基づいて光学系を制御して、対象物の像を拡大又は縮小してよい。
(項目16)
解析部は、対象物の画像を解析して対象物の中心又は重心を特定してよい。
制御部は、投光部及び光学系のいずれか1つを制御して、光を特定された対象物の中心に投射してよい。
(項目17)
解析部は、対象物の画像を解析して対象物を特定してよい。
制御部は、光を制御して、光を用いて特定された対象物をスキャンしてよい。
処理部は、光による対象物のスキャン位置と反射光の検出結果との関係に基づいて対象物までの距離を決定してよい。
(項目18)
距離測定装置は、撮像部により得られた対象物の画像を表示画面上に表示する表示部をさらに備えてよい。
(項目19)
表示部は、光が投射されている対象物上の場所、又は、距離を決定した対象物を示すオブジェクトを対象物の画像に重ねて表示してよい。
(項目20)
表示部は、撮像部により得られた対象物の画像を、光が投射されている対象物上の場所を画面中心に位置するよう表示してよい。
(項目21)
表示画面は、ユーザによるタッチ操作を検出するタッチ検出センサを含んでよい。
制御部は、表示画面上でタッチ操作により画像に含まれる少なくとも1つの場所が選択された場合に、光を制御して、少なくとも1つの場所に光を投射してよい。
(項目22)
距離測定装置は、表示画面上でタッチ操作により画像に含まれる複数の場所が選択された場合に、複数の場所までの距離、又は、複数の場所の間の距離及び/又は面積を算出する算出部をさらに備えてよい。
(項目23)
光学系は、レンズ素子、プリズム、及びミラーのうちの少なくとも1つの補正用光学素子を含んでよい。
制御部は、少なくとも1つの補正用光学素子を駆動してよい。
(項目24)
投光部は、撮像部が対象物を撮像した後に光を対象物に投射してよい。
(項目25)
制御部は、光学系又は撮像光学系を制御して距離測定装置のブレを補正してよい。
光を投射して対象物までの距離を測定する距離測定方法であってよい。
距離測定方法は、対象物の検出結果に基づいて、光の投射状態を制御してよい。
距離測定方法は、制御する段階で制御された光を対象物に投射してよい。
距離測定方法は、反射光の検出結果に基づいて、対象物までの距離を決定してよい。
また、補正レンズ14,23を機体の手振れ補正の制御に用いてもよい。
また、撮像部30を駆動装置(図示しない)を用いて、基準軸L10に直交する面内でシフトさせることで、対象物が画像中心に位置するように制御してもよい。
Claims (26)
- 光を投射して対象物までの距離を測定する距離測定装置であって、
前記対象物の検出結果に基づいて、前記光の投射状態を制御する制御部と、
前記制御部により制御された前記光を前記対象物に投射する投光部と、
反射光の検出結果に基づいて、前記対象物までの距離を決定する処理部と、
を備える距離測定装置。 - 前記対象物を撮像する撮像部を備え、前記制御部は、前記撮像部により得られた撮像結果に基づいて前記対象物を検出する、請求項1に記載の距離測定装置。
- 前記投光部は、光学系を介して前記光を対象物に投射する、請求項2に記載の距離測定装置。
- 前記撮像部は、前記光学系、又は、前記光学系と異なる撮像光学系を介して前記対象物を撮像する、請求項3に記載の距離測定装置。
- 前記投光部は前記光を射出する光源部を含み、前記制御部は、前記光源部及び前記光学系のいずれか1つを制御して光の投射状態を制御する、請求項3又は4に記載の距離測定装置。
- 前記光の投射状態は、前記光の照射方向及び前記光の強度のいずれか1つを含む、請求項3から5のいずれか一項に記載の距離測定装置。
- 前記撮像部は、更に前記対象物からの反射光を検出する、請求項3から6のいずれか一項に記載の距離測定装置。
- 前記対象物からの反射光を検出する検出部を備える、請求項3から6のいずれか一項に記載の距離測定装置。
- 前記撮像部により得られた前記撮像結果を解析する解析部を備え、
前記解析部は、前記撮像結果における前記対象物の画像を解析して前記対象物を特定する、請求項3から8のいずれか一項に記載の距離測定装置。 - 前記解析部は、機械学習モデルに基づいて前記対象物の画像を解析し、
前記機械学習モデルは、予め、距離を測定する対象となる対象物の画像を教師データとして機械学習することで構築されている、請求項9に記載の距離測定装置。 - 前記解析部は、画像処理法により前記対象物の画像を解析し、
前記画像処理法は、少なくともエッジ検出法を含む、
請求項9に記載の距離測定装置。 - 前記制御部は、前記解析部が解析した前記対象物が前記撮像部の撮像領域の中央で撮像されるように前記光学系、前記光学系と異なる撮像光学系、及び前記撮像部のいずれか1つを制御する、請求項9から11のいずれか一項に記載の距離測定装置。
- 前記解析部は、前記対象物を画像中心にて特定する、請求項11又は12に記載の距離測定装置。
- 前記撮像部は、前記対象物を異なるタイミングで複数回、撮像し、
前記解析部は、複数の前記画像間の画像差から前記対象物を特定する、
請求項9から13のいずれか一項に記載の距離測定装置。 - 前記解析部は、前記画像の解像度を検出し、
前記制御部は、前記解像度の検出結果に基づいて前記光学系を制御して、前記対象物の像を拡大又は縮小する、請求項9から14のいずれか一項に記載の距離測定装置。 - 前記解析部は、前記対象物の画像を解析して前記対象物の中心又は重心を特定し、
前記制御部は、前記投光部及び前記光学系のいずれか1つを制御して、前記光を特定された前記対象物の中心に投射する、請求項9から15のいずれか1つに記載の距離測定装置。 - 前記解析部は、前記対象物の画像を解析して前記対象物を特定し、
前記制御部は、前記光を制御して、前記光を用いて特定された前記対象物をスキャンし、
前記処理部は、前記光による前記対象物のスキャン位置と前記反射光の検出結果との関係に基づいて前記対象物までの距離を決定する、請求項9から16のいずれか一項に記載の距離測定装置。 - 前記撮像部により得られた前記対象物の画像を表示画面上に表示する表示部をさらに備える、請求項3から17のいずれか一項に記載の距離測定装置。
- 前記表示部は、前記光が投射されている前記対象物上の場所、又は、前記距離を決定した対象物を示すオブジェクトを前記対象物の画像に重ねて表示する、請求項18に記載の距離測定装置。
- 前記表示部は、前記撮像部により得られた前記対象物の画像を、前記光が投射されている前記対象物上の場所を画面中心に位置するよう表示する、請求項18に記載の距離測定装置。
- 前記表示画面は、ユーザによるタッチ操作を検出するタッチ検出センサを含み、
前記制御部は、前記表示画面上でタッチ操作により前記画像に含まれる少なくとも1つの場所が選択された場合に、前記光を制御して、前記少なくとも1つの場所に前記光を投射する、請求項18から20のいずれか一項に記載の距離測定装置。 - 前記表示画面上でタッチ操作により前記画像に含まれる複数の場所が選択された場合に、前記複数の場所までの距離、又は、前記複数の場所の間の距離及び/又は面積を算出する算出部をさらに備える、請求項21に記載の距離測定装置。
- 前記光学系は、レンズ素子、プリズム、及びミラーのうちの少なくとも1つの補正用光学素子を含み、
前記制御部は、前記少なくとも1つの補正用光学素子を駆動する、請求項3から22のいずれか一項に記載の距離測定装置。 - 前記投光部は、前記撮像部が前記対象物を撮像した後に前記光を前記対象物に投射する、請求項3から23のいずれか一項に記載の距離測定装置。
- 前記制御部は、前記光学系又は撮像光学系を制御して前記距離測定装置のブレを補正する、請求項3から24のいずれか一項に記載の距離測定装置。
- 光を投射して対象物までの距離を測定する距離測定方法であって、
前記対象物の検出結果に基づいて、前記光の投射状態を制御する段階と、
前記制御する段階で制御された前記光を前記対象物に投射する段階と、
反射光の検出結果に基づいて、前記対象物までの距離を決定する段階と、
を備える距離測定方法。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2021/003432 WO2022162922A1 (ja) | 2021-01-29 | 2021-01-29 | 距離測定装置及び距離測定方法 |
JP2022577994A JPWO2022162922A1 (ja) | 2021-01-29 | 2021-01-29 | |
US18/274,797 US20240118418A1 (en) | 2021-01-29 | 2021-01-29 | Distance measuring device and distance measuring method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2021/003432 WO2022162922A1 (ja) | 2021-01-29 | 2021-01-29 | 距離測定装置及び距離測定方法 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022162922A1 true WO2022162922A1 (ja) | 2022-08-04 |
Family
ID=82653215
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2021/003432 WO2022162922A1 (ja) | 2021-01-29 | 2021-01-29 | 距離測定装置及び距離測定方法 |
Country Status (3)
Country | Link |
---|---|
US (1) | US20240118418A1 (ja) |
JP (1) | JPWO2022162922A1 (ja) |
WO (1) | WO2022162922A1 (ja) |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1188870A (ja) * | 1997-09-05 | 1999-03-30 | Tokai Rika Co Ltd | 監視システム |
JP2001124544A (ja) * | 1999-10-25 | 2001-05-11 | Asahi Optical Co Ltd | 測距装置 |
JP2005156356A (ja) * | 2003-11-26 | 2005-06-16 | Matsushita Electric Ind Co Ltd | 被写体距離測定表示装置および携帯端末装置 |
JP2009192415A (ja) * | 2008-02-15 | 2009-08-27 | Toyota Central R&D Labs Inc | 対象物測距装置及びプログラム |
JP2012186538A (ja) * | 2011-03-03 | 2012-09-27 | Nikon Corp | 電子カメラ、画像表示装置、プログラム及び記録媒体 |
WO2015008587A1 (ja) * | 2013-07-16 | 2015-01-22 | 富士フイルム株式会社 | 撮影装置及び3次元計測装置 |
WO2016030925A1 (ja) * | 2014-08-27 | 2016-03-03 | 株式会社ニコンビジョン | 測距計および測距方法 |
JP2018096709A (ja) * | 2016-12-08 | 2018-06-21 | 富士通株式会社 | 距離測定装置および距離測定方法 |
-
2021
- 2021-01-29 WO PCT/JP2021/003432 patent/WO2022162922A1/ja active Application Filing
- 2021-01-29 JP JP2022577994A patent/JPWO2022162922A1/ja active Pending
- 2021-01-29 US US18/274,797 patent/US20240118418A1/en active Pending
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1188870A (ja) * | 1997-09-05 | 1999-03-30 | Tokai Rika Co Ltd | 監視システム |
JP2001124544A (ja) * | 1999-10-25 | 2001-05-11 | Asahi Optical Co Ltd | 測距装置 |
JP2005156356A (ja) * | 2003-11-26 | 2005-06-16 | Matsushita Electric Ind Co Ltd | 被写体距離測定表示装置および携帯端末装置 |
JP2009192415A (ja) * | 2008-02-15 | 2009-08-27 | Toyota Central R&D Labs Inc | 対象物測距装置及びプログラム |
JP2012186538A (ja) * | 2011-03-03 | 2012-09-27 | Nikon Corp | 電子カメラ、画像表示装置、プログラム及び記録媒体 |
WO2015008587A1 (ja) * | 2013-07-16 | 2015-01-22 | 富士フイルム株式会社 | 撮影装置及び3次元計測装置 |
WO2016030925A1 (ja) * | 2014-08-27 | 2016-03-03 | 株式会社ニコンビジョン | 測距計および測距方法 |
JP2018096709A (ja) * | 2016-12-08 | 2018-06-21 | 富士通株式会社 | 距離測定装置および距離測定方法 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2022162922A1 (ja) | 2022-08-04 |
US20240118418A1 (en) | 2024-04-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6777987B2 (ja) | 測定装置 | |
JP4057200B2 (ja) | 座標入力装置および座標入力装置の記録媒体 | |
EP2846187B1 (en) | Projection system with infrared monitoring | |
CN109313263B (zh) | 用于运行激光距离测量仪的方法 | |
US7633602B2 (en) | Measurement apparatus and measurement method | |
US20080291179A1 (en) | Light Pen Input System and Method, Particularly for Use with Large Area Non-Crt Displays | |
TW201411272A (zh) | 影像感測器定位裝置及方法 | |
JP2018132328A (ja) | 測量機 | |
US12025468B2 (en) | Optical sensor with overview camera | |
US20080074650A1 (en) | Measurement apparatus and measurement method | |
EP3945283B1 (en) | Surveying instrument | |
EP3812700A1 (en) | Surveying instrument | |
CN109470201B (zh) | 用于运行手持式激光测距仪的方法和手持式激光测距仪 | |
JP2018179918A (ja) | 形状計測システム、及び、形状計測方法 | |
US7301617B2 (en) | Surveying apparatus | |
US20220229182A1 (en) | Surveying Instrument | |
WO2022162922A1 (ja) | 距離測定装置及び距離測定方法 | |
JP2015108582A (ja) | 3次元計測方法と装置 | |
CN115720619A (zh) | 测定装置 | |
JP6972311B2 (ja) | 距離検出装置、光学機器、及び距離検出装置の姿勢検出方法 | |
US20210199800A1 (en) | Method for measuring distance of image displayed on television camera | |
JP4349937B2 (ja) | 眼科装置 | |
JPH076775B2 (ja) | 三次元形状データ取込み装置 | |
WO2024210090A1 (ja) | 測量装置 | |
JP7266151B2 (ja) | 指示位置検出装置、指示位置検出方法、指示位置検出プログラム、及び投影システム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21922936 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2022577994 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18274797 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 25.10.2023) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21922936 Country of ref document: EP Kind code of ref document: A1 |