WO2011059127A1 - 적외선 센서 및 이를 이용한 감지 방법 - Google Patents
적외선 센서 및 이를 이용한 감지 방법 Download PDFInfo
- Publication number
- WO2011059127A1 WO2011059127A1 PCT/KR2009/006699 KR2009006699W WO2011059127A1 WO 2011059127 A1 WO2011059127 A1 WO 2011059127A1 KR 2009006699 W KR2009006699 W KR 2009006699W WO 2011059127 A1 WO2011059127 A1 WO 2011059127A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- infrared
- image
- light
- imaging device
- unit
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10048—Infrared image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10141—Special mode during image acquisition
- G06T2207/10152—Varying illumination
Definitions
- the present invention relates to an infrared sensor and a sensing method using the same, and more particularly, when using an infrared sensor that detects an object by irradiating infrared rays in the detection area, by removing the disturbance by the external light emitter more accurately and precisely
- the present invention relates to an infrared sensor capable of detecting the position or distance of an object and a sensing method using the same.
- a sensor for measuring the presence or absence of an object or an obstacle in the detection area by irradiating infrared rays to the detection area and measuring the reflected light is widely used in industrial fields.
- a technique for measuring the position of an object has been developed by acquiring an infrared image using an image pickup device.
- a predetermined amount of light is irradiated to the detection area through photometric illumination, and then the light reflected by the object existing in the detection area and returned to the filter is filtered through an infrared filter, Acquire an infrared image through.
- the acquired infrared image is analyzed to determine the position of an object or obstacle existing in the sensing area.
- the infrared image by the external light emitter is recognized as an object in the sensing area, and thus the infrared sensor may malfunction.
- an infrared sensor is used indoors, it is difficult to distinguish between an infrared image generated by an indoor light such as an incandescent lamp or a fluorescent lamp or a light emitted through a window and an infrared image by a photometric light. That is, since the infrared image acquired through the image pickup device includes an infrared image by indoor lighting or light illumination, there is a possibility of erroneously recognizing a distant object, indoor lighting or light illumination as an object present in the sensing area.
- the infrared image generated by the external illuminator is mixed with the infrared image generated by the metering light, the detection accuracy of the object in the detection area is reduced, and the infrared light is limited only under limited lighting conditions to minimize the influence of the external illuminant. There is a limit to the use of sensors.
- an infrared sensor for achieving the above object, the illumination unit for irradiating the infrared to the detection area; An imaging device which acquires an infrared image in the sensing area in response to the infrared rays incident from the sensing area; And a controller configured to adjust a light emitting time for irradiating infrared light of the lighting unit and a photographing time of the image pickup device, wherein the controller comprises: an illumination controller for adjusting a light emitting time of the lighting unit; An imaging device controller for controlling a shooting time of the imaging device; A memory unit for storing an infrared image obtained by the imaging device; And comparing and calculating an infrared image obtained when the illumination unit irradiates infrared rays among the infrared images stored in the memory unit, and an infrared image obtained when the illumination unit does not irradiate infrared rays, thereby positioning the object in the detection area.
- an image processor for adjusting a light emit
- the lighting unit may be composed of a halogen lamp or an infrared LED.
- the imaging device may be composed of a CCD or a CMOS.
- the infrared sensor may further include an infrared transmission filter for filtering the light incident from the sensing region to transmit the light in the infrared wavelength region to the image pickup device.
- the image processor may calculate a distance between the infrared sensor and an object in the sensing area from the reflected light amount recorded in each pixel of the image pickup device.
- the control unit controls to acquire two infrared images when irradiated with infrared rays according to a first preset time interval and when no infrared rays are irradiated, and to obtain the two infrared images again after a second preset time interval. Can be controlled.
- the operation of the illumination unit, the image pickup device, the memory unit, and the image processing unit is continuously performed in one cycle, and the image pickup device may capture the next infrared image while the image processing unit processes the obtained two infrared images. Can be.
- a sensing method using an infrared sensor includes: (a) acquiring an infrared image in a sensing area through an imaging device in a state in which the lighting unit is controlled to not irradiate infrared rays; (b) acquiring an infrared image in the detection area through the image pickup device while controlling the illumination unit to irradiate infrared light; (c) comparing and processing the infrared images obtained in the steps (a) and (b), and extracting only the infrared image returned by the infrared light reflected from the illumination unit to the object in the sensing area; And (d) processing the extracted infrared image by the controller to calculate a position of an object in the sensing area.
- a sensing method using an infrared sensor includes: (a) acquiring an infrared image in a sensing area through an imaging device while controlling the illumination unit to irradiate infrared light; (b) acquiring an infrared image in the sensing area through the image pickup device while controlling the illumination unit not to emit infrared light; (c) comparing and processing the infrared images obtained in the steps (a) and (b), and extracting only the infrared image returned by the infrared light reflected from the illumination unit to the object in the sensing area; And (d) processing the extracted infrared image by the controller to calculate a position of an object in the sensing area.
- Infrared images obtained in the steps (a) and (b) may be compared and processed through subtraction.
- the step (d) may further include calculating a distance between the infrared sensor and an object in the sensing area by analyzing the reflected light amount of the extracted infrared images.
- the infrared sensor of the present invention and the sensing method using the same are not limited by the illumination condition, even if the infrared image by the external light emitter is mixed, the infrared image by the external light emitter can be removed through comparative analysis of the infrared image. In addition, there is an effect that can accurately identify the object in the detection area.
- FIG. 1 is a view schematically showing the configuration of an infrared sensor according to an embodiment of the present invention.
- FIGS. 2 and 3 are views for explaining a process of obtaining an infrared image using the infrared sensor of the present invention.
- FIG. 4 is a flowchart illustrating a sensing method using an infrared sensor according to an exemplary embodiment of the present invention.
- FIG. 5 is a graph for explaining a method of operating an infrared sensor according to the flowchart shown in FIG. 4.
- FIG. 1 is a view schematically showing the configuration of an infrared sensor according to an embodiment of the present invention.
- the infrared sensor 100 of the present invention may include an illumination unit 200, an imaging device 300, and a controller 400, and may further include an infrared transmission filter 500.
- the lighting unit 200 emits light to detect an object in the detection area.
- the lighting unit 200 may be, for example and without limitation, a general white light source, a red light source, a halogen lamp, or an infrared LED including an infrared wavelength, and more effectively using an illumination including only an infrared wavelength. 100 may be operated.
- the imaging device 300 acquires an infrared image in the sensing area in response to infrared rays incident from the sensing area of the infrared sensor 100.
- the imaging device 300 captures an infrared image in which the infrared rays irradiated from the illumination unit 200 to the sensing region are reflected back to the object.
- the type of the imaging device 300 is not particularly limited, and any image acquisition device capable of taking an image by reacting with light may be used, and typically, a charge-coupled device (CCD) or a complementary metal-oxide semiconductor (CMOS) Can be used.
- the imaging device 300 preferably uses a device sensitive to the infrared wavelength band.
- the infrared transmission filter 500 is mounted on the front of the image pickup device 300, and filters the light incident from the detection area to transfer the light in the infrared wavelength region to the image pickup device 300. Since the general image pickup device 300 reacts to various wavelength ranges of light, an infrared filter 500 capable of selecting and transmitting only an infrared wavelength range may maximize the detection effect of the infrared sensor 100.
- the coating of the lens of the image pickup device 300 can be configured to transmit only light in the infrared wavelength region.
- the controller 400 adjusts a light emitting time for irradiating infrared light of the lighting unit 200 and a photographing time of the image capturing device 300, and specifically, the lighting control unit 410, the image capturing device control unit 420, and the memory unit 430. And an image processor 440.
- the lighting controller 410 adjusts the light emission time of the lighting unit 200
- the imaging device controller 420 adjusts the shooting time of the imaging device 300.
- the infrared sensor 100 of the present invention may obtain an infrared image photographed through the imaging device 300 in a state in which the illumination unit 200 does not irradiate light, and in a state in which the illumination unit 200 irradiates light as necessary. Infrared image captured by the imaging device 300 can be obtained.
- the memory unit 430 stores the infrared image acquired by the image pickup device 300, and as described above, stores both the non-irradiated light and the infrared image photographed in the irradiated state.
- the image processor 440 compares the infrared image obtained when the illumination unit 200 irradiates infrared rays among the infrared images stored in the memory unit 430, and the infrared image obtained when the illumination unit 200 does not irradiate infrared rays.
- the calculation process calculates the position or distance of the object in the sensing area.
- the infrared image obtained when the illumination unit 200 is irradiated with infrared rays is a state in which an infrared image by the illumination unit 200 and an external light emitter is mixed, and is obtained when the illumination unit 200 does not irradiate infrared rays.
- the infrared image is an infrared image by an external light emitter. Therefore, when the two infrared images are analyzed, the infrared images sensitive to only the infrared rays generated by the illumination unit 200 may be obtained.
- FIGS. 2 and 3 are views for explaining a process of obtaining an infrared image using the infrared sensor of the present invention.
- the infrared image 30 obtained through the imaging device 300 while not illuminating the illumination unit 200 includes infrared information by an external light emitter 20.
- the infrared image 40 acquired through the image pickup device 300 while the lighting unit 200 emits light is not only infrared information by the external light emitter 20, but also infrared light irradiated from the lighting unit 200. Infrared information reflected and returned to (10) is included at the same time.
- the arithmetic operation is performed by comparing and calculating the infrared image 40 acquired while the lighting unit 200 emits light and the infrared image 30 obtained while the lighting unit 200 does not emit light.
- the infrared image 50 may be obtained by removing the infrared information by the external light emitter 20 and extracting only the infrared information by the illumination unit 200.
- various comparison operation processing methods such as an exclusive OR, as well as a difference operation, may be used.
- an infrared image 50 having more accurate near object 10 information may be extracted.
- infrared information by the external light emitter 20 is stored in each infrared image 30. 40 may show slight differences. Accordingly, an error may occur at the boundary of the near object 10 information in the infrared image 50 obtained by performing the difference calculation.
- a morphology dilation operation is applied to the infrared image 30 containing the information of the external light emitter 20, and the infrared region of the external light emitter 20 is sufficiently taken into account in time error.
- Zoom in to perform preprocessing After the preprocessing, the difference calculation may be performed to compensate for an error due to a time difference between the infrared images 30 and 40.
- the infrared sensor 100 is applied. Can increase the reliability.
- the infrared information by the external light emitter 20 is removed, and the infrared image 50 containing only the infrared information returned by the light generated by the illumination unit 200 is reflected on the object and analyzed to detect an object ( A method of calculating the position or distance of 10) will be described in detail with reference to FIG. 3.
- the infrared rays irradiated from the illumination unit 200 to the sensing region are reflected by the object 10 existing in the near field and input to the imaging device 300.
- the amount of light reflected from the object 10 has a correlation with the reflection direction of the object 10, the material of the object 10, the distance between the object 10 and the infrared sensor 100, and the like.
- the distance ⁇ between the infrared sensor 100 and the object 10 can be derived from the amount of light recorded in each pixel.
- the direction ( ⁇ ) of the object 10 with respect to the infrared sensor 100 is measured.
- a relative position between the infrared sensor 100 and the object 10 can be calculated by determining three factors, a distance ⁇ and a direction ⁇ , ⁇ , by applying a spherical coordinate system in a three-dimensional space.
- FIG. 4 is a flowchart illustrating a sensing method using an infrared sensor according to an exemplary embodiment of the present invention.
- an infrared image 30 obtained by the external light emitter 20 is obtained through the imaging device 300 in a state in which the illumination unit 200 is controlled to not irradiate infrared rays (S510).
- the illumination unit 200 is controlled to not irradiate infrared rays (S510).
- the lighting unit 200 emits light in this step (S510) due to an error in the control, if the light emitting for a relatively short time compared to the shooting time of the imaging device 300, It can be processed to have little effect on the detection result.
- the infrared image 40 is obtained through the imaging device 300 (S520).
- the infrared image 40 obtained in the step S520 includes infrared information by the external light emitter 20 and the lighting unit 200.
- steps 510 and 520 may be reversed.
- the infrared rays irradiated by the illumination unit 200 are reflected on the object 10 in the detection area by performing a comparative operation on the infrared images 30 and 40 acquired in steps 510 and 520 through the control unit 400. Only the regressed infrared image 50 is extracted (S530).
- the above-described difference operation can be used for the comparison operation process.
- the controller 400 processes the extracted infrared image 50 to calculate the position of the object 10 in the detection area.
- the controller 400 may analyze the reflected light amount of the extracted infrared image 50 to calculate a distance between the infrared sensor 100 and the object 10 in the sensing area (S540).
- FIG. 5 is a graph for explaining a method of operating an infrared sensor according to the flowchart shown in FIG. 4.
- the lighting unit 200 operates according to a control signal of the lighting control unit 410, and mainly emits light in two stages.
- the imaging device 300 operates according to a control signal of the imaging device controller 420, and mainly operates in steps 1 and 2.
- the light emission time of the lighting unit 200 has a close relationship with the photographing time of the imaging device 300.
- the imaging device 300 photographs the sensing region during a time interval during which the illumination unit 200 does not emit light. Subsequently, in the second step, the imaging device 300 captures the sensing area while the illumination unit 200 emits light.
- the first step and the second step may be reversed.
- the controller 400 controls the image capturing of the second stage to be started after the predetermined first time interval after the imaging of the imaging device 300 is finished in the first step, thereby minimizing the time difference between the two infrared images 30. , 40) can be obtained.
- the controller 400 controls the image pickup device 300 to start shooting in the fourth step after the shooting of the image pickup device 300 is finished in the second step, after a preset second time interval, and the second step.
- the controller 400 controls the image pickup device 300 to start shooting in the fourth step after the shooting of the image pickup device 300 is finished in the second step, after a preset second time interval, and the second step.
- the second time interval is minimized, it takes two times or more of the first time interval.
- the image processing is performed for the next work cycle during the image processing in the third or fourth stage of FIG. 2 hours interval can be minimized.
- the memory unit 430 stores the infrared images 30 and 40 acquired through the imaging device 300 when the imaging time of the imaging device 300 ends.
- the image processor 440 compares and processes the infrared images 30 and 40 stored in the memory unit 430 in the third step, and extracts only the infrared image 50 by the lighting unit 200. Next, the image processor 440 analyzes the infrared image 50 extracted in the third step, and calculates the position or distance of the object 10 in the sensing area.
- the above-described four steps may be repeatedly performed to continuously measure the object 10 in the sensing area.
- the first step of the next N + 1 times may be started to minimize the operating cycle of the infrared sensor 100.
- the infrared sensor 100 of the present invention may remove the infrared information generated by the external light emitter 20 and extract only the information reflected by the illumination unit 200 to the object 10.
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Electromagnetism (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Studio Devices (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Measurement Of Optical Distance (AREA)
- Optical Radar Systems And Details Thereof (AREA)
Abstract
Description
Claims (11)
- 감지 영역 내에 적외선을 조사하여 물체를 감지하는 적외선 센서에 있어서,상기 감지 영역에 적외선을 조사하는 조명부;상기 감지 영역으로부터 입사되는 적외선에 감응하여 감지 영역 내의 적외선 이미지를 획득하는 촬상소자; 및상기 조명부의 적외선을 조사하는 발광시간과 상기 촬상소자의 촬영시간을 조절하는 제어부를 포함하며,상기 제어부는,상기 조명부의 발광시간을 조절하는 조명 제어부;상기 촬상소자의 촬영시간을 조절하는 촬상소자 제어부;상기 촬상소자에서 획득된 적외선 이미지를 저장하는 메모리부; 및상기 메모리부에 저장된 적외선 이미지 중, 상기 조명부에서 적외선을 조사했을 때 획득된 적외선 이미지와, 상기 조명부에서 적외선을 조사하지 않았을 때 획득된 적외선 이미지를 비교 연산 처리하여, 상기 감지 영역 내 물체의 위치 또는 거리를 산출하는 이미지처리부를 포함하는 것을 특징으로 하는 적외선 센서.
- 제 1 항에 있어서, 상기 조명부는 할로겐 램프 또는 적외선 LED인 것을 특징으로 하는 적외선 센서.
- 제 1 항에 있어서, 상기 촬상 소자는 CCD 또는 CMOS인 것을 특징으로 하는 적외선 센서.
- 제 1 항에 있어서, 상기 감지 영역으로부터 입사되는 빛을 필터링하여 적외선 파장 영역의 빛을 상기 촬상소자로 전달하는 적외선 투과 필터를 더 포함하는 것을 특징으로 하는 적외선 센서.
- 제 1 항에 있어서, 상기 이미지처리부는 상기 촬상소자의 각 픽셀에 기록된 반사광량으로부터 상기 적외선 센서와 상기 감지 영역 내 물체 사이의 거리를 산출하는 것을 특징으로 하는 적외선 센서.
- 제 1 항에 있어서, 상기 제어부는 미리 설정된 제1 시간간격을 따라 적외선을 조사했을 때와 적외선을 조사하지 않았을 때의 2장의 적외선 이미지를 획득하도록 제어하며, 미리 설정된 제2 시간간격 후에 다시 상기 2장의 적외선 이미지를 획득하도록 제어하는 것을 특징으로 하는 적외선 센서.
- 제 6 항에 있어서, 상기 조명부, 촬상소자, 메모리부 및 이미지 처리부의 동작은 하나의 주기를 이루면서, 연속적으로 수행되며, 상기 이미지 처리부에서 상기 획득된 2장의 적외선 이미지를 처리하는 동안 상기 촬상소자는 다음의 적외선 이미지를 촬영하는 것을 특징으로 하는 적외선 센서.
- 감지 영역에 적외선을 조사하는 조명부와, 상기 감지 영역 내의 적외선 이미지를 획득하는 촬상소자와, 상기 조명부의 발광시간과 상기 촬상소자의 촬영시간을 조절하는 제어부를 포함하는 적외선 센서를 이용한 감지 방법으로서,(a) 상기 조명부에서 적외선을 조사하지 않도록 제어한 상태에서, 상기 촬상소자를 통해 상기 감지 영역 내의 적외선 이미지를 획득하는 단계;(b) 상기 조명부에서 적외선을 조사하도록 제어한 상태에서, 상기 촬상소자를 통해 상기 감지 영역 내의 적외선 이미지를 획득하는 단계;(c) 상기 (a), (b) 단계들에서 획득된 적외선 이미지들을 비교 연산 처리하여, 상기 조명부에서 조사한 적외선이 상기 감지 영역 내 물체에 반사되어 회귀한 적외선 이미지만을 추출하는 단계; 및(d) 상기 추출된 적외선 이미지를 상기 제어부에서 처리하여, 상기 감지 영역 내 물체의 위치를 산출하는 단계를 포함하는 것을 특징으로 하는 적외선 센서를 이용한 감지 방법.
- 감지 영역에 적외선을 조사하는 조명부와, 상기 감지 영역 내의 적외선 이미지를 획득하는 촬상소자와, 상기 조명부의 발광시간과 상기 촬상소자의 촬영시간을 조절하는 제어부를 포함하는 적외선 센서를 이용한 감지 방법으로서,(a) 상기 조명부에서 적외선을 조사하도록 제어한 상태에서, 상기 촬상소자를 통해 상기 감지 영역 내의 적외선 이미지를 획득하는 단계;(b) 상기 조명부에서 적외선을 조사하지 않도록 제어한 상태에서, 상기 촬상소자를 통해 상기 감지 영역 내의 적외선 이미지를 획득하는 단계;(c) 상기 (a), (b) 단계들에서 획득된 적외선 이미지들을 비교 연산 처리하여, 상기 조명부에서 조사한 적외선이 상기 감지 영역 내 물체에 반사되어 회귀한 적외선 이미지만을 추출하는 단계; 및(d) 상기 추출된 적외선 이미지를 상기 제어부에서 처리하여, 상기 감지 영역 내 물체의 위치를 산출하는 단계를 포함하는 것을 특징으로 하는 적외선 센서를 이용한 감지 방법.
- 제 8 항 또는 제 9 항에 있어서, 상기 (a), (b) 단계에서 획득된 적외선 이미지들을 차연산(subtract)을 통해 비교 연산 처리하는 것을 특징으로 하는 적외선 센서를 이용한 감지 방법.
- 제 8 항 또는 제 9 항에 있어서, 상기 (d) 단계는,상기 추출된 적외선 이미지들의 반사광량을 분석하여 상기 적외선 센서와 상기 감지 영역 내 물체 사이의 거리를 산출하는 단계를 더 포함하는 것을 특징으로 하는 적외선 센서를 이용한 감지 방법.
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020107005487A KR101182188B1 (ko) | 2009-11-13 | 2009-11-13 | 적외선 센서 및 이를 이용한 감지 방법 |
PCT/KR2009/006699 WO2011059127A1 (ko) | 2009-11-13 | 2009-11-13 | 적외선 센서 및 이를 이용한 감지 방법 |
US13/125,963 US8854471B2 (en) | 2009-11-13 | 2009-11-13 | Infrared sensor and sensing method using the same |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/KR2009/006699 WO2011059127A1 (ko) | 2009-11-13 | 2009-11-13 | 적외선 센서 및 이를 이용한 감지 방법 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2011059127A1 true WO2011059127A1 (ko) | 2011-05-19 |
Family
ID=43991776
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2009/006699 WO2011059127A1 (ko) | 2009-11-13 | 2009-11-13 | 적외선 센서 및 이를 이용한 감지 방법 |
Country Status (3)
Country | Link |
---|---|
US (1) | US8854471B2 (ko) |
KR (1) | KR101182188B1 (ko) |
WO (1) | WO2011059127A1 (ko) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2012142783A1 (zh) * | 2011-04-18 | 2012-10-26 | 广州市晶华光学电子有限公司 | 一种360度自动跟踪式狩猎相机及其工作方法 |
CN104458752A (zh) * | 2013-09-18 | 2015-03-25 | 电装波动株式会社 | 外观检查系统以及外观检查方法 |
US9558690B2 (en) | 2014-02-28 | 2017-01-31 | Samsung Display Co., Ltd. | Electronic device and display method thereof |
US10171010B2 (en) | 2012-07-16 | 2019-01-01 | Gsi Helmholtzzentrum Fuer Schwerionenforschung Gmbh | Method and apparatus for generating energy using piezo elements |
CN109360406A (zh) * | 2018-11-22 | 2019-02-19 | 东南大学 | 一种基于红外信号的自动跟随控制方法及系统 |
Families Citing this family (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9007466B2 (en) * | 2011-04-27 | 2015-04-14 | General Electric Company | System and method for thermographic inspection |
KR101289797B1 (ko) | 2011-11-21 | 2013-07-26 | 삼성테크윈 주식회사 | 줌 조명 시스템 및 이를 채용한 촬상 장치 |
EP2845165A4 (en) * | 2011-12-09 | 2015-08-19 | Microsoft Technology Licensing Llc | AMBIENT LIGHT WARNING FOR A PICTOR SENSOR |
KR102074857B1 (ko) * | 2012-09-26 | 2020-02-10 | 삼성전자주식회사 | 이벤트 기반 비전 센서를 이용한 근접 센서 및 근접 센싱 방법 |
FI124434B (en) * | 2012-10-31 | 2014-08-29 | Metso Automation Oy | Method and apparatus for track monitoring |
KR101484201B1 (ko) | 2013-03-29 | 2015-01-16 | 현대자동차 주식회사 | 운전자 머리 위치 측정 시스템 |
JP6442710B2 (ja) * | 2013-07-23 | 2018-12-26 | パナソニックIpマネジメント株式会社 | 固体撮像装置、撮像装置及びその駆動方法 |
IL233356A (en) * | 2014-06-24 | 2015-10-29 | Brightway Vision Ltd | Sensor-based imaging system with minimum wait time between sensor exposures |
CN105828053A (zh) * | 2016-06-03 | 2016-08-03 | 京东方科技集团股份有限公司 | 一种视频监控方法和设备 |
CN107621641B (zh) * | 2017-09-20 | 2019-06-21 | 歌尔股份有限公司 | 红外测障方法、装置及机器人 |
KR102114662B1 (ko) * | 2017-12-06 | 2020-05-25 | 서울과학기술대학교 산학협력단 | 적외선 led를 이용한 실내 측위 시스템 및 방법 |
CN109991267A (zh) * | 2019-03-25 | 2019-07-09 | 电子科技大学 | 一种长脉冲红外无损检测装置 |
KR102220950B1 (ko) * | 2019-07-31 | 2021-02-26 | 엘지전자 주식회사 | 자율 주행 시스템에서 차량을 제어하기 위한 방법 및 장치 |
CN115131451A (zh) * | 2021-03-24 | 2022-09-30 | 鸿富锦精密电子(郑州)有限公司 | 红外激光辐射强度可视化方法、电子设备及存储介质 |
KR102412095B1 (ko) * | 2021-04-21 | 2022-06-22 | 이승우 | 감시 장치 및 방법 |
KR102640510B1 (ko) * | 2022-05-20 | 2024-02-23 | (재)한국나노기술원 | 아이 세이프 적외선 스마트 저울 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR19990048343A (ko) * | 1997-12-09 | 1999-07-05 | 정몽규 | 탑승자 위치 검출장치 및 방법 |
KR20040028600A (ko) * | 2001-08-16 | 2004-04-03 | 레이던 컴퍼니 | 근거리 물체 감지 시스템 |
KR20040083497A (ko) * | 2002-02-02 | 2004-10-02 | 키네티큐 리미티드 | 헤드 위치 센서 |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100803414B1 (ko) | 2000-08-16 | 2008-02-13 | 레이던 컴퍼니 | 근거리 물체 감지 시스템 |
JP2002296488A (ja) * | 2001-03-29 | 2002-10-09 | Fuji Photo Film Co Ltd | カメラ |
JP5109221B2 (ja) * | 2002-06-27 | 2012-12-26 | 新世代株式会社 | ストロボスコープを使った入力システムを備える情報処理装置 |
DE10343406A1 (de) | 2003-09-19 | 2005-04-14 | Daimlerchrysler Ag | Entfernungsbestimmung eines Objektes |
KR100580630B1 (ko) * | 2003-11-19 | 2006-05-16 | 삼성전자주식회사 | 적외선을 이용한 사람 식별 장치 및 방법 |
US7362885B2 (en) * | 2004-04-20 | 2008-04-22 | Delphi Technologies, Inc. | Object tracking and eye state identification method |
US8180168B2 (en) * | 2007-01-16 | 2012-05-15 | Hewlett-Packard Development Company, L.P. | One-pass filtering and infrared-visible light decorrelation to reduce noise and distortions |
US8374438B1 (en) * | 2007-10-04 | 2013-02-12 | Redshift Systems Corporation | Visual template-based thermal inspection system |
US8160382B2 (en) * | 2007-10-15 | 2012-04-17 | Lockheed Martin Corporation | Method of object recognition in image data using combined edge magnitude and edge direction analysis techniques |
-
2009
- 2009-11-13 WO PCT/KR2009/006699 patent/WO2011059127A1/ko active Application Filing
- 2009-11-13 KR KR1020107005487A patent/KR101182188B1/ko active IP Right Grant
- 2009-11-13 US US13/125,963 patent/US8854471B2/en not_active Expired - Fee Related
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR19990048343A (ko) * | 1997-12-09 | 1999-07-05 | 정몽규 | 탑승자 위치 검출장치 및 방법 |
KR20040028600A (ko) * | 2001-08-16 | 2004-04-03 | 레이던 컴퍼니 | 근거리 물체 감지 시스템 |
KR20040083497A (ko) * | 2002-02-02 | 2004-10-02 | 키네티큐 리미티드 | 헤드 위치 센서 |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2012142783A1 (zh) * | 2011-04-18 | 2012-10-26 | 广州市晶华光学电子有限公司 | 一种360度自动跟踪式狩猎相机及其工作方法 |
US10171010B2 (en) | 2012-07-16 | 2019-01-01 | Gsi Helmholtzzentrum Fuer Schwerionenforschung Gmbh | Method and apparatus for generating energy using piezo elements |
CN104458752A (zh) * | 2013-09-18 | 2015-03-25 | 电装波动株式会社 | 外观检查系统以及外观检查方法 |
CN104458752B (zh) * | 2013-09-18 | 2018-10-23 | 电装波动株式会社 | 外观检查系统以及外观检查方法 |
US9558690B2 (en) | 2014-02-28 | 2017-01-31 | Samsung Display Co., Ltd. | Electronic device and display method thereof |
CN109360406A (zh) * | 2018-11-22 | 2019-02-19 | 东南大学 | 一种基于红外信号的自动跟随控制方法及系统 |
Also Published As
Publication number | Publication date |
---|---|
KR20110057083A (ko) | 2011-05-31 |
US8854471B2 (en) | 2014-10-07 |
KR101182188B1 (ko) | 2012-09-12 |
US20120013745A1 (en) | 2012-01-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2011059127A1 (ko) | 적외선 센서 및 이를 이용한 감지 방법 | |
KR101180027B1 (ko) | 광센서 및 이를 이용한 감지 방법 | |
WO2017039259A1 (ko) | 열화상 카메라를 이용한 전력설비 진단 장치 및 방법 | |
CN102841197B (zh) | 分析装置以及分析方法 | |
WO2012023639A1 (ko) | 다수의 센서를 이용하는 객체 계수 방법 및 장치 | |
WO2015115802A1 (ko) | 깊이 정보 추출 장치 및 방법 | |
KR101998639B1 (ko) | 열화상 카메라와 색상 카메라의 복합 영상을 이용한 지능형 발화지점 감시 시스템 | |
WO2013062345A1 (ko) | 비전시스템의 이미지 품질 향상을 위한 컬러조명 제어방법 | |
WO1998039627A3 (en) | Calibration of imaging systems | |
JP2004246252A (ja) | 画像情報収集装置及び方法 | |
EP3080788A2 (en) | Flame detection system and method | |
WO2017138754A1 (ko) | 차량하부 촬영장치 및 이를 운용하는 차량하부 촬영방법 | |
KR101295959B1 (ko) | 실내 천장 영상을 이용한 이동 로봇 위치 측정 장치 및 방법 | |
WO2016068353A1 (ko) | 영상처리기반의 거리 측정장치 및 그 방법 | |
CN104657702B (zh) | 眼球侦测装置、瞳孔侦测方法与虹膜辨识方法 | |
EP3053514A1 (en) | Organ imaging apparatus | |
CN112672137A (zh) | 获取深度图像的方法、结构光系统以及一种电子装置 | |
CN112825491B (zh) | 用于实现发光装置检测的方法和系统 | |
WO2017183796A1 (ko) | 적외선 광학 장치를 탑재한 전자 장치 및 적외선 광학 장치의 제어 방법 | |
WO2021091121A1 (ko) | 비색법을 위한 광 제어 장치 및 방법 | |
WO2020017755A1 (ko) | 전자 디바이스의 표면 측정 장치 및 방법 | |
JP2008164338A (ja) | 位置検出装置 | |
WO2023191429A1 (ko) | 카메라 교환렌즈 제어장치 및 그를 포함하는 카메라 | |
KR20150138007A (ko) | 두께 판단 방법, 장치 및 시스템 | |
WO2019009474A1 (ko) | 카메라 일체형 레이저 검지 장치 및 그의 동작 방법 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 20107005487 Country of ref document: KR Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13125963 Country of ref document: US |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 09851304 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 27.08.2012) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 09851304 Country of ref document: EP Kind code of ref document: A1 |