WO2016203078A2 - Procedimiento de mejora de imagen térmica o ir basado en información de escena para videoanálisis - Google Patents
Procedimiento de mejora de imagen térmica o ir basado en información de escena para videoanálisis Download PDFInfo
- Publication number
- WO2016203078A2 WO2016203078A2 PCT/ES2016/070443 ES2016070443W WO2016203078A2 WO 2016203078 A2 WO2016203078 A2 WO 2016203078A2 ES 2016070443 W ES2016070443 W ES 2016070443W WO 2016203078 A2 WO2016203078 A2 WO 2016203078A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- region
- interest
- procedure
- video analysis
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 104
- 238000004458 analytical method Methods 0.000 title claims abstract description 48
- 238000001514 detection method Methods 0.000 claims abstract description 46
- 238000012545 processing Methods 0.000 claims abstract description 17
- 238000001228 spectrum Methods 0.000 claims abstract description 8
- 230000009466 transformation Effects 0.000 claims description 20
- 238000001914 filtration Methods 0.000 claims description 18
- 230000006872 improvement Effects 0.000 claims description 14
- 210000004027 cell Anatomy 0.000 claims description 8
- 230000008569 process Effects 0.000 claims description 7
- 230000011218 segmentation Effects 0.000 claims description 6
- 230000003068 static effect Effects 0.000 claims description 6
- 238000009499 grossing Methods 0.000 claims description 5
- 210000004460 N cell Anatomy 0.000 claims description 4
- 230000002123 temporal effect Effects 0.000 claims description 2
- 238000010586 diagram Methods 0.000 description 9
- 230000000694 effects Effects 0.000 description 9
- 238000004364 calculation method Methods 0.000 description 7
- 238000010191 image analysis Methods 0.000 description 7
- 238000001429 visible spectrum Methods 0.000 description 4
- 230000007423 decrease Effects 0.000 description 2
- 238000002329 infrared spectrum Methods 0.000 description 2
- 238000003780 insertion Methods 0.000 description 2
- 230000037431 insertion Effects 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 230000002146 bilateral effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000003708 edge detection Methods 0.000 description 1
- 238000013213 extrapolation Methods 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 238000010606 normalization Methods 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 239000002689 soil Substances 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/90—Dynamic range modification of images or parts thereof
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/25—Determination of region of interest [ROI] or a volume of interest [VOI]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/255—Detecting or recognising potential candidate objects based on visual cues, e.g. shapes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19602—Image analysis to detect motion of the intruder, e.g. by frame subtraction
- G08B13/19613—Recognition of a predetermined image pattern or behaviour pattern indicating theft or intrusion
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B29/00—Checking or monitoring of signalling or alarm systems; Prevention or correction of operating errors, e.g. preventing unauthorised operation
- G08B29/18—Prevention or correction of operating errors
- G08B29/20—Calibration, including self-calibrating arrangements
- G08B29/24—Self-calibration, e.g. compensating for environmental drift or ageing of components
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10048—Infrared image
Definitions
- the present invention refers to an image improvement procedure for video analysis or automatic video surveillance systems whose input image is IR or thermal spectrum, in which depth or scene information is used to make this improvement.
- a generic video analysis system aims to determine the presence of people, vehicles or other specific objects (objective) in a given area of space from the images captured by an image acquisition device, preferably a fixed camera, which observes said particular area of space.
- the input image to the aforementioned video analysis system is IR (infrared) or thermal
- one of the main problems that the system must face is the lack of contrast between the scene (background) and the objective (foreg round), making it difficult to detect the objective.
- the soil temperature can be around 37 ° C, which can make it difficult to detect human targets due to the lack of image contrast. This effect is further accentuated if it fits in the remote areas of the image where objects are smaller. This problem not only suffer from automatic surveillance or video analysis systems but also those verified by an operator.
- the present invention has as its main purpose to describe an image improvement procedure for video analysis or automatic video surveillance systems whose input image is IR or thermal spectrum in which, using depth or scene information, the resulting image has sufficient contrast to determine the presence of a specific object. Explanation of the invention.
- the present invention manages to overcome all the drawbacks as well as defects mentioned above in the state of the art and achieve the purpose described in the previous paragraph.
- Video analysis or automatic surveillance systems comprise at least one image acquisition device through which different images of a specific area of space are obtained which together with an image digitization system provides the image with an image output digital.
- said image is subjected to an image processing system that applies at least one image improvement procedure such that at its output said image must be of sufficient quality to detect a particular type of object ( person, animal or any other user defined).
- video analysis or automatic surveillance systems offer at least two modes of operation, the system calibration and detection. Such modes of operation are performed by their corresponding calibration and detection systems.
- the calibration mode of operation is generally used at the beginning of the commissioning of the video analysis or automatic surveillance system since its purpose is to provide the image with a spatial reference so that the detection system can reference all the calculations that performs during the detection procedure: calculation of distance traveled, speed, size of objects, etc.
- the calibration stage performed by the calibration system must allow the equivalence to be provided. between the approximate size in pixels of the object to be detected (usually person) and each of the pixel coordinates of the image.
- the object of the present invention is an image improvement procedure for video analysis or automatic video surveillance systems whose input image is IR or thermal spectrum in which the depth or scene information of the image is used for this purpose.
- the first step of this procedure is that said depth or scene information of the image is entered by the user or that it comes from the calibration stage or the calibration system since obtaining the approximate size variation
- the object to be detected for each of the pixel coordinates of the image is an indirect form of the depth of the actual scene captured in the image.
- Strong calibration procedure that is based on obtaining both the intrinsic parameters of the camera (focal length, pixel size, radial lens distortion, etc.) and the extrinsic parameters (height and angular orientation) to measure distance and real speed over the image and extrapolation of the sizes of the objects to be detected for each pixel using basic geometry.
- the intrinsic parameters of the camera focal length, pixel size, radial lens distortion, etc.
- the extrinsic parameters height and angular orientation
- Calibration phase that obtains the size of a person for each image position from the size and position data obtained for each object identified as a person in the sample acquisition phase.
- the so-called spatial filtering it should be noted that it is based on selecting and applying a filter on the image in order to reduce noise, increase details, soften the image ..., obtaining an image with better contrast.
- These filters are nothing more than small sub-images that convolve with the main image and generate a response. Thus, depending on the size of the filter we will deal with different scales of image information.
- the spatial filtering and how to apply it is based on at least the following steps: For each point of the image, the size in pixels of the object to be detected (w, h) is obtained by means of the calibration system, wyh being respectively the width and height in pixels of the object to be detected at that point.
- a spatial filter of size between 3 x 3 pixels and max (w, h) x max (w, h) is constructed for each point of the image
- Each point in the image is convolved with the spatial filter of the size corresponding to that point.
- said filters adjust their size in a variable manner at each image position based on the image or scene depth information obtained during the calibration phase.
- said information should comprise at least the size of the object to be detected that was estimated during calibration.
- a spatial filtering is used to reduce the noise, it is possible to choose between those filters that reduce the noise in the spatial domain, among which the linear filters would be found, such as: average filter, and those not linear, such as: medium or bilateral filter, and those filters that reduce noise in the transformed domain, among which filters based on the wavelet transform would be found.
- the filter that is chosen it must adapt its size in each position of the image depending on the depth or scene information obtained during the calibration phase.
- the equalization of an image is actually the normalization of its histogram. That is, the The main idea is that, given an input image /, that input image is transformed based on the information of its histogram so that the output image has a histogram as flat as possible.
- / the input image whose possible values i range from 0 ⁇ i ⁇ L - 1
- the probability that the input image / has the value i is defined as:
- n I total number of pixels whose value is equal aiyn the total number of pixels in the image.
- the transformation function is that indicated on page 91 of the book by González and Woods entitled “Digital Image Processing” of Prentice Hall 2002.
- this simple equalization comprises at least the following steps:
- the depth or scene information of the image can be used to significantly improve the contrast of the image.
- depth or scene information is used to focus the improvement of the image in those areas where the objects to be detected by the video analysis system are smaller, achieving that in areas of difficult detection
- the contrast is as large as possible.
- step number 1 of the procedure for simple equalization is modified by calculating the transformation T using only the information of the pixels of the regions where the objects to be detected are smaller. In this way, the contrast in difficult detection areas is maximized even if the easy detection zones (large objects to be detected) can be harmed.
- ROI region of interest
- This region is not restricted to any specific shape or size to the point that it could well be made up of subregions. Preferably, it is defined as a rectangle
- the sub-image formed by the pixels of the input image / contents in the region r is defined as l r and the histogram of this sub-image as
- n go is the total number of pixels of the sub-image I r whose value is equal to r the total number of pixels of the sub-image I r .
- the image enhancement procedure based on simple equalization comprises at least the following steps:
- fig. 1 illustrates the block diagram of a video analysis or video surveillance system according to the invention
- fig. 2 shows the block diagram of the detection system
- fig. 3 represents the block diagram of a scene calibration system based on a strong calibration procedure
- fig. 4 illustrates the block diagram of a scene calibration system based on a weak calibration procedure
- fig. 5 shows an image to which a scene calibration system has been applied
- fig. 6 represents an equalization procedure.
- fig. 7 illustrates the operation of a hysteresis based switch.
- Fig. 1 illustrates the block diagram of a video analysis or automatic surveillance system (1) according to the invention comprising at least one image acquisition device (2) from which images of an area of space are obtained, a image scanning system (3) that provides the digital image obtained by said image acquisition device (2), an image processing system (4) and two alternative operating systems, the scene calibration system ( 5) and the detection system (6).
- the image acquisition devices (2) allow to obtain images in the IR or thermal spectrum. Preferably they are fixed cameras with this type of image capture. Also included are image acquisition devices (2) that obtain images in the near IR spectrum, such as day / night cameras operating with this section of the electromagnetic spectrum during night surveillance.
- the image acquisition device (2) in the case that it already allows obtaining a digital image or the image scanning system (3) can be prepared to transmit the images by any means of transmission (cable, fiber, wireless, etc.). ).
- the image processing system (4) applies at least one image improvement procedure such that at the exit of said system the image is of sufficient quality to detect a particular type of object, preferably, a person.
- the video analysis or automatic surveillance system (1) has two alternative operating systems, the detection system (6) and the scene calibration system (5).
- the detection system (6) is applied regularly during the operation of the video analysis or automatic surveillance system (1) since it is the one that allows the detection of specific objects, preferably people.
- the scene calibration system (5) is preferably applied only once at the start of the start-up of the video analysis or automatic surveillance system (1) and must provide the image with a spatial reference so that the Detection system (6) can reference all the calculations it performs during the detection process: calculation of the distance traveled, speed, size of objects, etc., as well as providing direct or indirect information of the depth of the real scene captured in the image for the image processing system (4).
- the scene calibration system (5) can be any type of system that obtains the depth of the actual scene captured in the image either directly or indirectly.
- the scene calibration system (5) is a system that obtains the variation of the approximate size of the object to be detected for each of the pixel coordinates of the image since it is an indirect way of measuring the depth of the real scene captured in the image.
- Fig. 2 shows the block diagram of the detection system (6) comprising a static scene segmentation system (7), a candidate generation system (8), a classification system (9), a monitoring system (10) and a decision system (20).
- the static scene segmentation system (7) classifies the pixels into at least two types, moving objects and objects belonging to the background of the image.
- the candidate generation system (8) groups the pixels that refer to moving objects and assigns a unique identifier to each moving object of the image. It should be noted that for both the static segmentation system (7) and the candidate generation system (8) it is very relevant to have a sufficiently contrasted image.
- the classification system (9) classifies moving objects according to whether it is an object to be detected, preferably, person and / or vehicle, or is not. This system needs, as just mentioned, the scene information obtained during the calibration phase to perform the necessary calculations (speed measurement, size, distance traveled, etc.) to classify the objects and hence in the block diagram of Fig. 2 a block called calibration appears to reference such a need for information.
- a tracking system (10) maintains the temporal coherence of the objects to finally, depending on the detection rules introduced by the user generate the respective intrusion alarms.
- the decision system (20) is responsible for determining, from some rules - hence there is a block called rules to reference such need for information - if the objects classified by the classification system (9) should be considered as intruders , generating the corresponding alarm in that case.
- FIG. view geometry in computer vision "from Cambridge University Press 2003.
- Said calibration system (5) may comprise at least one parameter insertion system of the image acquisition device (14) and a scene parameter calculation system (15).
- the parameter insertion system of the image acquisition device (14) obtains, it directly or through the user itself, the intrinsic parameters of the image acquisition device (2), such as: focal length, pixel size, distortion radial lens, and extrinsic parameters, such as: height and angular orientation.
- the scene parameter calculation system (15) obtains the size of the objects to be detected for each pixel.
- FIG. 4 A block diagram of a scene calibration system (5) based on a weak calibration procedure, such as that described in patent ES2452790 "Procedure and image analysis system" is illustrated in Fig. 4.
- said scene calibration system (5) comprises at least one static scene segmentation system (7), a candidate generation system (8), a tracking system (10), a Observed size / position mapping system (11) and scene parameter estimation system (12).
- the static scene segmentation systems (7), candidate generation (8), tracking (10) perform the same functions as those described in the detection system (6), and may even be the same.
- the size / position mapping system (11) obtains the variation of the approximate size of the object to be detected for each of the pixel coordinates of the image.
- the scene parameter estimation system (12) allows obtaining other parameters necessary for the detection system (6), such as: speed, size and distance measurement.
- the scene calibration system (5) of the video analysis or automatic surveillance system (1) uses the calibration procedure described in Spanish patent ES2452790 "Procedure and image analysis system" to obtain depth or scene information.
- FIG. 5 shows an image to which the calibration system has been applied and in which the rectangles (16) indicate the approximate size of the object to detect, preferably, people, at the point where the rectangle (16) is drawn.
- the image processing system (4) performs an image enhancement process comprising a processing step in which through said depth or scene information, entered by the user or obtained from Through any scene calibration system (5), although preferably those using the procedures just described, the contrast of the images captured by the image acquisition device (2) is improved.
- the image processing system (4) comprises at least one filter that adjusts its size in a variable manner at each image position based on the image or scene depth information, preferably, a percentage of the size of the object to be detected that has been estimated by the scene calibration system (5).
- This spatial filtering can be applied to the entire image or only to a region of interest r f .
- the criteria for defining said region of interest r f are preferably:
- the criterion for defining the region of interest r f will be all those pixels for which the expected object size given by the scene calibration system (5), is in the range ⁇ T min , T min + s (T max - T min )). Being ⁇ a number between 0 and 1.
- a criterion for defining the region of interest (r f ) comprises at least the following steps:
- the regions of interest (r f ) are not restricted to any form or size.
- said region of interest (r f ) is defined as a rectangle (17)
- rr [x, y, w rf , h rf ]
- the relevant content at the level of object detection, is usually centered in the central part of the image, so it is preferably defined
- Fig. 5 a rectangle (17) is drawn that defines a region of interest for that image. It should be noted that this type of rectangular region is suitable for any scene calibration system (5) since the final result of the calibration is a person size map per pixel.
- the weighting function g (x, y) in the most general case, is a function that takes values between 0 and 1, 0 (x, y) is the filtered image and I (x, y) is the image of entry.
- the value of g (x, y) in the center of the region of interest (rf) is maximum (equal to 1) y, as the values of x or y move away from the center, the value of g (x, y) decreases and therefore the unqualified image begins to take relevance as the function 1 - g (x,) grows.
- this entire stage performs a smoothing of the equalization and can also be understood as the introduction of an artificial focus that illuminates the area of the region of interest (r f ).
- the image processing system (4) comprises at least one equalization procedure in which the depth or scene information obtained through the scene calibration system (5) is used.
- said equalization procedure is based on a simple equalization procedure centered on a region that is considered of interest (r).
- the criteria for defining said region of interest are preferably:
- the criterion for defining the region of interest (r) will be all those pixels for which the expected object size given by the scene calibration system (5) is in the range ⁇ T min , T min + s (T max - T min )). Being ⁇ a number between 0 and 1 that allows to regulate the level of equalization.
- a criterion for defining the region of interest (r) comprises at least the following steps:
- the region of interest as the convex zone that surrounds the marked cells.
- the regions of interest (r) are not restricted to any form or size.
- the relevant content at the level of object detection, is usually centered in the central part of the image, so it is preferably defined
- I w is the total width of the input image;
- yy hor is the vertical coordinate that delimits the limit of detection (preferably, the vertical coordinate from which the expected size of the object to be detected by a person, is smaller than the minimum size that the system needs to be able to detect a person) that It can be entered by the user or it can be obtained from the calibration system (5).
- Fig. 5 a rectangle (17) is drawn that defines a region of interest for that image.
- this type of rectangular region is suitable for any scene calibration system (5) since the final result of the calibration is a person size map per pixel.
- the equalization procedure defines a sub-image formed by the pixels of the input image / contents in the region r as I r and the histogram of this sub-image as
- equalization procedure it comprises at least the following steps:
- an equalization smoothing stage is proposed by a weighted sum of the image equalized with the above method and the unqualified image as follows:
- I F (x, y) g (x, y) ⁇ 0 (x, y) + (l - g (x, y)) ⁇ I (x, y)
- weighting function g (x, y) can be any type of function whose value in the center of the region of interest is maximum, although preferably it is a two-dimensional Gaussian function centered in the center of the region of interest (r) and with standard deviations depending on the width and height dimensions of the region of interest itself (r), leaving the center of the Gaussian as: or
- the value of g (x, y) in the center of the region of interest (r) is maximum (equal to 1) y, as the values of x or y move away from the center, the value of g (x, y) decreases and therefore the unqualified image begins to take relevance as the function 1 - g (x, y) grows. Consequently, this entire stage performs a smoothing of the equalization and can also be understood as the introduction of an artificial spotlight that illuminates the area of the region of interest (r).
- a basic equalization procedure has been described in which the depth or scene information obtained through the scene calibration system (5) is used. However, two types of equalization procedure can be considered depending on the nature of the input image, local equalization procedure and remote equalization procedure.
- the local equalization procedure is the simplest and is the one shown in Fig. 6. As can be seen, in this type of equalization procedure the image of the image acquisition device (2) or the image is equalized image resulting from applying the image scanning system (3) using depth information available to the scene calibration system (5) of the video analysis or automatic surveillance system (1).
- a step that studies the range of the histogram by calculating the entropy in the region of interest is incorporated into the equalization method according to the invention, a measure that, although indirect, is much more robust than Simply study the width of the histogram.
- the entropy of the image in the region of interest is defined as:
- This metric will be larger the more the histogram stores at a uniform probability distribution and the wider the dynamic range of the image.
- H HL and H LH two threshold values are set for which we will activate (on) or deactivate (off) the equalization: H HL and H LH .
- the operation of this hysteresis-based switch is illustrated in Fig. 7. Specifically, if during the equalization procedure it is in the "off" equalization mode and the calculated entropy rises above H LH , the procedure is activated Equalization On the contrary, if it is in the "on" state of equalization and the entropy falls below H HL , the equalization procedure is deactivated.
- H r (t) ⁇ (t - l) - (l - p) + H r (t) - p where p is a very small value between 0 and 1.
- the remote equalization procedure is based on remotely defining the region of interest, based on the depth or scene information obtained in the image calibration system (5), for those image acquisition devices (2 ) or image scanning systems (3) that have software that executes an equalization procedure. That is, the equalization procedure is performed by the image acquisition device (2) or the image scanning system (3) but on the region of interest defined from the depth or scene information obtained in the system of Image calibration (5).
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computer Security & Cryptography (AREA)
- Image Processing (AREA)
- Image Analysis (AREA)
- Closed-Circuit Television Systems (AREA)
Abstract
Description
Claims
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/580,257 US10452922B2 (en) | 2015-06-15 | 2016-06-13 | IR or thermal image enhancement method based on background information for video analysis |
CA2989188A CA2989188A1 (en) | 2015-06-15 | 2016-06-13 | Method for ir or thermal image enchancement based on scene information for video analysis |
GB1721740.7A GB2557035B (en) | 2015-06-15 | 2016-06-13 | Method for IR or thermal image enhancement based on scene information for video analysis |
IL256202A IL256202B (en) | 2015-06-15 | 2017-12-08 | An information-based ir or thermal image enhancement method for video analysis |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
ES201530836A ES2563098B1 (es) | 2015-06-15 | 2015-06-15 | Procedimiento de mejora de imagen IR basado en información de escena para videoanálisis |
ESP201530836 | 2015-06-15 |
Publications (3)
Publication Number | Publication Date |
---|---|
WO2016203078A2 true WO2016203078A2 (es) | 2016-12-22 |
WO2016203078A3 WO2016203078A3 (es) | 2017-07-27 |
WO2016203078A4 WO2016203078A4 (es) | 2017-09-14 |
Family
ID=55440517
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/ES2016/070443 WO2016203078A2 (es) | 2015-06-15 | 2016-06-13 | Procedimiento de mejora de imagen térmica o ir basado en información de escena para videoanálisis |
Country Status (6)
Country | Link |
---|---|
US (1) | US10452922B2 (es) |
CA (1) | CA2989188A1 (es) |
ES (1) | ES2563098B1 (es) |
GB (1) | GB2557035B (es) |
IL (1) | IL256202B (es) |
WO (1) | WO2016203078A2 (es) |
Families Citing this family (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102615070B1 (ko) * | 2016-10-12 | 2023-12-19 | 삼성전자주식회사 | 디스플레이 장치, 및 그 제어방법 |
CN109658515B (zh) * | 2017-10-11 | 2022-11-04 | 阿里巴巴集团控股有限公司 | 点云网格化方法、装置、设备及计算机存储介质 |
CN109346100A (zh) * | 2018-10-25 | 2019-02-15 | 烟台市奥境数字科技有限公司 | 一种数字媒体交互式教学系统的网络传输方法 |
US10896492B2 (en) * | 2018-11-09 | 2021-01-19 | Qwake Technologies, Llc | Cognitive load reducing platform having image edge enhancement |
EP3888344B1 (en) * | 2018-11-27 | 2024-05-29 | Google LLC | Methods and systems for colorizing infrared images |
CN109409345B (zh) * | 2018-12-24 | 2020-10-02 | 台州和沃文化传播有限公司 | 智能化弹奏演出装置 |
CN112102207A (zh) * | 2020-10-29 | 2020-12-18 | 北京澎思科技有限公司 | 一种确定温度的方法、装置、电子设备及可读存储介质 |
TWI806006B (zh) * | 2021-02-20 | 2023-06-21 | 緯創資通股份有限公司 | 熱影像定位方法及其系統 |
CN113449664B (zh) * | 2021-07-06 | 2023-09-12 | 河南慧联世安信息技术有限公司 | 一种火灾现场火情监测系统和监测方法 |
CN114216931B (zh) * | 2021-11-05 | 2024-07-02 | 中国矿业大学 | 一种基于红外图像的带式输送机煤炭自燃检测方法 |
CN115035350B (zh) * | 2022-06-29 | 2024-05-07 | 电子科技大学 | 一种基于边缘检测增强的对空地、地面背景小目标检测方法 |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100080485A1 (en) * | 2008-09-30 | 2010-04-01 | Liang-Gee Chen Chen | Depth-Based Image Enhancement |
ES2452790A1 (es) * | 2013-03-28 | 2014-04-02 | Davantis Technologies Sl | Procedimiento y sistema de análisis de imágenes |
Family Cites Families (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5046118A (en) * | 1990-02-06 | 1991-09-03 | Eastman Kodak Company | Tone-scale generation method and apparatus for digital x-ray images |
AU4594796A (en) * | 1994-11-25 | 1996-06-19 | Yuriy Alexandrov | System and method for diagnosis of living tissue diseases |
US7308126B2 (en) * | 1997-08-28 | 2007-12-11 | Icad, Inc. | Use of computer-aided detection system outputs in clinical practice |
US6696945B1 (en) | 2001-10-09 | 2004-02-24 | Diamondback Vision, Inc. | Video tripwire |
US7706576B1 (en) | 2004-12-28 | 2010-04-27 | Avaya Inc. | Dynamic video equalization of images using face-tracking |
US7596241B2 (en) | 2005-06-30 | 2009-09-29 | General Electric Company | System and method for automatic person counting and detection of specific events |
US8139828B2 (en) * | 2005-10-21 | 2012-03-20 | Carestream Health, Inc. | Method for enhanced visualization of medical images |
US7826666B2 (en) * | 2008-02-27 | 2010-11-02 | Honeywell International Inc. | Methods and apparatus for runway segmentation using sensor analysis |
US8339475B2 (en) * | 2008-12-19 | 2012-12-25 | Qualcomm Incorporated | High dynamic range image combining |
US8274565B2 (en) * | 2008-12-31 | 2012-09-25 | Iscon Video Imaging, Inc. | Systems and methods for concealed object detection |
ITTO20090161A1 (it) | 2009-03-03 | 2010-09-04 | Galileo Avionica Spa | Equalizzazione ed elaborazione di immagini ir |
US8054290B2 (en) | 2009-05-27 | 2011-11-08 | Microsoft Corporation | Image contrast enhancement in depth sensor |
SE536510C2 (sv) | 2012-02-21 | 2014-01-14 | Flir Systems Ab | Bildbehandlingsmetod för detaljförstärkning och brusreduktion |
CN103400351B (zh) | 2013-07-30 | 2015-12-23 | 武汉大学 | 基于kinect深度图的低光照图像增强方法及系统 |
US9460499B2 (en) * | 2014-05-30 | 2016-10-04 | Shenzhen Mindray Bio-Medical Electronics Co., Ltd. | Systems and methods for selective enhancement of a region of interest in an image |
-
2015
- 2015-06-15 ES ES201530836A patent/ES2563098B1/es active Active
-
2016
- 2016-06-13 GB GB1721740.7A patent/GB2557035B/en active Active
- 2016-06-13 WO PCT/ES2016/070443 patent/WO2016203078A2/es active Application Filing
- 2016-06-13 CA CA2989188A patent/CA2989188A1/en not_active Abandoned
- 2016-06-13 US US15/580,257 patent/US10452922B2/en active Active
-
2017
- 2017-12-08 IL IL256202A patent/IL256202B/en active IP Right Grant
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100080485A1 (en) * | 2008-09-30 | 2010-04-01 | Liang-Gee Chen Chen | Depth-Based Image Enhancement |
ES2452790A1 (es) * | 2013-03-28 | 2014-04-02 | Davantis Technologies Sl | Procedimiento y sistema de análisis de imágenes |
Non-Patent Citations (8)
Title |
---|
DIEGO ARACENA PIZARRO ET AL: "COMPARACIÓN DE TÉCNICAS DE CALIBRACIÓN DE CÁMARAS DIGITALES", REVISTA FACULTAD DE INGENIERÍA - UNIVERSIDAD DE TARAPACÁ, vol. 13, no. 1, 1 January 2005 (2005-01-01), pages 57 - 67, XP055315519, DOI: 10.4067/S0718-13372005000100007 * |
E JAUREGI ET AL: "Approaches to door identiication for robot navigation 241 0 Approaches to door identification for robot navigation", 1 March 2010 (2010-03-01), XP055319069, Retrieved from the Internet <URL:http://cdn.intechopen.com/pdfs/10237/InTech-Approaches_to_door_identification_for_robot_navigation.pdf> [retrieved on 20161114] * |
GUANG DENG: "An Entropy Interpretation of the Logarithmic Image Processing Model With Application to Contrast Enhancement", IEEE TRANSACTIONS ON IMAGE PROCESSING, IEEE SERVICE CENTER, PISCATAWAY, NJ, US, vol. 17, no. 5, 1 May 2009 (2009-05-01), pages 1135 - 1140, XP011253695, ISSN: 1057-7149 * |
H SAIPOL HADI ET AL: "A Review of Infrared Spectrum in Human Detection for Surveillance Systems", INTERNATIONAL JOURNAL OF INTERACTIVE DIGITAL MEDIA INTERNATIONAL JOURNAL OF INTERACTIVE DIGITAL MEDIA |, 1 January 2013 (2013-01-01), XP055315518, Retrieved from the Internet <URL:http://magicx.my/ijidm/wp-content/uploads/2013-1-3-03-saipol.pdf> [retrieved on 20161111] * |
JIA-GUU LEU: "IMAGE CONTRAST ENHANCEMENT BASED ON THE INTENSITIES OF EDGE PIXELS", CVGIP GRAPHICAL MODELS AND IMAGE PROCESSING, ACADEMIC PRESS, DULUTH, MA, US, vol. 54, no. 6, 1 November 1992 (1992-11-01), pages 497 - 506, XP000332318, ISSN: 1077-3169, DOI: 10.1016/1049-9652(92)90069-A * |
REND G AARNINK ET AL: "A preprocessing Scientific paper algorithm for edge detection with multiple scales of resolution", ELSEVIER EUROPEAN JOURNAL OF ULTRASOUND, 1 January 1997 (1997-01-01), XP055348937, Retrieved from the Internet <URL:http://www.sciencedirect.com/science/article/pii/S0929826696002091/pdf?md5=dca10196eae47bb95efb4bc3e1c24277&pid=1-s2.0-S0929826696002091-main.pdf> [retrieved on 20170222] * |
TALI TREIBITZ ET AL: "Resolution loss without imaging blur", JOURNAL OF THE OPTICAL SOCIETY OF AMERICA A, vol. 29, no. 8, 1 August 2012 (2012-08-01), US, pages 1516, XP055348823, ISSN: 1084-7529, DOI: 10.1364/JOSAA.29.001516 * |
XIANGZHI BAI ET AL: "Infrared image enhancement through contrast enhancement by using multiscale new top-hat transform", INFRARED PHYSICS AND TECHNOLOGY, vol. 54, no. 2, 22 December 2010 (2010-12-22), pages 61 - 69, XP028173496, ISSN: 1350-4495, [retrieved on 20101222], DOI: 10.1016/J.INFRARED.2010.12.001 * |
Also Published As
Publication number | Publication date |
---|---|
IL256202B (en) | 2021-05-31 |
GB2557035B (en) | 2021-05-26 |
US20180225522A1 (en) | 2018-08-09 |
WO2016203078A4 (es) | 2017-09-14 |
CA2989188A1 (en) | 2016-12-22 |
GB201721740D0 (en) | 2018-02-07 |
ES2563098B1 (es) | 2016-11-29 |
GB2557035A (en) | 2018-06-13 |
IL256202A (en) | 2018-02-28 |
ES2563098A1 (es) | 2016-03-10 |
US10452922B2 (en) | 2019-10-22 |
WO2016203078A3 (es) | 2017-07-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
ES2563098B1 (es) | Procedimiento de mejora de imagen IR basado en información de escena para videoanálisis | |
US10580161B2 (en) | Imaging system, object detection device and method of operating same | |
US10032283B2 (en) | Modification of at least one parameter used by a video processing algorithm for monitoring of a scene | |
CN108549874B (zh) | 一种目标检测方法、设备及计算机可读存储介质 | |
US7912252B2 (en) | Time-of-flight sensor-assisted iris capture system and method | |
US9619708B2 (en) | Method of detecting a main subject in an image | |
US10748294B2 (en) | Method, system, and computer-readable recording medium for image object tracking | |
CN108010105B (zh) | 图像处理设备、图像处理方法和存储介质 | |
JP2013089252A (ja) | 映像処理方法及び装置 | |
JP6351243B2 (ja) | 画像処理装置、画像処理方法 | |
KR101339026B1 (ko) | 열화상 카메라 가시성 개선 방법 및 장치 | |
CN110659547B (zh) | 物体识别方法、装置、车辆和计算机可读存储介质 | |
CN109359577B (zh) | 一种基于机器学习的复杂背景下人数检测系统 | |
JP2013152669A (ja) | 画像監視装置 | |
US11657592B2 (en) | Systems and methods for object recognition | |
JP2009025910A (ja) | 障害物検出装置、障害物検出システム及び障害物検出方法 | |
KR101341243B1 (ko) | 기상 현상으로 인해 훼손된 영상을 복원하는 장치 및 방법 | |
WO2016063595A1 (ja) | 画像処理装置、画像処理方法及びプログラム | |
CN103337076B (zh) | 视频监控目标出现范围确定方法和装置 | |
Gundawar et al. | Improved single image dehazing by fusion | |
Zeng et al. | Detection of salient object using pixel blurriness | |
US12096163B2 (en) | Device and a method of encoding images including a privacy filter | |
CN112668370B (zh) | 一种基于深度图像的生物特征活体识别检测方法、装置 | |
ES2452790B1 (es) | Procedimiento y sistema de análisis de imágenes | |
Chaudhuri et al. | Frequency and spatial domains adaptive-based enhancement technique for thermal infrared images |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16758226 Country of ref document: EP Kind code of ref document: A2 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15580257 Country of ref document: US |
|
ENP | Entry into the national phase |
Ref document number: 2989188 Country of ref document: CA |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 201721740 Country of ref document: GB Kind code of ref document: A Free format text: PCT FILING DATE = 20160613 |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16758226 Country of ref document: EP Kind code of ref document: A2 |