JP2001333420A - Image supervisory method and device - Google Patents
Image supervisory method and deviceInfo
- Publication number
- JP2001333420A JP2001333420A JP2000154399A JP2000154399A JP2001333420A JP 2001333420 A JP2001333420 A JP 2001333420A JP 2000154399 A JP2000154399 A JP 2000154399A JP 2000154399 A JP2000154399 A JP 2000154399A JP 2001333420 A JP2001333420 A JP 2001333420A
- Authority
- JP
- Japan
- Prior art keywords
- image
- illumination
- luminance value
- monitoring
- area
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Landscapes
- Image Input (AREA)
- Image Processing (AREA)
- Closed-Circuit Television Systems (AREA)
- Editing Of Facsimile Originals (AREA)
- Burglar Alarm Systems (AREA)
- Stroboscope Apparatuses (AREA)
- Studio Devices (AREA)
- Image Analysis (AREA)
Abstract
Description
【0001】[0001]
【発明の属する技術分野】本発明は、画像入力時に、LE
D照明装置の照度を強くしたノーマル露光画像と、LED照
明装置の照度を弱くした高速露光画像を取り込んで、こ
れらを合成して処理対象画像とすることにより、低消費
電流で昼夜連続の監視を行う画像監視装置に関する。BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention
By capturing a normal exposure image with the illuminance of the D illuminator increased and a high-speed exposure image with the illuminance of the LED illuminator weakened, and combining them into an image to be processed, continuous monitoring day and night with low current consumption is achieved. The present invention relates to an image monitoring device that performs the monitoring.
【0002】[0002]
【従来の技術】動く物体の広ダイナミックレンジの画像
を得る撮像装置(特開平7−75026)が知られている。こ
こでは、第1の露出時間で撮像された第1画像と、第2
の露出時間で撮像された第2画像において、第2画像の
輝度値に応じた2種類の重み付け関数(fとg)を発生
する。そして、第1画像の輝度値(画素毎)に関数fに
よる重み付けを行い、第2画像の輝度値(画素毎)に関
数gによる重み付けを行って、これらの加算を行い、広
ダイナミックレンジの合成輝度値を演算して作成した合
成画像を処理対象画像としている。2. Description of the Related Art There is known an imaging apparatus for obtaining an image of a moving object having a wide dynamic range (Japanese Patent Laid-Open No. 7-75026). Here, the first image captured at the first exposure time and the second image
In the second image captured at the exposure time of, two types of weighting functions (f and g) corresponding to the luminance value of the second image are generated. Then, the luminance value (for each pixel) of the first image is weighted by the function f, and the luminance value (for each pixel) of the second image is weighted by the function g. The composite image created by calculating the luminance value is set as the processing target image.
【0003】[0003]
【発明が解決しようとする課題】上記のように、第1の
露出時間で撮像された第1画像と第2の露出時間で撮像
された第2画像のそれぞれに、異なる重み付けを行なっ
て画像の合成を実現する装置は、監視シーンにおいて、
所定以上の照度が保証されている場合に有効となる。し
かし、夜間や昼間に光が届かないシーンでは、照明装置
を点灯する必要があり、照明装置の点消灯制御や照度の
強弱制御を行なう必要があることの考慮がない。このた
め、昼夜の連続監視等では照明コストがアップする上
に、照明装置の寿命が短くなるという問題がある。As described above, the first image captured at the first exposure time and the second image captured at the second exposure time are respectively weighted differently. The device for realizing the synthesis
This is effective when the illuminance equal to or more than a predetermined value is guaranteed. However, in a scene where light does not reach during the night or day, it is necessary to turn on the lighting device, and there is no consideration that it is necessary to control turning on and off the lighting device and to control the intensity of the illuminance. For this reason, in continuous monitoring during the day and night, there is a problem that the lighting cost is increased and the life of the lighting device is shortened.
【0004】本発明の目的は、上記した従来技術の問題
点を克服し、照度を強くした画像と弱くした画像を入力
し、これらを合成して処理対象画像とすることにより、
低消費電流で昼夜連続の監視ができる画像監視装置を提
供することにある。[0004] An object of the present invention is to overcome the above-mentioned problems of the prior art, input an image with an increased illuminance and an image with a reduced illuminance, and combine them to obtain an image to be processed.
It is an object of the present invention to provide an image monitoring device capable of continuous monitoring day and night with low current consumption.
【0005】[0005]
【課題を解決するための手段】上記目的を達成する本発
明は、画像入力時に、照明装置の照度を強くした第1の
画像と、照明装置の照度を弱くもしくは消灯した第2の
画像を取り込んで、これらを画像合成して処理対象画像
とするところに特徴がある。According to the present invention to achieve the above object, at the time of image input, a first image in which the illuminance of the illumination device is increased and a second image in which the illuminance of the illumination device is reduced or turned off are captured. Thus, there is a feature in that these are combined into an image to be processed.
【0006】本発明の画像監視方法では、まず、照明装
置付きITVカメラ等で監視シーンを撮影するとき、第1
の画像として照明装置の照度を強くしてノーマル露光画
像を入力した後、第2の画像として照明装置の照度を弱
くして1フィールドずれた高速露光画像を取り込み、こ
の第1の画像と第2の画像を画素毎の輝度値に応じた重
み係数で加算して合成画像を作成する。ここで、前記第
1の画像の照明は夜間など暗所での物体が認識可能な照
度とし、前記第2の画像の照明は前記第1の画像では飽
和領域となる部分が認識可能な照度または照明オフとす
る。In the image surveillance method of the present invention, first, when photographing a surveillance scene with an ITV camera or the like with a lighting device, the first
After inputting the normal exposure image by increasing the illuminance of the illuminating device as the second image, capturing the high-speed exposure image shifted by one field by decreasing the illuminance of the illuminating device as the second image, and obtaining the first image and the second image. Are added with a weighting factor corresponding to the luminance value of each pixel to create a composite image. Here, the illumination of the first image is an illuminance that allows an object in a dark place such as at night to be recognizable, and the illumination of the second image is an illuminance or a recognizable illumination of a portion that is a saturated region in the first image. Turn off the light.
【0007】画素毎の輝度値に応じた重み係数は、第1
の画像の画素の輝度値を基準にして、第1の画像の輝度
値が小さい場合には、第1の画像の輝度値の比率を大き
く第2の画像の輝度値の比率を小さくし、第1の画像の
輝度値が大きい場合(飽和状態)には、第1の画像の輝
度値の比率を小さく第2の画像の輝度値の比率を大きく
する。The weighting factor corresponding to the luminance value for each pixel is the first
When the luminance value of the first image is small based on the luminance value of the pixel of the image, the ratio of the luminance value of the first image is increased and the ratio of the luminance value of the second image is decreased. When the luminance value of the first image is large (saturation state), the ratio of the luminance value of the first image is reduced and the ratio of the luminance value of the second image is increased.
【0008】ここで、第1の画像は照明装置の照度を強
くしているため、照明装置から投光した光の反射が大き
い領域は飽和状態となり易く、該領域に物体が存在して
も、第1の画像では認識できない場合がある。しかし、
その他の領域(非飽和領域)は、認識可能な明るさであ
る。一方、第2の画像は、照明装置の照度を弱く若しく
は消灯しているため、照明装置から投光した光の反射が
大きい領域は適切な輝度値となり、物体の存在が認識で
きるが、その他の領域は低輝度となり認識処理に不向き
である。Here, since the illuminance of the illuminating device is increased in the first image, a region where the reflection of light emitted from the illuminating device is large tends to be saturated, and even if an object is present in the region, The first image may not be recognized. But,
The other area (unsaturated area) has recognizable brightness. On the other hand, in the second image, since the illuminance of the lighting device is weakened or turned off, the region where the reflection of the light emitted from the lighting device is large has an appropriate luminance value, and the presence of the object can be recognized. The region has low brightness and is not suitable for recognition processing.
【0009】次に、この合成画像を処理対象画像として
フレーム間差分を行い、この差分画像の変化領域ないし
統合領域による追跡元領域をテンプレートパターンと
し、他方の画像で追跡元領域に対応する領域とほぼ同一
領域を探索領域として、正規化相関による濃淡パターン
マッチングを行い、類似度が所定値より高い変化領域を
外乱として除外し、類似度の低い変化領域を監視対象の
物体として検出する。Next, an inter-frame difference is performed using the synthesized image as an image to be processed, and a change area or an integrated area of the difference image is used as a tracking source area as a template pattern. A gray-scale pattern matching based on normalized correlation is performed using substantially the same region as a search region, a change region having a similarity higher than a predetermined value is excluded as a disturbance, and a change region having a low similarity is detected as an object to be monitored.
【0010】本発明の方法を適用する画像監視装置は、
監視領域の画像を取り込む照明装置付きITVカメラ等
の撮像手段を備えて、取り込み画像を処理して監視領域
に入る物体を監視する装置において、入力画像のフィー
ルドに応じて前記照明装置の照明状態を強くまたは弱く
制御する画像入力制御手段と、強い照明状態による第1
の画像と弱い照明状態による第2の画像を周期的に取り
込む画像入力手段と、取り込まれた第1の画像と第2の
画像を合成して処理対象画像を作成する画像合成手段
と、フレーム間の処理対象画像を基に画像処理して前記
物体を検出する移動物体検出手段を具備することを特徴
とする。なお、画像入力制御手段は、前記照明状態とと
もに前記撮像手段の露光状態を制御するようにしてもよ
い。An image monitoring apparatus to which the method of the present invention is applied includes:
In an apparatus for monitoring an object entering a monitoring area by processing an acquired image and including an imaging unit such as an ITV camera with an illumination apparatus that captures an image of a monitoring area, an illumination state of the illumination apparatus is changed according to a field of an input image. Image input control means for controlling strong or weak;
Image input means for periodically capturing the second image and the second image due to the weak illumination state; image combining means for combining the captured first image and second image to create a processing target image; Moving object detecting means for detecting the object by performing image processing based on the processing target image. The image input control means may control the exposure state of the imaging means together with the illumination state.
【0011】本発明によれば、ノーマル露光画像の輝度
値を基準にして、ノーマル露光画像の非飽和領域は、ノ
ーマル露光画像の輝度値の重みを大きくし、ノーマル露
光画像の飽和領域は、高速露光画像の輝度値の重みを大
きくして加算を行うので、広ダイナミックレンジを有す
る合成画像が得られる。更に、高速露光時には、照度を
暗くまたは消灯するので、常時強い照度とするより照明
電力の低減が図れる上に、照明装置が長寿命化できる効
果がある。According to the present invention, based on the luminance value of the normally exposed image, the weight of the luminance value of the normally exposed image is increased in the non-saturated region of the normally exposed image, and the saturated region of the normally exposed image is increased in the high speed. Since the addition is performed while increasing the weight of the brightness value of the exposure image, a composite image having a wide dynamic range can be obtained. Further, at the time of high-speed exposure, the illuminance is darkened or turned off, so that the illumination power can be reduced as compared to the case where the illuminance is always strong, and the illuminating device can have a longer life.
【0012】上記した本発明は昼夜の連続監視に好適で
ある。この場合に最も物体認識の難しくなる夜間などの
暗いシーンで、かつ照明装置による照明光が鏡や白壁等
の高反射率の物に反射し、画像上飽和してしまうシーン
を例に、本発明の作用を説明する。The present invention described above is suitable for continuous monitoring day and night. In this case, the present invention is applied to a dark scene such as a night scene where object recognition is most difficult, and a case where illumination light from an illumination device is reflected on a high-reflectance object such as a mirror or a white wall and is saturated on an image. The operation of will be described.
【0013】図2に示すように、照明装置付きITVカメ
ラで監視シーンを撮影する。入力タイミングがiフィー
ルド610では、露光状態をノーマル露光611に設定して、
照明装置の照度612を強くし、第1の画像613を入力す
る。次の入力タイミングi+1フィールド615では、露
光状態を高速露光616の状態に設定して、照明装置の照
度617を弱く(ないし消灯)した状態で、第2の画像618
を入力する。取り込まれた第1の画像613と第2の画像6
18を用いて、画素毎の輝度値に応じた重み係数で加算し
て画像合成630を作成する。画素毎の輝度値に応じた重
み係数は、第1の画像613の画素の輝度値を基準にし
て、第1の画像613の輝度値が小さい場合には、第1の
画像613の輝度値の比率を大きくして第2の画像618の輝
度値の比率を小さくし、第1の画像613の輝度値が大き
い場合(飽和状態)には、第1の画像613の輝度値の比
率を小さくして第2の画像618の輝度値の比率を大きく
する。As shown in FIG. 2, a surveillance scene is photographed by an ITV camera with a lighting device. When the input timing is the i-field 610, the exposure state is set to the normal exposure 611,
The illuminance 612 of the lighting device is increased, and a first image 613 is input. In the next input timing i + 1 field 615, the exposure state is set to the state of the high-speed exposure 616, and the illuminance 617 of the illumination device is weakened (or turned off).
Enter Captured first image 613 and second image 6
Using 18, an image synthesis 630 is created by adding with a weighting factor corresponding to the luminance value of each pixel. The weighting coefficient according to the luminance value of each pixel is based on the luminance value of the pixel of the first image 613, and when the luminance value of the first image 613 is small, The ratio is increased to decrease the ratio of the luminance values of the second image 618. If the luminance value of the first image 613 is large (saturated), the ratio of the luminance values of the first image 613 is decreased. Thus, the ratio of the luminance value of the second image 618 is increased.
【0014】この例で、第1の画像613は強い照明状態
であり、かつノーマル露光であるため、鏡等に反射光の
受光領域614は飽和状態となり、領域614に物体が存在し
ても、第1の画像613では識別できない。しかし、その
他の領域(非飽和領域)は、強い照明と通常のシャッタ
ー速度によるノーマル露光によって適切な輝度値を得る
ことができる。一方、第2の画像618は、消灯ないし弱
い照明とシャッター速度が高速の高速露光によって、鏡
等による反射光の受光領域614であっても適切な輝度値
となり、領域614に存在する物体619の認識ができる。し
かし、その他の領域は低輝度で暗い。In this example, since the first image 613 is in a strong illumination state and is under normal exposure, the light receiving area 614 of the reflected light on the mirror or the like becomes saturated. It cannot be identified in the first image 613. However, in other areas (non-saturated areas), an appropriate luminance value can be obtained by normal illumination with strong illumination and a normal shutter speed. On the other hand, the second image 618 has an appropriate luminance value even in the light receiving area 614 of the reflected light by the mirror or the like due to the extinguishing or weak illumination and the high-speed exposure with a high shutter speed. Can recognize. However, other areas are dark with low brightness.
【0015】このように、強い照明光状態であり、かつ
ノーマル露光時は、鏡等による反射光の受光領域が飽和
する。一方、弱い照明状態であり、かつ高速露光時は、
鏡等による反射光の受光領域であっても飽和し難くな
る。この照明状態に応じて露光を制御して撮影したノー
マル露光による第1の画像と高速露光による第2の画像
を所定の比率で合成すれば、夜間の暗い部分と鏡等によ
る反射光の受光領域を含む全体シーンについて高感度の
広ダイナミックレンジの画像を取得でき、暗がりの監視
領域で部分的に強力なノイズ光が当る場面でも、精度の
高い移動物体の監視が可能になる。なお、夜間に光がな
い暗いシーンの場合は、照明を点灯したノーマル露光画
像に飽和領域が存在することが少ない。従って、照明を
弱くないし消灯した高速露光画像の必要性は少なく、加
算比率が少なくてすむ。As described above, in a strong illumination light state and at the time of normal exposure, a light receiving area of light reflected by a mirror or the like is saturated. On the other hand, when the illumination is weak and the exposure is high,
Saturation hardly occurs even in a light receiving area of light reflected by a mirror or the like. By combining a first image by normal exposure and a second image by high-speed exposure at a predetermined ratio by controlling exposure in accordance with the illumination state, a dark area at night and a light receiving area of reflected light by a mirror or the like can be obtained. , A high-sensitivity image with a wide dynamic range can be obtained for the entire scene, and even in a scene where a strong noise light is partially applied in a dark monitoring area, a moving object with high accuracy can be monitored. In the case of a dark scene where there is no light at night, a saturated area rarely exists in a normally exposed image in which illumination is turned on. Therefore, the necessity of a high-speed exposure image which is not weak or turned off is small, and the addition ratio is small.
【0016】[0016]
【発明の実施の形態】以下、本発明の一実施例を図面を
用いて説明する。図1は、本発明の一実施例を示す画像
監視装置のブロック図である。本実施例の画像監視装置
は、監視装置本体10、表示装置7000から構成される。ま
たは、監視装置本体20、ITVカメラ100、照明装置200、
表示装置7000で構成してもよい。監視装置本体20は、パ
ーソナルコンピュータに画像処理ボードを装着して実現
してもよい。監視装置本体10は、LED照明とカメラと画
像処理ボードとパーソナルコンピュータを一体とした照
明付きインテリジェントカメラとして実現してもよい。DESCRIPTION OF THE PREFERRED EMBODIMENTS One embodiment of the present invention will be described below with reference to the drawings. FIG. 1 is a block diagram of an image monitoring apparatus according to an embodiment of the present invention. The image monitoring apparatus according to the present embodiment includes a monitoring apparatus main body 10 and a display device 7000. Or, the monitoring device body 20, the ITV camera 100, the lighting device 200,
The display device 7000 may be used. The monitoring device body 20 may be realized by mounting an image processing board on a personal computer. The monitoring device body 10 may be realized as an illuminated intelligent camera that integrates LED lighting, a camera, an image processing board, and a personal computer.
【0017】本実施例では、まず、LED照明装置200付き
ITVカメラ100で監視シーンを撮影する。画像入力制御部
500は、第1画像に対し照明装置200の照度を強くし、か
つ通常のシャッター速度のノーマル露光状態に設定し、
第1画像と1フィールドずれた第2画像に対し照明装置
200の照度を弱くし、かつシャッター速度が高速の高速
露光状態に設定する。画像入力部1000は、iフィールド
の第1の画像を入力した後、i+1フィールドの第2画
像を入力する。画像合成部2000は、取り込まれた第1画
像と第2画像を用いて、画素毎の輝度値に応じた重み係
数で加算して画像を合成し、広ダイナミックレンジを有
する画像を作成し処理対象画像とする。In this embodiment, first, an LED lighting device 200 is provided.
The surveillance scene is photographed with the ITV camera 100. Image input control unit
500 increases the illuminance of the illumination device 200 for the first image, and sets a normal exposure state at a normal shutter speed;
Illumination device for second image shifted by one field from first image
Set the high-speed exposure state with a low illuminance of 200 and a high shutter speed. After inputting the first image in the i-th field, the image input unit 1000 inputs the second image in the i + 1-th field. The image synthesizing unit 2000 synthesizes the images by adding the weighted coefficients corresponding to the luminance values of the pixels using the captured first image and the second image, creates an image having a wide dynamic range, and processes the image. Make an image.
【0018】差分画像作成部3000は、合成画像部2000で
合成した広ダイナミックレンジ画像を処理対象画像とし
て、処理対象画像のフレーム間で画素毎の差分処理を行
う。変化領域抽出部4000は、差分画像を2値化して変化
領域を抽出する。物体検知部50000は、変化領域抽出部4
000で抽出した変化領域の近隣を一つの統合領域にまと
めて、統合領域毎に直前フレームと現フレーム間の正規
化相関処理による濃淡パターンマッチングを行って物体
判定処理を行う。表示制御部6000は、検知された物体の
検知画像を格納して、監視センター等(図示していな
い)から表示要求のつど、その検知位置等の情報や検知
日や時刻等の情報をリアルタイムに表示装置7000に表示
する。The difference image creation unit 3000 performs difference processing for each pixel between frames of the processing target image using the wide dynamic range image synthesized by the synthesis image unit 2000 as a processing target image. The change area extraction unit 4000 extracts a change area by binarizing the difference image. The object detection unit 50000 includes the change area extraction unit 4
The neighborhood of the change area extracted in step 000 is combined into one integrated area, and object determination processing is performed for each integrated area by performing gray-scale pattern matching by normalized correlation processing between the immediately preceding frame and the current frame. The display control unit 6000 stores the detected image of the detected object and, in real time, displays information such as a detection position and information such as a detection date and time in response to a display request from a monitoring center or the like (not shown). The information is displayed on the display device 7000.
【0019】図3は、画像入力制御部500の一実施例を
示すブロック図である。画像入力タイミング設定部510
が画像入力タイミングをiフィールドに設定すると、露
光状態設定部520は露光状態をノーマル露光に設定し、
照明強度設定部530が出力電流を大きくして照明装置200
の照度を強くして、画像入力部1000の処理を行う。次
に、画像入力タイミング設定部510がi+1フィールド
に設定すると、露光状態設定部520は露光状態を高速露
光に設定し、照明強度設定部530が出力電流を小さくし
てLED等の照明装置の照度を弱くして、画像入力部1000
の処理を行う。これにより、LED照明が強いノーマル露
光画像と1フィールドずれたLED照明の弱い高速露光画
像が取り込まれる。FIG. 3 is a block diagram showing an embodiment of the image input control section 500. Image input timing setting unit 510
Sets the image input timing to the i field, the exposure state setting unit 520 sets the exposure state to normal exposure,
The illumination intensity setting unit 530 increases the output current and
And the processing of the image input unit 1000 is performed. Next, when the image input timing setting section 510 sets the i + 1 field, the exposure state setting section 520 sets the exposure state to high-speed exposure, and the illumination intensity setting section 530 reduces the output current to reduce the illuminance of the illumination device such as an LED. The image input unit 1000
Is performed. Thus, a normal exposure image with strong LED illumination and a high-speed exposure image with weak LED illumination shifted by one field are captured.
【0020】なお、本実施例の画像入力制御部500は照
明強度のみを制御し、露光制御は通常のカメラオートに
よっている。このカメラオートではノーマル露光の周期
が約1/30秒、高速露光の周期が1/500〜1/1000
秒に設定されている。Note that the image input control unit 500 of this embodiment controls only the illumination intensity, and exposure control is performed by a normal camera auto. In this camera auto, the normal exposure cycle is about 1/30 second, and the high-speed exposure cycle is 1/500 to 1/1000
Set to seconds.
【0021】図4は、画像入力部1000の内部の一実施例
を示すブロック図である。画像入力制御部500がiフィー
ルド目で照明装置200の照度を強くしてノーマル露光状
態に設定すると、A/D変換部1010はA/D変換を行
い、ノーマル露光画像入力部1020がノーマル露光画像を
取り込む。一方、画像入力制御部500がi+1フィール
ド目で照明装置200の照度を弱くして高速露光状態に設
定すると、A/D変換部1010はA/D変換を行い、高速
露光画像入力部1030が高速露光画像を取り込む。FIG. 4 is a block diagram showing an embodiment of the inside of the image input unit 1000. When the image input control unit 500 increases the illuminance of the illuminating device 200 in the i-th field and sets it to the normal exposure state, the A / D conversion unit 1010 performs A / D conversion, and the normal exposure image input unit 1020 outputs the normal exposure image. Take in. On the other hand, when the image input control unit 500 weakens the illuminance of the illuminating device 200 in the (i + 1) th field and sets it to the high-speed exposure state, the A / D conversion unit 1010 performs A / D conversion, and the high-speed exposure image input unit 1030 Capture the exposure image.
【0022】図5は、画像取り込みと合成画像処理を示
すタイムチャートである。ここでは、監視装置に制御可
能な照明装置を具備し、24時間連続監視可能な例で説
明する。時間経過1500tにおいて、画像処理の1周
期1510(画像取り込みから物体検出処理までのサイク
ル)をk秒とする。オンライン処理の場合、kは100mse
c〜数秒である。このk秒間隔で照明をオンして画像の
取り込みを行い、取り込み時以外は照明をオフしてい
る。FIG. 5 is a time chart showing image capture and composite image processing. Here, an example will be described in which the monitoring device is provided with a controllable lighting device and can be monitored continuously for 24 hours. One cycle 1510 of image processing (cycle from image capture to object detection processing) at time elapse 1500 t is k seconds. K is 100mse for online processing
c to several seconds. At this k-second interval, the illumination is turned on to capture an image, and the illumination is turned off except when capturing.
【0023】取込む画像には約1/30秒周期のノーマ
ル露光画像と、1/500〜1/1000秒周期の高速露光画
像がある。ノーマル露光画像に外光による飽和領域があ
る場合、高速露光画像はこの飽和領域を最適に写す照度
があればよいので、通常は照明不要となる。そこで、ノ
ーマル画像の取り込み時1540にのみ照明をオンし、その
後は照明をオフする。なお、合成画像に用いる高速露光
画像は照明による残光の消失したタイミング(○印)で
取り込んでいる。The captured image includes a normal exposure image having a period of about 1/30 second and a high-speed exposure image having a period of 1/500 to 1/1000 second. When the normal exposure image has a saturated region due to external light, the high-speed exposure image only needs to have an illuminance that optimally captures the saturated region, so that illumination is usually unnecessary. Therefore, the lighting is turned on only at the time of capturing the normal image 1540, and thereafter the lighting is turned off. The high-speed exposure image used for the composite image is captured at the timing when the afterglow due to illumination has disappeared (indicated by a circle).
【0024】図6は、画像合成部2000の一実施例を示す
ブロック図である。画素の輝度値算出部2100が、まず、
ノーマル露光状態の画像を取り込んで、各画素毎の輝度
値を算出する。ノーマル露光画像の合成比率算出部2200
は、算出された画素(i,j)の輝度値に応じて、低輝度
ならば比率を多くし、高輝度ならば少なくして、ノーマ
ル露光の比率を算出する。また、高速露光画像の合成比
率算出部2300は、合成比率全体を100とすると、100から
ノーマル露光画像の合成比率算出部2200で算出した比率
を減算した残りを高速露光の比率として算出する。即
ち、画素の輝度値算出部2100が算出した画素(i,j)の
輝度値が低輝度ならば少なくし、高輝度ならば多くした
比率を算出する。画像加算部2400は、画素の輝度値算出
部2100で算出した各画素毎の輝度値に、画素(i,j)に
対して、ノーマル露光画像の合成比率算出部2200で算出
した比率を演算し、高速露光画像の合成比率算出部2300
で算出した比率を演算して、画素毎に加算を行い、合成
画像を作成する。FIG. 6 is a block diagram showing an embodiment of the image synthesizing unit 2000. First, the pixel luminance value calculation unit 2100
The image in the normal exposure state is captured, and the luminance value of each pixel is calculated. Normal exposure image composition ratio calculation unit 2200
Calculates the ratio of normal exposure by increasing the ratio if the luminance is low and decreasing the ratio if the luminance is high according to the calculated luminance value of the pixel (i, j). Further, assuming that the entire combination ratio is 100, the high-speed exposure image combination ratio calculation unit 2300 calculates the remainder obtained by subtracting the ratio calculated by the normal exposure image combination ratio calculation unit 2200 from 100 as the high-speed exposure ratio. That is, if the luminance value of the pixel (i, j) calculated by the pixel luminance value calculation unit 2100 is low, the ratio is decreased, and if the luminance value is high, the ratio is increased. The image addition unit 2400 calculates the ratio calculated by the normal exposure image synthesis ratio calculation unit 2200 for the pixel (i, j) to the luminance value of each pixel calculated by the pixel luminance value calculation unit 2100. , High-speed exposure image synthesis ratio calculation unit 2300
The ratio calculated in the step is calculated, and addition is performed for each pixel to create a composite image.
【0025】図7は画像合成部2000の処理手順の一例を
示すフロー図である。まず、ノーマル露光画像の比率係
数αが設定される(s101)。次に、画素(i,j)に
ついてノーマル露光画像及び高速露光画像の輝度値を算
出する(s102)。次に、ノーマル画像の画素(i,
j)の輝度値に比率係数αを乗じてノーマル画像比輝度
値(Nnode)を計算する(s103)。次に、高速露光
画像の画素(i,j)の輝度値に高速画像の比率係数(1
−α)を乗じて高速画像比輝度値(Hnode)を計算する
(s104)。そして、ノーマル画像比輝度値(Nnod
e)と高速画像比輝度値(Hnode)を加算し、合成画像の
画素(i,j)の輝度値を求め(s105)、以上の処理
を全ての画素について繰り返す(s106)。なお、ス
テップs105では誤差を低減するため、NnodeとHnode
のそれぞれに2のn乗した係数を乗じたのち加算し、加
算後に2のn乗した係数で割っている。FIG. 7 is a flowchart showing an example of the processing procedure of the image synthesizing unit 2000. First, the ratio coefficient α of the normally exposed image is set (s101). Next, the luminance values of the normal exposure image and the high-speed exposure image are calculated for the pixel (i, j) (s102). Next, the pixels (i,
The normal image ratio luminance value (Nnode) is calculated by multiplying the luminance value of j) by the ratio coefficient α (s103). Next, the luminance value of the pixel (i, j) of the high-speed exposure image is added to the ratio coefficient (1
-Α) to calculate a high-speed image ratio luminance value (Hnode) (s104). Then, the normal image specific luminance value (Nnod
e) and the high-speed image ratio luminance value (Hnode) are added to obtain the luminance value of the pixel (i, j) of the composite image (s105), and the above processing is repeated for all the pixels (s106). In step s105, Nnode and Hnode are used to reduce the error.
Are multiplied by a coefficient raised to the power of 2n, added, and after addition, divided by a coefficient raised to the power of 2n.
【0026】図8は、合成比率の一例を示す説明図であ
る。この例では、ノーマル露光画像輝度値2410に対し、
高速露光画像合成比率2420は直線2430となり、この直線
の勾配が比率係数1−αとなる。ノーマル露光画像合成
比率は100%から高速露光画像合成比率を差し引いた
値αで、つまり両者の和は100%となる。ここでは、
ノーマル露光画像の輝度値(256階調)が最低(0)
のときノーマル露光の割合が100%、最高(255)
のとき約70%で、残りが高速露光の割合となる。これ
により、基準のノーマル露光で取り込んだ画像の画素の
輝度値が低い場合、ノーマル露光の比率が多くなり、輝
度値が高い場合、高速露光の比率が高くなるので、ダイ
ナミックレンジが広くて鮮明な画像を得ることができ
る。FIG. 8 is an explanatory diagram showing an example of the combination ratio. In this example, for a normally exposed image brightness value 2410,
The high-speed exposure image composition ratio 2420 becomes a straight line 2430, and the gradient of this straight line becomes the ratio coefficient 1-α. The normal exposure image composition ratio is a value α obtained by subtracting the high-speed exposure image composition ratio from 100%, that is, the sum of both is 100%. here,
Normal exposure image luminance value (256 gradations) is lowest (0)
In the case of, the ratio of normal exposure is 100%, the highest (255)
In this case, the ratio is about 70%, and the remainder is the ratio of high-speed exposure. As a result, when the luminance value of the pixel of the image captured by the standard normal exposure is low, the ratio of normal exposure increases, and when the luminance value is high, the ratio of high-speed exposure increases, so that the dynamic range is wide and sharp. Images can be obtained.
【0027】図9は、画像合成部2000で広ダイナミック
レンジの画像を得ることができる一原理を示す説明図で
ある。ここでは、画像として、ノーマル露光画像2460と
高速露光画像2470を用い、出力が8ビットの例で説明す
る。ノーマル露光画像2460は、ノーマル露光で飽和する
レンジIsn2465の輝度値が255である。高速露光画像2470
に関し、今、高速露光画像レンジをn倍に拡大した場
合、ノーマル飽和のn倍のレンジIsh2475となり、レン
ジ2475の輝度値も255である。高速露光画像は、ノーマ
ル露光画像より暗い領域の感度は低下し、明るい領域の
感度がアップする。FIG. 9 is an explanatory diagram showing one principle by which an image with a wide dynamic range can be obtained by the image synthesizing unit 2000. Here, a normal exposure image 2460 and a high-speed exposure image 2470 are used as images, and an example in which the output is 8 bits will be described. The normal exposure image 2460 has a luminance value of 255 in a range Isn2465 saturated by normal exposure. High-speed exposure image 2470
Now, when the high-speed exposure image range is expanded n times, the range becomes Ish 2475, which is n times the normal saturation, and the luminance value of the range 2475 is also 255. In a high-speed exposure image, the sensitivity in a darker area is lower than that in a normal exposure image, and the sensitivity in a brighter area is increased.
【0028】ここで、ノーマル露光の比率係数α(0≦
α≦1)とし、ノーマル露光画像の画素(i,j)の輝度
値をNnode、高速露光画像の画素(i,j)の輝度値をHnod
eとすると、合成画像の画素(i,j)の輝度値Bnodeは数
1で計算できる。Here, the ratio coefficient α of normal exposure (0 ≦ 0)
α ≦ 1), the luminance value of the pixel (i, j) of the normal exposure image is Nnode, and the luminance value of the pixel (i, j) of the high-speed exposure image is Hnod.
Assuming e, the luminance value Bnode of the pixel (i, j) of the composite image can be calculated by Equation 1.
【0029】[0029]
【数1】 (Equation 1)
【0030】数1から、合成画像飽和のレンジIsb2485
は、ノーマルで飽和するレンジIsn2465と高速飽和のレ
ンジIsh2475の間のレンジになり、広ダイナミックレン
ジの合成画像が得られることが分かる。From the equation 1, the range Isb2485 of the saturation of the synthesized image is obtained.
Is a range between the normal saturation range Isn2465 and the high-speed saturation range Ish2475, and it can be seen that a synthesized image with a wide dynamic range can be obtained.
【0031】図10は、差分画像作成部3000の一実施例
を示すブロック図である。ノイズ低減部3100は、画像合
成部2000で合成された処理対象画像に対して、平滑化処
理やメディアンフィルタ処理等を行ってノイズ等を除去
した後、画素間差分部3200は、処理対象画像の画素毎に
フレーム間差分を行い、差分画像を作成する。FIG. 10 is a block diagram showing an embodiment of the difference image creation unit 3000. The noise reduction unit 3100 performs a smoothing process, a median filter process, and the like on the processing target image synthesized by the image synthesis unit 2000 to remove noise and the like. An inter-frame difference is performed for each pixel to create a difference image.
【0032】図11は、変化領域抽出部4000の一実施例
を示すブロック図である。輝度頻度分布算出部4100は、
差分画像作成部3000でフレーム間の画素毎の差分を行っ
て作成した差分画像の輝度頻度分布を算出する。2値化
しきい値算出部4500は、輝度頻度分布算出部4100で算出
した輝度頻度分布から、差分画像の2値化しきい値をモ
ード法等の汎用的な手法で算出する。2値画像作成部46
00は、2値化しきい値算出部4500で算出したしきい値で
2値化処理して2値画像を作成する。微小面積除去部48
00は、検知対象に不適切な微小面積除外のフィルタリン
グや膨張・収縮等を行ってノイズを除外して変化領域の
2値画像を作成する。FIG. 11 is a block diagram showing an embodiment of the change area extraction unit 4000. The luminance frequency distribution calculation unit 4100 includes:
The difference image creation unit 3000 calculates the luminance frequency distribution of the difference image created by performing the difference for each pixel between frames. The binarization threshold calculation unit 4500 calculates a binarization threshold of the difference image from the luminance frequency distribution calculated by the luminance frequency distribution calculation unit 4100 by a general-purpose method such as a mode method. Binary image creation unit 46
In step 00, a binary image is created by performing a binarization process using the threshold value calculated by the binarization threshold value calculation unit 4500. Small area removal unit 48
In step 00, a binary image of a change area is created by performing filtering, expansion / contraction, or the like that is inappropriate for the detection target and excludes noise.
【0033】図12は、物体検知部5000の一実施例を示
すブロック図である。抽出領域統合部5100は、変化領域
抽出部4000で作成した2値画像のラベル画像を作成し
て、ラベル間の距離が所定距離以内のラベルか否か判定
し、所定距離以内でかつ所定距離以内のラベル群の外接
矩形が所定サイズか否か判定して、所定サイズ以内のラ
ベル群を統合する。外接矩形算出部5300は、統合したラ
ベル群の外接矩形領域を算出する。外接矩形領域類似度
算出部5500は、直前フレームと現フレームとの画像間
で、いずれか一方の画像の外接矩形領域をテンプレート
パターンとして、もう一方の画像に対し外接矩形領域と
ほぼ同一位置(拡張サイズは±1画素以下)で正規化相
関を行う。正規化相関における類似度の算出は、数2の
正規化相関処理による。FIG. 12 is a block diagram showing an embodiment of the object detection unit 5000. The extraction region integration unit 5100 creates a label image of the binary image created by the change region extraction unit 4000, determines whether the distance between the labels is within a predetermined distance, and within a predetermined distance and within a predetermined distance. It is determined whether or not the circumscribed rectangle of the label group has a predetermined size, and the label groups within the predetermined size are integrated. The circumscribed rectangle calculation unit 5300 calculates a circumscribed rectangle area of the integrated label group. The circumscribed rectangular area similarity calculation unit 5500 uses the circumscribed rectangular area of one of the images as a template pattern between the images of the immediately preceding frame and the current frame, and sets the other image at substantially the same position as the circumscribed rectangular area (extended image). Normalized correlation is performed with a size of ± 1 pixel or less. The calculation of the similarity in the normalized correlation is based on the normalized correlation processing of Expression 2.
【0034】[0034]
【数2】 (Equation 2)
【0035】即ち、登録テンプレートパターンと対象画
像の明るさを正規化して明るさの差を求める(3F−8車
番認識システムの濃淡パターンマッチング処理の応用、
情報処理学会第49回全国大会、平成6年後期、)もので
あり、数1の演算をマッチング領域全体にわたって実行
し、類似度を算出する。物体判定部5700は、数1により
算出した類似度が所定値以下の場合、移動物体と判定
し、それ以外の場合、外乱として除外する。That is, a brightness difference is obtained by normalizing the brightness of the registered template pattern and the target image (application of the light and shade pattern matching processing of the 3F-8 car number recognition system,
IPSJ 49th National Convention, late 1994), and calculates the similarity by executing the operation of Expression 1 over the entire matching region. The object determination unit 5700 determines that the object is a moving object when the similarity calculated by Expression 1 is equal to or less than a predetermined value, and otherwise excludes the object as a disturbance.
【0036】図13、抽出領域統合部5100の抽出領域の
統合を行う処理手順の一例を示すフロー図である。ステ
ップs210は、2値画像のラベリングを行い、1〜L
nまでのラベルをつけると、Lnが総ラベル数となる。
総ラベル数Lnを繰り返すため、ステップs220で処
理対象のラベル番号を(i)を初期化する。次に、ステ
ップs230で、ラベル番号を一つ大きくして(iを+
1)し、ステップs240で、全てのラベルが終了した
か否かチェックする。終了しない場合、i番目のラベル
に対し、ステップs250以降の処理を行う。FIG. 13 is a flowchart showing an example of a processing procedure for integrating the extraction areas by the extraction area integration unit 5100. In step s210, labeling of the binary image is performed, and 1 to L
When labels up to n are given, Ln becomes the total number of labels.
In order to repeat the total label number Ln, the label number to be processed is initialized to (i) in step s220. Next, in step s230, the label number is increased by one (i is increased by +
1) Then, in step s240, it is checked whether or not all the labels have been completed. If the processing is not to be ended, the processing after step s250 is performed on the i-th label.
【0037】まず、ステップs250で、i番目のラベ
ルに対し、重心(中心)座標を算出する。ステップs2
60でi番目のラベルとi+1番目〜Ln番目のラベル
間で、重心(中心)のX方向とY方向の距離が許容範囲
か否かチェックする。許容範囲でない場合、ステップs
230へ戻る。許容範囲以内の場合、ステップs270
で、許容範囲のラベルの外接矩形の大きさが所定の大き
さの範囲以内か否かチェックする。外接矩形が許容範囲
でない場合、新たな統合領域生成処理のためステップs
230へ戻る。許容範囲以内の場合、ステップs280
で、許容範囲以内のラベル全てをi番目のラベルに加
え、加えたラベルを抹消する。ステップs290でi番
目〜Ln番目のラベルについて、昇順にソートすると、
ステップs230へ戻り、新たなi番目から統合領域生
成処理を行う。これにより、次々と距離と外接矩形の許
容範囲以内のラベル群が統合されていく。First, in step s250, the barycenter (center) coordinates are calculated for the i-th label. Step s2
At 60, it is checked whether the distance between the center of gravity (center) in the X direction and the Y direction between the ith label and the (i + 1) th to Lnth labels is within an allowable range. If not, step s
Return to 230. If within the allowable range, step s270
Then, it is checked whether or not the size of the circumscribed rectangle of the label within the allowable range is within a predetermined size range. If the circumscribed rectangle is not within the allowable range, a step s is performed for a new integrated region generation process.
Return to 230. If within the allowable range, step s280
Then, all the labels within the allowable range are added to the i-th label, and the added label is deleted. When the i-th to Ln-th labels are sorted in ascending order in step s290,
Returning to step s230, an integrated area generation process is performed from the new i-th. As a result, the label groups within the allowable range of the distance and the circumscribed rectangle are successively integrated.
【0038】ここで、X方向とY方向の距離の許容範囲
は、例えば、物体が縦に長い人物の場合、X方向よりY
方向の距離や大きくしたり、または、物体が横に長い場
合、Y方向よりX方向の距離を大きくしたりして設定す
る。外接矩形の許容範囲も同様であり、いずれにして
も、どこまでの範囲を統合するかにより、検知対象物体
に追随して適切に設定すればよい。Here, the allowable range of the distance between the X direction and the Y direction is, for example, when the object is a vertically long person, the Y range is more than the X direction.
The distance is set to be larger or longer in the direction, or if the object is long horizontally, the distance in the X direction is set to be larger than the distance in the Y direction. The same applies to the permissible range of the circumscribed rectangle. In any case, the range may be appropriately set so as to follow the detection target object depending on the range to be integrated.
【0039】図14は、外接矩形算出部5300の外接矩形
を算出する手順の一例を示す説明図である。抽出領域統
合部5100で統合されたラベル群は、同一ラベル番号を有
するため、該ラベル群のみ抽出すると、ラベルa5310、
ラベルb5320、ラベルc5330、ラベルd5340、ラベルe
5350、ラベルf5360、ラベルg5370が抽出される。これ
らのラベルに対して、外接している矩形を外接矩形5400
として算出する。FIG. 14 is an explanatory diagram showing an example of a procedure for calculating a circumscribed rectangle by the circumscribed rectangle calculation unit 5300. Since the label group integrated by the extraction area integration unit 5100 has the same label number, if only the label group is extracted, the label a5310,
Label b5320, Label c5330, Label d5340, Label e
5350, label f5360, and label g5370 are extracted. For these labels, the bounding rectangle is
Is calculated as
【0040】図15は、外接矩形領域類似度算出部5500
での正規化相関処理による類似度により移動物体を判定
する場合の一例を示す説明図である。外接矩形領域5400
は、フレーム間差分により抽出した変化領域であり、現
フレーム画像における変化領域(移動物体領域)であ
る。移動物体は、直前フレームでは、外接矩形5400内部
には存在していない。即ち、直前フレーム5510における
外接矩形領域5400は、移動物体が存在しない背景シーン
であり、現フレーム画像5530における外接矩形領域5400
は、移動物体を含むシーンである。直前フレーム画像と
現フレーム画像間では、外接矩形領域5400を比較して相
関を行うと、背景が大部分隠れるため、類似度がかなり
小さくなる。これにより、移動物体の存在の有無を正確
に判定できる。FIG. 15 shows a circumscribed rectangular area similarity calculation unit 5500.
FIG. 8 is an explanatory diagram showing an example of a case where a moving object is determined based on the similarity obtained by the normalized correlation processing in FIG. Bounding rectangular area 5400
Is a change area extracted by the inter-frame difference, and is a change area (moving object area) in the current frame image. The moving object does not exist inside the circumscribed rectangle 5400 in the immediately preceding frame. That is, the circumscribed rectangular area 5400 in the immediately preceding frame 5510 is a background scene where no moving object exists, and the circumscribed rectangular area 5400 in the current frame image 5530.
Is a scene including a moving object. When the correlation is performed by comparing the circumscribed rectangular area 5400 between the immediately preceding frame image and the current frame image, the background is mostly hidden, and the similarity is considerably reduced. Thereby, the presence or absence of the moving object can be accurately determined.
【0041】図16は、外接矩形領域類似度算出部5500
の処理手順の一例を示すフロー図である。ステップs3
10は、現入力画像に対し、統合領域の外接矩形算出部
で算出した外接矩形領域の濃淡画像をテンプレートパタ
ーンとして登録する。ステップs320は、直前フレー
ム画像に対し、ステップs310とほぼ同一位置の外接
矩形領域をパターンマッチング領域として設定する。ス
テップs330は、ステップs310で登録したパター
ンとの濃淡パターンマッチングを行い、類似度を算出す
る。類似度の算出は数2による。ステップs340は、
算出した類似度が所定値以上か否か判定する。所定値以
上の場合、ステップs350は、背景と類似していない
ため、移動物体と判定する。所定値未満の場合、ステッ
プs360は、背景と類似しているため、外乱と判定す
る。ステップs370は、生成した外接矩形領域が全て
終了した否か判定し、終了していない場合、ステップs
310へ戻る。FIG. 16 shows a circumscribed rectangular area similarity calculation unit 5500.
It is a flowchart which shows an example of the processing procedure of. Step s3
Reference numeral 10 registers, as a template pattern, a grayscale image of a circumscribed rectangular area calculated by the integrated area circumscribed rectangle calculation unit with respect to the current input image. In step s320, a circumscribed rectangular area at almost the same position as in step s310 is set as a pattern matching area for the immediately preceding frame image. A step s330 performs a light and shade pattern matching with the pattern registered in the step s310 to calculate a similarity. Calculation of the similarity is based on Equation 2. Step s340 is:
It is determined whether the calculated similarity is equal to or greater than a predetermined value. If the value is equal to or larger than the predetermined value, step s350 is determined to be a moving object because it is not similar to the background. If it is less than the predetermined value, step s360 is determined to be a disturbance because it is similar to the background. A step s370 decides whether or not all of the generated circumscribed rectangular areas have been completed.
Return to 310.
【0042】図17は、表示装置7000に検知結果を表示
した一例を示す説明図である。人物6200を検知すると、
表示制御部6000が、格納した検知物体の外接矩形位置を
用いて、表示装置7000に現入力画像に重ねて外接矩形枠
6300を表示制御する。図17において、表示制御部6000
が、検知物体を表示装置7000に表示制御する場合、検知
物体が明示できる表示方法ならば何でもよい。このよう
に表示装置7000に表示することにより、監視者は、例え
ば、人物監視の場合に、人物及び検知情報を表示装置70
00上でオンラインに把握できる。また、表示装置7000が
遠隔地にあれば、テレビ電話等にRS−232C等の標準的な
通信手段で人物を検知したことを報知して、遠方の表示
装置に表示してもよい。FIG. 17 is an explanatory diagram showing an example in which the detection result is displayed on the display device 7000. When a person 6200 is detected,
The display control unit 6000 uses the stored circumscribed rectangular position of the detected object, and superimposes the current input image on the display device 7000 to form a circumscribed rectangular frame.
Display control of 6300. In FIG. 17, the display control unit 6000
However, when display control of the detected object is performed on the display device 7000, any display method can be used as long as the detected object can be clearly indicated. By displaying the information on the display device 7000 in this manner, for example, in the case of monitoring a person, the observer can display the person and the detection information on the display device 70.
You can grasp online on 00. If the display device 7000 is located at a remote place, the fact that a person has been detected by a standard communication means such as RS-232C may be notified to a videophone or the like, and displayed on a remote display device.
【0043】[0043]
【発明の効果】本発明によれば、まず、照明装置付きIT
Vカメラで監視シーンを撮影する場合に、iフィールド
の入力タイミングでは照明を強くしたノーマル露光状態
を設定して第1画像を入力、I+1フィールドの入力タ
イミングでは照明を弱くないし消灯した高速露光状態に
設定して第2画像を入力し、両画像を画素毎の輝度値に
応じた重み係数で加算して画像合成を作成するので、広
ダイナミックレンジの監視画像を作成でき、夜間などの
暗いシーンの監視にヘッドライト等の強いノイズ光が当
るような場合にも、高感度の監視が可能になる。また、
ノーマル露光画像の取り込み時に照明を行うので、消費
電流の低減が図れる上に、照明装置が長寿命化できる効
果がある。According to the present invention, first, an IT with a lighting device
When shooting a surveillance scene with a V camera, a normal exposure state in which the illumination is increased is set at the input timing of the i-field, and the first image is input. At the input timing of the I + 1 field, the illumination is not weak and is turned off in the high-speed exposure state. The second image is input after setting, and the two images are added with a weighting factor corresponding to the luminance value of each pixel to create an image composite, so that a monitoring image with a wide dynamic range can be created, and a dark scene such as at night can be created. Even when strong noise light such as a headlight hits the monitoring, high-sensitivity monitoring becomes possible. Also,
Since illumination is performed at the time of capturing a normally exposed image, current consumption can be reduced, and the life of the illumination device can be extended.
【図1】本発明における画像監視装置の一実施例を示す
ブロック図。FIG. 1 is a block diagram showing an embodiment of an image monitoring apparatus according to the present invention.
【図2】本発明の作用を示す説明図。FIG. 2 is an explanatory view showing the operation of the present invention.
【図3】画像入力制御部の一実施例を示すブロック図。FIG. 3 is a block diagram showing an embodiment of an image input control unit.
【図4】画像入力部の一実施例を示すブロック図。FIG. 4 is a block diagram showing an embodiment of an image input unit.
【図5】画像の取り込みと照明のタイミングを示すタイ
ムチャート。FIG. 5 is a time chart showing timings of capturing an image and lighting.
【図6】画像合成部の一実施例を示すブロック図。FIG. 6 is a block diagram showing an embodiment of an image synthesizing unit.
【図7】画像合成部の処理手順の一例を示すフローチャ
ート。FIG. 7 is a flowchart illustrating an example of a processing procedure of an image combining unit.
【図8】画像合成部の合成比率の一例を示す説明図。FIG. 8 is an explanatory diagram illustrating an example of a composition ratio of an image composition unit.
【図9】画像合成によるダイナミックレンジの拡大の原
理を示す説明図。FIG. 9 is an explanatory diagram showing the principle of expanding a dynamic range by image synthesis.
【図10】差分画像作成部の一実施例を示すブロック
図。FIG. 10 is a block diagram showing an embodiment of a difference image creation unit.
【図11】変化領域抽出部の一実施例を示すブロック
図。FIG. 11 is a block diagram showing an embodiment of a change area extraction unit.
【図12】物体検知部の一実施例を示すブロック図。FIG. 12 is a block diagram illustrating an embodiment of an object detection unit.
【図13】物体検知部の抽出領域の統合を行う手順の一
例を示すフローチャート。FIG. 13 is a flowchart illustrating an example of a procedure for integrating the extraction regions of the object detection unit.
【図14】外接矩形算出部の外接矩形を算出する一手順
を示す説明図。FIG. 14 is an explanatory diagram showing one procedure of calculating a circumscribed rectangle by a circumscribed rectangle calculation unit.
【図15】正規化相関処理による移動物体判定の一例を
示す説明図。FIG. 15 is an explanatory diagram showing an example of moving object determination by normalized correlation processing.
【図16】外接矩形領域類似度算出部の処理手順の一例
を示すフローチャート。FIG. 16 is a flowchart illustrating an example of a processing procedure of a circumscribed rectangular area similarity calculation unit.
【図17】表示装置に表示した検知結果の一例を示す説
明図。FIG. 17 is an explanatory diagram illustrating an example of a detection result displayed on a display device.
100…ITVカメラ、200…照明装置、500…画像入力制御
部、510…画像入力タイミング設定部、520…露光状態設
定部、530…照明強度設定部、1000…画像入力部、1010
…A/D変換部、1020…ノーマル露光画像入力部、1030
…高速露光画像入力部、2000…画像合成部、3000…差分
画像作成部、4000…変化領域抽出部、5000…物体検知
部、6000…表示制御部、7000…表示装置。100 ITV camera, 200 lighting device, 500 image input control unit, 510 image input timing setting unit, 520 exposure state setting unit, 530 illumination intensity setting unit, 1000 image input unit, 1010
... A / D converter, 1020 ... Normal exposure image input unit, 1030
... High-speed exposure image input unit, 2000 ... Image synthesis unit, 3000 ... Difference image creation unit, 4000 ... Change area extraction unit, 5000 ... Object detection unit, 6000 ... Display control unit, 7000 ... Display device.
───────────────────────────────────────────────────── フロントページの続き (51)Int.Cl.7 識別記号 FI テーマコート゛(参考) G03B 15/05 G03B 15/05 5C076 G06T 1/00 430 G06T 1/00 430G 5C084 3/00 300 3/00 300 5L096 7/20 7/20 A G08B 13/196 G08B 13/196 H04N 1/387 H04N 1/387 5/225 5/225 C (72)発明者 小林 芳樹 茨城県日立市大みか町七丁目1番1号 株 式会社日立製作所日立研究所内 (72)発明者 大田和 久雄 茨城県ひたちなか市大字稲田1410番地 株 式会社日立製作所デジタルメディア製品事 業部内 (72)発明者 崎村 茂寿 茨城県日立市大みか町七丁目1番1号 株 式会社日立製作所日立研究所内 (72)発明者 佐久間 喜郎 東京都三鷹市下連雀六丁目11番23号 セコ ム株式会社内 Fターム(参考) 2H053 AA01 AB03 AD00 AD21 BA71 BA91 5B047 AA07 AB02 BB06 CA19 5B057 AA19 BA02 CA08 CA13 CB08 CB12 CC01 CE08 CH08 DA02 DA16 DB03 DB09 DC32 5C022 AB15 AC69 5C054 AA01 CA04 CB03 CC03 CH02 EA01 FE12 HA19 5C076 AA12 AA40 BA06 5C084 AA02 AA07 AA08 AA13 BB27 BB32 CC16 CC19 DD12 DD66 GG20 GG39 GG42 GG43 GG44 GG52 GG56 GG57 GG61 GG68 GG73 GG78 5L096 AA09 BA02 CA02 CA17 EA39 GA08 ──────────────────────────────────────────────────続 き Continued on the front page (51) Int.Cl. 7 Identification symbol FI Theme coat ゛ (Reference) G03B 15/05 G03B 15/05 5C076 G06T 1/00 430 G06T 1/00 430G 5C083 3/00 300 3/00 300 5L096 7/20 7/20 A G08B 13/196 G08B 13/196 H04N 1/387 H04N 1/387 5/225 5/225 C (72) Inventor Yoshiki Kobayashi 7-1-1, Omikamachi, Hitachi City, Ibaraki Prefecture No. within Hitachi, Ltd.Hitachi Research Laboratories (72) Inventor, Hisao Ota 1410 Inada, Hitachi, Ibaraki, Japan Digital Media Products Division (72) Inventor, Shigehisa Sakimura Omikamachi, Hitachi, Ibaraki, Japan No. 1-1, Hitachi Research Laboratory, Hitachi Ltd. (72) Inventor Yoshiro Sakuma Shimo, Tokyo F-term (reference) 2H053 AA01 AB03 AD00 AD21 BA71 BA91 5B047 AA07 AB02 BB06 CA19 5B057 AA19 BA02 CA08 CA13 CB08 CB12 CC01 CE08 CH08 DA02 DA16 DB03 DB09 DC32 5C022 AB15 AC69 A05 CB03 CC03 CH02 EA01 FE12 HA19 5C076 AA12 AA40 BA06 5C084 AA02 AA07 AA08 AA13 BB27 BB32 CC16 CC19 DD12 DD66 GG20 GG39 GG42 GG43 GG44 GG52 GG56 GG57 GG61 GG68 GG73 CA17 AGG02 CA17A17A
Claims (5)
像を画像処理して監視領域に入る物体を監視する方法に
おいて、 周期的な画像入力時に、まず照明を強くして第1の画像
を取り込み、次のフィールドで照明を弱くないしは消灯
して第2の画像を取り込み、前記第1の画像と前記第2
の画像に基づき合成画像を作成し、この合成画像を処理
対象画像とするフレーム間差分により前記差分画像を生
成することを特徴とする画像監視方法。1. A method of monitoring an object that enters a monitoring area by performing image processing on a difference image between input images obtained by capturing the monitoring area, comprising: Capture, and in the next field, turn on or off the illumination and capture a second image, capturing the first image and the second image.
An image monitoring method, comprising: creating a composite image based on the image of (i), and generating the difference image by an inter-frame difference using the composite image as a processing target image.
能な照度とし、前記第2の画像の照明は前記第1の画像
では飽和領域となる部分が認識可能な照度または照明オ
フとすることを特徴とする画像監視方法。2. The illumination of the first image according to claim 1, wherein the illumination of the first image is an illuminance capable of recognizing an object in a dark place such as at night, and the illumination of the second image is a saturated area in the first image. An image monitoring method, characterized in that a part is set to an illuminance or a lighting-off state that can be recognized.
にして、第1の画像の輝度値が非飽和の領域は第1の画
像の輝度値の重みを大きくし、一方、第1の画像の輝度
値が飽和の領域は第2の画像の輝度値の重みを大きくし
た重み係数で画素ごとの輝度値を加算して作成すること
を特徴とする画像監視方法。3. The image according to claim 1, wherein the area where the luminance value of the first image is unsaturated is based on the luminance value of the pixel of the first image. The weight of the luminance value is increased, while the area where the luminance value of the first image is saturated is created by adding the luminance value of each pixel with a weighting factor obtained by increasing the weight of the luminance value of the second image. Characteristic image monitoring method.
撮像手段を備えて、取り込み画像を処理して監視領域に
入る物体を監視する装置において、 入力画像のフィールドに応じて前記照明装置の照明状態
を強くまたは弱く制御する画像入力制御手段と、強い照
明状態による第1の画像と弱い照明状態による第2の画
像を周期的に取り込む画像入力手段と、取り込まれた第
1の画像と第2の画像を合成して処理対象画像を作成す
る画像合成手段と、フレーム間の処理対象画像を基に画
像処理して前記物体を検出する移動物体検出手段を具備
することを特徴とする画像監視装置。4. An apparatus for processing a captured image to monitor an object entering a monitoring area, comprising: an imaging device with an illumination device for capturing an image of a monitoring area, wherein an illumination state of the illumination apparatus is determined according to a field of an input image. Input control means for controlling strongly or weakly, image input means for periodically capturing a first image in a strong illumination state and a second image in a weak illumination state, and a first and a second captured image. An image monitoring apparatus comprising: an image synthesizing unit that synthesizes an image to generate a processing target image; and a moving object detection unit that detects an object by performing image processing based on a processing target image between frames.
段の露光状態を制御することを特徴とする画像監視装
置。5. The image monitoring apparatus according to claim 4, wherein the image input control means controls an exposure state of the imaging means together with the illumination state.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2000154399A JP2001333420A (en) | 2000-05-22 | 2000-05-22 | Image supervisory method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2000154399A JP2001333420A (en) | 2000-05-22 | 2000-05-22 | Image supervisory method and device |
Publications (1)
Publication Number | Publication Date |
---|---|
JP2001333420A true JP2001333420A (en) | 2001-11-30 |
Family
ID=18659502
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
JP2000154399A Pending JP2001333420A (en) | 2000-05-22 | 2000-05-22 | Image supervisory method and device |
Country Status (1)
Country | Link |
---|---|
JP (1) | JP2001333420A (en) |
Cited By (53)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100595084B1 (en) | 2003-08-20 | 2006-06-30 | 엘지전자 주식회사 | Method for managing digital slow shutter mode in monitoring camera |
JP2007013777A (en) * | 2005-07-01 | 2007-01-18 | Toyota Central Res & Dev Lab Inc | Imaging apparatus |
JP2007311896A (en) * | 2006-05-16 | 2007-11-29 | Fujifilm Corp | Imaging apparatus and imaging control program |
US7433494B2 (en) | 2002-09-19 | 2008-10-07 | Denso Corporation | Moving body detecting apparatus |
JP2009071789A (en) * | 2007-09-18 | 2009-04-02 | Denso Corp | Vehicle periphery monitoring system |
JP2010026858A (en) * | 2008-07-22 | 2010-02-04 | Panasonic Corp | Authentication imaging apparatus |
WO2010135575A2 (en) * | 2009-05-20 | 2010-11-25 | Express Imaging Systems, Llc | Long-range motion detection for illumination control |
JP2011029974A (en) * | 2009-07-27 | 2011-02-10 | Aisin Seiki Co Ltd | Display image correcting device |
JP2011188277A (en) * | 2010-03-09 | 2011-09-22 | Sony Corp | Image processor, image processing method and program |
US8118456B2 (en) | 2008-05-08 | 2012-02-21 | Express Imaging Systems, Llc | Low-profile pathway illumination system |
US8508137B2 (en) | 2009-05-20 | 2013-08-13 | Express Imaging Systems, Llc | Apparatus and method of energy efficient illumination |
US8610358B2 (en) | 2011-08-17 | 2013-12-17 | Express Imaging Systems, Llc | Electrostatic discharge protection for luminaire |
US8629621B2 (en) | 2011-08-24 | 2014-01-14 | Express Imaging Systems, Llc | Resonant network for reduction of flicker perception in solid state lighting systems |
US8878440B2 (en) | 2012-08-28 | 2014-11-04 | Express Imaging Systems, Llc | Luminaire with atmospheric electrical activity detection and visual alert capabilities |
US8896215B2 (en) | 2012-09-05 | 2014-11-25 | Express Imaging Systems, Llc | Apparatus and method for schedule based operation of a luminaire |
US8901825B2 (en) | 2011-04-12 | 2014-12-02 | Express Imaging Systems, Llc | Apparatus and method of energy efficient illumination using received signals |
US8922124B2 (en) | 2011-11-18 | 2014-12-30 | Express Imaging Systems, Llc | Adjustable output solid-state lamp with security features |
US8926139B2 (en) | 2009-05-01 | 2015-01-06 | Express Imaging Systems, Llc | Gas-discharge lamp replacement with passive cooling |
US8926138B2 (en) | 2008-05-13 | 2015-01-06 | Express Imaging Systems, Llc | Gas-discharge lamp replacement |
US9125261B2 (en) | 2008-11-17 | 2015-09-01 | Express Imaging Systems, Llc | Electronic control to regulate power for solid-state lighting and methods thereof |
US9131552B2 (en) | 2012-07-25 | 2015-09-08 | Express Imaging Systems, Llc | Apparatus and method of operating a luminaire |
KR20150120547A (en) * | 2014-04-17 | 2015-10-28 | 재단법인 다차원 스마트 아이티 융합시스템 연구단 | Method and Apparatus for Event Detection using Frame Grouping |
US9185777B2 (en) | 2014-01-30 | 2015-11-10 | Express Imaging Systems, Llc | Ambient light control in solid state lamps and luminaires |
US9204523B2 (en) | 2012-05-02 | 2015-12-01 | Express Imaging Systems, Llc | Remotely adjustable solid-state lamp |
CN105120191A (en) * | 2015-07-31 | 2015-12-02 | 小米科技有限责任公司 | Video recording method and device |
US9210759B2 (en) | 2012-11-19 | 2015-12-08 | Express Imaging Systems, Llc | Luminaire with ambient sensing and autonomous control capabilities |
US9210751B2 (en) | 2012-05-01 | 2015-12-08 | Express Imaging Systems, Llc | Solid state lighting, drive circuit and method of driving same |
US9241401B2 (en) | 2010-06-22 | 2016-01-19 | Express Imaging Systems, Llc | Solid state lighting device and method employing heat exchanger thermally coupled circuit board |
US9288873B2 (en) | 2013-02-13 | 2016-03-15 | Express Imaging Systems, Llc | Systems, methods, and apparatuses for using a high current switching device as a logic level sensor |
US9301365B2 (en) | 2012-11-07 | 2016-03-29 | Express Imaging Systems, Llc | Luminaire with switch-mode converter power monitoring |
US9360198B2 (en) | 2011-12-06 | 2016-06-07 | Express Imaging Systems, Llc | Adjustable output solid-state lighting device |
KR20160091331A (en) * | 2013-11-25 | 2016-08-02 | 르노 에스.아.에스. | System and method for forming nighttime images for a motor vehicle |
US9414449B2 (en) | 2013-11-18 | 2016-08-09 | Express Imaging Systems, Llc | High efficiency power controller for luminaire |
US9445485B2 (en) | 2014-10-24 | 2016-09-13 | Express Imaging Systems, Llc | Detection and correction of faulty photo controls in outdoor luminaires |
US9462662B1 (en) | 2015-03-24 | 2016-10-04 | Express Imaging Systems, Llc | Low power photocontrol for luminaire |
US9466443B2 (en) | 2013-07-24 | 2016-10-11 | Express Imaging Systems, Llc | Photocontrol for luminaire consumes very low power |
US9497393B2 (en) | 2012-03-02 | 2016-11-15 | Express Imaging Systems, Llc | Systems and methods that employ object recognition |
US9538612B1 (en) | 2015-09-03 | 2017-01-03 | Express Imaging Systems, Llc | Low power photocontrol for luminaire |
US9572230B2 (en) | 2014-09-30 | 2017-02-14 | Express Imaging Systems, Llc | Centralized control of area lighting hours of illumination |
US9924582B2 (en) | 2016-04-26 | 2018-03-20 | Express Imaging Systems, Llc | Luminaire dimming module uses 3 contact NEMA photocontrol socket |
US9985429B2 (en) | 2016-09-21 | 2018-05-29 | Express Imaging Systems, Llc | Inrush current limiter circuit |
US10098212B2 (en) | 2017-02-14 | 2018-10-09 | Express Imaging Systems, Llc | Systems and methods for controlling outdoor luminaire wireless network using smart appliance |
WO2018220993A1 (en) * | 2017-05-29 | 2018-12-06 | ソニーセミコンダクタソリューションズ株式会社 | Signal processing device, signal processing method and computer program |
US10164374B1 (en) | 2017-10-31 | 2018-12-25 | Express Imaging Systems, Llc | Receptacle sockets for twist-lock connectors |
US10219360B2 (en) | 2017-04-03 | 2019-02-26 | Express Imaging Systems, Llc | Systems and methods for outdoor luminaire wireless control |
US10230296B2 (en) | 2016-09-21 | 2019-03-12 | Express Imaging Systems, Llc | Output ripple reduction for power converters |
US10568191B2 (en) | 2017-04-03 | 2020-02-18 | Express Imaging Systems, Llc | Systems and methods for outdoor luminaire wireless control |
US10904992B2 (en) | 2017-04-03 | 2021-01-26 | Express Imaging Systems, Llc | Systems and methods for outdoor luminaire wireless control |
WO2021070713A1 (en) * | 2019-10-09 | 2021-04-15 | 株式会社デンソー | Object recognition device |
US11212887B2 (en) | 2019-11-04 | 2021-12-28 | Express Imaging Systems, Llc | Light having selectively adjustable sets of solid state light sources, circuit and method of operation thereof, to provide variable output characteristics |
US11234304B2 (en) | 2019-05-24 | 2022-01-25 | Express Imaging Systems, Llc | Photocontroller to control operation of a luminaire having a dimming line |
US11317497B2 (en) | 2019-06-20 | 2022-04-26 | Express Imaging Systems, Llc | Photocontroller and/or lamp with photocontrols to control operation of lamp |
US11375599B2 (en) | 2017-04-03 | 2022-06-28 | Express Imaging Systems, Llc | Systems and methods for outdoor luminaire wireless control |
-
2000
- 2000-05-22 JP JP2000154399A patent/JP2001333420A/en active Pending
Cited By (73)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7433494B2 (en) | 2002-09-19 | 2008-10-07 | Denso Corporation | Moving body detecting apparatus |
KR100595084B1 (en) | 2003-08-20 | 2006-06-30 | 엘지전자 주식회사 | Method for managing digital slow shutter mode in monitoring camera |
JP2007013777A (en) * | 2005-07-01 | 2007-01-18 | Toyota Central Res & Dev Lab Inc | Imaging apparatus |
JP4716182B2 (en) * | 2006-05-16 | 2011-07-06 | 富士フイルム株式会社 | Imaging apparatus and imaging control program |
JP2007311896A (en) * | 2006-05-16 | 2007-11-29 | Fujifilm Corp | Imaging apparatus and imaging control program |
JP2009071789A (en) * | 2007-09-18 | 2009-04-02 | Denso Corp | Vehicle periphery monitoring system |
US8118456B2 (en) | 2008-05-08 | 2012-02-21 | Express Imaging Systems, Llc | Low-profile pathway illumination system |
US8926138B2 (en) | 2008-05-13 | 2015-01-06 | Express Imaging Systems, Llc | Gas-discharge lamp replacement |
JP2010026858A (en) * | 2008-07-22 | 2010-02-04 | Panasonic Corp | Authentication imaging apparatus |
US9967933B2 (en) | 2008-11-17 | 2018-05-08 | Express Imaging Systems, Llc | Electronic control to regulate power for solid-state lighting and methods thereof |
US9125261B2 (en) | 2008-11-17 | 2015-09-01 | Express Imaging Systems, Llc | Electronic control to regulate power for solid-state lighting and methods thereof |
US8926139B2 (en) | 2009-05-01 | 2015-01-06 | Express Imaging Systems, Llc | Gas-discharge lamp replacement with passive cooling |
US8810138B2 (en) | 2009-05-20 | 2014-08-19 | Express Imaging Systems, Llc | Apparatus and method of energy efficient illumination |
WO2010135575A3 (en) * | 2009-05-20 | 2011-02-03 | Express Imaging Systems, Llc | Long-range motion detection for illumination control |
US8541950B2 (en) | 2009-05-20 | 2013-09-24 | Express Imaging Systems, Llc | Apparatus and method of energy efficient illumination |
WO2010135575A2 (en) * | 2009-05-20 | 2010-11-25 | Express Imaging Systems, Llc | Long-range motion detection for illumination control |
US8987992B2 (en) | 2009-05-20 | 2015-03-24 | Express Imaging Systems, Llc | Apparatus and method of energy efficient illumination |
US8872964B2 (en) | 2009-05-20 | 2014-10-28 | Express Imaging Systems, Llc | Long-range motion detection for illumination control |
US8508137B2 (en) | 2009-05-20 | 2013-08-13 | Express Imaging Systems, Llc | Apparatus and method of energy efficient illumination |
JP2011029974A (en) * | 2009-07-27 | 2011-02-10 | Aisin Seiki Co Ltd | Display image correcting device |
JP2011188277A (en) * | 2010-03-09 | 2011-09-22 | Sony Corp | Image processor, image processing method and program |
US9241401B2 (en) | 2010-06-22 | 2016-01-19 | Express Imaging Systems, Llc | Solid state lighting device and method employing heat exchanger thermally coupled circuit board |
US8901825B2 (en) | 2011-04-12 | 2014-12-02 | Express Imaging Systems, Llc | Apparatus and method of energy efficient illumination using received signals |
US9713228B2 (en) | 2011-04-12 | 2017-07-18 | Express Imaging Systems, Llc | Apparatus and method of energy efficient illumination using received signals |
US8610358B2 (en) | 2011-08-17 | 2013-12-17 | Express Imaging Systems, Llc | Electrostatic discharge protection for luminaire |
US8629621B2 (en) | 2011-08-24 | 2014-01-14 | Express Imaging Systems, Llc | Resonant network for reduction of flicker perception in solid state lighting systems |
US8922124B2 (en) | 2011-11-18 | 2014-12-30 | Express Imaging Systems, Llc | Adjustable output solid-state lamp with security features |
US9360198B2 (en) | 2011-12-06 | 2016-06-07 | Express Imaging Systems, Llc | Adjustable output solid-state lighting device |
US9497393B2 (en) | 2012-03-02 | 2016-11-15 | Express Imaging Systems, Llc | Systems and methods that employ object recognition |
US9210751B2 (en) | 2012-05-01 | 2015-12-08 | Express Imaging Systems, Llc | Solid state lighting, drive circuit and method of driving same |
US9204523B2 (en) | 2012-05-02 | 2015-12-01 | Express Imaging Systems, Llc | Remotely adjustable solid-state lamp |
US9801248B2 (en) | 2012-07-25 | 2017-10-24 | Express Imaging Systems, Llc | Apparatus and method of operating a luminaire |
US9131552B2 (en) | 2012-07-25 | 2015-09-08 | Express Imaging Systems, Llc | Apparatus and method of operating a luminaire |
US8878440B2 (en) | 2012-08-28 | 2014-11-04 | Express Imaging Systems, Llc | Luminaire with atmospheric electrical activity detection and visual alert capabilities |
US9693433B2 (en) | 2012-09-05 | 2017-06-27 | Express Imaging Systems, Llc | Apparatus and method for schedule based operation of a luminaire |
US8896215B2 (en) | 2012-09-05 | 2014-11-25 | Express Imaging Systems, Llc | Apparatus and method for schedule based operation of a luminaire |
US9301365B2 (en) | 2012-11-07 | 2016-03-29 | Express Imaging Systems, Llc | Luminaire with switch-mode converter power monitoring |
US9433062B2 (en) | 2012-11-19 | 2016-08-30 | Express Imaging Systems, Llc | Luminaire with ambient sensing and autonomous control capabilities |
US9210759B2 (en) | 2012-11-19 | 2015-12-08 | Express Imaging Systems, Llc | Luminaire with ambient sensing and autonomous control capabilities |
US9288873B2 (en) | 2013-02-13 | 2016-03-15 | Express Imaging Systems, Llc | Systems, methods, and apparatuses for using a high current switching device as a logic level sensor |
US9466443B2 (en) | 2013-07-24 | 2016-10-11 | Express Imaging Systems, Llc | Photocontrol for luminaire consumes very low power |
US9414449B2 (en) | 2013-11-18 | 2016-08-09 | Express Imaging Systems, Llc | High efficiency power controller for luminaire |
US9781797B2 (en) | 2013-11-18 | 2017-10-03 | Express Imaging Systems, Llc | High efficiency power controller for luminaire |
KR102155374B1 (en) | 2013-11-25 | 2020-09-11 | 르노 에스.아.에스. | System and method for forming nighttime images for a motor vehicle |
KR20160091331A (en) * | 2013-11-25 | 2016-08-02 | 르노 에스.아.에스. | System and method for forming nighttime images for a motor vehicle |
US9185777B2 (en) | 2014-01-30 | 2015-11-10 | Express Imaging Systems, Llc | Ambient light control in solid state lamps and luminaires |
KR101579000B1 (en) | 2014-04-17 | 2015-12-22 | 재단법인 다차원 스마트 아이티 융합시스템 연구단 | Method and Apparatus for Event Detection using Frame Grouping |
KR20150120547A (en) * | 2014-04-17 | 2015-10-28 | 재단법인 다차원 스마트 아이티 융합시스템 연구단 | Method and Apparatus for Event Detection using Frame Grouping |
US9572230B2 (en) | 2014-09-30 | 2017-02-14 | Express Imaging Systems, Llc | Centralized control of area lighting hours of illumination |
US9445485B2 (en) | 2014-10-24 | 2016-09-13 | Express Imaging Systems, Llc | Detection and correction of faulty photo controls in outdoor luminaires |
US9462662B1 (en) | 2015-03-24 | 2016-10-04 | Express Imaging Systems, Llc | Low power photocontrol for luminaire |
KR101743194B1 (en) * | 2015-07-31 | 2017-06-02 | 시아오미 아이엔씨. | Video recording method and device, program and recording medium |
CN105120191A (en) * | 2015-07-31 | 2015-12-02 | 小米科技有限责任公司 | Video recording method and device |
US9538612B1 (en) | 2015-09-03 | 2017-01-03 | Express Imaging Systems, Llc | Low power photocontrol for luminaire |
US9924582B2 (en) | 2016-04-26 | 2018-03-20 | Express Imaging Systems, Llc | Luminaire dimming module uses 3 contact NEMA photocontrol socket |
US9985429B2 (en) | 2016-09-21 | 2018-05-29 | Express Imaging Systems, Llc | Inrush current limiter circuit |
US10230296B2 (en) | 2016-09-21 | 2019-03-12 | Express Imaging Systems, Llc | Output ripple reduction for power converters |
US10098212B2 (en) | 2017-02-14 | 2018-10-09 | Express Imaging Systems, Llc | Systems and methods for controlling outdoor luminaire wireless network using smart appliance |
US10219360B2 (en) | 2017-04-03 | 2019-02-26 | Express Imaging Systems, Llc | Systems and methods for outdoor luminaire wireless control |
US10390414B2 (en) | 2017-04-03 | 2019-08-20 | Express Imaging Systems, Llc | Systems and methods for outdoor luminaire wireless control |
US10568191B2 (en) | 2017-04-03 | 2020-02-18 | Express Imaging Systems, Llc | Systems and methods for outdoor luminaire wireless control |
US10904992B2 (en) | 2017-04-03 | 2021-01-26 | Express Imaging Systems, Llc | Systems and methods for outdoor luminaire wireless control |
US11653436B2 (en) | 2017-04-03 | 2023-05-16 | Express Imaging Systems, Llc | Systems and methods for outdoor luminaire wireless control |
US11375599B2 (en) | 2017-04-03 | 2022-06-28 | Express Imaging Systems, Llc | Systems and methods for outdoor luminaire wireless control |
WO2018220993A1 (en) * | 2017-05-29 | 2018-12-06 | ソニーセミコンダクタソリューションズ株式会社 | Signal processing device, signal processing method and computer program |
US10164374B1 (en) | 2017-10-31 | 2018-12-25 | Express Imaging Systems, Llc | Receptacle sockets for twist-lock connectors |
US11234304B2 (en) | 2019-05-24 | 2022-01-25 | Express Imaging Systems, Llc | Photocontroller to control operation of a luminaire having a dimming line |
US11317497B2 (en) | 2019-06-20 | 2022-04-26 | Express Imaging Systems, Llc | Photocontroller and/or lamp with photocontrols to control operation of lamp |
US11765805B2 (en) | 2019-06-20 | 2023-09-19 | Express Imaging Systems, Llc | Photocontroller and/or lamp with photocontrols to control operation of lamp |
JP2021060368A (en) * | 2019-10-09 | 2021-04-15 | 株式会社デンソー | Object recognition device |
JP7099426B2 (en) | 2019-10-09 | 2022-07-12 | 株式会社デンソー | Object recognition device |
WO2021070713A1 (en) * | 2019-10-09 | 2021-04-15 | 株式会社デンソー | Object recognition device |
US11212887B2 (en) | 2019-11-04 | 2021-12-28 | Express Imaging Systems, Llc | Light having selectively adjustable sets of solid state light sources, circuit and method of operation thereof, to provide variable output characteristics |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP2001333420A (en) | Image supervisory method and device | |
US6141433A (en) | System and method for segmenting image regions from a scene likely to represent particular objects in the scene | |
JP4878644B2 (en) | Moving object noise removal processing apparatus and moving object noise removal processing program | |
US6778180B2 (en) | Video image tracking engine | |
JP3423861B2 (en) | Method and apparatus for monitoring a moving object | |
JP2000175176A (en) | Supervisory method for mobile object and image supervisory device | |
CN110572636B (en) | Camera contamination detection method and device, storage medium and electronic equipment | |
JP2009244946A (en) | Traffic light recognizing apparatus, traffic light recognizing method, and traffic light recognizing program | |
JP3569163B2 (en) | Moving object monitoring device | |
JP2007272532A (en) | Fire detection apparatus | |
JP2000306684A (en) | Image monitor device and its illumination control method | |
JP4542929B2 (en) | Image signal processing device | |
JP4611776B2 (en) | Image signal processing device | |
JPH10289321A (en) | Image monitoring device | |
JP2003242440A (en) | Character recognizing method and its device | |
Spagnolo et al. | Advances in background updating and shadow removing for motion detection algorithms | |
JP3232502B2 (en) | Fog monitoring system | |
JP2009044526A (en) | Photographing device, photographing method, and apparatus and method for recognizing person | |
JP4740755B2 (en) | Monitoring device using images | |
JP2006059183A (en) | Image processor | |
JP4491360B2 (en) | Image signal processing device | |
JP3423886B2 (en) | Moving object monitoring device | |
JP2011071925A (en) | Mobile tracking apparatus and method | |
JP4656977B2 (en) | Sensing device | |
JP3532769B2 (en) | Image monitoring device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A621 | Written request for application examination |
Free format text: JAPANESE INTERMEDIATE CODE: A621 Effective date: 20050928 |
|
RD02 | Notification of acceptance of power of attorney |
Free format text: JAPANESE INTERMEDIATE CODE: A7422 Effective date: 20070411 |
|
RD04 | Notification of resignation of power of attorney |
Free format text: JAPANESE INTERMEDIATE CODE: A7424 Effective date: 20070411 |
|
A977 | Report on retrieval |
Free format text: JAPANESE INTERMEDIATE CODE: A971007 Effective date: 20071214 |
|
A131 | Notification of reasons for refusal |
Free format text: JAPANESE INTERMEDIATE CODE: A131 Effective date: 20071225 |
|
A521 | Request for written amendment filed |
Free format text: JAPANESE INTERMEDIATE CODE: A523 Effective date: 20080225 |
|
A02 | Decision of refusal |
Free format text: JAPANESE INTERMEDIATE CODE: A02 Effective date: 20080318 |