[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

JP5133677B2 - Monitoring system - Google Patents

Monitoring system Download PDF

Info

Publication number
JP5133677B2
JP5133677B2 JP2007337163A JP2007337163A JP5133677B2 JP 5133677 B2 JP5133677 B2 JP 5133677B2 JP 2007337163 A JP2007337163 A JP 2007337163A JP 2007337163 A JP2007337163 A JP 2007337163A JP 5133677 B2 JP5133677 B2 JP 5133677B2
Authority
JP
Japan
Prior art keywords
state
face
person
abnormality
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
JP2007337163A
Other languages
Japanese (ja)
Other versions
JP2009157780A (en
Inventor
勝 徳田
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Car Mate Manufacturing Co Ltd
Original Assignee
Car Mate Manufacturing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Car Mate Manufacturing Co Ltd filed Critical Car Mate Manufacturing Co Ltd
Priority to JP2007337163A priority Critical patent/JP5133677B2/en
Publication of JP2009157780A publication Critical patent/JP2009157780A/en
Application granted granted Critical
Publication of JP5133677B2 publication Critical patent/JP5133677B2/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Emergency Alarm Devices (AREA)
  • Alarm Systems (AREA)
  • Telephonic Communication Services (AREA)
  • Image Analysis (AREA)

Description

本発明は、遠隔監視システムに関し、特に、赤ちゃんや乳幼児あるいは高齢者の表情をカメラから取り込んだ画像を画像認識により状態を判断し、乳幼児の状態を監視者が常に見なくても同等の判断を監視者に逐次情報提供するための監視システムに関する。   The present invention relates to a remote monitoring system, and in particular, determines the state by image recognition of an image obtained by capturing the expression of a baby, an infant or an elderly person from a camera, and makes an equivalent determination even if the observer does not always see the state of the infant. The present invention relates to a monitoring system for sequentially providing information to a monitor.

従来、監視者が離れた場所から監視対象の監視を行うための種々の技術が開示されている。このような遠隔監視システムでは、監視対象の近くに監視カメラを設置し、監視者が離れた場所に設置したモニタで監視対象を監視するのが最も基本的であるが、さらに、たとえば特開平1-129590号公報では、周囲の音を収集しその音圧や周波数が異常と判断されると、監視カメラを旋回させて異常音が発生したポイントの画像情報を監視者が監視するモニタに送信し表示を行わせるようにして監視者に異常を通報するようにしている。 Conventionally, various techniques for monitoring an object to be monitored from a place where a monitor is remote have been disclosed. In such a remote monitoring system, it is most basic that a monitoring camera is installed near the monitoring target and the monitoring target is monitored by a monitor installed at a location away from the monitoring person. In -129590, when surrounding sounds are collected and the sound pressure and frequency are determined to be abnormal, the monitoring camera is turned and the image information of the point where the abnormal sound is generated is transmitted to the monitor monitored by the monitor. An error is reported to the monitoring person by displaying it.

また、特開昭55−132165号公報では、監視対象の近くに設置した集音装置と監視者に近くに設置した拡声装置とを電話の内線を介して監視対象の周辺の音を聞けるようにしている。 In Japanese Patent Laid-Open No. 55-132165, a sound collecting device installed near a monitoring target and a loudspeaker installed near the monitoring person can listen to sounds around the monitoring target via a telephone extension. ing.

また、特開平10−11674号公報の異常通報システムは、音、温度、マイク、体温等のセンサにより、異常値に対する閾値を設けて、設定値との比較を行って異常があった場合、監視者に通報するシステムであった。 In addition, the abnormality notification system disclosed in Japanese Patent Application Laid-Open No. 10-11474 provides a threshold value for an abnormal value by a sensor such as sound, temperature, microphone, body temperature, etc., and compares it with a set value to monitor the abnormality. It was a system to notify the person.

また、本発明に関する類似の従来技術として顔照合技術としては、特開2001−92963があるが、本発明は照合ではなく、表情の分析であるので、若干異なる。また、表情認識技術として、特開2001−43345がある。これは、笑い、怒り、喜び、悲しみといった表情的にも、感情的にも大きく異なるものではなく、つらさといった人物の所定の表情に対する強さ、激しさといった度合いを認識するものであり、表情の遷移といった時系列データを使用する。また、顔の検出に関しては、正面画像を得ることが認識精度に関わり、そのための手法として、特開2006−259900号や、特開2004−46399号が挙げられる。 As a similar conventional technique related to the present invention, there is Japanese Unexamined Patent Publication No. 2001-92963 as a face matching technique. However, since the present invention is not a matching but an analysis of facial expressions, it is slightly different. Japanese Patent Laid-Open No. 2001-43345 is a facial expression recognition technique. This does not differ greatly in terms of expression and emotion, such as laughter, anger, joy, and sadness. Use time-series data such as transitions. As for face detection, obtaining a front image is related to recognition accuracy, and as a technique therefor, JP 2006-259900 A and JP 2004-46399 A can be cited.

また、動画像処理およびパターンマッチングを使って顔の動きの異常を感知して、通報するものとして、就寝状態以上検知装置の特開平11−99140号があげられる。
特開平1-129590号 特開昭55−132165号 特開平10−11674号 特開2001−92963号 特開2001−43345号 特開2006−259900号 特開2004−46399号 特開平11−99140号 P. Viola and M.Jones: Rapid object detection using a boosted cascade of simple features, Proc.On CVPR, Vol. 1.Pp 511-520 2001 Paul Ekman,Wallace V.Friesen, 工藤 力訳:“表情分析入門”,誠信書房, 1987
Japanese Patent Application Laid-Open No. 11-99140, which is a detection device for detecting a sleeping state or higher, detects abnormalities in facial movement using moving image processing and pattern matching and reports them.
JP-A-1-129590 JP 55-132165 A Japanese Patent Laid-Open No. 10-11474 JP 2001-92963 A JP 2001-43345 A JP 2006-259900 A JP 2004-46399 A JP-A-11-99140 P. Viola and M. Jones: Rapid object detection using a boosted cascade of simple features, Proc. On CVPR, Vol. 1.Pp 511-520 2001 Paul Ekman, Wallace V. Friesen, Kudo Riki: “Introduction to facial expression analysis”, Seishin Shobo, 1987

上述した従来技術では、音や、温度、マイク、体温等のセンサ等の個々の使用または、複合した使用により異常監視の精度を高めるように努めているが、通常状態では、温度や体温が急激に変化するのか考えにくく、実際に機能するのは、音の情報に依存するところが大きい。
また、音の情報に依存した場合、周囲情報やテレビの音などの影響も大きく、精度を高めるのは難しい。さらに音が出ない場合も想定できる。
このように異常通報の状態の精度をあまり細かく設定できないため、誤報が多く監視者の負担は依然多くなってしまう。
In the above-described prior art, efforts are made to improve the accuracy of abnormality monitoring by individual use or combined use of sensors such as sound, temperature, microphone, body temperature, etc., but in normal conditions, the temperature and body temperature suddenly increase. It is hard to think about whether it will change, and what actually functions depends heavily on the sound information.
In addition, depending on sound information, the influence of ambient information and TV sound is large, and it is difficult to improve accuracy. It can also be assumed that there is no sound.
As described above, since the accuracy of the abnormality report state cannot be set very finely, there are many misreports, and the burden on the monitor is still increased.

顔認証に関する従来技術に関しては、本システムは表情に関する認識ではあるが、遷移状態を見るものではなく、逐次処理を行うものである。また、監視対象者の顔の向きにより、カメラから顔を検出することが出来ないときに、正面画像を得るために使用する手段としては、本手法は、音やものの動きにより、監視対象者の注意を引くところにある。また、就寝状態の異常を検知するシステムは、動画像により監視対象者の動きと時間を検知して、それにより動きが異常かどうかを判断するものであるが、限られた条件の下に判定するのであり、乳幼児の状態を判定するのには不十分といえる。 Regarding the prior art related to face authentication, this system recognizes facial expressions, but does not look at the transition state but performs sequential processing. In addition, as a means for obtaining a front image when the face cannot be detected from the camera due to the orientation of the face of the person being monitored , this technique is based on the movement of the sound or object. There is a place to draw attention. In addition, the system that detects abnormalities in the sleeping state detects the movement and time of the person being monitored from the moving image, and determines whether the movement is abnormal based on this, but it is determined under limited conditions. Therefore, it can be said that it is insufficient to determine the state of the infant.

しかし、画像認識を用いて、顔の表情により状態を判別することにより、監視対象者の状態を判別することが出来れば、そこで表情に異常があるときだけ映像とともに通報すれば、精度の高い情報をおくることができる。 However, if the state of the person being monitored can be determined by determining the state based on facial expressions using image recognition, high-accuracy information can be obtained by reporting the image only when there is an abnormality in the expression. Can be sent.

本発明の目的は、画像認識および学習により監視対象者の顔の表情を高精度に判断し、監視対象者に異常がある場合のみ通報する監視システムを提供することにある。 An object of the present invention is to provide a monitoring system that determines a facial expression of a monitoring target person with high accuracy by image recognition and learning, and reports only when there is an abnormality in the monitoring target person.

本発明の監視システムは、監視者が携帯する携帯端末と、監視対象の近くに設置された前記携帯端末と無線で相互に通信可能な異常通報装置とで構成され、前記異常通報装置が、監視対象の顔を検知する顔検知手段と、検知した顔から顔の状態を判断する顔状態判断手段と、周辺の音を収集する音収集手段と検知した音から周囲の音の異常を判断する手段とを備え、顔状態の異常若しくは周囲の音の異常を予め記憶された所定値に基づき判断し異常があると判断したときに異常通報信号を前記携帯端末に送信する制御手段とを備え、前記異常通報装置が顔検知の際、監視対象者の顔の向きにより、カメラから顔を検出することが出来ないときに、異常通報装置に接続してあるがらがらの動作による動きや、スピーカーからのお母さんの呼びかけなどの登録もしくは、携帯端末からの呼びかけ音やものの動きにより、監視対象者の注意を引くことによって、監視対象者の顔をカメラの方向に向かせる。また、そのときの反応により、監視対象者の状態を判定する手段を備え、前記携帯端末が、表示手段と音声手段と、前記異常通報装置から受信した異常通報信号に基づき前記表示手段に異常通報を表示し前記音声手段より警告音を発生させる制御手段とを備えた事を特徴とする。 Monitoring system of the present invention, a mobile terminal monitored person carries, is composed of a communicable abnormality notification device to another in the installed the mobile terminal and the wireless near monitored, the abnormality notification device, monitoring Face detecting means for detecting the target face, face state judging means for judging the face state from the detected face, sound collecting means for collecting surrounding sounds, and means for judging abnormality of surrounding sounds from the detected sounds with the door, and a control means for transmitting an abnormality notification signal when it is determined that determines based on abnormality prestored predetermined value of the abnormality or ambient sound of the face state is abnormal to the portable terminal, wherein When the abnormality reporting device detects a face, if the face cannot be detected from the camera due to the orientation of the face of the person being monitored, the movement caused by the movement of the cradle connected to the abnormality reporting device or the mother from the speaker Call Registration of such or, by the motion of the interrogation sound and things from the mobile terminal, by drawing the attention of the person to be monitored, to suited the face of the person to be monitored in the direction of the camera. Further, the reaction of that time, with the means for determining the status of the person to be monitored, wherein the portable terminal, a display unit and an audio unit, abnormality notification on the display means based on the abnormality notification signal received from the abnormality notification device characterized in that and a control means for generating a warning sound from the display and the audio means.

更に、前記携帯端末が、異常通報装置からの通報の履歴を映像および音で、設定可能な間隔で定期的に取得し、監視対象者の状態として表あるいはグラフ化して監視対象者の健康状態、食事状態、排泄状態のうち少なくともひとつの状態を把握することが出来る手段を備えると良い。 Further, the mobile terminal periodically acquires a history of notifications from the abnormality notification device with video and sound at settable intervals, and is displayed in a table or graph as the state of the monitoring target person to indicate the health state of the monitoring target person, It is preferable to provide means capable of grasping at least one of the meal state and the excretion state.

また、前記携帯端末が、異常通報装置からの通報時、または、任意のときに以前の履歴を確認して、検出状態の学習を行えるようにして状態判断の精度を高める手段を備えると良い。 Further, it is preferable that the mobile terminal includes a means for checking the previous history at the time of reporting from the abnormality reporting device or at any time so as to be able to learn the detected state and to improve the accuracy of the state determination.

また、前記顔検知手段が1軸、あるいは2軸のモータで制御する手段をもち、監視対象者の顔の位置および方向を検知し、常にカメラの中に捉えるように制御し顔トラッキング手段を備えると良い。 The face detection means includes means for controlling with a single-axis or two-axis motor, and detects the position and direction of the face of the person to be monitored, and includes a face tracking means for control so that it is always captured in the camera. And good.

また、前記顔検知手段が、顔検知状態により、「存在状態」、「不在状態」、「布団かぶり状態」、「うつぶせ寝状態」のうちの少なくとも1つの状態である事を判断する手段を備え、監視対象者の状態が好ましくない状態であるときに異常通報する手段を備えると良い。 Further, the face detection means comprises means for determining that the face detection state is at least one of a “presence state”, a “absence state”, a “futon cover state”, and a “lie down state”, It is preferable to provide means for reporting an abnormality when the monitoring subject is in an undesirable state.

また、前記顔状態判断手段が、前顔検知手段により「存在状態」の場合、顔の「覚醒状態」、「睡眠状態」、「泣いている状態」、「機嫌がいい状態」、「普通の状態」、「笑っている状態」、「吐しゃ状態」、などの基本状態を推定して顔の動きの情報とあわせて監視対象者の状態を推定する手段を備え、監視対象者の状態が好ましくない状態であるときに異常通報する手段を備えると良い。 In addition, if the face state determining means, by the pre-Symbol face detection means of the "presence state", "vigilance" in the face, "sleep state", "crying state", "mood is good state", "Normal The state of the monitoring subject is provided with means for estimating the state of the monitoring subject along with the information on the movement of the face by estimating the basic state such as the state of ”,“ the state of laughing ”,“ the state of vomit ” It is preferable to provide a means for reporting an abnormality when is unfavorable.

また、前記携帯端末が、監視対象者の状態を定期的な一日のリズムで把握し、お昼寝などの睡眠時間の管理を統計的に表示し、ずれた場合の注意を促す手段を備えると良い。 Further, the portable terminal includes means for grasping the state of the person being monitored with a regular daily rhythm, statistically displaying the management of sleep time such as a nap, and urging attention when there is a shift. good.

また、前記異常通報装置が、顔検知の際、監視対象者の顔が、カメラからの死角になっているときに、監視対象者の注意を引き、正面画像を取得する手段を備えると良い。 In addition, the abnormality reporting device may include means for acquiring a front image by drawing the attention of the monitoring subject when the face of the monitoring subject is in a blind spot from the camera during face detection.

また、前記異常通報装置が、睡眠中の監視対象者の呼吸を画面中の動き部分により検知すると良い。 In addition, the abnormality reporting device may detect the breathing of the monitoring target person during sleep based on a moving part on the screen.

本発明は、監視対象者の顔の向きにより、カメラから顔を検出することが出来ないときに、異常通報装置に接続してあるがらがらの動作による動きや、スピーカーからのお母さんの呼びかけなどの登録もしくは、携帯端末からの呼びかけ音やものの動きにより、監視対象者の注意を引くことによって、監視対象者の顔をカメラの方向に向かせる手段を備え、また、そのときの反応により、監視対象者の状態を判定する手段を備え、前記携帯端末が、表示手段と音声手段と、前記異常通報装置から受信した異常通報信号に基づき前記表示手段に異常通報を表示し前記音声手段より警告音を発生させる制御手段とを備えることにより、監視者が、監視したい情報である顔情報を自動的に判断し、監視者が監視対象者を注視しなくても、注視したときと同等の安心感を与え、注視する労力や負担を軽減する効果がある。また、学習機能を備えていることにより、監視対象者の個体の差による状態推定エラーを使用するたびに軽減することが出来、精度を向上することが可能である。 In the present invention, when the face cannot be detected from the camera due to the face orientation of the person to be monitored, the movement caused by the operation of the gargle connected to the anomaly reporting device, the mother calling from the speaker, etc. Or, it is equipped with means to draw the attention of the monitoring subject by the calling sound from the mobile terminal or the movement of the object, so that the face of the monitoring subject is directed toward the camera. The portable terminal is configured to display an abnormality report on the display unit based on an abnormality report signal received from the abnormality report device and generate a warning sound from the voice unit. by a control means for, monitored person, automatically determines the face information which is information to be monitored, the monitoring person without watching the person to be monitored, and gaze and And giving the same sense of security, there is an effect to reduce the labor and burden to watch. In addition, since the learning function is provided, it is possible to reduce the state estimation error due to the difference between individuals of the monitoring target each time it is used, and it is possible to improve the accuracy.

以下、本発明の一実施の形態について、図を踏まえて具体的に説明する。 Hereinafter, an embodiment of the present invention will be specifically described with reference to the drawings.

図1、図2は本発明による監視システムの一実施の形態を示すブロック図である。図1は監視対象者である乳幼児もしくは、高齢者の近くに設置する異常通報装置である。また、図2は監視者が手元においておく携帯端末装置である。 1 and 2 are block diagrams showing an embodiment of a monitoring system according to the present invention. FIG. 1 shows an abnormality notification device installed near an infant or an elderly person who is a monitoring target. FIG. 2 shows a portable terminal device that a supervisor keeps at hand.

図1の異常通報装置は、カメラ5が監視対象者の向きを向くように設置しておく。監視対象者の動ける状態によって、撮影の範囲35は調整する。たとえば、新生児でほとんど動きが少ない場合は、顔が画面の半分を占めるくらいのサイズにしてもよい。カメラ5からの入力はビデオ入力部4によりデジタル信号で所定の形式に変換されて、メモリ3に取り込まれる。ここで、CPU2により、画像内に顔があるかどうかの判定がなされる。 The abnormality notification device in FIG. 1 is installed so that the camera 5 faces the person to be monitored. The shooting range 35 is adjusted according to the movement state of the monitoring subject. For example, if the newborn has little movement, the face may occupy half the screen. The input from the camera 5 is converted into a predetermined format by a digital signal by the video input unit 4 and is taken into the memory 3. Here, the CPU 2 determines whether or not there is a face in the image.

顔検知の手法は、参考文献1のような、代表的なAdaboost手法を用いた顔検知手法によって行う。顔検知を行い、初期状態では、かならず顔が入るように設置し検知を行う。
検知が完了したら、その後、その顔の位置が画面上どこにあるのか追従し、常に画面の中央になるように、CPU2からカメラコントローラ6を通じてカメラ制御台を調整しカメラ5の姿勢を制御するようにしてもよいし、モータ制御を省略して、画面から顔を検知できなければ、警告を出すようにしてもよい。ここで、監視対象者の顔の向きがカメラの向きと同じ場合、カメラから顔を検出することが出来なくなる。
The face detection method is performed by a face detection method using a typical Adaboost method as in Reference 1. Face detection is performed, and in the initial state, the face is always placed and detected.
After the detection is completed, the position of the face is followed on the screen, and the camera control table is adjusted from the CPU 2 through the camera controller 6 to control the posture of the camera 5 so that it always becomes the center of the screen. Alternatively, the motor control may be omitted, and a warning may be issued if a face cannot be detected from the screen. Here, when the direction of the face of the person to be monitored is the same as the direction of the camera, the face cannot be detected from the camera.

この場合は検出を喪失した位置により、それがカメラの視野から外れたのか、カメラ視野内で顔の向きが同一方向により顔が検知できないのか判断する。前者の場合は、カメラコントローラ6を駆動することによりカメラ制御台7をコントロールして、カメラ中心から前回顔を検知した方向にカメラを動作させてもよいし、カメラの視野から外れたと認識しているだけでもよい。これらの流れを図3に示す。画像入力23のたびに、図4の(a)のように顔検知24を行い、カメラ視野内、あるいは、指定した顔検出範囲内に顔が検知されれば(顔検知?25)、顔中心座標36を取得26する。このとき、顔が検出されているので、顔が画面中に存在し、「存在」状態27と確認して正常状態と判断28する。また、顔検知が不可能な場合、前回および前々回などの顔中心座標の履歴を判断し、顔の移動ベクトル量を計算29し、移動量が多く、今回の位置が検出範囲外に移動したことが推定される場合は、「不在30」と判断する。図4の(c)にあたる。移動量が少ない場合は、顔の周りの動き31を動き検知エリア44内で検知して、図5のように動き検知エリアの動画像処理によるオプティカルフローを検出し動きが多い場合(図4(c)の41−42−43)は、「布団かぶり32」状態と判定し、動きが少ない場合図4(b)の38−39−40)は、「寝返り33」状態と判定する。「不在」「布団かぶり」、「寝返り」状態の場合、異常状態と判断34し、顔検出が行えない状態が一定時間続くようであれば、別途予め時間を設定しておき、端末に「不在」「布団かぶり」、「寝返り」の状態であると通報するようにする。 In this case, it is determined whether the face is not detected based on the position where the detection is lost or whether the face is out of the field of view of the camera or the face direction is the same in the camera field of view. In the former case, the camera controller 6 may be controlled by driving the camera controller 6 to operate the camera in the direction in which the previous face was detected from the camera center, or it may be recognized that the camera is out of the field of view. Just be there. These flows are shown in FIG. Each time an image is input 23, face detection 24 is performed as shown in FIG. 4A, and if a face is detected within the camera field of view or within a specified face detection range (face detection? 25), the face center is detected. A coordinate 36 is obtained 26. At this time, since the face is detected, the face is present on the screen, and the “exist” state 27 is confirmed to determine that the face is in a normal state 28. If face detection is not possible, the history of face center coordinates such as the previous time and the previous time is judged, the face movement vector amount is calculated 29, the movement amount is large, and the current position has moved out of the detection range. Is estimated as “absent 30”. This corresponds to (c) of FIG. When the movement amount is small, the motion 31 around the face is detected in the motion detection area 44, and the optical flow by the moving image processing in the motion detection area is detected as shown in FIG. c) 41-42-43) is determined to be in the “futon cover 32” state, and if there is little movement, it is determined in 38-39-40) in FIG. In the case of “absent”, “futon-covered”, and “turned over” states, it is determined that the state is abnormal 34, and if a state where face detection cannot be performed continues for a certain period of time, a time is set in advance, ”Report“ futon cover ”and“ turn over ”.

後者の場合、異常通報装置に接続してあるがらがらの動作による動きや、スピーカーからのお母さんの呼びかけなどの登録もしくは、携帯端末からの呼びかけ音やものの動きにより、監視対象者の注意を引くことによって、監視対象者の顔をカメラの方向に向かせる。また、そのときの反応により、監視対象者の状態を判定する。 In the latter case, by registering the movements of gargles connected to the error reporting device, calling the mother from the speakers, or calling the monitor person's attention by the movement of the calling sound or things from the mobile terminal, , Ru skein toward the face of the person to be monitored in the direction of the camera. Further, the state of the monitoring subject is determined based on the reaction at that time.

顔検知手段にて、顔の検知ができ、「存在」状態と判断した場合、さらに、図5のようなフローにより、その状態を判定する。顔の位置を検出すると顔の範囲内で、各器官の状態をチェックする。口検知を行い、口の周辺の状態を見て、口の周りに、異常がないかをチェックする。何か通常の状態と異なる場合は、異物検知と判断し、「吐しゃ状態」と判断する。「吐しゃ状態」の場合、異常状態と判断し、端末に通報する。
各器官を検出する場合、図8の特徴点64を探し、その位置関係を特徴ベクトル65から計算し、表情や状態を分析する。また、図7のように目の縦長さ58と横長さ59の比率やや口の縦長さ62、横長さ61による比率や口と目の矩形の作り縦長さ62と横長さ63の比率や位置関係を見てもよい。
When the face detection means can detect the face and determines that the state is “existing”, the state is further determined according to the flow shown in FIG. When the face position is detected, the state of each organ is checked within the face range. Mouth detection is performed and the condition around the mouth is checked to check whether there is any abnormality around the mouth. If it is different from the normal state, it is determined that the foreign object is detected, and it is determined that the state is “discharging”. In the case of “Vomiting state”, it is determined as an abnormal state, and the terminal is notified.
When detecting each organ, the feature point 64 of FIG. 8 is searched, the positional relationship is calculated from the feature vector 65, and the facial expression and state are analyzed. Further, as shown in FIG. 7, the ratio between the vertical length 58 and the horizontal length 59 of the eyes, the vertical length 62 of the mouth, the ratio of the horizontal length 61, the ratio of the vertical length 62 and the horizontal length 63 of the mouth and eye rectangles, and the positional relationship. You may see.

口周辺にものがない場合は、図7の目の部分の特徴量から、目の開閉状態を調べる。目の特徴ベクトルの縦横比などで検出してもよい。ここで、目が閉じていると出た場合、「睡眠状態」と判定し、それ以外は「覚醒状態」と判断し、笑顔であるかどうか判定する。その他の「泣いている状態」、「機嫌がいい状態」、「普通の状態」についても、
図8の特徴点の各ベクトルの関係から判断する。これは、事前に、不特定多数の被験者からのデータをとり、特徴量と「状態」の関係の相関をとっておく。使用時は、監視者により状態を学習させる。このデータをフィードバックして、顔状態検知の精度を高めていく。
When there is nothing around the mouth, the open / closed state of the eye is examined from the feature amount of the eye part in FIG. It may be detected by the aspect ratio of the eye feature vector. Here, when it comes out that the eyes are closed, it is determined as “sleep state”, otherwise it is determined as “wake state” and it is determined whether or not it is a smile. Other “crying”, “feeling good”, “normal”
Judgment is made from the relationship between the vectors of feature points in FIG. This takes in advance data from a large number of unspecified subjects and correlates the relationship between the feature quantity and the “state”. When in use, the state is learned by the supervisor. This data is fed back to improve the accuracy of face condition detection.

端末側では、これまでのデータ履歴を例えば1日単位で、図9のように表示して確認することができる。週単位や、月単位で確認できるようにしてもよい。図9の例では状態の表示項目66は、「不在」「起きた状態」「睡眠中」布団かぶり」「吐しゃ」「うつぶせ」となっているが、これは、任意に設定可能にする。イベントの項目も「ミルク」「オムツ」となっているが、これも任意に設定可能にすることも可能である。状態はグラフ化68四手表示可能で、あり、イベントも時間と回数が表示69可能である。このようにミルクや、おむつ交換の情報を入れておくと次回の目安とすることもできる。入力はい端末の入力モードにて71のように入れることが出来る。また、状態表示が間違っている場合は、学習入力70することが出来る。 On the terminal side, the data history so far can be displayed and confirmed, for example, in units of one day as shown in FIG. It may be possible to check on a weekly or monthly basis. In the example of FIG. 9, the state display items 66 are “absent”, “woke up”, “sleeping”, “futon cover”, “vomit”, and “collapse”, but this can be arbitrarily set. The event items are also “milk” and “diapers”, but these can be arbitrarily set. The state can be displayed in a graph 68, and the event can also be displayed 69 with time and number of times. In this way, if milk or diaper replacement information is entered, it can be used as a guideline for the next time. It can be entered as 71 in the input mode of the input yes terminal. If the status display is wrong, a learning input 70 can be made.

就寝状態では、対象者の呼吸をみることにより、異常がないかを確認することができる。図10のように幼児の場合、腹式呼吸が主であるので、顔の位置を検知した後、一定距離位置の胸部腹部の付近となる位置の動き72を調べることのより呼吸を見ることができる。呼吸をしているかどうかの判断は単に動画像処理により動きが一定に動いているかを判定すればよい。 In the sleeping state, it can be confirmed whether there is any abnormality by looking at the breathing of the subject. In the case of an infant as shown in FIG. 10, since abdominal breathing is the main, after detecting the position of the face, it is possible to see the breath by examining the movement 72 in the vicinity of the chest abdomen at a certain distance. it can. Whether or not the person is breathing may be determined by simply determining whether or not the movement is moving by moving image processing.

顔検知以外でも動画像解析により、監視対象者の動きおよび顔の向きを検知して、監視対象者である乳幼児が、活動状態にあるのか、休止状態にあるのかを判断する。休止状態の場合は、特に問題がないが、活動状態のときに怪我などの注意が必要であるので、活動状態に入ったことを通知する。 In addition to face detection, the movement and face orientation of the monitoring subject are detected by moving image analysis to determine whether the infant who is the monitoring subject is in an active state or a resting state. In the sleep state, there is no particular problem. However, since attention such as an injury is necessary in the active state, it is notified that the active state has been entered.

また、同時に音声信号を検知しておき、周囲の音が急激に変化した場合の通報するようにしておく。 At the same time, an audio signal is detected, and a notification is given when the surrounding sound changes abruptly.

尚、本実施例の携帯端末は、携帯電話等の既存の通信手段を有する機器を利用しても良い。また、実施例においては、乳幼児に関する監視システムとして記述したが、病人や、要介護者などの監視システムとして使用する事が出来、更に、例えば、パソコンのモニタにカメラをセットして、勤務中の社員の状態の様子を検査することも可能でこの場合は、疲労度などのチェックを行い、一定時間で休憩を促すシステムにしてもよいし、運転中のドライバーの状態をチェックしてもよいし、工場の作業者の状態をチェックしてもよいし、テレビを見ている自動の状態をチェックしてもよいし、人間の疲労や状態のチェックなど様々シーンに利用する事が出来る。 Note that the mobile terminal of this embodiment may use a device having an existing communication means such as a mobile phone. In the embodiment, although described as a monitoring system for infants, it can be used as a monitoring system for sick people, care recipients, etc. It is also possible to inspect the state of the employee's condition. In this case, it may be a system that checks the fatigue level etc. and encourages a break in a certain time, or the driver's condition while driving You can check the status of the workers in the factory, you can check the automatic status of watching TV, and it can be used for various situations such as checking human fatigue and status.

本発明は、乳幼児等の状態を離れた場所から容易に確認出来、必要に応じて異常状態を監視者に知らせる事が出来る監視システムに利用する事が出来る。   INDUSTRIAL APPLICABILITY The present invention can be used in a monitoring system that can easily check the state of an infant or the like from a remote location and can notify a monitor of an abnormal state as necessary.

一実施の形態の異常通報装置の構成部品Component parts of an abnormality notification device according to an embodiment 一実施の形態の異常受信端末の構成部品Component parts of abnormal reception terminal according to one embodiment 一実施の形態の顔検知状態の流れFlow of face detection state in one embodiment 一実施の形態の顔の位置履歴ベクトルFace position history vector of one embodiment 一実施の形態の顔の周りの動き検出エリアMotion detection area around the face in one embodiment 一実施の形態の顔検知状態の流れFlow of face detection state in one embodiment 一実施の形態の顔の器官の特徴量Features of facial organs according to one embodiment 一実施の形態の顔の器官の特徴量れFeature amount of facial organ in one embodiment 一実施の形態の情報端末の表示例Display example of information terminal according to one embodiment 一実施の形態の胸部動き検知範囲Chest motion detection range of one embodiment

符号の説明Explanation of symbols

1 異常通報装置
2 CPU
3 メモリ
4 ビデオ入力
5 カメラ
6 カメラコントローラ
7 カメラ制御台
8 音声入力
9 マイク
10 送受信部
11 アンテナ
12 異常通報装置
13 CPU
14 メモリ
15 ビデオ入力
16 カメラ
17 カメラコントローラ
18 カメラ制御台
19 音声入力
20 マイク
21 送受信部
22 アンテナ
23 画像入力
24 顔検知
25 顔検知?
26 顔中心座標取得
27 存在
28 正常状態判断
29 顔座標履歴移動ベクトル大?
30 不在
31 顔回りの動き?
32 布団かぶり
33 寝返り
34 異常状態判断
35 顔検出エリア
36 顔中心座標
37 検出顔エリア
38 n−1枚目検出顔中心位置履歴
39 n枚目検出顔中心位置履歴
40 n+1枚目検出顔中心位置履歴
41 n−1枚目検出顔中心位置履歴
42 n枚目検出顔中心位置履歴
43 n+1枚目検出顔中心位置履歴
44 動き検知エリア
45 存在
46 口検知
47 口周辺にもの?
48 異物検知
49 吐しゃ状態
50 異常状態判断
51 目の開平状態
52 睡眠状態
53 笑顔か?
54 笑っている状態
55 その他の状態
56 特徴量学習
57 正常状態判断
58 目の縦の長さ
59 目の横の長さ
60 口の縦長さ
61 口の横長さ
62 目と口のつくる矩形の縦長さ
63 目と口のつくる矩形の横長さ
64 特徴点
65 特徴ベクトル
66 状態表示項目
67 イベント表示項目
68 状態グラフ
69 イベントグラフ
70 状態学習入力
71 イベント入力
72 胸部の動き
1 Abnormality reporting device 2 CPU
3 Memory 4 Video input 5 Camera 6 Camera controller 7 Camera control stand 8 Audio input 9 Microphone 10 Transceiver 11 Antenna 12 Abnormality notification device 13 CPU
14 Memory 15 Video input 16 Camera 17 Camera controller 18 Camera control table 19 Audio input 20 Microphone 21 Transceiver 22 Antenna 23 Image input 24 Face detection 25 Face detection?
26 Face center coordinate acquisition 27 Existence 28 Normal state determination 29 Face coordinate history movement vector large?
30 Absence 31 Movement around the face?
32 Duvet cover 33 Roll over 34 Abnormal state determination 35 Face detection area 36 Face center coordinates 37 Detection face area 38 n-1st detection face center position history 39 nth detection face center position history 40 n + 1th detection face center position history 41 n-1st detected face center position history 42 nth detected face center position history 43 n + 1th detected face center position history 44 motion detection area 45 presence 46 mouth detection 47 What is around the mouth?
48 Foreign object detection 49 Exhalation state 50 Abnormal state judgment 51 Eye open state 52 Sleep state 53 Smile?
54 Laughing State 55 Other State 56 Feature Learning 57 Normal State Judgment 58 Eye Vertical Length 59 Eye Horizontal Length 60 Mouth Vertical Length 61 Mouth Horizontal Length 62 Rectangle Vertical Length Created by Eye and Mouth 63 Horizontal length of eyes and mouth 64 Feature point 65 Feature vector 66 State display item 67 Event display item 68 State graph 69 Event graph 70 State learning input 71 Event input 72 Chest movement

Claims (9)

監視者が携帯する携帯端末と、監視対象の近くに設置された前記携帯端末と無線で相互に通信可能な異常通報装置とで構成され、
前記異常通報装置が、監視対象の顔を検知する顔検知手段と、検知した顔から顔の状態を判断する顔状態判断手段と、周辺の音を収集する音収集手段と検知した音から周囲の音の異常を判断する手段とを備え、顔状態の異常若しくは周囲の音の異常を予め記憶された所定値に基づき判断し
異常があると判断したときに異常通報信号を前記携帯端末に送信する制御手段とを備え、
前記異常通報装置が顔検知の際、監視対象者の顔の向きにより、カメラから顔を検出することが出来ないときに、監視対象者の注意を引き、監視対象者の顔をカメラの方向に向かせる手段を備え、
前記携帯端末が、表示手段と音声手段と、前記異常通報装置から受信した異常通報信号に基づき前記表示手段に異常通報を表示し前記音声手段より警告音を発生させる制御手段とを備えた監視システム。
A mobile terminal monitored person carries, is composed of a communicable abnormality notification device to another in the installed said portable terminal wirelessly near monitored,
The abnormality reporting device includes a face detection unit that detects a face to be monitored, a face state determination unit that determines a face state from the detected face, a sound collection unit that collects surrounding sounds, and a detected sound to Means for determining an abnormality of the sound, and an abnormality notification signal is transmitted to the portable terminal when the abnormality of the face state or the abnormality of the surrounding sound is determined based on a predetermined value stored in advance and it is determined that there is an abnormality. Control means,
When the anomaly reporting device detects a face, if the face cannot be detected from the camera due to the orientation of the face of the monitored person, the monitoring person's face is drawn in the direction of the camera. With means to
It said portable terminal, monitoring system comprising a display means and audio means, and control means for generating a warning sound from the display and the audio means abnormality notification on the display means based on the abnormality notification signal received from the abnormality notification device .
前記監視対象者の顔の向きにより、カメラから顔を検出することが出来ないときに、監視対象者の注意を引き、監視対象者の顔をカメラの方向に向かせる手段が、登録された音よりなる、請求項1に記載の監視システム。 When the face of the person being monitored cannot be detected from the camera, means for drawing the attention of the person to be watched and directing the face of the person being watched toward the camera is a registered sound. The monitoring system according to claim 1, further comprising : 前記登録された音が、呼びかけ音よりなる、請求項2に記載の監視システム。 The monitoring system according to claim 2, wherein the registered sound is a call sound . 前記監視対象者の顔の向きにより、カメラから顔を検出することが出来ないときに、監視対象者の注意を引き、監視対象者の顔をカメラの方向に向かせる手段が、ものの動きよりなる、請求項1〜3何れか1項に記載の監視システム。 When the face cannot be detected from the camera due to the face direction of the person to be monitored, the means for drawing the attention of the person to be watched and directing the face of the person to be watched toward the camera is a movement of the object. The monitoring system according to any one of claims 1 to 3 . 前記監視対象者の顔の向きにより、カメラから顔を検出することが出来ないときに、監視対象者の注意を引き、監視対象者の顔をカメラの方向に向かせる手段を実施したときの反応により、監視対象者の状態を判定する、請求項1〜4何れか1項に記載の監視システム。 Response when the means for drawing the attention of the monitoring subject and directing the monitoring subject's face in the direction of the camera when the face cannot be detected from the camera due to the orientation of the monitoring subject's face The monitoring system according to any one of claims 1 to 4 , wherein the state of the person to be monitored is determined by : 前記携帯端末が、異常通報装置からの通報の履歴を映像および音で、設定可能な間隔で定期的に取得し、監視対象者の状態として表あるいはグラフ化して監視対象者の健康状態、食事状態、排泄状態のうち少なくともひとつの状態を把握することが出来る手段を備えた、請求項1〜のいずれ1項に記載の監視システム。 The mobile terminal periodically acquires a history of notification from the abnormality notification device in video and sound at a settable interval, and is displayed in a table or graph as the state of the monitoring target person, and the health state, meal state of the monitoring target person , comprising means that can be grasped at least one state of the excretion state monitoring system according to any one of claims 1-5. 前記携帯端末が、異常通報装置からの通報時、または、任意のときに以前の履歴を確認して、検出状態の学習を行えるようにして状態判断の精度を高める手段を備えた請求項1〜6の何れか1項に記載の監視システム。 It said portable terminal, when notification of the abnormality notification device, or confirms the previous history at any time, provided with means to improve the accuracy of state determination so as to perform the learning of the detection state, claim 1 The monitoring system according to any one of -6 . 前記顔検知手段が、顔検知状態により、「存在状態」、「不在状態」、「布団かぶり状態」、「うつぶせ寝状態」のうちの少なくとも1つの状態である事を判断する顔状態判断手段を備えた、請求項1〜7の何れか1項に記載の監視システム。 The face detection means includes face state determination means for determining whether the face detection state is at least one of “existing state”, “absent state”, “futon-covered state”, and “collapsed sleeping state”. The monitoring system according to any one of claims 1 to 7 . 前記顔状態判断手段が、前記顔検知手段により「存在状態」の場合、顔の「覚醒状態」、「睡眠状態」、「泣いている状態」、「機嫌がいい状態」、「普通の状態」、「笑っている状態」、「吐しゃ状態」、などの基本状態を推定して顔の動きの情報とあわせて監視対象者の状態を推定する手段を備えた、請求項8に記載の監視システム。 When the face state judging means is “existing state” by the face detecting means, the “wake state”, “sleep state”, “crying state”, “good state”, “normal state” of the face The monitoring according to claim 8 , further comprising means for estimating a state of the person to be monitored together with information on facial movement by estimating a basic state such as “smiling state” and “screaming state”. system.
JP2007337163A 2007-12-27 2007-12-27 Monitoring system Expired - Fee Related JP5133677B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2007337163A JP5133677B2 (en) 2007-12-27 2007-12-27 Monitoring system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2007337163A JP5133677B2 (en) 2007-12-27 2007-12-27 Monitoring system

Publications (2)

Publication Number Publication Date
JP2009157780A JP2009157780A (en) 2009-07-16
JP5133677B2 true JP5133677B2 (en) 2013-01-30

Family

ID=40961713

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2007337163A Expired - Fee Related JP5133677B2 (en) 2007-12-27 2007-12-27 Monitoring system

Country Status (1)

Country Link
JP (1) JP5133677B2 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7569017B2 (en) 2019-07-04 2024-10-17 日本電気硝子株式会社 Exterior glass components, housings and doors

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5845506B2 (en) * 2009-07-31 2016-01-20 兵庫県 Action detection device and action detection method
CN102236781A (en) * 2010-04-22 2011-11-09 鸿富锦精密工业(深圳)有限公司 System and method for sensing infant sleep
KR101270312B1 (en) 2011-10-18 2013-05-31 아주대학교산학협력단 Dangerous condition observation apparataus and method for human body
KR101464344B1 (en) * 2014-03-25 2014-11-25 (주)그린아이티코리아 Surveillance camera and image managing system, and method for detecting abnormal state by training normal state of surveillance image
US20160292983A1 (en) * 2015-04-05 2016-10-06 Smilables Inc. Wearable infant monitoring device
JP2018514814A (en) * 2015-04-05 2018-06-07 スマイラブルズ インコーポレイテッド High-function infant monitoring system, infant monitoring hub and infant learning acceptability detection system
KR101861613B1 (en) * 2016-08-08 2018-05-28 (주)허니냅스 Sensor device and operation method of sensor device, and newborn baby vomit detecting system using sensor device
JP2019217103A (en) * 2018-06-21 2019-12-26 ノーリツプレシジョン株式会社 Assistance system, assistance method, and assistance program
JP7196467B2 (en) * 2018-08-29 2022-12-27 カシオ計算機株式会社 Opening/closing state determination device, opening/closing state determination method, and program
JP6867701B2 (en) * 2018-12-03 2021-05-12 株式会社チームボックス Monitoring equipment, monitoring system, monitoring method and monitoring program
CN109756626B (en) * 2018-12-29 2021-09-24 维沃移动通信有限公司 Reminding method and mobile terminal
WO2020168468A1 (en) * 2019-02-19 2020-08-27 深圳市汇顶科技股份有限公司 Help-seeking method and device based on expression recognition, electronic apparatus and storage medium
JP7443283B2 (en) * 2021-03-29 2024-03-05 公益財団法人鉄道総合技術研究所 Wakefulness estimation method, wakefulness estimation device, and wakefulness estimation program

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001285844A (en) * 2000-03-31 2001-10-12 Matsushita Electric Ind Co Ltd Radio repeating and monitoring system
JP3495692B2 (en) * 2000-09-05 2004-02-09 積水化学工業株式会社 Watching service system
US6611206B2 (en) * 2001-03-15 2003-08-26 Koninklijke Philips Electronics N.V. Automatic system for monitoring independent person requiring occasional assistance
WO2005010535A2 (en) * 2003-07-22 2005-02-03 Ronjo Company Method of monitoring sleeping infant

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7569017B2 (en) 2019-07-04 2024-10-17 日本電気硝子株式会社 Exterior glass components, housings and doors

Also Published As

Publication number Publication date
JP2009157780A (en) 2009-07-16

Similar Documents

Publication Publication Date Title
JP5133677B2 (en) Monitoring system
US10595766B2 (en) Abnormal motion detector and monitor
US8493220B2 (en) Arrangement and method to wake up a sleeping subject at an advantageous time instant associated with natural arousal
EP1371042B1 (en) Automatic system for monitoring person requiring care and his/her caretaker automatic system for monitoring person requiring care and his/her caretaker
KR100838099B1 (en) Automatic system for monitoring independent person requiring occasional assistance
US20160345832A1 (en) System and method for monitoring biological status through contactless sensing
EP3432772B1 (en) Using visual context to timely trigger measuring physiological parameters
KR20170096901A (en) Infant Health Monitoring System
CN113384247A (en) Nursing system and automatic nursing method
US11574532B2 (en) Visible-light-image physiological monitoring system with thermal detecting assistance
CN113408477A (en) Infant sleep monitoring system, method and equipment
KR100961476B1 (en) Behavior pattern recognition device and its method
Siedel et al. Contactless interactive fall detection and sleep quality estimation for supporting elderly with incipient dementia
JP7388199B2 (en) Biological information collection system, biological information collection method and program
JP2023101317A (en) Sleep determination system, sleep determination method, and sleep determination program
CN220545148U (en) Infant monitoring system
CN213070105U (en) Turnover monitoring system based on IMU
CN105976163A (en) Alarm clock reminding method and system
Lavanya et al. Comprehensive Non-Intrusive Patient Monitoring System using Advanced AI and ML
JP2006247014A (en) Monitoring camera system

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20101118

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20120406

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20120508

A521 Request for written amendment filed

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20120628

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20121106

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20121108

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20151116

Year of fee payment: 3

R150 Certificate of patent or registration of utility model

Ref document number: 5133677

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

LAPS Cancellation because of no payment of annual fees