[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

WO2022070616A1 - Monitoring device, vehicle, monitoring method, and program - Google Patents

Monitoring device, vehicle, monitoring method, and program Download PDF

Info

Publication number
WO2022070616A1
WO2022070616A1 PCT/JP2021/029488 JP2021029488W WO2022070616A1 WO 2022070616 A1 WO2022070616 A1 WO 2022070616A1 JP 2021029488 W JP2021029488 W JP 2021029488W WO 2022070616 A1 WO2022070616 A1 WO 2022070616A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
monitoring device
image
abnormality
unit
Prior art date
Application number
PCT/JP2021/029488
Other languages
French (fr)
Japanese (ja)
Inventor
虎喜 岩丸
崚 武智
Original Assignee
本田技研工業株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 本田技研工業株式会社 filed Critical 本田技研工業株式会社
Priority to JP2022553515A priority Critical patent/JP7488908B2/en
Publication of WO2022070616A1 publication Critical patent/WO2022070616A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R25/00Fittings or systems for preventing or indicating unauthorised use or theft of vehicles
    • B60R25/30Detection related to theft or to other events relevant to anti-theft systems
    • B60R25/31Detection related to theft or to other events relevant to anti-theft systems of human presence inside or outside the vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras

Definitions

  • the present invention relates to a vehicle monitoring technique.
  • Patent Document 1 proposes a method of setting a security level according to the surrounding environment based on the sensing data of the surrounding environment of the vehicle.
  • an object of the present invention is to provide an advantageous technique for detecting an abnormality of a parked vehicle with a simple configuration / processing.
  • the monitoring device as one aspect of the present invention is a monitoring device for monitoring the state of a parked vehicle, and the photographing means provided on the vehicle and the monitoring device while the vehicle is parked.
  • An abnormality occurs in the vehicle when a specific means for specifying a characteristic portion of an image obtained by the photographing means and a change in the characteristic portion are detected in the image obtained by the photographing means while the vehicle is parked. It is characterized by comprising a determination means for determining that the vehicle has been used.
  • Block diagram showing a configuration example of a monitoring system The figure which shows the arrangement example of the photographing part (camera) in a vehicle
  • a flowchart showing a monitoring process for monitoring a parked vehicle The figure which shows an example of the image obtained in the shooting part
  • the figure which shows an example of the image obtained in the shooting part The figure which shows an example of the image obtained in the shooting part
  • the figure which shows an example of the image obtained in the shooting part The figure which shows an example of the image obtained in the shooting part
  • the figure which shows an example of the notification screen to a user The figure which shows an example of the notification screen to a user
  • the monitoring system according to the present invention is a system (device) for monitoring a parked vehicle and detecting an abnormality caused in the vehicle.
  • vehicle abnormalities include mischief to a parked vehicle and theft of the vehicle.
  • the vehicle may fall.
  • a motorcycle is exemplified as a saddle-mounted vehicle, but the present invention is also applied to other types of saddle-mounted vehicles such as tricycles and non-saddle-type vehicles such as four-wheeled vehicles.
  • the monitoring system according to the above can be applied.
  • FIG. 1 is a block diagram showing a configuration example of the monitoring system 100 of the present embodiment.
  • the monitoring system 100 of the present embodiment may include a monitoring device 10 provided in the vehicle V and an information terminal 20 owned by a user of the vehicle V.
  • the monitoring device 10 and the information terminal 20 can be connected to each other by using a communication method such as Bluetooth (registered trademark) or Wi-Fi (registered trademark), but can communicate with each other via the Internet. May be connected.
  • the monitoring device 10 may include, for example, a processing unit 11, a storage unit 12, a photographing unit 13, an audio output unit 14, and a communication unit 15 that are communicably connected to each other via the system bus 16.
  • the processing unit 11 is, for example, an ECU (Electronic Control Unit), and may include a processor typified by a CPU, a storage device such as a semiconductor memory, an interface with an external device, and the like.
  • the storage unit 12 stores a program executed by the processor, data used by the processor for processing, and the like, and the processing unit 11 reads the program stored in the storage unit 12 into a storage device such as a memory and executes the program. be able to.
  • the storage unit 12 stores an application program (monitoring program) for monitoring the parked vehicle V
  • the processing unit 11 stores the monitoring program stored in the storage unit 12 in memory. It can be read and executed by a storage device such as. That is, the processing unit 11 can be a computer that executes a monitoring program.
  • the processing unit 11 of the present embodiment may be provided with, for example, a specific unit 11a, a determination unit 11b, and a notification unit 11c.
  • the specific unit 11a identifies a characteristic unit of an image (image) obtained by the photographing unit 13 while the vehicle V is parked, for example, by using a known image processing technique.
  • the determination unit 11b determines that an abnormality has occurred in the vehicle V when the characteristic unit specified by the specific unit 11a changes in the image obtained by the photographing unit 13 while the vehicle V is parked.
  • the notification unit 11c notifies the user of the vehicle V (specifically, an information terminal) that the abnormality has occurred in the vehicle V via the communication unit 15. Notify 20).
  • the photographing unit 13 includes, for example, a camera, and is provided in the vehicle V so as to photograph a part of the vehicle V and / or the surroundings of the vehicle V and acquire an image (image).
  • the photographing unit 13 may take a picture so as to repeatedly (for example, periodically) acquire a still image, or may take a picture so as to acquire a moving image.
  • the photographing unit 13 of the present embodiment may include a plurality of cameras 13a to 13f arranged in the vehicle V as shown in FIG.
  • FIG. 2 is a view of the vehicle V as a saddle-mounted vehicle as viewed from above, and shows an example of arrangement of the photographing unit 13 (cameras 13a to 13f) in the vehicle V.
  • the fan shape of the broken line represents the shooting direction (shooting range) of each camera
  • "FR" indicates the front of the vehicle V
  • "RR” indicates the rear of the vehicle.
  • the camera 13a is provided on the vehicle V so as to acquire an image behind the vehicle V including an image of a part of the vehicle V (for example, the seat 32).
  • the camera 13b is provided on the vehicle V so as to acquire an image in front of the vehicle V including an image of a part of the vehicle V (for example, a bar handle 31 (handle), a seat 32).
  • the cameras 13c to 13f are provided on the vehicle V so as to acquire an image around the vehicle V.
  • the camera 13c can photograph the front of the vehicle V
  • the camera 13d can photograph the rear of the vehicle V
  • the camera 13e can photograph the left side of the vehicle V
  • the camera 13f can photograph the right side of the vehicle V.
  • the voice output unit 14 includes, for example, a speaker, and outputs voice to the outside of the vehicle V.
  • the voice output from the voice output unit 14 can be arbitrarily set, but may be, for example, a warning sound, music, a human voice, or the like.
  • the communication unit 15 is communicably connected to the information terminal 20 by using a communication method such as Bluetooth (registered trademark) or Wi-Fi (registered trademark).
  • the communication unit 15 may have a function as a transmission unit for transmitting information to the information terminal 20 and a function as a reception unit for receiving information from the information terminal 20.
  • the information terminal 20 is a device capable of transmitting and receiving information to and from the vehicle V (monitoring device 10) such as a smartphone or tablet terminal, and includes, for example, a processing unit 21, a storage unit 22, a display unit 23, and a communication unit 24.
  • the processing unit 21 includes a processor typified by a CPU, a storage device such as a semiconductor memory, an interface with an external device, and the like.
  • the storage unit 22 stores a program executed by the processor, data used by the processor for processing, and the like, and the processing unit 21 reads the program stored in the storage unit 22 into a storage device such as a memory and executes the program. be able to.
  • the display unit 23 is, for example, a display, and displays (outputs) various information such as information received from the monitoring device 10.
  • the communication unit 24 is communicably connected to the vehicle V (monitoring device 10) using a communication method such as Bluetooth (registered trademark) or Wi-Fi (registered trademark).
  • the communication unit 15 has a function as a transmission unit for transmitting information to the vehicle V (monitoring device 10) and a function as a reception unit for receiving information from the vehicle V (monitoring device 10). sell.
  • FIG. 3 is a flowchart showing a monitoring process performed by the processing unit 11.
  • the monitoring process of the vehicle V may be understood as a determination process (detection process) for determining (detecting) an abnormality of the vehicle V based on the image obtained by the photographing unit 13.
  • step S11 the processing unit 11 determines whether or not to start the monitoring process. For example, when the ignition (ignition switch) of the vehicle V is turned off, the processing unit 11 determines that the vehicle V has been parked (parking of the vehicle V has started), and determines that the monitoring process is started. be able to. Further, the processing unit 11 may determine that the monitoring process is started when the user (driver or the like) receives an instruction to start the monitoring process via an input unit (not shown) provided in the vehicle V. .. The processing unit 11 repeatedly executes step S11 until it is determined to start the monitoring process, and proceeds to step S12 when it is determined to start the monitoring process.
  • the processing unit 11 (specific unit 11a) captures the feature unit of the image obtained by the photographing unit 13 while the vehicle V is parked (specifically, after it is determined that the monitoring process is started in step S11). Identify.
  • the processing unit 11 can use a known image processing technique when specifying a feature portion of an image.
  • the processing unit 11 extracts and extracts a portion (feature point) having a feature amount such as a corner portion (corner), a curvature, a change in brightness, and a change in color in the image obtained by the photographing unit 13.
  • Information indicating the feature amount and positional relationship of the feature points can be specified (recognized) as the feature portion of the image.
  • the processing unit 11 is an object whose position (and / or posture) in the video does not change for a predetermined time (hereinafter, referred to as a target object). It is advisable to identify the characteristic part from. As a result, it is possible to avoid identifying the feature portion from the moving object (for example, a person or a car) in the image obtained by the shooting unit 13, and to reduce the false detection of the abnormality of the vehicle V. Can be done.
  • the predetermined time can be arbitrarily set, but as an example, it can be set within the range of several tens of seconds to several minutes.
  • FIG. 4A shows an example of the image 41 obtained by the photographing unit 13 (camera 13b).
  • the image 41 shown in FIG. 4A includes an image of the bar handle 31 and the seat 32 of the vehicle V.
  • the handle In the parked vehicle V, the handle is locked and the positional relationship between the camera 13b and the bar handle 31 is fixed. Therefore, unless there is an abnormality in the vehicle V (mischief, tipping, etc.), the image 41 is shown.
  • the position and / or posture of the image of the bar handle 31 does not change. Therefore, in the present embodiment, the processing unit 11 (specific unit 11a) determines the image of the bar handle 31 included in the image 41 as the target object, and the straight line Fa 1 connecting both ends of the bar handle 31 is the image 41.
  • the processing unit 11 (specific unit 11a) determines the image of the sheet 32 included in the image 41 as the target object and specifies a part Fa 2 of the image of the sheet 32 as the feature unit of the image 41. good.
  • FIG. 5A shows an example of the image 42 obtained by the photographing unit 13 (for example, the camera 13c).
  • the image 42 shown in FIG. 5A is an image around the vehicle V, and as an example, shows an image obtained by the camera 13c when the vehicle V is parked on the breakwater.
  • the processing unit 11 (specific unit 11a) determines a subject image whose position in the image 42 does not change for a predetermined time as a target object, and places Fb 1 to Fb 3 having a relatively large feature amount in the subject image. Each can be specified as a feature portion of the image 42.
  • the processing unit 11 may use a straight line portion (for example, a straight line connecting the locations Fb 1 and Fb 2 (horizontal line)) in the image 42 as a feature unit of the image 42.
  • the subject image as the target object may be a natural object or an artificial object.
  • natural objects that can be target objects include horizon, trees, and rocks.
  • artificial objects that can be target objects include ground (white lines on roads), signs, buildings (buildings, etc.), and the like.
  • step S13 the processing unit 11 (determination unit 11b) determines whether or not the change in the feature unit specified in step S12 is detected in the image newly obtained by the photographing unit 13 while the vehicle V is parked. ..
  • the processing unit 11 uses at least one of the position and the posture of the feature portion in the image used in step S12 as a reference, and the processing unit 11 uses one of the position and the posture of the feature portion in the image newly obtained by the photographing unit 13. Determines if has changed from the criteria. Then, when a change in the characteristic portion is detected in the newly obtained image, it is determined that an abnormality has occurred in the vehicle V, and the process proceeds to step S14.
  • the processing unit 11 may cause the voice output unit 14 to output a voice (for example, a warning sound).
  • step S14 the processing unit 11 (determination unit 11b) identifies the mode of change in the feature unit detected in step S13, and determines the type of abnormality in the vehicle V based on the mode of change in the identified feature unit. do.
  • the processing unit 11 determines the type of abnormality of the vehicle V based on the mode of change of the feature unit.
  • step S15 the processing unit 11 (notification unit 11c) notifies the user via the communication unit 15 that an abnormality has occurred in the parked vehicle V (specifically, the parked vehicle V). Information indicating that an abnormality has occurred is transmitted to the information terminal 20).
  • the types of the feature portions are the straight line Fa 1 connecting both ends of the bar handle 31, the partial Fa 2 of the sheet 32, and the feature quantity in the subject image, as described above with reference to FIGS. 4A and 5A. It is a relatively large place such as Fb 1 to Fb 3 .
  • the mode of change of the feature part is that at least one of the position and the posture of the feature part in the image changes, the positional relationship of a plurality of feature parts in the image changes, and the feature included in the image. It may include the fact that the part is no longer included.
  • the type of abnormality of the vehicle V may include a steering wheel lock or a seat lock release (that is, rotation of the steering wheel or opening / closing of the seat) unintentional by the user, a fall of the vehicle V, and the like.
  • step S13 if the change in the feature portion is not detected in the newly obtained image, the process proceeds to step S16.
  • the processing unit 11 determines whether or not to end the monitoring process. For example, the processing unit 11 can determine that the monitoring process is terminated when the ignition (ignition switch) of the vehicle V is turned on. Alternatively, when a certain period of time has elapsed from the start of parking of the vehicle V (for example, from the time when the ignition is turned off or when the user gives an instruction to start the monitoring process). It may be determined that the monitoring process is terminated.
  • step S16 it is determined that the vehicle V is parked in a safe place (that is, a place where the possibility of mischief, theft, or a fall is low) after a certain period of time has passed since the parking of the vehicle V was started. Because it can be done.
  • the fixed period can be arbitrarily set, but as an example, it can be set within the range of one day or several days to several weeks. If it is determined in step S16 that the monitoring process is not terminated, the process returns to step S13, and if it is determined that the monitoring process is terminated, this flowchart is terminated.
  • FIG. 4B shows an example of the image 41 newly obtained by the photographing unit 13 (camera 13b) in step S13.
  • the straight line Fa 1 (the straight line connecting both ends of the bar handle 31) specified as the feature portion in step S12 changes to the straight line Fa 1'as compared with the image 41 shown in FIG. 4A.
  • the processing unit 11 determines that the steering wheel lock is released and the bar handle is rotated as the type of abnormality of the vehicle V.
  • the processing unit 11 (notification unit 11c) provides the notification information for notifying the user of the abnormality of the vehicle V that the steering wheel lock is released, via the communication unit 15.
  • the notification information may include video data (image data) in which a change in the feature portion is detected.
  • the processing unit 21 of the information terminal 20 that has received the notification information can display the notification screen 23a for notifying the user of the abnormality of the vehicle V on the display unit 23 (display).
  • the notification screen 23a shown in FIG. 6A is provided with a display field 23a 1 for an image in which a change in a feature portion is detected and a display field 23a 2 for a comment that the handle lock is released.
  • the processing unit determines whether at least one of the position and the posture of a part Fa 2 of the image of the sheet 32 specified as the feature portion in step S12 changes.
  • the processing unit 11 provides the notification information for notifying the user of the abnormality of the vehicle V that the seat lock is released, via the communication unit 15, the information terminal of the user. Can be transmitted to 20.
  • FIG. 5B shows an example of the image 42 newly obtained by the photographing unit 13 (camera 13c) in step S13.
  • the positions of the locations Fb 1 to Fb 3 specified as feature portions in step S12 in the image change to Fb 1'to Fb 3 ' . ing. That is, the position and posture (inclination) of the shape (triangular shape) having the points Fb 1 to Fb 3 specified as the feature portions in step S12 as the vertices are changing.
  • step S14 the processing unit 11 (determination unit 11b) moves the parking position of the vehicle V (specifically, the vehicle V falls) as a type of abnormality of the vehicle V, and the vehicle V It can be determined that the camera 13c attached to the vehicle is tilted. Then, in step S15, the processing unit 11 (notification unit 11c) notifies the user of the abnormality of the vehicle V that the vehicle V has fallen, and the information terminal 20 of the user via the communication unit 15. Send to. As shown in FIG. 6B, the processing unit 21 of the information terminal 20 that has received the notification information can display the notification screen 23b for notifying the user of the abnormality of the vehicle V on the display unit 23 (display).
  • the notification screen 23b shown in FIG. 6B is provided with a display column 23b 1 for an image in which a change in a feature portion is detected and a display column 23b 2 for a comment that the parking position of the vehicle V has moved.
  • the characteristic portion of the image obtained by the photographing unit 13 is specified, and the vehicle V responds to the change in the characteristic portion in the image. Detect anomalies.
  • a sensor other than the photographing unit 13 for example, for example. , GPS sensor, gyro sensor, etc.
  • the user can quickly grasp the abnormality of the vehicle V.
  • the monitoring device of the above embodiment is A monitoring device (for example, 10) that monitors the state of a parked vehicle (for example, V).
  • a monitoring device for example, 10
  • Specific means for example, 11a
  • a feature portion for example, Fa 1 to Fa 2 , Fb 1 to Fb 3
  • an image for example, 41, 42
  • the vehicle is provided with a determination means (for example, 11b) for determining that an abnormality has occurred in the vehicle when a change in the characteristic portion is detected in an image obtained by the photographing means while the vehicle is parked.
  • the photographing means it is possible to accurately detect an abnormality in the vehicle based on the image obtained by the photographing means. Further, since it is not necessary to use a sensor other than the photographing means (camera) when detecting the abnormality of the vehicle, it may be advantageous in reducing the vehicle cost.
  • the specific means identifies the characteristic portion from a target object whose position in the image does not change for a predetermined time among a plurality of objects included in the image obtained by the photographing means while the vehicle is parked. According to this embodiment, it is possible to avoid identifying the feature portion from a moving object (for example, a person or a car) in the image obtained by the photographing means, and it is possible to reduce false detection of an abnormality in the vehicle. be able to.
  • the photographing means is provided on the vehicle so as to photograph a part of the vehicle (for example, 31, 32).
  • the target object is an image of a part of the vehicle included in the image obtained by the photographing means. According to this embodiment, it is possible to detect an abnormality such as mischief or theft of the vehicle.
  • the vehicle is a saddle-mounted vehicle.
  • the portion of the vehicle photographed by the imaging means includes any one of a steering wheel (eg, 31) and a seat (eg, 32). According to this embodiment, it is possible to accurately detect the operation of the steering wheel (release of the steering wheel lock) and the opening / closing of the seat (release of the seat lock) for the parked vehicle.
  • the photographing means is provided on the vehicle so as to photograph the surroundings of the vehicle.
  • the target object is a subject image around the vehicle included in the image obtained by the photographing means. According to this embodiment, in addition to mischief and theft of the vehicle, it is possible to detect an abnormality such as a movement of the parking position of the vehicle (including a fall of the vehicle).
  • the subject image includes an image of any one of the ground, a building, and a natural object. According to this embodiment, it is possible to accurately detect the movement of the parking position of the vehicle such as the fall of the vehicle.
  • the determination means determines the type of abnormality of the vehicle based on the mode of change of the feature portion in the image obtained by the photographing means. According to this embodiment, it is possible to make the user grasp detailed information about the abnormality of the vehicle, such as whether the steering wheel lock is released or the vehicle has fallen.
  • the determination means determines that an abnormality has occurred in the vehicle when at least one of the position and the posture of the feature portion in the image obtained by the photographing means changes. According to this embodiment, it is possible to accurately detect an abnormality in the vehicle based on a change in at least one of the position and the posture of the feature portion in the image.
  • the determination means determines that an abnormality has occurred in the vehicle
  • the determination means further includes a notification means (for example, 11c) for notifying the user of the vehicle that the abnormality has occurred in the vehicle.
  • the user of the vehicle can quickly grasp that an abnormality has occurred in the vehicle.
  • the determination means ends the abnormality determination process of the vehicle when a certain period of time has elapsed from the start of parking of the vehicle. According to this embodiment, if there is no abnormality in the vehicle (that is, change in the characteristic portion) in a certain period of time, it can be determined that the possibility of abnormality in the vehicle is low. By terminating the determination process (vehicle monitoring process), it is possible to reduce the decrease in the remaining battery level in the vehicle V.
  • the determination means starts the determination process assuming that parking of the vehicle has started when the ignition of the vehicle is turned off. According to this embodiment, it is possible to determine the start of parking of the vehicle and automatically start the determination process (vehicle start process) by using the turning off of the ignition of the vehicle as a trigger.
  • the determination means starts the process of determining an abnormality of the vehicle when the ignition of the vehicle is turned off, and ends when the ignition of the vehicle is turned on. According to this embodiment, it is possible to determine the parking start / parking end of the vehicle and automatically start / end the determination process (vehicle start process) by using the vehicle ignition off / on as a trigger.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Emergency Alarm Devices (AREA)
  • Alarm Systems (AREA)

Abstract

A monitoring device (10) that monitors the state of a vehicle while parked comprises: a photographing means (13) provided to the vehicle; an identification means (11a) that identifies a feature part of an image obtained by the photographing means while the vehicle is parked; and a determination means (11b) that determines that an abnormality has occurred in the vehicle if a change in the feature part in the image obtained by the photographing means is detected while the vehicle is parked.

Description

監視装置、車両、監視方法、およびプログラムMonitoring equipment, vehicles, monitoring methods, and programs
 本発明は、車両の監視技術に関するものである。 The present invention relates to a vehicle monitoring technique.
 近年、駐車中の車両のセキュリティを向上させることが求められている。特許文献1には、車両の周辺環境のセンシングデータに基づいて、当該周辺環境に応じたセキュリティレベルを設定する方法が提案されている。 In recent years, it has been required to improve the security of parked vehicles. Patent Document 1 proposes a method of setting a security level according to the surrounding environment based on the sensing data of the surrounding environment of the vehicle.
特開2008-9615号公報Japanese Unexamined Patent Publication No. 2008-9615
 例えば鞍乗型車両では、駐車中にいたずらや盗難、転倒といった車両の異常が生じ易いが、車両コストの増加を抑制するとの観点から、単純な構成・処理で駐車中の車両の異常を検知することが望まれている。 For example, in a saddle-mounted vehicle, vehicle abnormalities such as mischief, theft, and tipping are likely to occur during parking, but from the viewpoint of suppressing an increase in vehicle cost, abnormalities in the parked vehicle are detected with a simple configuration and processing. Is desired.
 そこで、本発明は、単純な構成・処理で駐車中の車両の異常を検知するために有利な技術を提供することを目的とする。 Therefore, an object of the present invention is to provide an advantageous technique for detecting an abnormality of a parked vehicle with a simple configuration / processing.
 上記目的を達成するために、本発明の一側面としての監視装置は、駐車中の車両の状態を監視する監視装置であって、前記車両に設けられた撮影手段と、前記車両の駐車中に前記撮影手段で得られた映像の特徴部を特定する特定手段と、前記車両の駐車中に前記撮影手段で得られた映像において前記特徴部の変化を検出した場合に、前記車両に異常が生じたと判定する判定手段と、を備えることを特徴とする。 In order to achieve the above object, the monitoring device as one aspect of the present invention is a monitoring device for monitoring the state of a parked vehicle, and the photographing means provided on the vehicle and the monitoring device while the vehicle is parked. An abnormality occurs in the vehicle when a specific means for specifying a characteristic portion of an image obtained by the photographing means and a change in the characteristic portion are detected in the image obtained by the photographing means while the vehicle is parked. It is characterized by comprising a determination means for determining that the vehicle has been used.
 本発明によれば、例えば、単純な構成・処理で駐車中の車両の異常を検知するために有利な技術を提供することができる。 According to the present invention, for example, it is possible to provide an advantageous technique for detecting an abnormality of a parked vehicle with a simple configuration / processing.
 本発明のその他の特徴及び利点は、添付図面を参照とした以下の説明により明らかになるであろう。なお、添付図面においては、同じ若しくは同様の構成には、同じ参照番号を付す。 Other features and advantages of the present invention will be clarified by the following description with reference to the accompanying drawings. In the attached drawings, the same or similar configurations are given the same reference numbers.
 添付図面は明細書に含まれ、その一部を構成し、本発明の実施の形態を示し、その記述と共に本発明の原理を説明するために用いられる。
監視システムの構成例を示すブロック図 車両における撮影部(カメラ)の配置例を示す図 駐車中の車両を監視する監視処理を示すフローチャート 撮影部で得られた映像の一例を示す図 撮影部で得られた映像の一例を示す図 撮影部で得られた映像の一例を示す図 撮影部で得られた映像の一例を示す図 ユーザへの通知画面の一例を示す図 ユーザへの通知画面の一例を示す図
The accompanying drawings are included in the specification and are used to form a part thereof, show embodiments of the present invention, and explain the principles of the present invention together with the description thereof.
Block diagram showing a configuration example of a monitoring system The figure which shows the arrangement example of the photographing part (camera) in a vehicle A flowchart showing a monitoring process for monitoring a parked vehicle The figure which shows an example of the image obtained in the shooting part The figure which shows an example of the image obtained in the shooting part The figure which shows an example of the image obtained in the shooting part The figure which shows an example of the image obtained in the shooting part The figure which shows an example of the notification screen to a user The figure which shows an example of the notification screen to a user
 以下、添付図面を参照して実施形態を詳しく説明する。なお、以下の実施形態は特許請求の範囲に係る発明を限定するものでなく、また実施形態で説明されている特徴の組み合わせの全てが発明に必須のものとは限らない。実施形態で説明されている複数の特徴のうち二つ以上の特徴が任意に組み合わされてもよい。また、同一若しくは同様の構成には同一の参照番号を付し、重複した説明は省略する。 Hereinafter, embodiments will be described in detail with reference to the attached drawings. It should be noted that the following embodiments do not limit the invention according to the claims, and not all combinations of features described in the embodiments are essential to the invention. Two or more of the plurality of features described in the embodiments may be arbitrarily combined. In addition, the same or similar configuration will be given the same reference number, and duplicated explanations will be omitted.
 <監視システムの構成>
 以下に、本発明に係る監視システム(監視装置)を鞍乗型車両に適用する例について説明する。本発明に係る監視システムは、駐車中の車両を監視し、当該車両に生じた異常を検知するためのシステム(装置)である。車両の異常としては、例えば駐車中の車両へのいたずらや車両の盗難などが挙げられ、例えば鞍乗型車両の場合には車両の転倒も挙げられうる。また、以下の実施形態では、鞍乗型車両として自動二輪車を例示して説明するが、三輪車などの他の形式の鞍乗型車両、四輪車など鞍乗型以外の車両にも、本発明に係る監視システムを適用することができる。
<Configuration of monitoring system>
An example of applying the monitoring system (monitoring device) according to the present invention to a saddle-mounted vehicle will be described below. The monitoring system according to the present invention is a system (device) for monitoring a parked vehicle and detecting an abnormality caused in the vehicle. Examples of vehicle abnormalities include mischief to a parked vehicle and theft of the vehicle. For example, in the case of a saddle-mounted vehicle, the vehicle may fall. Further, in the following embodiments, a motorcycle is exemplified as a saddle-mounted vehicle, but the present invention is also applied to other types of saddle-mounted vehicles such as tricycles and non-saddle-type vehicles such as four-wheeled vehicles. The monitoring system according to the above can be applied.
 図1は、本実施形態の監視システム100の構成例を示すブロック図である。本実施形態の監視システム100は、車両Vに設けられた監視装置10と、当該車両Vのユーザが所有する情報端末20とを含みうる。監視装置10と情報端末20とは、例えばBluetooth(登録商標)やWi-Fi(登録商標)などの通信方式を用いて相互に通信可能に接続されうるが、インターネットを介して相互に通信可能に接続されてもよい。 FIG. 1 is a block diagram showing a configuration example of the monitoring system 100 of the present embodiment. The monitoring system 100 of the present embodiment may include a monitoring device 10 provided in the vehicle V and an information terminal 20 owned by a user of the vehicle V. The monitoring device 10 and the information terminal 20 can be connected to each other by using a communication method such as Bluetooth (registered trademark) or Wi-Fi (registered trademark), but can communicate with each other via the Internet. May be connected.
 監視装置10は、例えば、システムバス16を介して相互に通信可能に接続された処理部11、記憶部12、撮影部13、音声出力部14、および通信部15を含みうる。処理部11は、例えばECU(Electronic Control Unit)であり、CPUに代表されるプロセッサ、半導体メモリ等の記憶デバイス、外部デバイスとのインタフェース等を含みうる。記憶部12には、プロセッサが実行するプログラムやプロセッサが処理に使用するデータ等が格納されており、処理部11は、記憶部12に記憶されたプログラムをメモリ等の記憶デバイスに読み出して実行することができる。本実施形態の場合、記憶部12には、駐車中の車両Vを監視するためのアプリケーションプログラム(監視プログラム)が格納されており、処理部11は、記憶部12に記憶された監視プログラムをメモリ等の記憶デバイスに読み出して実行しうる。つまり、処理部11は、監視プログラムを実行するコンピュータとなりうる。 The monitoring device 10 may include, for example, a processing unit 11, a storage unit 12, a photographing unit 13, an audio output unit 14, and a communication unit 15 that are communicably connected to each other via the system bus 16. The processing unit 11 is, for example, an ECU (Electronic Control Unit), and may include a processor typified by a CPU, a storage device such as a semiconductor memory, an interface with an external device, and the like. The storage unit 12 stores a program executed by the processor, data used by the processor for processing, and the like, and the processing unit 11 reads the program stored in the storage unit 12 into a storage device such as a memory and executes the program. be able to. In the case of the present embodiment, the storage unit 12 stores an application program (monitoring program) for monitoring the parked vehicle V, and the processing unit 11 stores the monitoring program stored in the storage unit 12 in memory. It can be read and executed by a storage device such as. That is, the processing unit 11 can be a computer that executes a monitoring program.
 また、本実施形態の処理部11には、例えば、特定部11aと、判定部11bと、通知部11cとが設けられうる。特定部11aは、例えば公知の画像処理技術を用いて、車両Vの駐車中に撮影部13で得られた映像(画像)の特徴部を特定する。判定部11bは、車両Vの駐車中に撮影部13で得られた映像において、特定部11aで特定された特徴部に変化が生じた場合、当該車両Vに異常が生じたと判定する。通知部11cは、判定部11bで車両Vに異常が生じたと判定された場合に、当該車両Vに異常が生じた旨を通信部15を介して当該車両Vのユーザ(具体的には情報端末20)に通知する。 Further, the processing unit 11 of the present embodiment may be provided with, for example, a specific unit 11a, a determination unit 11b, and a notification unit 11c. The specific unit 11a identifies a characteristic unit of an image (image) obtained by the photographing unit 13 while the vehicle V is parked, for example, by using a known image processing technique. The determination unit 11b determines that an abnormality has occurred in the vehicle V when the characteristic unit specified by the specific unit 11a changes in the image obtained by the photographing unit 13 while the vehicle V is parked. When the determination unit 11b determines that an abnormality has occurred in the vehicle V, the notification unit 11c notifies the user of the vehicle V (specifically, an information terminal) that the abnormality has occurred in the vehicle V via the communication unit 15. Notify 20).
 撮影部13は、例えばカメラを含み、車両Vの一部および/または車両Vの周囲を撮影して映像(画像)を取得するように車両Vに設けられる。例えば、撮影部13は、静止画を繰り返し(例えば周期的に)取得するように撮影を行ってもよいし、動画を取得するように撮影を行ってもよい。また、本実施形態の撮影部13は、図2に示すように車両Vに配置された複数のカメラ13a~13fを含みうる。図2は、鞍乗型車両としての車両Vを上方から見た図であり、車両Vにおける撮影部13(カメラ13a~13f)の配置例を示している。図中において、破線の扇形状は各カメラの撮影方向(撮影範囲)を表しており、「FR」は車両Vの前方、「RR」は車両の後方をそれぞれ示している。図2に示す例では、カメラ13aは、車両Vの一部(例えばシート32)の像を含む車両Vの後方の映像を取得するように車両Vに設けられている。カメラ13bは、車両Vの一部(例えばバーハンドル31(ハンドル)、シート32)の像を含む車両Vの前方の映像を取得するように車両Vに設けられている。カメラ13c~13fは、車両Vの周囲の映像を取得するように車両Vに設けられている。カメラ13cは車両Vの前方、カメラ13dは車両Vの後方、カメラ13eは車両Vの左側方、カメラ13fは車両Vの右側方をそれぞれ撮影しうる。 The photographing unit 13 includes, for example, a camera, and is provided in the vehicle V so as to photograph a part of the vehicle V and / or the surroundings of the vehicle V and acquire an image (image). For example, the photographing unit 13 may take a picture so as to repeatedly (for example, periodically) acquire a still image, or may take a picture so as to acquire a moving image. Further, the photographing unit 13 of the present embodiment may include a plurality of cameras 13a to 13f arranged in the vehicle V as shown in FIG. FIG. 2 is a view of the vehicle V as a saddle-mounted vehicle as viewed from above, and shows an example of arrangement of the photographing unit 13 (cameras 13a to 13f) in the vehicle V. In the figure, the fan shape of the broken line represents the shooting direction (shooting range) of each camera, "FR" indicates the front of the vehicle V, and "RR" indicates the rear of the vehicle. In the example shown in FIG. 2, the camera 13a is provided on the vehicle V so as to acquire an image behind the vehicle V including an image of a part of the vehicle V (for example, the seat 32). The camera 13b is provided on the vehicle V so as to acquire an image in front of the vehicle V including an image of a part of the vehicle V (for example, a bar handle 31 (handle), a seat 32). The cameras 13c to 13f are provided on the vehicle V so as to acquire an image around the vehicle V. The camera 13c can photograph the front of the vehicle V, the camera 13d can photograph the rear of the vehicle V, the camera 13e can photograph the left side of the vehicle V, and the camera 13f can photograph the right side of the vehicle V.
 音声出力部14は、例えばスピーカを含み、車両Vの外部に音声を出力する。音声出力部14から出力される音声は、任意に設定可能であるが、例えば警告音や音楽、人の声などでありうる。また、通信部15は、例えばBluetooth(登録商標)やWi-Fi(登録商標)などの通信方式を用いて情報端末20と通信可能に接続される。具体的には、通信部15は、情報端末20に情報を送信する送信部としての機能と、情報端末20から情報を受信する受信部としての機能とを有しうる。 The voice output unit 14 includes, for example, a speaker, and outputs voice to the outside of the vehicle V. The voice output from the voice output unit 14 can be arbitrarily set, but may be, for example, a warning sound, music, a human voice, or the like. Further, the communication unit 15 is communicably connected to the information terminal 20 by using a communication method such as Bluetooth (registered trademark) or Wi-Fi (registered trademark). Specifically, the communication unit 15 may have a function as a transmission unit for transmitting information to the information terminal 20 and a function as a reception unit for receiving information from the information terminal 20.
 情報端末20は、例えばスマートフォンやタブレット端末など車両V(監視装置10)と情報の送受信を行うことができる装置であり、例えば、処理部21、記憶部22、表示部23、および通信部24を含みうる。処理部21は、CPUに代表されるプロセッサ、半導体メモリ等の記憶デバイス、外部デバイスとのインタフェース等を含む。記憶部22には、プロセッサが実行するプログラムやプロセッサが処理に使用するデータ等が格納されており、処理部21は、記憶部22に記憶されたプログラムをメモリ等の記憶デバイスに読み出して実行することができる。表示部23は、例えばディスプレイであり、監視装置10から受信した情報など様々な情報を表示(出力)する。通信部24は、例えばBluetooth(登録商標)やWi-Fi(登録商標)などの通信方式を用いて車両V(監視装置10)と通信可能に接続される。具体的には、通信部15は、車両V(監視装置10)に情報を送信する送信部としての機能と、車両V(監視装置10)から情報を受信する受信部としての機能とを有しうる。 The information terminal 20 is a device capable of transmitting and receiving information to and from the vehicle V (monitoring device 10) such as a smartphone or tablet terminal, and includes, for example, a processing unit 21, a storage unit 22, a display unit 23, and a communication unit 24. Can include. The processing unit 21 includes a processor typified by a CPU, a storage device such as a semiconductor memory, an interface with an external device, and the like. The storage unit 22 stores a program executed by the processor, data used by the processor for processing, and the like, and the processing unit 21 reads the program stored in the storage unit 22 into a storage device such as a memory and executes the program. be able to. The display unit 23 is, for example, a display, and displays (outputs) various information such as information received from the monitoring device 10. The communication unit 24 is communicably connected to the vehicle V (monitoring device 10) using a communication method such as Bluetooth (registered trademark) or Wi-Fi (registered trademark). Specifically, the communication unit 15 has a function as a transmission unit for transmitting information to the vehicle V (monitoring device 10) and a function as a reception unit for receiving information from the vehicle V (monitoring device 10). sell.
 <監視処理>
 次に、監視プログラムが実行されたときに処理部11で行われる処理(監視処理)のフローについて説明する。図3は、処理部11で行われる監視処理を示すフローチャートである。本フローチャートでは、撮影部13による撮影が繰り返し(連続的に、周期的に)行われているものとする。例えば、本フローチャートでは、撮影部13により動画が取得されているものとする。なお、車両Vの監視処理は、撮影部13で得られた映像に基づいて車両Vの異常を判定(検知)する判定処理(検知処理)として理解されてもよい。
<Monitoring process>
Next, a flow of processing (monitoring processing) performed by the processing unit 11 when the monitoring program is executed will be described. FIG. 3 is a flowchart showing a monitoring process performed by the processing unit 11. In this flowchart, it is assumed that the photographing by the photographing unit 13 is repeatedly (continuously and periodically) performed. For example, in this flowchart, it is assumed that the moving image is acquired by the photographing unit 13. The monitoring process of the vehicle V may be understood as a determination process (detection process) for determining (detecting) an abnormality of the vehicle V based on the image obtained by the photographing unit 13.
 ステップS11では、処理部11は、監視処理を開始するか否かを判断する。例えば、処理部11は、車両Vのイグニッション(イグニッションスイッチ)がオフされた場合に、車両Vが駐車された(車両Vの駐車が開始された)と判断して、監視処理を開始すると判断することができる。また、処理部11は、車両Vに備えられた入力部(不図示)を介してユーザ(運転者など)から監視処理の開始指示を受け付けた場合に、監視処理を開始すると判断してもよい。処理部11は、監視処理を開始すると判断するまでステップS11を繰り返し実行し、監視処理を開始すると判断した場合にステップS12に進む。 In step S11, the processing unit 11 determines whether or not to start the monitoring process. For example, when the ignition (ignition switch) of the vehicle V is turned off, the processing unit 11 determines that the vehicle V has been parked (parking of the vehicle V has started), and determines that the monitoring process is started. be able to. Further, the processing unit 11 may determine that the monitoring process is started when the user (driver or the like) receives an instruction to start the monitoring process via an input unit (not shown) provided in the vehicle V. .. The processing unit 11 repeatedly executes step S11 until it is determined to start the monitoring process, and proceeds to step S12 when it is determined to start the monitoring process.
 ステップS12では、処理部11(特定部11a)は、車両Vの駐車中(具体的には、ステップS11で監視処理を開始すると判断した後)に撮影部13で得られた映像の特徴部を特定する。例えば、処理部11は、映像の特徴部を特定する際に、公知の画像処理技術を用いることができる。一例として、処理部11は、撮影部13で得られた映像内において、角部(コーナー)や曲率、明度の変化、色の変化などの特徴量を有する部分(特徴点)を抽出し、抽出した特徴点の特徴量や位置関係などを示す情報を映像の特徴部として特定(認識)することができる。 In step S12, the processing unit 11 (specific unit 11a) captures the feature unit of the image obtained by the photographing unit 13 while the vehicle V is parked (specifically, after it is determined that the monitoring process is started in step S11). Identify. For example, the processing unit 11 can use a known image processing technique when specifying a feature portion of an image. As an example, the processing unit 11 extracts and extracts a portion (feature point) having a feature amount such as a corner portion (corner), a curvature, a change in brightness, and a change in color in the image obtained by the photographing unit 13. Information indicating the feature amount and positional relationship of the feature points can be specified (recognized) as the feature portion of the image.
 ここで、処理部11は、撮影部13で得られた映像内に含まれるオブジェクトのうち、当該映像内における位置(および/または姿勢)が所定時間変化しないオブジェクト(以下では、対象オブジェクトと称する)から特徴部を特定するとよい。これにより、撮影部13で得られた映像内において移動しているオブジェクト(例えば人や車など)から特徴部を特定することを回避することができ、車両Vの異常の誤検知を低減することができる。なお、所定時間は、任意に設定可能であるが、一例として数十秒~数分の範囲内で設定されうる。 Here, among the objects included in the video obtained by the shooting unit 13, the processing unit 11 is an object whose position (and / or posture) in the video does not change for a predetermined time (hereinafter, referred to as a target object). It is advisable to identify the characteristic part from. As a result, it is possible to avoid identifying the feature portion from the moving object (for example, a person or a car) in the image obtained by the shooting unit 13, and to reduce the false detection of the abnormality of the vehicle V. Can be done. The predetermined time can be arbitrarily set, but as an example, it can be set within the range of several tens of seconds to several minutes.
 図4Aは、撮影部13(カメラ13b)で得られた映像41の一例を示している。図4Aに示される映像41には、車両Vのバーハンドル31およびシート32の像が含まれている。駐車中の車両Vでは、ハンドルロックが掛けられており、カメラ13bとバーハンドル31との位置関係が固定されているため、車両Vの異常(いたずらや転倒等)がない限り、映像41内におけるバーハンドル31の像の位置および/または姿勢は変化しない。したがって、本実施形態では、処理部11(特定部11a)は、映像41に含まれるバーハンドル31の像を対象オブジェクトとして決定し、当該バーハンドル31の両端を繋ぐ直線Faを当該映像41の特徴部として特定することができる。また、カメラ13bが車体フレームに固定されている場合、カメラ13bとシート32との位置関係が固定されるため、車両Vの異常がない限り、映像41内におけるシート32の像の位置および/または姿勢は変化しない。したがって、処理部11(特定部11a)は、映像41に含まれるシート32の像を対象オブジェクトとして決定し、当該シート32の像の一部Faを当該映像41の特徴部として特定してもよい。 FIG. 4A shows an example of the image 41 obtained by the photographing unit 13 (camera 13b). The image 41 shown in FIG. 4A includes an image of the bar handle 31 and the seat 32 of the vehicle V. In the parked vehicle V, the handle is locked and the positional relationship between the camera 13b and the bar handle 31 is fixed. Therefore, unless there is an abnormality in the vehicle V (mischief, tipping, etc.), the image 41 is shown. The position and / or posture of the image of the bar handle 31 does not change. Therefore, in the present embodiment, the processing unit 11 (specific unit 11a) determines the image of the bar handle 31 included in the image 41 as the target object, and the straight line Fa 1 connecting both ends of the bar handle 31 is the image 41. It can be specified as a feature part. Further, when the camera 13b is fixed to the vehicle body frame, the positional relationship between the camera 13b and the seat 32 is fixed. Therefore, unless there is an abnormality in the vehicle V, the position and / or the position of the image of the seat 32 in the image 41. The posture does not change. Therefore, even if the processing unit 11 (specific unit 11a) determines the image of the sheet 32 included in the image 41 as the target object and specifies a part Fa 2 of the image of the sheet 32 as the feature unit of the image 41. good.
 図5Aは、撮影部13(例えばカメラ13c)で得られた映像42の一例を示している。図5Aに示される映像42は、車両Vの周囲の映像であり、一例として、防波堤に車両Vを駐車したときにカメラ13cで得られた映像を示している。この場合、処理部11(特定部11a)は、映像42内における位置が所定時間変化しない被写体像を対象オブジェクトとして決定し、当該被写体像において特徴量が比較的大きい箇所Fb~Fbを、当該映像42の特徴部としてそれぞれ特定しうる。また、処理部11は、映像42内における直線部分(例えば、箇所FbとFbとを繋ぐ直線(水平線))を当該映像42の特徴部としてもよい。ここで、対象オブジェクトとしての被写体像は、自然物であってもよいし、人工物であってもよい。対象オブジェクトとなりうる自然物としては、例えば、水平線、樹木、岩などが挙げられうる。また、対象オブジェクトとなりうる人工物としては、地面(道路の白線)や標識、建物(ビル等)などが挙げられうる。 FIG. 5A shows an example of the image 42 obtained by the photographing unit 13 (for example, the camera 13c). The image 42 shown in FIG. 5A is an image around the vehicle V, and as an example, shows an image obtained by the camera 13c when the vehicle V is parked on the breakwater. In this case, the processing unit 11 (specific unit 11a) determines a subject image whose position in the image 42 does not change for a predetermined time as a target object, and places Fb 1 to Fb 3 having a relatively large feature amount in the subject image. Each can be specified as a feature portion of the image 42. Further, the processing unit 11 may use a straight line portion (for example, a straight line connecting the locations Fb 1 and Fb 2 (horizontal line)) in the image 42 as a feature unit of the image 42. Here, the subject image as the target object may be a natural object or an artificial object. Examples of natural objects that can be target objects include horizon, trees, and rocks. In addition, examples of artificial objects that can be target objects include ground (white lines on roads), signs, buildings (buildings, etc.), and the like.
 ステップS13では、処理部11(判定部11b)は、車両Vの駐車中に撮影部13で新たに得られた映像において、ステップS12で特定した特徴部の変化を検出したか否かを判断する。例えば、処理部11は、ステップS12で用いた映像内における特徴部の位置および姿勢の少なくとも一方を基準として、撮影部13で新たに得られた映像における当該特徴部の位置および姿勢のいずれか一方が当該基準から変化したか否かを判断する。そして、新たに得られた映像において特徴部の変化を検出した場合には、車両Vに異常が生じたと判定してステップS14に進む。なお、処理部11は、特徴部の変化を検出した場合に、音声出力部14に音声(例えば警告音)を出力させてもよい。 In step S13, the processing unit 11 (determination unit 11b) determines whether or not the change in the feature unit specified in step S12 is detected in the image newly obtained by the photographing unit 13 while the vehicle V is parked. .. For example, the processing unit 11 uses at least one of the position and the posture of the feature portion in the image used in step S12 as a reference, and the processing unit 11 uses one of the position and the posture of the feature portion in the image newly obtained by the photographing unit 13. Determines if has changed from the criteria. Then, when a change in the characteristic portion is detected in the newly obtained image, it is determined that an abnormality has occurred in the vehicle V, and the process proceeds to step S14. When the processing unit 11 detects a change in the feature unit, the processing unit 11 may cause the voice output unit 14 to output a voice (for example, a warning sound).
 ステップS14では、処理部11(判定部11b)は、ステップS13で検出された特徴部の変化の態様を特定し、特定した特徴部の変化の態様に基づいて、車両Vの異常の種類を判定する。例えば、監視装置10の記憶部12には、特徴部の種類、特徴部の変化の態様、および車両Vの異常の種類の対応関係を示す情報が予め記憶されている。処理部11は、記憶部12に記憶されている当該情報を参照することにより、特徴部の変化の態様に基づいて、車両Vの異常の種類を判定することができる。次いで、ステップS15では、処理部11(通知部11c)は、駐車中の車両Vに異常が生じた旨を通信部15を介してユーザに通知する(具体的には、駐車中の車両Vに異常が生じたことを示す情報を情報端末20に送信する)。 In step S14, the processing unit 11 (determination unit 11b) identifies the mode of change in the feature unit detected in step S13, and determines the type of abnormality in the vehicle V based on the mode of change in the identified feature unit. do. For example, in the storage unit 12 of the monitoring device 10, information indicating the type of the characteristic unit, the mode of change of the characteristic unit, and the correspondence relationship between the types of abnormalities of the vehicle V is stored in advance. By referring to the information stored in the storage unit 12, the processing unit 11 can determine the type of abnormality of the vehicle V based on the mode of change of the feature unit. Next, in step S15, the processing unit 11 (notification unit 11c) notifies the user via the communication unit 15 that an abnormality has occurred in the parked vehicle V (specifically, the parked vehicle V). Information indicating that an abnormality has occurred is transmitted to the information terminal 20).
 ここで、特徴部の種類とは、図4Aおよび図5Aを参照して上述したように、バーハンドル31の両端を繋ぐ直線Faや、シート32の一部Fa、被写体像において特徴量が比較的大きい箇所Fb~Fbなどのことである。特徴部の変化の態様とは、映像内における特徴部の位置および姿勢の少なくとも一方が変化することや、映像内における複数の特徴部の位置関係が変化すること、映像内に含まれていた特徴部が含まれなくなったことなどを含みうる。また、車両Vの異常の種類とは、ユーザの意図しないハンドルロックやシートロックの解除(即ち、ハンドルの回転やシートの開閉)、車両Vの転倒などを含みうる。 Here, the types of the feature portions are the straight line Fa 1 connecting both ends of the bar handle 31, the partial Fa 2 of the sheet 32, and the feature quantity in the subject image, as described above with reference to FIGS. 4A and 5A. It is a relatively large place such as Fb 1 to Fb 3 . The mode of change of the feature part is that at least one of the position and the posture of the feature part in the image changes, the positional relationship of a plurality of feature parts in the image changes, and the feature included in the image. It may include the fact that the part is no longer included. Further, the type of abnormality of the vehicle V may include a steering wheel lock or a seat lock release (that is, rotation of the steering wheel or opening / closing of the seat) unintentional by the user, a fall of the vehicle V, and the like.
 また、ステップS13において、新たに得られた映像において特徴部の変化を検出しなった場合にはステップS16に進む。ステップS16では、処理部11は、監視処理を終了するか否かを判断する。例えば、処理部11は、車両Vのイグニッション(イグニッションスイッチ)がオンされた場合に、監視処理を終了すると判断することができる。或いは、処理部11は、車両Vの駐車が開始されてから(例えば、イグニッションがオフされたときから、または、ユーザから監視処理の開始指示があったときから)一定の期間が経過した場合に監視処理を終了すると判断してもよい。これは、車両Vの駐車が開始されてから一定の期間が経過した場合には、車両Vが安全な場所(即ち、いたずらや盗難、転倒の可能性が低い場所)に駐車されていると判断することができるからである。このように車両に異常が生じる可能性の低い場合に監視処理を終了することで、車両Vにおけるバッテリ残量の低下を低減することができる。なお、一定の期間は、任意に設定可能であるが、一例として1日、または、数日~数週間の範囲内で設定されうる。ステップS16で監視処理を終了しないと判断した場合にはステップS13に戻り、監視処理を終了すると判断した場合には本フローチャートを終了する。 Further, in step S13, if the change in the feature portion is not detected in the newly obtained image, the process proceeds to step S16. In step S16, the processing unit 11 determines whether or not to end the monitoring process. For example, the processing unit 11 can determine that the monitoring process is terminated when the ignition (ignition switch) of the vehicle V is turned on. Alternatively, when a certain period of time has elapsed from the start of parking of the vehicle V (for example, from the time when the ignition is turned off or when the user gives an instruction to start the monitoring process). It may be determined that the monitoring process is terminated. It is determined that the vehicle V is parked in a safe place (that is, a place where the possibility of mischief, theft, or a fall is low) after a certain period of time has passed since the parking of the vehicle V was started. Because it can be done. By ending the monitoring process when there is a low possibility that an abnormality will occur in the vehicle in this way, it is possible to reduce a decrease in the remaining battery level in the vehicle V. The fixed period can be arbitrarily set, but as an example, it can be set within the range of one day or several days to several weeks. If it is determined in step S16 that the monitoring process is not terminated, the process returns to step S13, and if it is determined that the monitoring process is terminated, this flowchart is terminated.
 次に、ステップS13~S15で実行される処理の具体例を説明する。
 図4Bは、ステップS13において撮影部13(カメラ13b)で新たに得られた映像41の一例を示している。図4Bに示される映像41では、図4Aに示される映像41と比べ、ステップS12で特徴部として特定された直線Fa(バーハンドル31の両端を繋ぐ直線)が直線Fa’に変化している。即ち、ステップS12で特徴部として特定された直線Faの位置および姿勢(傾き)が変化している。この場合、ステップS14において、処理部11(判定部11b)は、車両Vの異常の種類として、ハンドルロックが解除されてバーハンドルが回転したと判定することができる。そして、ステップS15において、処理部11(通知部11c)は、ハンドルロックが解除されているとの車両Vの異常をユーザに通知するための通知情報を、通信部15を介して当該ユーザの情報端末20に送信する。通知情報は、特徴部の変化が検知された映像データ(画像データ)を含んでもよい。通知情報を受信した情報端末20の処理部21は、図6Aに示すように、車両Vの異常をユーザに通知するための通知画面23aを表示部23(ディスプレイ)に表示しうる。図6Aに示される通知画面23aでは、特徴部の変化が検知された映像の表示欄23aと、ハンドルロックが解除されているとのコメントの表示欄23aとが設けられている。
Next, a specific example of the processing executed in steps S13 to S15 will be described.
FIG. 4B shows an example of the image 41 newly obtained by the photographing unit 13 (camera 13b) in step S13. In the image 41 shown in FIG. 4B, the straight line Fa 1 (the straight line connecting both ends of the bar handle 31) specified as the feature portion in step S12 changes to the straight line Fa 1'as compared with the image 41 shown in FIG. 4A. There is. That is, the position and posture (inclination) of the straight line Fa 1 specified as the feature portion in step S12 have changed. In this case, in step S14, the processing unit 11 (determination unit 11b) can determine that the steering wheel lock is released and the bar handle is rotated as the type of abnormality of the vehicle V. Then, in step S15, the processing unit 11 (notification unit 11c) provides the notification information for notifying the user of the abnormality of the vehicle V that the steering wheel lock is released, via the communication unit 15. Send to terminal 20. The notification information may include video data (image data) in which a change in the feature portion is detected. As shown in FIG. 6A, the processing unit 21 of the information terminal 20 that has received the notification information can display the notification screen 23a for notifying the user of the abnormality of the vehicle V on the display unit 23 (display). The notification screen 23a shown in FIG. 6A is provided with a display field 23a 1 for an image in which a change in a feature portion is detected and a display field 23a 2 for a comment that the handle lock is released.
 ここで、図4Bに図示していないが、ステップS12において特徴部として特定されたシート32の像の一部Faの位置および姿勢の少なくとも一方が変化した場合、処理部(判定部11)は、車両Vの異常の種類として、シートロックが解除されてシート32が開かれたと判定することができる。この場合においても、処理部11(通知部11d)は、シートロックが解除されているとの車両Vの異常をユーザに通知するための通知情報を、通信部15を介して当該ユーザの情報端末20に送信しうる。 Here, although not shown in FIG. 4B, when at least one of the position and the posture of a part Fa 2 of the image of the sheet 32 specified as the feature portion in step S12 changes, the processing unit (determination unit 11) changes. As a type of abnormality of the vehicle V, it can be determined that the seat lock is released and the seat 32 is opened. Also in this case, the processing unit 11 (notification unit 11d) provides the notification information for notifying the user of the abnormality of the vehicle V that the seat lock is released, via the communication unit 15, the information terminal of the user. Can be transmitted to 20.
 図5Bは、ステップS13において撮影部13(カメラ13c)で新たに得られた映像42の一例を示している。図5Bに示される映像42では、図5Aに示される映像42と比べ、ステップS12で特徴部として特定された箇所Fb~Fbの映像内の位置がFb’~Fb’に変化している。即ち、ステップS12で特徴部として特定された箇所Fb~Fbをそれぞれ頂点とする形状(三角形状)の位置および姿勢(傾き)が変化している。この場合、ステップS14において、処理部11(判定部11b)は、車両Vの異常の種類として、車両Vの駐車位置が移動して(具体的には、車両Vが転倒して)、車両Vに取り付けられたカメラ13cが傾いたと判定することができる。そして、ステップS15において、処理部11(通知部11c)は、車両Vが転倒したとの車両Vの異常をユーザに通知するための通知情報を、通信部15を介して当該ユーザの情報端末20に送信する。通知情報を受信した情報端末20の処理部21は、図6Bに示すように、車両Vの異常をユーザに通知するための通知画面23bを表示部23(ディスプリ)に表示しうる。図6Bに示される通知画面23bでは、特徴部の変化が検知された映像の表示欄23bと、車両Vの駐車位置が移動したとのコメントの表示欄23bとが設けられている。 FIG. 5B shows an example of the image 42 newly obtained by the photographing unit 13 (camera 13c) in step S13. In the image 42 shown in FIG. 5B, as compared with the image 42 shown in FIG. 5A, the positions of the locations Fb 1 to Fb 3 specified as feature portions in step S12 in the image change to Fb 1'to Fb 3 ' . ing. That is, the position and posture (inclination) of the shape (triangular shape) having the points Fb 1 to Fb 3 specified as the feature portions in step S12 as the vertices are changing. In this case, in step S14, the processing unit 11 (determination unit 11b) moves the parking position of the vehicle V (specifically, the vehicle V falls) as a type of abnormality of the vehicle V, and the vehicle V It can be determined that the camera 13c attached to the vehicle is tilted. Then, in step S15, the processing unit 11 (notification unit 11c) notifies the user of the abnormality of the vehicle V that the vehicle V has fallen, and the information terminal 20 of the user via the communication unit 15. Send to. As shown in FIG. 6B, the processing unit 21 of the information terminal 20 that has received the notification information can display the notification screen 23b for notifying the user of the abnormality of the vehicle V on the display unit 23 (display). The notification screen 23b shown in FIG. 6B is provided with a display column 23b 1 for an image in which a change in a feature portion is detected and a display column 23b 2 for a comment that the parking position of the vehicle V has moved.
 上述したように、本実施形態の監視システム100(監視装置10)では、撮影部13で得られた映像の特徴部を特定し、当該映像内で特徴部が変化したことに応じて車両Vの異常を検知する。これにより、撮影部13で得られた映像に基づいて車両Vの異常を精度よく検知することができるとともに、車両Vの異常を検知する際に撮影部13(カメラ)以外の他のセンサ(例えば、GPSセンサやジャイロセンサなど)を用いなくてよいため、車両コストの低減に有利となりうる。また、監視装置10で検知された車両Vの異常はユーザ(情報端末20)に通知されるため、当該ユーザは、車両Vの異常を迅速に把握することが可能となる。 As described above, in the monitoring system 100 (monitoring device 10) of the present embodiment, the characteristic portion of the image obtained by the photographing unit 13 is specified, and the vehicle V responds to the change in the characteristic portion in the image. Detect anomalies. As a result, it is possible to accurately detect the abnormality of the vehicle V based on the image obtained by the photographing unit 13, and when detecting the abnormality of the vehicle V, a sensor other than the photographing unit 13 (camera) (for example, for example). , GPS sensor, gyro sensor, etc.), which can be advantageous in reducing vehicle cost. Further, since the abnormality of the vehicle V detected by the monitoring device 10 is notified to the user (information terminal 20), the user can quickly grasp the abnormality of the vehicle V.
 <実施形態のまとめ>
 1.上記実施形態の監視装置は、
 駐車中の車両(例えばV)の状態を監視する監視装置(例えば10)であって、
 前記車両に設けられた撮影手段(例えば13)と、
 前記車両の駐車中に前記撮影手段で得られた映像(例えば、41、42)の特徴部(例えば、Fa~Fa、Fb~Fb)を特定する特定手段(例えば11a)と、
 前記車両の駐車中に前記撮影手段で得られた映像において前記特徴部の変化を検出した場合に、前記車両に異常が生じたと判定する判定手段(例えば11b)と、を備える。
 この実施形態によれば、撮影手段で得られた映像に基づいて車両の異常を精度よく検知することができる。また、車両の異常を検知する際に撮影手段(カメラ)以外の他のセンサを用いなくてよいため、車両コストの低減に有利となりうる。
<Summary of embodiments>
1. 1. The monitoring device of the above embodiment is
A monitoring device (for example, 10) that monitors the state of a parked vehicle (for example, V).
With the photographing means (for example, 13) provided in the vehicle,
Specific means (for example, 11a) for specifying a feature portion (for example, Fa 1 to Fa 2 , Fb 1 to Fb 3 ) of an image (for example, 41, 42) obtained by the photographing means while the vehicle is parked.
The vehicle is provided with a determination means (for example, 11b) for determining that an abnormality has occurred in the vehicle when a change in the characteristic portion is detected in an image obtained by the photographing means while the vehicle is parked.
According to this embodiment, it is possible to accurately detect an abnormality in the vehicle based on the image obtained by the photographing means. Further, since it is not necessary to use a sensor other than the photographing means (camera) when detecting the abnormality of the vehicle, it may be advantageous in reducing the vehicle cost.
 2.上記実施形態において、
 前記特定手段は、前記車両の駐車中に前記撮影手段で得られた映像内に含まれる複数のオブジェクトのうち、当該映像内における位置が所定時間変化しない対象オブジェクトから前記特徴部を特定する。
 この実施形態によれば、撮影手段で得られた映像内において移動しているオブジェクト(例えば人や車)から特徴部を特定することを回避することができ、車両の異常の誤検知を低減することができる。
2. 2. In the above embodiment
The specific means identifies the characteristic portion from a target object whose position in the image does not change for a predetermined time among a plurality of objects included in the image obtained by the photographing means while the vehicle is parked.
According to this embodiment, it is possible to avoid identifying the feature portion from a moving object (for example, a person or a car) in the image obtained by the photographing means, and it is possible to reduce false detection of an abnormality in the vehicle. be able to.
 3.上記実施形態において、
 前記撮影手段は、前記車両の一部(例えば31、32)を撮影するように前記車両に設けられ、
 前記対象オブジェクトは、前記撮影手段で得られた映像内に含まれる前記車両の一部の像である。
 この実施形態によれば、車両へのいたずらや盗難などの異常を検知することができる。
3. 3. In the above embodiment
The photographing means is provided on the vehicle so as to photograph a part of the vehicle (for example, 31, 32).
The target object is an image of a part of the vehicle included in the image obtained by the photographing means.
According to this embodiment, it is possible to detect an abnormality such as mischief or theft of the vehicle.
 4.上記実施形態において、
 前記車両は、鞍乗型車両であり、
 前記撮影手段により撮影される前記車両の一部は、ハンドル(例えば31)およびシート(例えば32)のいずれか1つを含む。
 この実施形態によれば、駐車中の車両に対するハンドルの操作(ハンドルロックの解除)やシートの開閉(シートロックの解除)を精度よく検知することができる。
4. In the above embodiment
The vehicle is a saddle-mounted vehicle.
The portion of the vehicle photographed by the imaging means includes any one of a steering wheel (eg, 31) and a seat (eg, 32).
According to this embodiment, it is possible to accurately detect the operation of the steering wheel (release of the steering wheel lock) and the opening / closing of the seat (release of the seat lock) for the parked vehicle.
 5.上記実施形態において、
 前記撮影手段は、前記車両の周囲を撮影するように前記車両に設けられ、
 前記対象オブジェクトは、前記撮影手段で得られた映像内に含まれる前記車両の周囲の被写体像である。
 この実施形態によれば、車両へのいたずらや盗難に加えて、車両の駐車位置の移動(車両の転倒を含む)などの異常を検知することができる。
5. In the above embodiment
The photographing means is provided on the vehicle so as to photograph the surroundings of the vehicle.
The target object is a subject image around the vehicle included in the image obtained by the photographing means.
According to this embodiment, in addition to mischief and theft of the vehicle, it is possible to detect an abnormality such as a movement of the parking position of the vehicle (including a fall of the vehicle).
 6.上記実施形態において、
 前記被写体像は、地面、建物および自然物のうちいずれか1つの像を含む。
 この実施形態によれば、車両の転倒など、車両の駐車位置の移動を精度よく検知することができる。
6. In the above embodiment
The subject image includes an image of any one of the ground, a building, and a natural object.
According to this embodiment, it is possible to accurately detect the movement of the parking position of the vehicle such as the fall of the vehicle.
 7.上記実施形態において、
 前記判定手段は、前記撮影手段で得られた映像における前記特徴部の変化の態様に基づいて、前記車両の異常の種類を判定する。
 この実施形態によれば、ハンドルロックが解除されたのか、若しくは車両が転倒したのかなど、車両の異常に関する詳細な情報をユーザに把握させることができる。
7. In the above embodiment
The determination means determines the type of abnormality of the vehicle based on the mode of change of the feature portion in the image obtained by the photographing means.
According to this embodiment, it is possible to make the user grasp detailed information about the abnormality of the vehicle, such as whether the steering wheel lock is released or the vehicle has fallen.
 8.上記実施形態において、
 前記判定手段は、前記撮影手段で得られた映像内における前記特徴部の位置および姿勢の少なくとも一方が変化した場合に、前記車両に異常が生じたと判定する。
 この実施形態によれば、映像内における特徴部の位置および姿勢の少なくとも一方の変化に基づいて、車両の異常を精度よく検知することができる。
8. In the above embodiment
The determination means determines that an abnormality has occurred in the vehicle when at least one of the position and the posture of the feature portion in the image obtained by the photographing means changes.
According to this embodiment, it is possible to accurately detect an abnormality in the vehicle based on a change in at least one of the position and the posture of the feature portion in the image.
 9.上記実施形態において、
 前記判定手段で前記車両に異常が生じたと判定された場合に、前記車両に異常が生じた旨を前記車両のユーザに通知する通知手段(例えば11c)を更に備える。
 この実施形態によれば、車両に異常が生じたことを当該車両のユーザが迅速に把握することができる。
9. In the above embodiment
When the determination means determines that an abnormality has occurred in the vehicle, the determination means further includes a notification means (for example, 11c) for notifying the user of the vehicle that the abnormality has occurred in the vehicle.
According to this embodiment, the user of the vehicle can quickly grasp that an abnormality has occurred in the vehicle.
 10.上記実施形態において、
 前記判定手段は、前記車両の異常の判定処理を、前記車両の駐車が開始されてから一定の期間が経過した場合に終了する。
 この実施形態によれば、一定の期間において車両の異常(即ち、特徴部の変化)が生じていない場合には、車両に異常が生じる可能性の低いと判断することができるため、この場合に判定処理(車両の監視処理)を終了することで、車両Vにおけるバッテリ残量の低下を低減することができる。
10. In the above embodiment
The determination means ends the abnormality determination process of the vehicle when a certain period of time has elapsed from the start of parking of the vehicle.
According to this embodiment, if there is no abnormality in the vehicle (that is, change in the characteristic portion) in a certain period of time, it can be determined that the possibility of abnormality in the vehicle is low. By terminating the determination process (vehicle monitoring process), it is possible to reduce the decrease in the remaining battery level in the vehicle V.
 11.上記実施形態において、
 前記判定手段は、前記車両のイグニッションがオフされた場合に前記車両の駐車が開始されたとして前記判定処理を開始する。
 この実施形態によれば、車両のイグニッションのオフをトリガとして、車両の駐車開始を判断し、判定処理(車両の開始処理)を自動で開始することができる。
11. In the above embodiment
The determination means starts the determination process assuming that parking of the vehicle has started when the ignition of the vehicle is turned off.
According to this embodiment, it is possible to determine the start of parking of the vehicle and automatically start the determination process (vehicle start process) by using the turning off of the ignition of the vehicle as a trigger.
 12.上記実施形態において、
 前記判定手段は、前記車両の異常の判定処理を、前記車両のイグニッションがオフされた場合に開始し、前記車両のイグニッションがオンされた場合に終了する。
 この実施形態によれば、車両のイグニッションのオフ/オンをトリガとして、車両の駐車開始/駐車終了を判断し、判定処理(車両の開始処理)を自動で開始/終了することができる。
12. In the above embodiment
The determination means starts the process of determining an abnormality of the vehicle when the ignition of the vehicle is turned off, and ends when the ignition of the vehicle is turned on.
According to this embodiment, it is possible to determine the parking start / parking end of the vehicle and automatically start / end the determination process (vehicle start process) by using the vehicle ignition off / on as a trigger.
 発明は上記の実施形態に制限されるものではなく、発明の要旨の範囲内で、種々の変形・変更が可能である。 The invention is not limited to the above embodiment, and various modifications and changes can be made within the scope of the gist of the invention.
 本願は、2020年9月30日提出の日本国特許出願特願2020-166008を基礎として優先権を主張するものであり、その記載内容の全てを、ここに援用する。 This application claims priority based on Japanese Patent Application No. 2020-166008 submitted on September 30, 2020, and all the contents thereof are incorporated herein by reference.
10:監視装置、11:処理部、11a:特定部、11b:判定部、11c:通知部、13:撮影部 10: Monitoring device, 11: Processing unit, 11a: Specific unit, 11b: Judgment unit, 11c: Notification unit, 13: Imaging unit

Claims (15)

  1.  駐車中の車両の状態を監視する監視装置であって、
     前記車両に設けられた撮影手段と、
     前記車両の駐車中に前記撮影手段で得られた映像の特徴部を特定する特定手段と、
     前記車両の駐車中に前記撮影手段で得られた映像において前記特徴部の変化を検出した場合に、前記車両に異常が生じたと判定する判定手段と、
     を備えることを特徴とする監視装置。
    A monitoring device that monitors the condition of parked vehicles.
    The shooting means provided in the vehicle and
    A specific means for specifying a feature portion of an image obtained by the photographing means while the vehicle is parked, and a specific means for specifying the feature portion.
    A determination means for determining that an abnormality has occurred in the vehicle when a change in the characteristic portion is detected in the image obtained by the photographing means while the vehicle is parked.
    A monitoring device characterized by being equipped with.
  2.  前記特定手段は、前記車両の駐車中に前記撮影手段で得られた映像内に含まれる複数のオブジェクトのうち、当該映像内における位置が所定時間変化しない対象オブジェクトから前記特徴部を特定する、ことを特徴とする請求項1に記載の監視装置。 The specific means identifies the characteristic portion from a target object whose position in the image does not change for a predetermined time among a plurality of objects included in the image obtained by the photographing means while the vehicle is parked. The monitoring device according to claim 1.
  3.  前記撮影手段は、前記車両の一部を撮影するように前記車両に設けられ、
     前記対象オブジェクトは、前記撮影手段で得られた映像内に含まれる前記車両の一部の像である、ことを特徴とする請求項2に記載の監視装置。
    The photographing means is provided on the vehicle so as to photograph a part of the vehicle.
    The monitoring device according to claim 2, wherein the target object is an image of a part of the vehicle included in the image obtained by the photographing means.
  4.  前記車両は、鞍乗型車両であり、
     前記撮影手段により撮影される前記車両の一部は、ハンドルおよびシートのいずれか1つを含む、ことを特徴とする請求項3に記載の監視装置。
    The vehicle is a saddle-mounted vehicle.
    The monitoring device according to claim 3, wherein a part of the vehicle photographed by the photographing means includes any one of a handle and a seat.
  5.  前記撮影手段は、前記車両の周囲を撮影するように前記車両に設けられ、
     前記対象オブジェクトは、前記撮影手段で得られた映像内に含まれる前記車両の周囲の被写体像である、ことを特徴とする請求項2に記載の監視装置。
    The photographing means is provided on the vehicle so as to photograph the surroundings of the vehicle.
    The monitoring device according to claim 2, wherein the target object is a subject image around the vehicle included in the image obtained by the photographing means.
  6.  前記被写体像は、地面、建物および自然物のうちいずれか1つの像を含む、ことを特徴とする請求項5に記載の監視装置。 The monitoring device according to claim 5, wherein the subject image includes an image of any one of a ground, a building, and a natural object.
  7.  前記判定手段は、前記撮影手段で得られた映像における前記特徴部の変化の態様に基づいて、前記車両の異常の種類を判定する、ことを特徴とする請求項1乃至6のいずれか1項に記載の監視装置。 One of claims 1 to 6, wherein the determination means determines the type of abnormality of the vehicle based on the mode of change of the feature portion in the image obtained by the photographing means. The monitoring device described in.
  8.  前記判定手段は、前記撮影手段で得られた映像内における前記特徴部の位置および姿勢の少なくとも一方が変化した場合に、前記車両に異常が生じたと判定する、ことを特徴とする請求項1乃至7のいずれか1項に記載の監視装置。 Claim 1 to claim 1, wherein the determination means determines that an abnormality has occurred in the vehicle when at least one of the position and the posture of the feature portion in the image obtained by the photographing means changes. The monitoring device according to any one of 7.
  9.  前記判定手段で前記車両に異常が生じたと判定された場合に、前記車両に異常が生じた旨を前記車両のユーザに通知する通知手段を更に備える、ことを特徴とする請求項1乃至8のいずれか1項に記載の監視装置。 The first to eighth aspects of claim 1 to 8, further comprising a notification means for notifying the user of the vehicle that an abnormality has occurred in the vehicle when the determination means determines that an abnormality has occurred in the vehicle. The monitoring device according to any one of the following items.
  10.  前記判定手段は、前記車両の異常の判定処理を、前記車両の駐車が開始されてから一定の期間が経過した場合に終了する、ことを特徴とする請求項1乃至9のいずれか1項に記載の監視装置。 The determination means according to any one of claims 1 to 9, wherein the determination means ends the abnormality determination process of the vehicle when a certain period of time has elapsed from the start of parking of the vehicle. The monitoring device described.
  11.  前記判定手段は、前記車両のイグニッションがオフされた場合に前記車両の駐車が開始されたとして前記判定処理を開始する、ことを特徴とする請求項10に記載の監視装置。 The monitoring device according to claim 10, wherein the determination means starts the determination process assuming that parking of the vehicle has started when the ignition of the vehicle is turned off.
  12.  前記判定手段は、前記車両の異常の判定処理を、前記車両のイグニッションがオフされた場合に開始し、前記車両のイグニッションがオンされた場合に終了する、ことを特徴とする請求項1乃至9のいずれか1項に記載の監視装置。 The determination means 1 to 9 are characterized in that the determination process for determining an abnormality of the vehicle is started when the ignition of the vehicle is turned off and ends when the ignition of the vehicle is turned on. The monitoring device according to any one of the above items.
  13.  請求項1乃至12のいずれか1項に記載の監視装置を備える車両。 A vehicle equipped with the monitoring device according to any one of claims 1 to 12.
  14.  車両に設けられた撮影手段を用いて駐車中の前記車両を監視する監視方法であって、
     前記車両の駐車中に前記撮影手段で得られた映像の特徴部を特定する特定工程と、
     前記車両の駐車中に前記撮影手段で得られた映像において前記特徴部に変化が生じた場合、前記車両に異常が生じたと判定する判定工程と、
     を含むことを特徴とする監視方法。
    It is a monitoring method for monitoring the parked vehicle by using a photographing means provided on the vehicle.
    A specific step of specifying a characteristic portion of an image obtained by the photographing means while the vehicle is parked, and
    A determination step of determining that an abnormality has occurred in the vehicle when a change occurs in the characteristic portion in the image obtained by the photographing means while the vehicle is parked.
    A monitoring method characterized by including.
  15.  請求項14に記載の監視方法の各工程をコンピュータに実行させるためのプログラム。 A program for causing a computer to execute each process of the monitoring method according to claim 14.
PCT/JP2021/029488 2020-09-30 2021-08-10 Monitoring device, vehicle, monitoring method, and program WO2022070616A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2022553515A JP7488908B2 (en) 2020-09-30 2021-08-10 Monitoring device, vehicle, monitoring method, and program

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020-166008 2020-09-30
JP2020166008 2020-09-30

Publications (1)

Publication Number Publication Date
WO2022070616A1 true WO2022070616A1 (en) 2022-04-07

Family

ID=80949893

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/029488 WO2022070616A1 (en) 2020-09-30 2021-08-10 Monitoring device, vehicle, monitoring method, and program

Country Status (2)

Country Link
JP (1) JP7488908B2 (en)
WO (1) WO2022070616A1 (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003276568A (en) * 2002-03-26 2003-10-02 Denso Corp Improper movement assuming device of vehicle
JP2006290172A (en) * 2005-04-11 2006-10-26 Fujitsu Ten Ltd Anti-theft device for vehicle and anti-theft method for vehicle
US20070122000A1 (en) * 2005-11-29 2007-05-31 Objectvideo, Inc. Detection of stationary objects in video
JP2016203886A (en) * 2015-04-27 2016-12-08 富士通テン株式会社 Detection device, detection system, detection method, and program
CN106652542A (en) * 2017-02-21 2017-05-10 上海量明科技发展有限公司 Method for detecting condition of sharing vehicle, unmanned aerial vehicle and system
US20180308293A1 (en) * 2017-04-19 2018-10-25 Ford Global Technologies, Llc Control module activation to monitor vehicles in a key-off state
JP2019091347A (en) * 2017-11-16 2019-06-13 トヨタ自動車株式会社 Driverless transport system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020140401A (en) 2019-02-27 2020-09-03 株式会社Jvcケンウッド Abnormality monitoring system, and, abnormality monitoring method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003276568A (en) * 2002-03-26 2003-10-02 Denso Corp Improper movement assuming device of vehicle
JP2006290172A (en) * 2005-04-11 2006-10-26 Fujitsu Ten Ltd Anti-theft device for vehicle and anti-theft method for vehicle
US20070122000A1 (en) * 2005-11-29 2007-05-31 Objectvideo, Inc. Detection of stationary objects in video
JP2016203886A (en) * 2015-04-27 2016-12-08 富士通テン株式会社 Detection device, detection system, detection method, and program
CN106652542A (en) * 2017-02-21 2017-05-10 上海量明科技发展有限公司 Method for detecting condition of sharing vehicle, unmanned aerial vehicle and system
US20180308293A1 (en) * 2017-04-19 2018-10-25 Ford Global Technologies, Llc Control module activation to monitor vehicles in a key-off state
JP2019091347A (en) * 2017-11-16 2019-06-13 トヨタ自動車株式会社 Driverless transport system

Also Published As

Publication number Publication date
JPWO2022070616A1 (en) 2022-04-07
JP7488908B2 (en) 2024-05-22

Similar Documents

Publication Publication Date Title
US20190092345A1 (en) Driving method, vehicle-mounted driving control terminal, remote driving terminal, and storage medium
US11265508B2 (en) Recording control device, recording control system, recording control method, and recording control program
JP6771673B2 (en) System for monitoring below autonomous driving vehicles
JP4643860B2 (en) VISUAL SUPPORT DEVICE AND SUPPORT METHOD FOR VEHICLE
CN109421453A (en) Trailer with prediction mounting angle function falls back auxiliary system
US8115809B2 (en) Vehicle surroundings monitoring apparatus
US9868422B2 (en) Control apparatus of brake system and method of controlling the same
JP7168367B2 (en) accident reporting device
JP7259661B2 (en) VEHICLE RECORDING CONTROL DEVICE, VEHICLE RECORDING DEVICE, VEHICLE RECORDING CONTROL METHOD AND PROGRAM
JP6696558B1 (en) Vehicle recording control device, vehicle recording device, vehicle recording control method, and program
JP2018160732A (en) Image processing apparatus, camera deviation determination system, and image processing method
WO2022070616A1 (en) Monitoring device, vehicle, monitoring method, and program
KR20100057253A (en) Image recording equipment around a vehicle
JP2005311698A (en) Vehicle periphery visual recognition device
JP7069726B2 (en) Notification device and in-vehicle device
CN113627370A (en) A motorcycle, a method for detecting the posture of a sidecar, and a storage medium
KR100459584B1 (en) Image system having transmission function of the scene image
JP2006123650A (en) In-vehicle monitoring device
JP6664411B2 (en) Security device, security control method, program, and storage medium
US20240336197A1 (en) Driving assistance system and vehicle
JP4218441B2 (en) On-vehicle moving body detection device
CN216184804U (en) Driving assistance system and vehicle
JP2022186922A (en) Processing device
US11124122B2 (en) Method and device for controlling vehicle steering, vehicle-mounted controller and vehicle
KR102550200B1 (en) System for providing panorama impact image and method thereof

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21874919

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2022553515

Country of ref document: JP

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 202317015254

Country of ref document: IN

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21874919

Country of ref document: EP

Kind code of ref document: A1