[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

SE541541C2 - Method and system for theft detection in a vehicle - Google Patents

Method and system for theft detection in a vehicle

Info

Publication number
SE541541C2
SE541541C2 SE1650327A SE1650327A SE541541C2 SE 541541 C2 SE541541 C2 SE 541541C2 SE 1650327 A SE1650327 A SE 1650327A SE 1650327 A SE1650327 A SE 1650327A SE 541541 C2 SE541541 C2 SE 541541C2
Authority
SE
Sweden
Prior art keywords
vehicle
image
captured
sensor
predefined object
Prior art date
Application number
SE1650327A
Other languages
Swedish (sv)
Other versions
SE1650327A1 (en
Inventor
Daniel Bemler
Marie Bemler
Original Assignee
Scania Cv Ab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Scania Cv Ab filed Critical Scania Cv Ab
Priority to SE1650327A priority Critical patent/SE541541C2/en
Priority to PCT/SE2017/050195 priority patent/WO2017155448A1/en
Publication of SE1650327A1 publication Critical patent/SE1650327A1/en
Publication of SE541541C2 publication Critical patent/SE541541C2/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R25/00Fittings or systems for preventing or indicating unauthorised use or theft of vehicles
    • B60R25/10Fittings or systems for preventing or indicating unauthorised use or theft of vehicles actuating a signalling device
    • B60R25/102Fittings or systems for preventing or indicating unauthorised use or theft of vehicles actuating a signalling device a signal being sent to a remote location, e.g. a radio signal being transmitted to a police station, a security company or the owner
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R25/00Fittings or systems for preventing or indicating unauthorised use or theft of vehicles
    • B60R25/30Detection related to theft or to other events relevant to anti-theft systems
    • B60R25/305Detection related to theft or to other events relevant to anti-theft systems using a camera
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R25/00Fittings or systems for preventing or indicating unauthorised use or theft of vehicles
    • B60R25/30Detection related to theft or to other events relevant to anti-theft systems
    • B60R25/31Detection related to theft or to other events relevant to anti-theft systems of human presence inside or outside the vehicle
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19639Details of the system layout
    • G08B13/19647Systems specially adapted for intrusion detection in or around a vehicle
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19695Arrangements wherein non-video detectors start video recording or forwarding but do not generate an alarm themselves
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/188Capturing isolated or intermittent images triggered by the occurrence of a predetermined event, e.g. an object reaching a predetermined position
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19678User interface
    • G08B13/19684Portable terminal, e.g. mobile phone, used for viewing video remotely
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N2201/3201Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N2201/3225Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to an image, a page or a document
    • H04N2201/3253Position information, e.g. geographical position at time of capture, GPS data

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)
  • Burglar Alarm Systems (AREA)
  • Alarm Systems (AREA)

Abstract

SUMMARYMethod (500) and control unit (410) for burglary detection in a vehicle (100). The method (500) comprises activating (501 ) at least one sensor (110, 120, 130, 140) of the vehicle (100), triggered by a vehicle alarm alert; capturing (502) a stream of images (200) with the sensor (110, 120, 130, 140) of the vehicle (100); detecting (503) a predefined object (160) in the captured (502) stream of images (200) by utilising a forwardly directed sensor (110) or other sensors (120) of the at least one sensor (110, 120, 130, 140) of the vehicle 100 for identifying objects as a predefined object (160); and storing (508) only any captured images (200) comprising the predefined object (160) when the predefined object (160) is detected (503) in the image (200).SUMMARYMethod (500) and control unit (410) for burglary detection in a vehicle (100). The method (500) comprises activating (501) at least one sensor (110, 120, 130, 140) of the vehicle (100), triggered by a vehicle alarm alert; capturing (502) a stream of images (200) with the sensor (110, 120, 130, 140) of the vehicle (100); detecting (503) a predefined object (160) in the captured (502) stream of images (200) by utilizing a forwardly directed sensor (110) or other sensors (120) of the at least one sensor (110, 120, 130, 140) of the vehicle 100 for identifying objects as a predefined object (160); and storing (508) only any captured images (200) comprising the predefined object (160) when the predefined object (160) is detected (503) in the image (200).

Description

METHOD AND SYSTEM FOR THEFT DETECTION IN A VEHICLE TECHNICAL FIELD This document discloses methods and a control unit. More particularly, methods and a control unit are described, for burglary detection in a vehicle.
BACKGROUND A parked vehicle is a rewarding target for criminals, as it is normally empty from people and has a relatively low theft protection, at least in comparison with most buildings. Even if the vehicle per se often have protection systems making it more or less impossible to start the vehicle without key, the fuel and / or freight may represent considerable values.
The vehicles as herein discussed may comprise a means for transportation in broad sense such as e.g. a truck, a bus, a car, a motorcycle, a bicycle, a terrain vehicle, a boat, an aircraft, a drone, a spacecraft, etc.
The cab and the fuel tank of the vehicle may be protected by a key lock. However, such locks are relatively easy to open for the malicious person. To put a more complex lock on the cab may have little result as the thief instead then may open the cab by breaking a window. In case the fuel tank is protected by a reinforced lock, the thief instead may make a hole in the fuel tank and drain the fuel.
The vehicle may further, besides the key lock, be protected by a vehicle alarm reacting on vibrations and/ or a shell alarm of the cab and / or the fuel tank.
However, such alarm may have little repellent effect on a thief, as there may be no one in the vicinity to react on the alarm, or even hear the alarm.
Further, vehicle alarms often generate false alarms, which is most distracting for the ambience and discourage the general public from reacting on vehicle alarms. Also, such false alarm, or a genuine alarm when left on for a long time, will drain the battery of the vehicle. Thereby the driver, besides the trouble caused by the burglary, also has to reload the battery before being able to start the vehicle and continue driving, which may cause a transportation delay.
To park the vehicle only in a locked and guarded garage may be a solution, however such guarded garage may not be available during a transportation route, where the professional driver is forced to stop for night rest by the tachograph, due to legal limitations on the working hours of professional drivers in many jurisdictions.
Further, it would also be desired to spot places which are in particular affected by theft and possibly warn drivers from parking at these places and / or put the attention of the local police department to such parking places, for being included in a surveillance route of a local police patrol.
The documents GB2343283 and WO2015035431 describe detection of fuel theft by placing a sensor in the cover of a vehicle fuel tank. Besides activation of an alarm, also surveillance cameras situated on the vehicle are activated when the cover is removed.
A disadvantage with this solution is that such filming generates a lot of data that is needed to be taken care of. This may be solved by placing a high capacity data storage device and potent data processing equipment in the vehicle to handle the data. However, such computing equipment is expensive and may in itself constitute an attractive object for theft, besides the additional production cost of the vehicle. This is also true for additional surveillance cameras and sensors in the vehicle.
Document US2012105635 describes an interconnection between the vehicle alarm and vehide cameras of the vehicle. When the alarm system is triggered, the vehicle cameras are activated and the captured images are stored in the vehicle. Again this generate large amount of data which has to be handled, requiring expensive computing equipment associated with the previously mentioned disadvantages.
Document DE102008006308 describes detection of people around a vehicle using a surveillance camera and in a second step activation of fuel level sensors for detecting fuel theft.
By activating the surveillance camera as soon as someone is situated around the vehicle will generate enormous amount of data to be handled and stored, in particular in case a high resolution camera is used, which is a prerequisite for being able to identify the intruder.
Consequently, there is a requirement for improvements in order to reduce vehicle burglary and enable catching the thief by being able to identify him/ her.
SUMMARY It is therefore an object of this invention to solve at least some of the above problems and improve measurements for reducing vehicle burglary.
According to a first aspect of the invention, this objective is achieved by a method for burglary detection in a vehicle. The method comprises activating at least one sensor of the vehicle, triggered by a vehicle alarm alert, wherein the at least one sensor comprises, or are connected to a control unit configured to image recognition/ computer vision and object recognition. Further the method comprises capturing a stream of images with the sensor of the vehicle. In addition, the method furthermore comprises detecting a predefined object in the captured stream of images by utilising a forwardly directed sensor or other sensors of the at least one vehicle sensor for identifying objects as a predefined object. The method also comprises storing only any captured images comprising the predefined object when the predefined object is detected in the image.
According to a second aspect of the invention, this objective is achieved by a control unit for burglary detection in a vehicle, wherein the control unit is configured to activate at least one sensor of the vehicle, triggered by a vehicle alarm alert. Further the control unit is configured to generate a command to the sensor to capture a stream of images. The control unit is in addition configured to detect a predefined object in the captured stream of images by utilising a forwardly directed sensor or other sensors of the at least one vehicle sensor for identifying objects as a predefined object. Also, the control unit is configured to store, in a data storage device, only any captured images comprising the predefined object when the predefined object is detected in the image.
According to a third aspect of the invention, this objective is achieved by a method in a security centre, for compiling information related to burglary detection in vehicles. The method comprises receiving an image captured by a sensor in a vehicle. The method further comprises storing the received image.
Thanks to the described aspects, images and/ or video sequences may be captured of suspected thieves/ malicious people and/ or their escape vehicle when triggering the vehicle alarm, and stored in a database. Thereby, criminals are discouraged from theft and/ or other criminal activity directed against the vehicle, as the risk of being caught is radically increased.
Further, places where fuel is stolen may be detected and drivers may be warned from parking at such places. This information may also, or in addition be provided to the police for crime investigations, possibly together with images of a suspected thief.
Also, in case a false alarm is triggered by the vehicle alarm, the alarm may be switched off by the driver/ owner, avoiding that the battery is drained. Further, by eliminating or reducing false alarms, the general public becomes more attentive to real active vehicle alarms, making them more willing to check, observe and take appropriate action. Thereby, measurements for reducing vehicle burglary are achieved.
Other advantages and additional novel features will become apparent from the subsequent detailed description.
FIGURES Embodiments of the invention will now be described in further detail with reference to the accompanying figures, in which: Figure 1A illustrates a vehicle according to an embodiment of the invention; Figure 1B illustrates a vehicle according to an embodiment of the invention; Figure 2 illustrates image capturing and storage, according to an embodiment; Figure 3 illustrates a scenario where images are captured by sensors placed on several different structures; Figure 4A depicts an example of a vehicle interior, according to an embodiment; Figure 4B depicts an example of a vehicle interior, according to an embodiment; Figure 5 is a flow chart illustrating an embodiment of a method; and Figure 6 is an illustration depicting a system according to an embodiment; and Figure 7 is a flow chart illustrating an embodiment of a method.
DETAILED DESCRIPTION Embodiments of the invention described herein are defined as methods and a control unit, which may be put into practice in the embodiments described below. These embodiments may, however, be exemplified and realised in many different forms and are not to be limited to the examples set forth herein; rather, these illustrative examples of embodiments are provided so that this disclosure will be thorough and complete.
Still other objects and features may become apparent from the following detailed description, considered in conjunction with the accompanying drawings. It is to be understood, however, that the drawings are designed solely for purposes of illustration and not as a definition of the limits of the herein disclosed embodiments, for which reference is to be made to the appended claims. Further, the drawings are not necessarily drawn to scale and, unless otherwise indicated, they are merely intended to conceptually illustrate the structures and procedures described herein.
Figure 1A illustrates a scenario with a parked vehicle 100.
The vehicle 100 may comprise a means for transportation in broad sense such as e.g. a truck, a car, a motorcycle, a trailer, a bus, a bike, a train, a tram, an aircraft, a watercraft, a cable transport, an aerial tramway, an elevator, a drone, a spacecraft, or other similar manned or unmanned means of conveyance.
The vehicle 100 may be driver controlled or driverless (i.e. autonomously controlled) in different embodiments. However, for enhanced clarity, the vehicle 100 is subsequently described as having a driver.
The vehicle 100 may comprise a fuel tank, i.e. for diesel, gasoline, ethanol, methanol, etc., which fuel may be attractive for thieves to steal.
Alternatively, or in addition in case of a hybrid vehicle, the vehicle 100 may comprise a rechargeable battery, e.g. based on lithium-ion and other lithium-based variants such as Lithium iron phosphate and Lithium-titanate, which may be attractive for thieves to steal. Lead acid batteries, Nickel metal hydride (NiMH) and/ or zinc-air battery are other possible options.
In further alternatives, the vehicle 100 may comprise fuel cells, which may form a possible target for theft. Such fuel cell may comprise e.g. Polymer Electrolyte Membrane (PEM) Fuel Cells, direct methanol fuel cells, phosphoric acid fuel cells, molten carbonate fuel cells, solid oxide fuel cells, reformed methanol fuel cell and / or regenerative fuel cells, just to mention some non-limiting examples.
The vehicle 100 may comprise one or more sensors, such as e.g. a forwardly directed sensor 110 in some embodiments. In the illustrated embodiment, which is merely an arbitrary example, the forwardly directed sensor 110 may be situated e.g. at the front of the vehicle 100, behind the windscreen of the vehicle 100.
Mounting the forwardly directed sensor 110 behind the windshield have some advantages compared to externally mounted camera systems. These advantages include the possibility to use windshield wipers for cleaning and using the light from headlights to illuminate objects in the camera’s field of view. It is also protected from dirt, snow, rain and to some extent also from damage, vandalism and/ or theft. Such sensor 110 may also be used for a variety of other tasks.
The forwardly directed sensor 110 may be directed towards the front of the vehicle 100, in the driving direction. The sensor 110 may comprise e.g. a camera, a stereo camera, an infrared camera, a video camera, a radar, a lidar, an ultrasound device, a time-of-flight camera, or similar device, in different embodiments.
Further the vehicle 100 may comprise one or two side view sensors 120. The side view sensors 120 may be situated at the left/ right sides of the vehicle 100 (as regarded in the driving direction), arranged to detect objects at the respective side of the vehicle 100. The side view sensors 120 may comprise e.g. a camera, a stereo camera, an infrared camera, a video camera, a radar, a lidar, an ultrasound device, a time-of-flight camera, or similar device in different embodiments.
In some embodiments, the sensors 110, 120 may comprise e.g. a motion detector and/ or be based on a Passive Infrared (PIR) sensor sensitive to a person's skin temperature through emitted black body radiation at mid-infrared wavelengths, in contrast to background objects at room temperature; or by emitting a continuous wave of microwave radiation and detect motion through the principle of Doppler radar; or by emitting an ultrasonic wave an detecting and analysing the reflections; or by a tomographic motion detection system based on detection of radio wave disturbances, to mention some possible implementations.
Instead of using traditional rear view mirrors on the vehicle 100, side view sensors 120 in combination with one or more devices intended to display objects outside a driver's direct field of vision may be used. Such presentational device may comprise e.g. a display, a projector, a Head-Up Display, a transparent display being part of the windshield, intelligent glasses of the driver, etc., which output an image, or stream of images, captured by a corresponding sensor 110, 120. Typically, the sensor 120 on the left side of the vehicle 100 may be associated with a presentational device on the left side of the cabin while the sensor on the right side of the vehicle 100 may be associated with a presentational device on the right side of the cabin, even if other combinations are possible.
However, such presentational device intended to display objects outside a driver's direct field of vision may in some embodiments comprise a reflecting element such as e.g. represented by a (rear view) mirror.
The vehicle 100 may comprise one or several independent vehicle alarms for monitoring doors, cargo compartment, shell alarm of the fuel tank etc., for unauthorised entry. The vehicle alarm may be triggered e.g. by vibrations transferred to a shock sensor associated with the vehicle alarm, or by switches in-built in door lock mechanisms, an indication that the level of fuel is changing in the tank, a telephone call to the alarm system or any other trigger.
By connecting the vehicle alarm /-s with sensors 110, 120 of the vehicle 100, such as sensors 120 of the digital rear view mirrors, a more effective alarm for fuel theft, vehicle intrusion, assault, kidnapping and/ or sabotage is achieved.
Subsequently, when using the terminology "(suspected) thief” or “burglar", it may as well imply involvement of any other criminal such as robber, kidnapper, saboteur, vandal, assassin, terrorist or any other similar malicious person.
When the alarm, or one of the vehicle alarms, is triggered, the alarm may wake up all sensors 1 10, 120 of the vehicle 100 and they are instructed to record what is happening. Thereby the suspected thief and his/ her colleagues and own escape vehicle may be caught on picture, also when they are situated a bit apart from the vehicle 100, or on an opposite side of the vehicle 100.
In other embodiments, only a subset of the vehicle sensors 110, 120 may be activated, such as e.g. the side view sensors 120/ digital rear view mirrors. Since the side view sensors 120 are placed on both sides of the vehicle 100, the fuel thief will be detected independently if the tank is placed on the left or the right side. By only activating a relevant subset of the sensors 110, 120, data memory requirements are reduced. Information collected via the sensors 110, 120 will then be saved and able for distribution.
To take one or more pictures, i.e. to record a video sequence, only when needed will lead to less data, easier handling of information and a smaller interference of integrity as images will only be captured, triggered under specific situations, i.e. when a vehicle alarm is alerted.
Distribution of information can be made by air If the vehicle 100 Is connected and sent to a supervision centre, a telephone and / or fleet management portal; or by reading out from the Electronic Control Unit (ECU).
Taking pictures or filming via the sensors 110, 120 may also be triggered from inside the cab, e.g. if something suspicious is seen/ heard by someone resting in the cab (or for whatever reason), in some embodiments.
In some embodiments, the forwardly directed sensor 110 (which also may be used e.g. for lane departure warning etc.), or other sensors 120 on the vehicle 100 may be utilised for identifying objects, such as e.g. humans and active vehicles. Thereby, it may be determined if there is any person on an image, and only images where a human is present and detected on the image may be saved. Thus data handling and data storage may be further reduced.
Further, according to some embodiments, several sensors 110, 120, such as e.g. cameras may be directed into the same direction, or into the same object. Thereby, based upon the various sensor signals from the different sensors 110, 120, a three dimensional image may be created further enhancing identification of the suspected thief.
By providing a vehicle alarm system taking images and/ or sequences of images such a video sequence, the possibility of catching criminals will increase, which is likely to have a deterrent effect on potential thieves. Also, the experience of safety of the driver in the cab is enhanced, providing a better and more relaxed working climate for the driver. Further, increased costs due to additional sensors may be avoided by using already existing sensors 110, 120 on the vehicle 100.
Figure 1B schematically illustrates a scenario, similar to the previously discussed scenario illustrated in Figure 1A, but with the vehicle 100 seen from an above perspective and wherein a vehicle alarm 150 and an object 160 is depicted.
The object 160 may be predefined, such as a human, a human holding a tool for making an in-break, i.e. a burglary tool and / or an activated but static vehicle (which may be a potential escape vehicle of the thief), a human holding a weapon, etc. The burglary tool may for example be a crowbar, a wrecking bar, a bolt cutter, a fuel hose, a fuel tank, etc.
In some embodiments, the predefined object 160 may be a human having a movement pattern which is characteristic for a thief, such as e.g. moving towards the fuel tank of the vehicle 100 while looking around and then stops, facing the tank, etc. In other embodiments, when a human is discovered in a picture, a parsing may be made against a picture register of criminals. Such criminals may be convicted criminals or merely suspected criminals. The predefined object 160 may in some embodiments comprise a human included in the picture register of criminals, in some embodiments.
In this bird-view image, also the left side sensor 130 is depicted. It may be noted that the vehicle 100 may have additional sensors such as reversing camera 140, in some embodiments, which may be utilised for the same purpose as the forward directed sensor 110 when driving backwards, in some embodiments.
The sensors 110, 120, 130, 140 comprises, or are connected to a control unit configured to image recognition/ computer vision and object recognition.
Computer vision is a technical field comprising methods for acquiring, processing, analysing, and understanding images and, in general, high-dimensional data from the real world in order to produce numerical or symbolic information. A theme in the development of this field has been to duplicate the abilities of human vision by electronically perceiving and understanding an image. Understanding in this context means the transformation of visual images (the input of retina) into descriptions of world that can interface with other thought processes and elicit appropriate action. This image understanding can be seen as the disentangling of symbolic information from image data using models constructed with the aid of geometry, physics, statistics, and learning theory. Computer vision may also be described as the enterprise of automating and integrating a wide range of processes and representations for vision perception.
The image data of the sensors 110, 120, 130, 140 may take many forms, such as e.g. images, video sequences, views from multiple cameras, or multi-dimensional data from a scanner.
Computer vision may comprise e.g. scene reconstruction, event detection, video tracking, object recognition, object pose estimation, learning, indexing, motion estimation, and image restoration, just to mention some examples.
When the predefined object/ thief 160 is triggering the vehicle alarm 150 (or when the vehicle alarm 150 is triggered for whatever reason), the sensors 110, 120, 130, 140 are activated and start taking pictures. In some embodiments, an image recognition procedure may be made and only images wherein the predefined object 160 are detected are saved. The sensors 110, 130, 140 taking images wherein no predefined object 160 are detected may not be saved in some embodiments, thereby saving data handling/ storage capacity.
In some embodiments, the predefined object 160 may be recognised by deep learning (sometimes also referred to as deep structured learning, hierarchical learning and/ or deep machine learning); a branch of machine learning based on a set of algorithms that attempt to model high-level abstractions in data by using multiple processing layers with complex structures, or otherwise composed of multiple non-linear transformations. Deep learning is based on learning representations of data. An observation (e.g., an image) can be represented in many ways such as a vector of intensity values per pixel, or in a more abstract way as a set of edges, regions of particular shape, etc.
Deep learning typically uses a cascade of many layers of nonlinear processing units for feature extraction and transformation. Each successive layer uses the output from the previous layer as input. The algorithms may be supervised or unsupervised and applications may comprise pattern analysis (unsupervised) and classification (supervised). Further, deep learning may be based on the (unsupervised) learning of multiple levels of features or representations of the data. Higher level features may be derived from lower level features to form a hierarchical representation. By Deep learning, multiple levels of representations that correspond to different levels of abstraction are learned; the levels form a hierarchy of concepts. The composition of a layer of nonlinear processing units used in a deep learning algorithm depends on the problem to be solved, i.e. recognising the predefined object 160.
Figure 2 describes a scenario wherein the vehicle alarm, or alternatively the driver or other vehicle responsible person, has triggered activation of the vehicle sensors 110, 120, 130, 140. The vehicle sensor 120 captures an image 200 of the predefined object 160. The captured image 200 may be sent to and handled by an image recognition procedure confirming that the image 200 comprises the predefined object 160.
In some embodiments, a cropping may be made of the image 200, eliminating a part of the image 200 not comprising the predefined object 160, resulting in a cropped image 210.
The image 200, or the cropped image 210 as the case may be, which is confirmed to comprise an image of the predefined object 160, in a data storage device 220, such as e.g. a database.
The data storage device 220 may be comprised locally in the vehicle 100, or alternatively be stored outside the vehicle 100 In a vehicle external structure, as will be further discussed in conjunction with presentation of Figure 4B.
Images or video sequences wherein no predefined object 160 is comprised may not be stored, as they do not provide any additional information concerning the activities of the thief 160.
Figure 3 describes a scenario wherein yet an embodiment of the invention is implemented, where the vehicle 100 is parked on a parking lot or similar, close to another vehicle 300.
Assume the thief 160 is triggering the alarm 150 of the vehicle 100 at a first point in time t1. The sensors 110, 120, 130, 140 are then activated by the alarm 150. An image of the thief 160 is then captured at the time t1 by the sensor 120.
In this illustrative example, the thief 160 starts running behind the vehicle 100, and is then captured by the reverse sensor 140 at a second point in time t2.
In the illustrated embodiment, the vehicle alarm 150 of the vehicle 100 is also triggering sensors 310, 320, 330, 340 situated on another structure, such as the other vehicle 300.
The triggering of the sensors 310, 320, 330, 340 of the other vehicle 300 may be made over a wireless communication interface, such as e.g. Vehicle-to-Vehicle (V2V) communication, or Vehicle-to-Structure (V2X) communication.
In some embodiments, the communication between vehicles 100, 300 may be performed via V2V communication, e.g. based on Dedicated Short-Range Communications (DSRC) devices. DSRC works in 5.9 GHz band with bandwidth of 75 MHz and approximate range of 1000 m in some embodiments.
The wireless communication may be made according to any IEEE standard for wireless vehicular communication like e.g. a special mode of operation of IEEE 802.11 for vehicular networks called Wireless Access in Vehicular Environments (WAVE). IEEE 802.11 p is an extension to 802.11 Wireless LAN medium access layer (MAC) and physical layer (PHV) specification.
Such wireless communication interface may comprise, or at least be inspired by wireless communication technology such as Wi-Fi, Wireless Local Area Network (WLAN), Ultra Mobile Broadband (UMB), Bluetooth (BT), Near Field Communication (NFC), Radio-Frequency Identification (RFID), optical communication such as Infrared Data Association (IrDA) or infrared transmission to name but a few possible examples of wireless communications in some embodiments.
The communication may alternatively be made over a wireless interface comprising, or at least being inspired by radio access technologies such as e.g. 3GPP LTE, LTE-Advanced, E-UTRAN, UMTS, GSM, GSM/ EDGE, WCDMA, Time Division Multiple Access (TDM A) networks, Frequency Division Multiple Access (FDMA) networks, Orthogonal FDMA (OFDMA) networks, Single-Carrier FDMA (SC-FDMA) networks, Worldwide Interoperability for Microwave Access (WiMax), or Ultra Mobile Broadband (UMB), High Speed Packet Access (HSPA) Evolved Universal Terrestrial Radio Access (E-UTRA), Universal Terrestrial Radio Access (UTRA), GSM EDGE Radio Access Network (GERAN), 3GPP2 CDMA technologies, e.g., CDMA2000 1x RTT and High Rate Packet Data (HRPD), or similar, just to mention some few options, via a wireless communication network.
Thus all, or a subset of neighbour vehicles 300, e.g. all vehicles 300 within a predefined or configurable distance of the vehicle 100 comprising the activated vehicle alarm 150 may be encourage to activate their sensors 310, 320, 330, 340 for capturing images.
In this illustrated scenario, the thief 160 is captured by one of the sensors 330 of the neighbour vehicle 300 at a third point in time t3. The sensor 330 is also capturing an image of the escape vehicle 160-2 of the thief 160. Thanks to high resolution of the image, the escape vehicle 160-2 may be identified via the licence plate. Also, a possible accomplice to the thief 160, driving the escape vehicle 160-2 may be identified, or at least captured for later identification.
The images or film sequences captured by the sensor 330 of the neighbour vehicle 300 may be transmitted to the data storage device 220, where the images/ film sequences captured by the sensors 110, 120, 130, 140 of the first vehicle 100 are stored, via e.g. the previously discussed wireless communication interface. The data storage device 220 may be situated in the vehicle 100, or be stored external to the vehicle 100.
Further, as previously mentioned, in case the vehicle alarm 150 becomes activated and no sensor 110, 120, 130, 140, 310, 320, 330, 340 detects any predefined object 160, it is likely that the vehicle alarm 150 was triggered by another reason than an in-break attempt, such as e.g. a heavy vehicle passing, another vehicle touching the vehicle 100 while parking, etc. The alarm may then be deactivated. Thereby the battery load level of the vehicle battery is maintained and the environment is saved from irritating and disturbing alarm noise.
Figure 4A illustrates an example of how the previous scenario in Figure 1 B and / or Figure 3 may be perceived by the driver of the vehicle 100, according to an embodiment.
A control unit 410 may be configured for burglary detection in the vehicle 100, when parked or being stationary. The control unit 310 may comprise, or be connected to a positioning unit 420, at least one sensor 110, 120, 130, 140, at least one vehicle alarm 150, a data storage device 220 and possibly a wireless transmitter 440.
The transmitter 440, or transceiver, or communication device, is configured for wired or wireless communication with a vehicle external structure 450 comprising a communication device 460.
The vehicle external structure 450 may comprise e.g. a security centre, a police department, a monitoring centre, a parking security office or similar. In some embodiments, the data storage device 220 may be comprised at the vehicle external structure 450, as illustrated in Figure 4B.
The communication between the control unit 410 and the vehicle external structure 450 may thus be made by the transmitter 440. The transmitter 440 may be configured for wireless communication, e.g. based on a Vehicle-to-Vehicle (V2V) signal, or any other wireless signal based on, or at least inspired by wireless communication technology such as Wi-Fi, Ultra Mobile Broadband (UMB), Wireless Local Area Network (WLAN), Bluetooth (BT), or infrared transmission to name but a few possible examples of wireless communications. The vehicle 100 may also comprise means for communication via a wireless communication interface, e.g. any of the previously mentioned, or at least partly based on or inspired by radio access technologies such as, e.g., 3rd Generation Partnership Project (3GPP) Long Term Evolution (LTE), LTE-Advanced, Evolved Universal Terrestrial Radio Access Network (E-UTRAN), Universal Mobile Telecommunications System (UMTS), Wideband Code Division Multiple Access (WCDMA), etc.
The control unit 410 may communicate with the other vehicle internal units via e.g. a communication bus. The communication bus may comprise e.g. a Controller Area Network (CAN) bus, a Media Oriented Systems Transport (MOST) bus, or similar. However, the datalink may alternatively be made over a wireless connection comprising, or at least be inspired by wireless communication technology such as Wi-Fi, Wireless Local Area Network (WLAN), Ultra Mobile Broadband (UMB), Bluetooth (BT), Near Field Communication (NFC), Radio-Frequency Identification (RFID), optical communication such as Infrared Data Association (IrDA) or infrared transmission to name but a few possible examples of wireless communications in some embodiments.
The geographical position of the vehicle 100 may be determined by the positioning unit 420 in the vehicle 100, which may be based on a satellite navigation system such as the Navigation Signal Timing and Ranging (Navstar) Global Positioning System (GPS), Differential GPS (DGPS), Galileo, GLONASS, or the like.
The geographical position of the positioning unit 420, (and thereby also of the vehicle 100) may be made continuously with a certain predetermined or configurable time intervals according to various embodiments.
Positioning by satellite navigation is based on distance measurement using triangulation from a number of satellites 430-1, 430-2, 430-3, 430-4. In this example, four satellites 430-1, 430-2, 430-3, 430-4 are depicted, but this is merely an example. More than four satellites 430- 1, 430-2, 430-3, 430-4 may be used for enhancing the precision, or for creating redundancy. The satellites 430-1, 430-2, 430-3, 430-4 continuously transmit information about time and date (for example, in coded form), identity (which satellite 430-1, 430-2, 430-3, 430-4 that broadcasts), status, and where the satellite 430-1, 430-2, 430-3, 430-4 are situated at any given time. The GPS satellites 430-1, 430-2, 430-3, 430-4 sends information encoded with different codes, for example, but not necessarily based on Code Division Multiple Access (CDMA). This allows information from an individual satellite 430-1, 430-2, 430-3, 430-4 distinguished from the others' information, based on a unique code for each respective satellite 430-1, 430-2, 430-3, 430-4. This information can then be transmitted to be received by the appropriately adapted positioning device comprised in the vehicles 100.
Distance measurement can according to some embodiments comprise measuring the difference in the time it takes for each respective satellite signal transmitted by the respective satellites 430-1, 430-2, 430-3, 430-4 to reach the positioning unit 420. As the radio signals travel at the speed of light, the distance to the respective satellite 430-1, 430-2, 430-3, 430-4 may be computed by measuring the signal propagation time.
The positions of the satellites 430-1, 430-2, 430-3, 430-4 are known, as they continuously are monitored by approximately 15-30 ground stations located mainly along and near the earth's equator. Thereby the geographical position, i.e. latitude and longitude, of the vehicle 100 may be calculated by determining the distance to at least three satellites 430-1, 430-2, 430-3, 430-4 through triangulation. For determination of altitude, signals from four satellites 430-1, 430-2, 430-3, 430-4 may be used according to some embodiments.
The geographical position of the vehicle 100 may alternatively be determined, e.g. by having transponders positioned at known positions around the route and a dedicated sensor in the vehicle 100, for recognising the transponders and thereby determining the position; by detecting and recognising WiFi networks (WiFi networks along the route may be mapped with certain respective geographical positions in a database); by receiving a Bluetooth beaconing signal, associated with a geographical position, or other signal signatures of wireless signals such as e.g. by triangulation of signals emitted by a plurality of fixed base stations with known geographical positions. The position may alternatively be entered by the driver.
Having determined the geographical position of the positioning unit 420 (or in another way), it may be presented on a map, a screen or a display device where the position of the vehicle 100 may be marked, In some alternative embodiments.
In some embodiments, a triggered alarm signal or a tank level change, detected when the vehicle 100 has been stationary, and any captured images comprising the predefined object 160 may be saved in the database 360 and stored there, together with the geographical position of the vehicle 100, possibly also together with an identification reference of the vehicle 100, and / or a determined moment in time when the alarm was triggered.
It thereby becomes possible to detect and register the geographical position where fuel is frequently stolen and / or which are frequently visited by thieves, and measures may be made for avoiding those geographical positions.
This information may be collected from a plurality of vehicles 100 and the geographical indications may be collected in the data storage device 220, possibly together with an identification reference of the vehicle 100 and/ or its driver.
Based on the geographical positions stored in the data storage device 220, a (visual) presentation of zones which are in particular affected by fuel theft may be made. This information may be distributed to e.g. the police, or to vehicle owners/ drivers. Further, in some embodiments, a warning may be emitted when a vehicle 100 is approaching such zone, discouraging the driver from stopping/ parking the vehicle 100 there.
As many vehicles 100 comprise positioning units 420 and means for wireless communication 440, various data and information may be transmitted from the vehicle 100 to the vehicle external structure 450, such as e.g. identification reference of the vehicle 100, in some embodiments.
According to some alternative embodiments, the sensors 110, 120, 130, 140 of the vehicle 100, or possibly sensors 310, 320, 330, 340 of another vehicle 300 in the vicinity of the vehicle 100 may be activated/ deactivated by the driver via an activation unit 470. The activation unit 470 may communicate over a communication interface, which may be wired or wireless according to different embodiments.
Figure 4B illustrates an alternative example to the embodiment previously illustrated in Figure 4A of how the previous scenario in Figure 1B and/ or Figure 3 may be perceived by the driver of the vehicle 100, according to an embodiment.
In the illustrated embodiment, most devices are arranged and situated similar to or even identical with the embodiment illustrated in Figure 4A; however, the data storage device 220 is situated at the vehicle external structure 450.
An advantage therewith, in comparison with having the data storage device 220 at the vehicle 100 is that data from a plurality of vehicles 100, 300 may be collected and stored. Thereby reliable data may be collected and verified from various vehicles 100, 300. Further, by having the data storage device 220 external to the vehicle 100, it is saved also when the vehicle 100 is stolen, or partly damaged due to vandalism or an accident.
Figure 5 illustrates an example of a method 500 for burglary detection and burglar spotting in a vehicle 100.
The vehicle 100 may comprise a means for transportation in broad sense such as e.g. a truck, a car, a motorcycle, a trailer, a bus, a bike, a train, a tram, an aircraft, a watercraft, a cable transport, an aerial tramway, an elevator, a drone, a spacecraft, or other similar manned or unmanned means of conveyance.
The vehicle 100 may be stationary or parked in some embodiments.
In order to be able to detect the burglary, the method 500 may comprise a number of steps 501-510. However, some of these steps 501-510 may be performed solely in some alternative embodiments, like e.g. step 501-510. Further, the described steps 501-510 may be performed in a somewhat different chronological order than the numbering suggests. The method 500 may comprise the subsequent steps: Step 501 comprises activating at least one sensor 110, 120, 130, 140 of the vehicle 100, triggered by a vehicle alarm alert of a vehicle alarm 150 in the vehicle 100.
The sensor 110, 120, 130, 140 may be a camera, a video camera, an infrared camera, etc. The vehicle 100 may comprise one or several sensors 110, 120, 130, 140 of the same type, or different types.
The activation of the sensor 110, 120, 130, 140 may be triggered by a vehicle responsible person such as the driver or owner in some different embodiments.
Step 502 comprises capturing an image 200, or a stream of images, with the activated 501 sensor 110, 120, 130, 140 of the vehicle 100.
The image capturing may alternatively be triggered by a vehicle responsible person such as the driver or owner in some different embodiments, e.g. from the cab or on distance via a wireless communication interface.
Step 503 comprises detecting a predefined object 160 in the captured 502 image 200, or a captured 502 stream of images 200, as the case may be.
The predefined object 160 may be a person, a thief, a suspected thief, a suspected or possible escape vehicle, etc.
Step 504, which only may be comprised in some embodiments, comprises cropping the captured 502 image 200 around the detected 503 predefined object 160.
Step 505, which only may be comprised in some embodiments, comprises transmitting the captured 502 image 200 when the predefined object 160 is detected 503 in the image 200 to a security centre 450.
The security centre 450 may be e.g. a police department, a security surveillance centre, etc.
Step 506, which only may be comprised in some embodiments, comprises observing geographical position of the vehicle 100 triggered by the vehicle alarm alert.
Thereby, the geographical position of the vehicle 100 may be determined when the vehicle alarm 150 is activated.
Step 507, which only may be comprised in some embodiments wherein step 506 has been performed, comprises storing the observed 506 geographical position of the vehicle 100, associated with the captured 502 image 200 in the data storage device 220.
Thereby, places where theft in vehicles 100 has occurred may be detected and measures may be taken for warning drivers from parking at such places. Such warning may be emitted to e.g. all vehicles 100 having the same owner or the same freight forwarder; to all vehicles 100 made by the same manufacturer; to all vehicles 100 subscribing to such service, etc.
Step 508 comprises storing the captured 502 image 200 when the predefined object 160 is detected 503 in the image 200 in the data storage device 220.
In some embodiments wherein step 504 has been performed, the cropped 504 image 210 may be stored in the data storage device 220.
In some embodiments, date and time of the event of the captured 502 image 200 may be determined and stored. Thereby, the thief and the theft may be more precisely monitored and spotted, which may assist the police in investigating the theft and may possibly be used as evidence against suspected thieves.
In some embodiments, an identification reference of the vehicle 100 may be stored in the data storage device 220, in particular when the data storage device 220 is comprised in a vehicle external structure 450. The identification reference may be any unique reference value such as e.g. a registration number of the vehicle 100 or similar.
Step 509, which only may be comprised in some embodiments, comprises emitting a wireless signal for triggering image capturing of a sensor 310, 320, 330, 340 situated on another structure 300 than the vehicle 100 and transmission of such captured image 200 to a data storage device 220 used for storing 508 the captured 502 image 200.
The other structure 300 may be another vehicle or building within a certain distance from the vehicle 100, such as e.g. 10 meters, 30 meters, 50 meters, 100 meters, 200 meters, 400 meters, etc. or somewhere in between these values.
Step 510, which only may be comprised In some embodiments, comprises deactivating the vehicle alarm alert when no predefined object 160 is detected in the captured 502 image 200.
Thereby redundant false alarms may be omitted or at least reduced.
Figure 6 Illustrates an embodiment of a system 600 for burglary detection in a parked vehicle 100. The system 600 comprises at least one sensor 110, 120, 130, 140 arranged In the vehicle 100. The sensor 110, 120, 130, 140 may be e.g. a camera, a video camera, an infrared camera etc., which may be arranged perhaps in particular primarily for other tasks than taking pictures of suspected thieves.
The system 600 in addition comprises one or more vehicle alarms 150.
The system 600 comprises a control unit 410. The control unit 410 is configured to perform at least some of the previously described method steps 501-510, described above and illustrated in Figure 5.
The control unit 410 is configured to activate at least one sensor 110, 120, 130, 140 of the vehicle 100, triggered by a vehicle alarm alert. Further the control unit 410 is configured to generate a command to the sensor 110, 120, 130, 140 to capture an image 200. In addition, the control unit 410 is further configured to detect a predefined object 160 in the captured image 200. Furthermore, the control unit 410 is also configured to store the captured image 200 when the predefined object 160 is detected in the image 200, in a data storage device 220.
In some embodiments, the control unit 410 may be configured to crop the captured image 200 around the detected predefined object 160. In addition, the control unit 410 may also be configured to store the cropped image 210 in the data storage device 220.
The control unit 410 may be configured, in some embodiments, to transmit the captured image 200 when the predefined object 160 is detected in the image 200 to a security centre 450, via a transmitter 440.
Furthermore, the control unit 410 may also be configured to generate a command to the transmitter 440, to emit a wireless signal for triggering image capturing of a sensor 310, 320, 330, 340 situated on another structure 300 than the vehicle 100 and transmission of such captured image 200 to a data storage device 220 used for storing the captured image 200.
In some embodiments, the control unit 410 may also be configured to deactivate the vehicle alarm alert when no predefined object 160 is detected in the captured image 200.
In further addition the control unit 410 may also be configured to observe geographical position of the vehicle 100 via a positioning device 420, triggered by the vehicle alarm alert. The control unit 410 may also be configured to store the observed geographical position of the vehicle 100, associated with the captured image 200 in the data storage device 220.
The system 600 also comprises a data storage device 220, for storing images 200, or stream of images, captured according to the method 500. The data storage device 220 may be situated in the vehicle 100, or in a vehicle external structure 450, such a security centre or police department.
This information stored in the data storage device 220 may then be outputted e.g. to (a subset of) vehicle drivers, police authorities etc. In various embodiments, the data storage device 220 may also be additionally configured to store an identification reference of the vehicle 100, a time/ date reference, size of the tank level change, etc. The data storage device 220 is further configured to store the observed geographical position of the vehicle 100 where the tank level change has been detected.
The data storage device 220 may comprise data only from one vehicle 100, or from a plurality of vehicles 100, such as various vehicles 100 owned by the same owner, in different embodiments.
The above described control unit 410, as illustrated in Figure 6 may according to some embodiments comprise a processor for performing various computations, required for performing the method 500 according to at least some of the previously described steps 501-510. Such processor may comprise one or more instances of a processing circuit, i.e. a Central Processing Unit (CPU), a processing unit, a processing circuit, a processor, an Application Specific Integrated Circuit (ASIC), a microprocessor, or other processing logic that may interpret and execute instructions. The here utilised expression “processor" may thus represent a processing circuitry comprising a plurality of processing circuits, such as, e.g., any, some or all of the ones enumerated above.
Furthermore, the control unit 410 may comprise a memory in some embodiments. The optional memory may comprise a tangible, physical device utilised to store data or programs, i.e., sequences of instructions, on a temporary or permanent basis. According to some embodiments, the memory may comprise integrated circuits comprising silicon-based transistors. The memory may comprise e.g. a memory card, a flash memory, a USB memory, a hard disc, or another similar volatile or non-volatile storage unit for storing data such as e.g. ROM (Read-Only Memory), PROM (Programmable Read-Only Memory), EPROM (Erasable PROM), EEPROM (Electrically Erasable PROM), etc. in different embodiments.
The previously described steps 501-510 to be performed by the control unit 410 may be implemented through the one or more processors within the control unit 410, together with computer program product for performing at least some of the functions of the steps 501 -510. Thus a computer program product, comprising instructions for performing the steps 501 -510 in the control unit 410 may perform the method 500 according to at least some of the steps 501-510, when the computer program is loaded into the one or more processors of the control unit 410.
The computer program product mentioned above may be provided for instance in the form of a tangible data carrier carrying computer program code for performing at least some of the steps 501-510 according to some embodiments when being loaded into the one or more processors of the control unit 410. The data carrier may be, e.g., a hard disk, a CD ROM disc, a memory stick, an optical storage device, a magnetic storage device or any other appropriate medium such as a disk or tape that may hold machine readable data in a nontransitory manner. The computer program product may furthermore be provided as computer program code on a server and downloaded to the control unit 410 remotely, e.g., over an Internet or an intranet connection.
Figure 7 illustrates an example of a method 700 in a security centre 450, for compiling information related to burglary detection in stationary vehicles 100. The security centre 450 may e.g. be a stationary or mobile police department, or a private business security section, surveillance centre or similar.
Such information may comprise images 200, 210 taken by sensors 110, 120, 130, 140, 310, 320, 330, 340 in, or in the vicinity of the vehicle 100. The information may further comprise identification of the vehicle 100, geographical position of the burglary, time information, etc.
The vehicle 100 may be stationary or parked in various embodiments.
In order to be able to compile the information related to burglary detection, the method 700 may comprise a number of steps 701-704. However, some of these steps 701-704 may be performed solely in some alternative embodiments, like e.g. step 701-704. Further, the described steps 701-704 may be performed in a somewhat different chronological order than the numbering suggests. The method 700 may comprise the subsequent steps: Step 701 comprises receiving an image 200, 210, or a stream of images, captured by a sensor 110, 120, 130, 140, 310, 320, 330, 340 in a vehicle 100, 300, or in the vicinity of the vehicle 100.
The received images 200, 210 comprise a predefined object 160. The predefined object 160 may be a person, a thief, a suspected thief, a suspected or possible escape vehicle, etc.
Step 702 comprises storing the received 701 image 200, 210, e.g. in a data storage device 220.
In some embodiments, date and time of the event of the received 701 image 200, 210 may be determined and stored associated with the image 200, 210. Thereby, the thief and the theft may be more precisely monitored and spotted, which may assist the police in investigating the theft and may possibly be used as evidence against suspected thieves.
Information may then be extracted from the data storage device 220 when making an investigation concerning the burglary, for determining when, where and what has been stolen and possibly identify the thieves and/ or their vehicle.
Step 703 which only may be performed in some embodiments, comprises collecting information comprising observed geographical positions from a plurality of vehicles 100, 300, from which images 200 have been received 701.
Step 704 which only may be performed in some embodiments, comprises providing a recommendation to a vehicle 100, 300, to avoid parking on a geographical position where information from a number of vehicles 100, 300 exceeding a threshold value, has been collected 703.
Thereby, places where theft in vehicles 100 has occurred may be detected and measures may be taken for warning drivers from parking at such places. Such warning may be emitted to e.g. all vehicles 100 having the same owner or the same freight forwarder; to all vehicles 100 made by the same manufacturer; to all vehicles 100 subscribing to such service, etc.
The security centre 450 for compiling information related to burglary detection in vehicles 100, may further comprise a receiver 460 configured to receive an image 200 captured by a sensor 110, 120, 130, 140, 310, 320, 330, 340 in a vehicle 100, 300. Further, the security centre 450 may comprise a data storage device 220 configured to store the received image 200.
The terminology used in the description of the embodiments as illustrated in the accompanying drawings is not intended to be limiting of the described methods 500, 700; the control unit 410, the data storage device 220, the system 600 and / or the security centre 450. Various changes, substitutions or alterations may be made, without departing from invention embodiments as defined by the appended claims.
As used herein, the term "and/ or" comprises any and all combinations of one or more of the associated listed items. The term “or” as used herein, is to be interpreted as a mathematical OR, i.e., as an inclusive disjunction; not as a mathematical exclusive OR (XOR), unless expressly stated otherwise. In addition, the singular forms "a", "an" and "the" are to be interpreted as "at least one”, thus also possibly comprising a plurality of entities of the same kind, unless expressly stated otherwise. It will be further understood that the terms "includes", "comprises", "including" or "comprising", specifies the presence of stated features, actions, integers, steps, operations, elements, or components, but do not preclude the presence or addition of one or more other features, actions, integers, steps, operations, elements, components, or groups thereof. A single unit such as e.g. a processor may fulfil the functions of several items recited in the claims. The mere fact that certain measures are recited in mutu ally different dependent claims does not indicate that a combination of these measures cannot be used to advantage. A computer program may be stored/ distributed on a suitable medium, such as an optical storage medium or a solid-state medium supplied together with or as part of other hardware, but may also be distributed in other forms such as via Internet or other wired or wireless communication system.

Claims (16)

PATENT CLAIMS
1. A method (500) for burglary detection in a vehicle (100), which method (500) comprises: activating (501) at least one sensor (110, 120, 130, 140) of the vehicle (100), triggered by a vehicle alarm alert; wherein said at least one sensor (110, 120, 130, 140) comprises, or are connected to a control unit (410) configured to image recognition/ computer vision and object recognition; capturing (502) a stream of images (200) with the sensor (110, 120, 130, 140) of the vehicle (100); detecting (503) a predefined object (160) in the captured (502) stream of images (200) by utilising a forwardly directed sensor (110) or other sensors (120) of said at least one sensor (110, 120, 130, 140 ) of the vehicle (100) for identifying objects as a predefined object (160); and storing (508) only any captured images (200) comprising the predefined object (160) when the predefined object (160) is detected (503) in the image (200).
2. The method (500) according to claim 1, further comprising: cropping (504) the captured (502) image (200) around the detected (503) predefined object (160); and storing (508) the cropped (504) image (210).
3. The method (500) according to any of claim 1 or claim 2, further comprising: transmitting (505) the captured (502) image (200) when the predefined object (160) is detected (503) in the image (200) to a security centre (450).
4. The method (500) according to any of claims 1-3, wherein the activation (501) of the sensor (110, 120, 130, 140) and the image capturing (502) is triggered by a vehicle responsible person.
5. The method (500) according to any of claims 1 -4, further comprising: emitting (509) a wireless signal for triggering image capturing of a sensor (310, 320, 330, 340) situated on another structure (300) than the vehicle (100) and transmission of such captured image (200) to a data storage device (220) used for storing (508) the captured (502) image (200).
6. The method (500) according to any of claims 1 -5, further comprising: deactivating (510) the vehicle alarm alert when no predefined object (160) is detected in the captured (502) image (200).
7. The method (500) according to any of claims 1 -6, further comprising: observing (506) geographical position of the vehicle (100) triggered by the vehicle alarm alert; and storing (507) the observed (506) geographical position of the vehicle (100), associated with the captured (502) image (200).
8. A control unit (410) for burglary detection in a vehicle (100), wherein the control unit (410) is configured to: activate at least one sensor (110, 120, 130, 140) of the vehicle (100), triggered by a vehicle alarm alert; generate a command to the sensor (110, 120, 130, 140) to capture a stream of images (200); detect a predefined object (160) in the captured stream of images (200) by utilising a forwardly directed sensor (110) or other sensors (120) of said at least one sensor (110, 120, 130, 140) of the vehicle 100 for identifying objects as a predefined object (160); and store only any captured images (200) of the captured stream of images (200) comprising the predefined object (160), in a data storage device (220).
9. The control unit (410) according to claim 8, configured to: crop the captured image (200) around the detected predefined object (160); and store the cropped image (210) in the data storage device (220).
10. The control unit (410) according to any of claim 8 or claim 9, configured to: transmit the captured image (200) when the predefined object (160) is detected in the image (200) to a security centre (450), via a transmitter (440).
11. 1 1 . The control unit (410) according to any of claims 8-10, further configured to: generate a command to the transmitter (440), to emit a wireless signal for triggering image capturing of a sensor (310, 320, 330, 340) situated on another structure (300) than the vehicle (100) and transmission of such captured image (200) to a data storage device (220) used for storing the captured image (200).
12. The control unit (410) according to any of claims 8-11, further configured to: deactivate the vehicle alarm alert when no predefined object (160) is detected in the captured image (200).
13. The control unit (410) according to any of claims 8-12, further configured to: observe geographical position of the vehicle (100) via a positioning device (420), triggered by the vehicle alarm alert; and store the observed geographical position of the vehicle (100), associated with the captured image (200) in the data storage device (220).
14. A computer program comprising program code for performing a method (500) according to any of claims 1-7, when the computer program is executed in a computer.
15. A data storage device (220) for storing images (200), captured according to the method (500) according to any of claims 1-7, when the predefined object (160) is detected in the image (200).
16. A system (600) for burglary detection in a vehicle (100), comprising: at least one sensor (110, 120, 130, 140); a vehicle alarm (150); a control unit (410) according to any of claims 8-13; and a data storage device (220) according to claim 15.
SE1650327A 2016-03-10 2016-03-10 Method and system for theft detection in a vehicle SE541541C2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
SE1650327A SE541541C2 (en) 2016-03-10 2016-03-10 Method and system for theft detection in a vehicle
PCT/SE2017/050195 WO2017155448A1 (en) 2016-03-10 2017-03-02 Method and system for theft detection in a vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
SE1650327A SE541541C2 (en) 2016-03-10 2016-03-10 Method and system for theft detection in a vehicle

Publications (2)

Publication Number Publication Date
SE1650327A1 SE1650327A1 (en) 2017-09-11
SE541541C2 true SE541541C2 (en) 2019-10-29

Family

ID=59789774

Family Applications (1)

Application Number Title Priority Date Filing Date
SE1650327A SE541541C2 (en) 2016-03-10 2016-03-10 Method and system for theft detection in a vehicle

Country Status (2)

Country Link
SE (1) SE541541C2 (en)
WO (1) WO2017155448A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022112244A1 (en) * 2020-11-24 2022-06-02 Bode - Die Tür Gmbh Vehicle, system and method using a vandalism detection device
WO2022167432A1 (en) * 2021-02-05 2022-08-11 Daimler Truck AG Method for preventing the theft of a load for an autonomously acting commercial vehicle

Families Citing this family (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10259427B1 (en) 2017-10-11 2019-04-16 Robert Bosch Gmbh Vehicle security system using sensor data
US10346790B1 (en) * 2018-03-06 2019-07-09 Blackberry Limited Chain of custody information for cargo transportation units
CN110316153B (en) * 2018-03-30 2021-04-20 比亚迪股份有限公司 Vehicle monitoring system and method based on vehicle-mounted display terminal and vehicle
EP3561786B1 (en) * 2018-04-24 2022-10-26 Dr. Ing. h.c. F. Porsche AG Method and device for operating a vehicle
US11080975B2 (en) * 2018-06-29 2021-08-03 Baidu Usa Llc Theft proof techniques for autonomous driving vehicles used for transporting goods
EP3633641A1 (en) 2018-10-04 2020-04-08 Volvo Car Corporation Method and vehicle system for handling parameters associated with surroundings of a vehicle
GB2582904B (en) * 2019-03-26 2021-04-14 Atsr Ltd Method and apparatus for controlling access to a vehicle
GB2588770B (en) * 2019-11-04 2022-04-20 Ford Global Tech Llc Vehicle to vehicle security
RU2748780C1 (en) * 2020-11-05 2021-05-31 Общество с ограниченной ответственностью "Скайтрэк" (ООО "Скайтрэк") Method and system for detecting alarm events occurring on vehicle during cargo transportation in real time
DE102021200794A1 (en) * 2021-01-28 2022-08-11 Saidinger Gmbh Transport vehicle with load securing system
IT202100009362A1 (en) * 2021-04-14 2022-10-14 Leonardo Holding S R L BREAK-IN DETECTOR DEVICE
EP4279394A1 (en) 2022-05-16 2023-11-22 KID-Systeme GmbH Dynamically adjustable surveillance system and method of dynamically adjusting operation modes of a surveillance system
EP4378764A1 (en) * 2022-11-29 2024-06-05 Ningbo Geely Automobile Research & Development Co. Ltd. Method for activating a vehicle alarm system for lighting up a target vehicle, vehicle alarm system and vehicle comprising an alarm system
TWI846392B (en) * 2023-03-20 2024-06-21 英屬開曼群島商鴻騰精密科技股份有限公司 Monitoring method, device, internet of vehicles server, and storage medium

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB8825446D0 (en) * 1988-10-31 1988-11-30 Lawrence M J Vehicle security camera
US5027104A (en) * 1990-02-21 1991-06-25 Reid Donald J Vehicle security device
JP3664784B2 (en) * 1995-10-31 2005-06-29 松下電器産業株式会社 Wide area monitoring device
US20020097145A1 (en) * 1997-11-06 2002-07-25 David M. Tumey Integrated vehicle security system utilizing facial image verification
US6690411B2 (en) * 1999-07-20 2004-02-10 @Security Broadband Corp. Security system
CA2412389A1 (en) * 2000-06-09 2001-12-20 Robert Jeff Scaman Secure, vehicle mounted, surveillance system
US7064657B2 (en) * 2004-01-08 2006-06-20 International Business Machines Corporation Method and system for accessing and viewing images of a vehicle interior
US20060250501A1 (en) * 2005-05-06 2006-11-09 Widmann Glenn R Vehicle security monitor system and method
US7956735B2 (en) * 2006-05-15 2011-06-07 Cernium Corporation Automated, remotely-verified alarm system with intrusion and video surveillance and digital video recording
DE102008064034A1 (en) * 2008-12-22 2010-06-24 Deutsche Post Ag Method and system for monitoring the interior of a freight transport device
US9449482B2 (en) * 2010-07-14 2016-09-20 Honeywell International Inc. Method and apparatus for activating and deactivating video cameras in a security system
SE536729C2 (en) * 2011-02-14 2014-06-24 Scania Cv Ab Procedure and system for monitoring a motor vehicle from the point of view of intrusion
US9117371B2 (en) * 2012-06-22 2015-08-25 Harman International Industries, Inc. Mobile autonomous surveillance
US9619718B2 (en) * 2013-12-18 2017-04-11 Johnson Controls Technology Company In-vehicle camera and alert systems
US9329597B2 (en) * 2014-01-17 2016-05-03 Knightscope, Inc. Autonomous data machines and systems

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022112244A1 (en) * 2020-11-24 2022-06-02 Bode - Die Tür Gmbh Vehicle, system and method using a vandalism detection device
WO2022167432A1 (en) * 2021-02-05 2022-08-11 Daimler Truck AG Method for preventing the theft of a load for an autonomously acting commercial vehicle

Also Published As

Publication number Publication date
WO2017155448A1 (en) 2017-09-14
SE1650327A1 (en) 2017-09-11

Similar Documents

Publication Publication Date Title
WO2017155448A1 (en) Method and system for theft detection in a vehicle
US11453365B2 (en) Recording video of an operator and a surrounding visual field
US11713060B2 (en) Systems and methods for remote monitoring of a vehicle, robot or drone
US10421436B2 (en) Systems and methods for surveillance of a vehicle using camera images
US10300911B2 (en) Vehicle control apparatus and vehicle control method
US10789840B2 (en) Systems, apparatuses and methods for detecting driving behavior and triggering actions based on detected driving behavior
US10504302B1 (en) 360 degree vehicle camera accident monitoring system
US8836784B2 (en) Automotive imaging system for recording exception events
US11557125B2 (en) Method for monitoring the environment of a vehicle
US20150268338A1 (en) Tracking from a vehicle
US10647300B2 (en) Obtaining identifying information when intrusion is detected
US20180086307A1 (en) Device and method for monitoring a vehicle, particularly for the management of loss events
US20240246547A1 (en) Artificial intelligence-enabled alarm for detecting passengers locked in vehicle
CN115361653A (en) Providing safety via vehicle-based monitoring of neighboring vehicles
US10388132B2 (en) Systems and methods for surveillance-assisted patrol
JP7363838B2 (en) Abnormal behavior notification device, abnormal behavior notification system, abnormal behavior notification method, and program
US10967833B1 (en) Vehicle monitoring system using multiple security cameras
WO2019050448A1 (en) Method and control arrangement for estimating vehicle dimensions
US11151387B2 (en) Camera system to detect unusual activities
JP2017182347A (en) Vehicle communication system, vehicle peripheral information transmission method, and control program
US20230410525A1 (en) Vehicle off-guard monitoring system
KR100978879B1 (en) Method for tracing a vehicle
US11491952B2 (en) Vehicle monitoring and theft mitigation system
Ramprakaash et al. IoT based Tracking System for Two Wheelers
US20210122332A1 (en) System for providing vehicle security, monitoring and alerting