EP2447925B1 - Method for providing information assisting a driver of a vehicle and a corresponding system for use in and remote of a vehicle - Google Patents
Method for providing information assisting a driver of a vehicle and a corresponding system for use in and remote of a vehicle Download PDFInfo
- Publication number
- EP2447925B1 EP2447925B1 EP10189342.8A EP10189342A EP2447925B1 EP 2447925 B1 EP2447925 B1 EP 2447925B1 EP 10189342 A EP10189342 A EP 10189342A EP 2447925 B1 EP2447925 B1 EP 2447925B1
- Authority
- EP
- European Patent Office
- Prior art keywords
- image information
- display
- navigation system
- information
- road
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Not-in-force
Links
Images
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0967—Systems involving transmission of highway information, e.g. weather, speed limits
- G08G1/096708—Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control
- G08G1/096716—Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control where the received information does not generate an automatic action on the vehicle control
Definitions
- the present invention concerns a method for providing information assisting a driver of a vehicle which is equipped with a navigation system. It also concerns a corresponding system for use in a vehicle.
- TrafficView A Driver Assistant Device for Traffic Monitoring based on Car-to-Car Communication
- S. Dashtinezhad et al. cf. http://www.cs.rutgers.edu/ ⁇ iftode/vtcsp04.pdf.
- AHS Automatic Highway System
- NaviView In order to improve the comfort of drivers under the control of AHS, it is necessary to recover the driver's field of vision. In this paper, a NaviView system for the purpose of recovery is proposed.
- AHS the entire traffic load on the highway is visually monitored by many roadside video cameras. NaviView utilizes these video images for the recovery of the driver's visual field. It enables the driver to observe the behavior of other cars by means of a virtual view from a point just above his car. The virtual view presented to the driver is generated by combining the video images of the fixed roadside cameras.
- GPS global positioning system
- the camera and transmitter are powered by the cables that power the vehicle's reverse lights.
- the reversing lights turn on and power is sent to the camera and the transmitter, sending out a video signal.
- the GPS unit then automatically switches over to the camera view.
- the reverse lights go out and the camera stops sending an image.
- the unit will stop showing the camera's image and return to the previous function. See for instance: http://www.navsgo.com/GO740RV.html
- WO 01/82261 A1 Another navigation system with integration of live images is disclosed in WO 01/82261 A1 .
- the respective system employs an approach with a two-way communications between the navigation system in a car and an image server. This approach causes quite some load on existing communication systems in particular in urban areas where many cars and many cameras will be put to use.
- the method provides information assisting a driver of a vehicle equipped with a navigation system in accordance with claim 1.
- the inventive system is designed for use in a vehicle.
- the system comprises the features of claim 10.
- Fig. 1 shows a schematic structural diagram of a system 200 in accordance with the present invention.
- the system 200 comprises a navigation system 2 inside a vehicle 1 (e.g. a car), a remote system 100 and at least one road-side camera C1.
- the system 200 furthermore comprises communication means or channels, as indicated in Fig. 1 by arrows.
- the image information, which is transmitted from the road-side camera C1 to the remote system 100, is represented by the reference sign I1.
- the image information 12 is the information which is directly or indirectly transmitted from the remote system 100 to the navigation system 2.
- the data size of 12 is smaller than the data size of I1 or than the data size of the image recorded by the road-side camera C1. If the compression is carried out inside the road-side camera C1, the data size of 12 might be about the same as the data size of I1.
- a “navigation system” 2 is an electronic processors-based system which for instance enables or supports the driver of a vehicle 1 to find a certain route or to find a location.
- the navigation system 2 typically has a GPS or other positioning feature which is included or connected.
- the navigation system 2 may be integrated into the vehicle 1 (pre-installed system), or it may be a portable system (add-on system) which can be used, as needed, in different vehicles.
- Any portable computer-based system e.g. a smart phone in which a navigation application is implemented, or which is able to connect to a server-based or cloud-based navigation system is also considered to be a navigation system for the purposes of the present invention.
- a web-based navigation system is also considered to be a navigation system for the purposes of the present invention.
- a navigation system 2 may be entirely onboard the vehicle 1, or it may be located elsewhere and communicate via radio or other signals with the vehicle, or it may use a combination of these methods.
- the navigation system 2 can be a system presenting 2-dimensional or 3-dimensional maps. It could also be a system 2 where real images are used to provide a more realistic look.
- the invention can be used in connection with any of these systems 2.
- a “remote system 100” is a single server (i.e. a physical computer dedicated to running a special service) or host, a set-up where two or more servers (e.g. a series or network of computers) are connected, or a cloud-computing environment (e.g. an arrangement where shared resources, software, and information are provided to computers and other devices, such as the navigation system 2).
- the remote system 100 is a stationary system. At least a part of the navigation system 2 (for instance the receiver and the display 2) is mobile.
- the remote system 100 is configured and programmed to provide essential inventive services across a communication link or network to at least one vehicle 1 comprising a navigation system 2.
- a “road-side camera C1” is either a camera at a fixed location, e.g. attached to a pole, lamp, tree or building, or a vehicle-based camera which keeps a pre-defined position for a certain period of time, e.g. during rush hours.
- a camera C1 is herein considered to be a device that records images.
- the camera C1 preferably comprises or is connected to communication means for establishing a direct or indirect communication link to the remote system 100.
- the road-side camera C1 preferably provides digital image information.
- Figures 6, 7, and 8 show exemplary schematic flow-charts of the various aspects of the inventive method.
- the invention concerns a method which provides information assisting a driver of a vehicle 1 which is equipped with a navigation system 2.
- the method comprises the following steps:
- the road-side camera C1 is typically selected by the driver of the vehicle 1 before a specific section or part of a route R1 is approached.
- the image information 12 Before the image information 12 is transmitted, it is made available by the remote system 100 (cf. step S2 in Fig. 6 ). Before displaying the image information 13 on the navigation system's display 4, the image information 12 is directly or indirectly (e.g. via a cellular phone) received (cf. step S4 in Fig. 6 ) and processed (cf. step S5 in Fig. 6 ) by the navigation system 2.
- the image information 12 is directly or indirectly (e.g. via a cellular phone) received (cf. step S4 in Fig. 6 ) and processed (cf. step S5 in Fig. 6 ) by the navigation system 2.
- the driver typically does not know where the road-side cameras C1 are positioned and in which direction they are pointing.
- the object 3, which is displayed on the display 4, thus indicates one or more of the following:
- This feature or these features F1 - F4 is/are important since the driver should not be distracted when selecting the right camera C1. In addition, it is important that the driver is able to match, relate or connect the information 13 displayed on the display 4 to the real world. Due to the fact that the object 3 includes one or more of the features F1 - F4, it is easy to interpret the visual information provided by the road-side camera C1.
- the object 3, which is displayed on the display 4, may be temporarily magnified if the driver touches or pre-selects the object 3. This feature can be used in connection with all embodiments.
- Fig. 11 shows several different objects 3.1 through 3.5 which could be used to show on a display 4 the position of a road-side camera C1.
- Objects 3 are preferred where the object as such indicates a viewing direction. It is obvious when looking at the different objects 3.1 - 3.5 in Fig. 11 that all of them are pointing to the right hand side.
- These objects 3.1 - 3.5 are preferred. They can be used in connection with any of the embodiments.
- All embodiments of the invention may comprise additional features, objects or means which help the driver to "read” an image 13.
- the following features could be used as single feature or two or more of these features can be combined:
- the invention is facilitated by a number of measures which have been taken in order to control the bandwidth or to keep it low.
- the respective measures are listed below.
- the actual embodiments of the invention may comprises one or more of these measures.
- the real-time image information I1 can be sent through different communication channels from the remote system 100 to the vehicle 1.
- the remote system 100 is considered to be the source and the vehicle 1, respectively the inventive system 300 in the vehicle 1 in considered to be the destination or receiver.
- the real-time image information I1 is not necessarily sent from the remote system 100 right to the vehicle 1. It is also conceivable that there are systems (relays, routers, switches etc.) in-between, such as a computing environment of a mobile phone company or a server of a specialized service provider.
- a typical high resolution still image provided by a road-side camera C1 provides still images which have a size of about 1 MB or more.
- the ideal size of a still image is for the purposes of the present invention considered to be between 4 kB and 500 kB. If one assumes that in a certain city there are 50 road-side cameras C1 - C50 and that the images are processed (e.g.
- the road-side camera C1 should have a resolution which is better than 0,2 megapixels and preferably more than 0,5 megapixels. This rule applies to all embodiments.
- the road-side camera carries on optical filter or screen in order to block information which is not considered relevant or which for legal reasons has to be blanked out.
- a corresponding illustration is provided in Fig. 12 .
- the hashed field 400 represents a filter attached to a road-side camera. This approach also helps to reduce the bandwidth requirements.
- the dynamic image content is what matters the most in the context of the present invention since the view of a road crossing, for instance, which does not show any moving objects (such as pedestrians or cars) is not as interesting to the driver of the vehicle 1 as an image which shows for instance the stop-and-go of vehicles in front of a traffic light. This is one of the reasons why the measure M2 is considered advantageous.
- FIG. 10A shows the real traffic situation similar to the situation of Fig. 5A .
- the corresponding image information I1 can be transmitted from the camera C1 to the remote system 100.
- An image compression e.g. a separation of static and dynamic content (measure M2) or a regular data compression (measure M3), could also be carried out by the camera C1 or by a module attached to the camera C1.
- Fig. 10B shows the static image content which is here reduced to some very basic shapes and elements, such as road markings, outlines of buildings and landmarks, for instance. In the present example elements and features (such as plants, windows, doors, etc.) which are not considered important are removed.
- Fig 10C shows the dynamic image content, such as cars, traffic lights, changing traffic signs, pedestrians, etc.
- the transmission of the static image content does not require much bandwidth. In a vector based system it would be sufficient to just transmit vector information for lines and edges.
- the respective static image content is actually transmitted from the remote system 100 to the navigation system 2.
- the respective static image content is stored in the navigation system 2 (e.g. using a CD ROM or another storage medium). In this case the static information is not required to be transmitted.
- the dynamic content is transmitted to the navigation system 2.
- the system 2, or a special module attached thereto maps the dynamic content onto the static content (step S1.3, Fig. 7 ), no matter whether the static content is locally available or transmitted from the remote system 100.
- Fig. 10D shows the display 4 with a "reconstructed" image where static and dynamic content has been merged.
- FIG. 7 A flow chart is presented in Fig. 7 where the static and dynamic image content are separately handled and transmitted (steps S1.1 and S1.2, Fig. 7 ).
- the static and dynamic image content are merged or combined by the navigation system 2, or by a module attached or connected to the system 2 (step S1.3, Fig. 7 ).
- FIG. 8 A flow chart is presented in Fig. 8 where the static image content is not transmitted because it is available at the navigation system 2 (step S2.1, Fig. 8 ).
- the static image content could be retrieved from a local storage medium, for instance.
- the dynamic image content is separately handled and transmitted (steps S2.2 and S2.3, Fig. 8 ).
- the static and dynamic image content are merged or combined by the navigation system 2, or by a module attached or connected to the system 2 (step S2.4, Fig. 8 ).
- cellular or smart push approach instead of broadcasting all images to all systems 300 in a general push approach, one could divide the whole city area into smaller cells (herein called cellular or smart push approach). In this case only images of road-side cameras (e.g. the cameras C1 - C10) within a particular cell are transmitted to systems 300 inside the cell or close to this cell. This helps to reduce the overall bandwidth requirement drastically.
- road-side cameras e.g. the cameras C1 - C10
- images of road-side cameras are transmitted to systems 300 inside a vehicle 1 where a route has been defined or programmed in the navigation system 2 (i.e. if a route finding process of the navigation system 2 is active) which is passing these cameras C5 - C10.
- a route has been defined or programmed in the navigation system 2 (i.e. if a route finding process of the navigation system 2 is active) which is passing these cameras C5 - C10.
- This helps to reduce the overall bandwidth requirement drastically.
- a driver who is driving from location A to location B using the navigation system 2 and following the route R1, as shown in Fig. 9 would only be enabled to request and receive image information 12 from the road-side camera C2.
- Image information 12 from the road-side camera C3 is only requestable by a driver whose route passes by the position of the road-side camera C3.
- the cell or micro-cell structure of a cellular mobile phone network is used in order to determine whether a system 200 currently is in a certain cell and whether there are any cameras (e.g. the cameras C1 - C10) in the same or in a neighboring cell which are providing real-time image information I1 which could be useful to the driver of a certain vehicle 1.
- This approach could be used together with the cellular or smart broadcast push approach, or it could be used in connection with a pull approach.
- a system 200 inside or close to a certain cell is enabled to request real-time image information I1 from a certain camera C1 - C10 within or close to the same cell, e.g. by using the systems user interface.
- the pull approach is in another embodiment used without making a pre-selection or the like using the current vehicle's position inside a cell.
- This approach is herein called user specific pull approach.
- the user is enabled by a software module of the system 300 to request real-time image information I1 provided by a certain camera CX, no matter where he or his vehicle 1 is located.
- This real-time image information I1 is directly or indirectly requested from the remote system 100 and sent to the user, vehicle 1 or system 300 using a dedicated downlink, e.g. a mobile phone connection.
- TCM Traffic Message Channel
- RDS FM Radio Data System
- I1 Real-time image information I1 could be transmitted together with radio signals in a broadcast fashion, but this service should be limited to local radio transmitters, because it would not make much sense for a vehicle 1 in one city to receive images from cameras in other cities.
- Digital radio describes radio communications technologies which carry information as a digital signal. Since a digital modulation method is used for the transmission, the vehicle 1 or navigation system 2 in this case comprises a digital demodulator in order to be able to receive and process the digital signals. The digital radio service could be used to transmit real-time image information I1 in a broadcast fashion.
- a regular phone is programmed to receive the information I1, or a separate communications module (e.g. comprising a SIM card) is implemented inside the inventive system 300 or is connectable to the system 300.
- a separate communications module e.g. comprising a SIM card
- Such an approach is preferably being used for realizing the cellular or smart broadcast push approach or the user specific pull approach.
- image data messages are received silently and decoded by a car radio, a mobile phone, a PDA, a smart phone or a navigation system 2, and delivered (e.g. made visible) to the driver in a variety of ways.
- the system 300 in all embodiments includes a display 4 or other means or indicators which can be dedicated or shared with the ones already existing in the vehicle 1. Besides visual indicators the navigation system 2 may also include audible means or other means to inform the driver.
Landscapes
- Life Sciences & Earth Sciences (AREA)
- Atmospheric Sciences (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Navigation (AREA)
- Traffic Control Systems (AREA)
Description
- The present invention concerns a method for providing information assisting a driver of a vehicle which is equipped with a navigation system. It also concerns a corresponding system for use in a vehicle.
- There is a need to improve current navigation systems. It is quite frustrating that still today these systems are not able to deal properly with traffic congestions in urban areas. There is quite some potential to reduce traffic jams and to reduce at the same time the pollution caused by cars which are waiting in queues, e.g. in front of traffic lights, tunnel portals, parking decks, shopping centers, and so forth.
- There is a vehicle-to-vehicle communication system known as "TrafficView" which enables the spreading of traffic related information from car to car using short-range wireless communication. Details can be found in "TrafficView: A Driver Assistant Device for Traffic Monitoring based on Car-to-Car Communication", S. Dashtinezhad et al., cf.
http://www.cs.rutgers.edu/∼iftode/vtcsp04.pdf. - There is the long felt desire to provide real-time information or even images while driving inside a car. It is thus not surprising to see that a number of approaches and schemes have been developed which employ road-side cameras and which link the cameras with a display inside the car.
- The following presentations
- "Embedded Systems", WSITC FORUM, AN INTERNATIONAL CES 2007 PERSPECTIVE, which was held by Robert Mitchell, AEEMA, 20 February 2007,
- "What's Hot in ICT?", WSITC FORUM, AN INTERNATIONAL CES 2007 PERSPECTIVE, which was held by Angus M Robinson, AEEMA, 20 February 2007
- For details see also:
- http://interdependent.com.au/wsitc/documents/WSITC_Forum_20Feb07_Progra m_PPTs.pdf
- The use of road-side cameras near intersections is proposed and discussed in "Visual Assistance for Drivers by Mixed Reality", Y. Kameda et al., 14th World Congress at of ITS, 2007, Beijing. The road-side cameras are used in order to provide visual assistance and a better overview for a driver who has a receiver in his car. For details see: http://www.kamedalab.org/research/publication/2007/200710_ITSWC/ITSWC2007-3210-kameda.pdf
- Yet another approach, called AHS (Automated Highway System), is discussed in the following paper: "NaviView: Bird's-Eye View for Highway Drivers Using Roadside Cameras," Eitaro Ichihara, Hiroyuki Takao, Yuichi Ohta, icmcs, vol. 2, pp.559, 1999 IEEE International Conference on Multimedia Computing and Systems (ICMCS'99) - . The AHS is believed to increase the traffic potential of highways because it enables drivers to shorten the distance between cars. AHS may make the drivers uncomfortable because it presents them with a reduced field of vision of the car ahead and may induce anxiety over the reliability of the control system. In order to improve the comfort of drivers under the control of AHS, it is necessary to recover the driver's field of vision. In this paper, a NaviView system for the purpose of recovery is proposed. In AHS, the entire traffic load on the highway is visually monitored by many roadside video cameras. NaviView utilizes these video images for the recovery of the driver's visual field. It enables the driver to observe the behavior of other cars by means of a virtual view from a point just above his car. The virtual view presented to the driver is generated by combining the video images of the fixed roadside cameras.
- Another approach is presented in the following publication: "Car navigation system with image recognition," Kohei Ito, Naohiko Ichihara, Hiroto Inoue, Ryujiro Fujita, Mitsuo Yasushi, icce, pp.1-2, 2009 Digest of Technical Papers International Conference on Consumer Electronics, 2009. The authors state that many cars now would have on-board cameras, and that many kinds of driver support systems that use image recognition are being developed. The authors themselves claim to have developed a car navigation system with image recognition that enhances drivers' safety, convenience, and entertainment.
- There are GPS (global positioning system) based navigation systems with backup camera display. The camera and transmitter are powered by the cables that power the vehicle's reverse lights. When the vehicle is put into reverse gear, the reversing lights turn on and power is sent to the camera and the transmitter, sending out a video signal. The GPS unit then automatically switches over to the camera view. When the vehicle is taken out of reverse gear, the reverse lights go out and the camera stops sending an image. At the same time the unit will stop showing the camera's image and return to the previous function. See for instance: http://www.navsgo.com/GO740RV.html
- Another navigation system with integration of live images is disclosed in
WO 01/82261 A1 - Another navigation system is known from the published patent application
WO 2007/057696 A1 . This system is dsigned to show symbols on a display which represent cameras. The driver can pick a camera by touching the respective symbole on the display. This selection causes a play back of audo or visual information which was recorded by a camera before. This application also addresses bandwidth-related issues. In order to overcome bandwidth limitations, it is proposed to use a compression scheme. - All these systems mentioned above confirm that there is a desire for establishing a connection between cameras and cars. It is a disadvantage of the known approaches that they are just providing overview information which is not linked at all to the capabilities of a navigation system. These systems have the serious disadvantage that their use while driving might distract the driver. It is yet another disadvantage of known systems, that they require quite some bandwidth for the transmission of image information. Any system which would be offered to all users of navigation systems in an urban area would lead to a communication or capacity overload.
- It is an objective of the present invention to provide a camera-assisted or camera-based navigation system which offers real time or close-to-real time information.
- It is another objective to provide a system which is well suited for large scale operation e.g. in urban area.
- The method, according to the present invention, provides information assisting a driver of a vehicle equipped with a navigation system in accordance with
claim 1. - The inventive system is designed for use in a vehicle. The system comprises the features of claim 10.
- The features of advantageous embodiments are presented in the dependent method and apparatus claims. The respective advantages are addressed or become apparent from the following detailed description.
- For a more complete description of the present invention and for further objects and advantages thereof, reference is made to the following description, taken in conjunction with the accompanying drawings:
- Fig. 1
- shows a schematic structural diagram of a system in accordance with the present invention;
- Fig. 2
- shows a schematic diagram of a display of a system in accordance with the present invention;
- Fig. 3
- shows a schematic diagram of a display of a system in accordance with the present invention;
- Fig. 4
- shows a schematic diagram of a display of a system in accordance with the present invention;
- Fig. 5A
- shows a schematic diagram of a real traffic situation;
- Fig. 5B
- shows a schematic diagram of a display of a system in accordance with the present invention where an image of the real traffic situation of
Fig. 5A is shown; - Fig. 6
- shows a schematic flow-chart in accordance with the present invention;
- Fig. 7
- shows a schematic flow-chart in accordance with the present invention;
- Fig. 8
- shows a schematic flow-chart in accordance with the present invention;
- Fig. 9
- shows a schematic map with two objects representing two road-side cameras in accordance with the present invention;
- Fig. 10A
- shows a schematic diagram of a real traffic situation;
- Fig. 10B
- shows a schematic diagram of the relevant static image content of
Fig. 10A ; - Fig. 10C
- shows a schematic diagram of the relevant dynamic image content of
Fig. 10A ; - Fig. 10D
- shows a schematic diagram of the "reconstructed" image which is based on or derived from the image of
Fig. 10A ; - Fig. 11
- shows several different objects which could be used to show on a display the position of a road-side camera;
- Fig. 12
- shows a schematic illustration of an image obtained by a road-side camera to which a filter is attached.
-
Fig. 1 shows a schematic structural diagram of asystem 200 in accordance with the present invention. Thesystem 200 comprises anavigation system 2 inside a vehicle 1 (e.g. a car), aremote system 100 and at least one road-side camera C1. Thesystem 200 furthermore comprises communication means or channels, as indicated inFig. 1 by arrows. The image information, which is transmitted from the road-side camera C1 to theremote system 100, is represented by the reference sign I1. Theimage information 12 is the information which is directly or indirectly transmitted from theremote system 100 to thenavigation system 2. According to the invention, the data size of 12 is smaller than the data size of I1 or than the data size of the image recorded by the road-side camera C1. If the compression is carried out inside the road-side camera C1, the data size of 12 might be about the same as the data size of I1. - A "navigation system" 2, as herein used, is an electronic processors-based system which for instance enables or supports the driver of a
vehicle 1 to find a certain route or to find a location. Thenavigation system 2 typically has a GPS or other positioning feature which is included or connected. Thenavigation system 2 may be integrated into the vehicle 1 (pre-installed system), or it may be a portable system (add-on system) which can be used, as needed, in different vehicles. Any portable computer-based system (e.g. a smart phone) in which a navigation application is implemented, or which is able to connect to a server-based or cloud-based navigation system is also considered to be a navigation system for the purposes of the present invention. A web-based navigation system is also considered to be a navigation system for the purposes of the present invention. In other words, anavigation system 2, as herein used, may be entirely onboard thevehicle 1, or it may be located elsewhere and communicate via radio or other signals with the vehicle, or it may use a combination of these methods. - The
navigation system 2 can be a system presenting 2-dimensional or 3-dimensional maps. It could also be asystem 2 where real images are used to provide a more realistic look. The invention can be used in connection with any of thesesystems 2. - A "
remote system 100" is a single server (i.e. a physical computer dedicated to running a special service) or host, a set-up where two or more servers (e.g. a series or network of computers) are connected, or a cloud-computing environment (e.g. an arrangement where shared resources, software, and information are provided to computers and other devices, such as the navigation system 2). Theremote system 100 is a stationary system. At least a part of the navigation system 2 (for instance the receiver and the display 2) is mobile. - The
remote system 100 is configured and programmed to provide essential inventive services across a communication link or network to at least onevehicle 1 comprising anavigation system 2. - A "road-side camera C1" is either a camera at a fixed location, e.g. attached to a pole, lamp, tree or building, or a vehicle-based camera which keeps a pre-defined position for a certain period of time, e.g. during rush hours. A camera C1 is herein considered to be a device that records images. The camera C1 preferably comprises or is connected to communication means for establishing a direct or indirect communication link to the
remote system 100. The road-side camera C1 preferably provides digital image information. -
Figures 6, 7, and 8 show exemplary schematic flow-charts of the various aspects of the inventive method. - The invention concerns a method which provides information assisting a driver of a
vehicle 1 which is equipped with anavigation system 2. The method comprises the following steps: - Providing real-time image information I1 to the
remote system 100, the real-time image information I1 having been obtained from at least one road-side camera C1 (as illustrated inFig. 1 , for instance). - Displaying an object 3 (see for instance
Figures 2, 3 ,4, 5B ,9 ) on adisplay 4 of thenavigation system 2 which represents the at least one road-side camera C1. - Selecting a corresponding road-side camera C1 using a user interface of the navigation system 2 (e.g. by touching the
object 3 on a touchsensitive display 4 or by using a keyboard or by using voice control). This step enables the user of thesystem 2 to select a particular (corresponding) road-side camera C1 (cf. step S1 inFig. 6 ). - Transmitting
image information 12 which is based on the real-time image information I1 provided by the corresponding road-side camera C1 which has been selected (cf. step S3 inFig. 6 ). The transmission takes place between theremote system 100 or a relay unit attached or connected to theremote system 100 and the navigation system 2 (as illustrated inFig. 1 , for instance). - Displaying an
image 13 based on theimage information 12 on thedisplay 4 in order to provide visual information of a specific section or part of the route R1 (cf. step S6 inFig. 6 ). InFig. 5B a situation or embodiment is shown, where on thedisplay 4 animage 5 in an image is shown. A split-screen solution could also be used. - The road-side camera C1 is typically selected by the driver of the
vehicle 1 before a specific section or part of a route R1 is approached. - Before the
image information 12 is transmitted, it is made available by the remote system 100 (cf. step S2 inFig. 6 ). Before displaying theimage information 13 on the navigation system'sdisplay 4, theimage information 12 is directly or indirectly (e.g. via a cellular phone) received (cf. step S4 inFig. 6 ) and processed (cf. step S5 inFig. 6 ) by thenavigation system 2. - The driver typically does not know where the road-side cameras C1 are positioned and in which direction they are pointing. The
object 3, which is displayed on thedisplay 4, thus indicates one or more of the following: - F1: a
viewing direction 102 of a corresponding road-side camera C1 (cf.Fig. 3 ), - F2: a line of sight of a corresponding road-side camera C1,
- F3: field of
view 101 of a corresponding road-side camera C1 (cf.Fig. 2 ), - F4: a
range 103 of a corresponding road-side camera C1 (cf.Fig. 4 ). - This feature or these features F1 - F4 is/are important since the driver should not be distracted when selecting the right camera C1. In addition, it is important that the driver is able to match, relate or connect the
information 13 displayed on thedisplay 4 to the real world. Due to the fact that theobject 3 includes one or more of the features F1 - F4, it is easy to interpret the visual information provided by the road-side camera C1. - The
object 3, which is displayed on thedisplay 4, may be temporarily magnified if the driver touches or pre-selects theobject 3. This feature can be used in connection with all embodiments. -
Fig. 11 shows several different objects 3.1 through 3.5 which could be used to show on adisplay 4 the position of a road-side camera C1.Objects 3 are preferred where the object as such indicates a viewing direction. It is obvious when looking at the different objects 3.1 - 3.5 inFig. 11 that all of them are pointing to the right hand side. These objects 3.1 - 3.5 are preferred. They can be used in connection with any of the embodiments. - All embodiments of the invention may comprise additional features, objects or means which help the driver to "read" an
image 13. The following features could be used as single feature or two or more of these features can be combined: - F5: when reaching the position of the road-side camera C1 a visual or audible cue can be provided by the
navigation system 2. - F6: when reaching the zone or area which is within the viewing field of a roadside camera C1, the
viewing direction 102 and/or the line of sight and/or the field ofview 101 and/or therange 103 is/are highlighted or put in the foreground on thedisplay 4. - F7: In a 3-
D navigation system 2 or in anavigation system 2 using real images, the image information 13 (at least the dynamic content) is mapped onto the 3-D image or real image (called overlay mode). - The invention is facilitated by a number of measures which have been taken in order to control the bandwidth or to keep it low. The respective measures are listed below. The actual embodiments of the invention may comprises one or more of these measures.
- The most important inventive measures M1 through M6, which help to keep bandwidth requirement low, are listed below:
- M1: Real-time image information I1 is only offered locally, that is a driver can only obtain images of cameras C1 which are along his route R1 or within a certain local area. This measure M1 is preferably combined with an active route finding process being carried out by the
navigation system 2. - M2: In addition to any of the other measures or instead of the other measures, a distinction is made between static (motionless) image content and dynamic or quasi dynamic image content (i.e. changing content). This can be done by means of software-implemented edge detection for instance, which ensures that only limited image data are transmitted for static portions or areas of an image. These data are then used in the
vehicle 1 to build or reconstruct an image. - M3: In addition to any of the other measures or instead of the other measures, software or hardware-based techniques (such as data compression) are employed which ensure that only limited image data or a reduced data volume are transmitted. These data are then used in the
vehicle 1 to build or reconstruct an image; - M4: In addition to any of the other measures or instead of the other measures, software or hardware-based techniques are employed which ensure that dynamic or quasi dynamic image content is transmitted on request only.
- M5: In addition to any of the other measures or instead of the other measures, software or hardware-based techniques are employed which ensure that real-time image information is transmitted on request only. In this case either the user of the inventive system or the system itself requests real-time image information relevant for or related to a certain route R1.
- M6: A distinction can be made between road-side cameras C2 which currently show important traffic information and road-side cameras C3 where no relevant information is available. This can be achieved either in that an automated software-based process run by the
remote system 100 or an operator distinguishes relevant information from not relevant information. The automated software-based process either uses a pattern recognition scheme, or it processes additional information. The additional information can be provided by radar cameras, induction loops integrated into the road, motion sensors and other auxiliary units. Theobject 3 could for instance be highlighted using a special color scheme (e.g. a green object versus a red object), or the shape of theobject 3 could change. A highlightedobject 3 would than indicate that there is relevant image information available. - The real-time image information I1 can be sent through different communication channels from the
remote system 100 to thevehicle 1. In the following a distinction is made between push approaches and pull approaches. In both cases theremote system 100 is considered to be the source and thevehicle 1, respectively theinventive system 300 in thevehicle 1 in considered to be the destination or receiver. The real-time image information I1 is not necessarily sent from theremote system 100 right to thevehicle 1. It is also conceivable that there are systems (relays, routers, switches etc.) in-between, such as a computing environment of a mobile phone company or a server of a specialized service provider. - The following embodiments are based on a series of assumptions. A typical high resolution still image provided by a road-side camera C1 provides still images which have a size of about 1 MB or more. The ideal size of a still image is for the purposes of the present invention considered to be between 4 kB and 500 kB. If one assumes that in a certain city there are 50 road-side cameras C1 - C50 and that the images are processed (e.g. by means of compression or by a separation of real-time image data or dynamic image content from static image content) so that they have 10 kB each, a total of 50 times 10 kB is to be broadcast, if all
inventive systems 200, no matter where they are in the city, are to receive the images of all 50 road-side cameras C1 - C50. If pictures are obtained from the cameras C1 - C50 orremote system 100 once per minute, then in a broadcast push approach 500 kB per minute have to be transmitted to allvehicles 1 in a certain region. In this example everysystem 200 at least in this region receives new images of all cameras C1 - C50 once per minute. The corresponding transmission frequency can be higher or lower. - The road-side camera C1 should have a resolution which is better than 0,2 megapixels and preferably more than 0,5 megapixels. This rule applies to all embodiments.
- In a preferred embodiment it is possible to compress, filter or process the image data at the camera side or at the
remote system 100 so that the size of the images gets smaller (reduced data volume). It is, however, to be kept in mind that features of an image when displayed in thevehicle 1 have to be visibly resolved. Such a compression scheme can be applied to all embodiments. - In a preferred embodiment it is possible to process the image data at the camera side or at the
remote system 100 so that license plates or the faces of people are blanked out. - Even more preferred is an embodiment where a separation of real-time image data or dynamic image content from static image content is carried out at the camera side and/or at the
remote system 100 so that relevant image information is always made visible whereas less important (e.g. static) image information is less well visible on adisplay 4. Such a scheme can be applied to all embodiments. - Even more preferred is an embodiment where the road-side camera carries on optical filter or screen in order to block information which is not considered relevant or which for legal reasons has to be blanked out. A corresponding illustration is provided in
Fig. 12 . The hashedfield 400 represents a filter attached to a road-side camera. This approach also helps to reduce the bandwidth requirements. - The dynamic image content is what matters the most in the context of the present invention since the view of a road crossing, for instance, which does not show any moving objects (such as pedestrians or cars) is not as interesting to the driver of the
vehicle 1 as an image which shows for instance the stop-and-go of vehicles in front of a traffic light. This is one of the reasons why the measure M2 is considered advantageous. - An example of a possible implementation or embodiment of the measure M2 is schematically illustrated in the sequence of
Figures 10A through 10D . A corresponding flow chart is presented inFig. 7 .Fig. 10A shows the real traffic situation similar to the situation ofFig. 5A . The corresponding image information I1, including details such as windows, doors, trees, pedestrians and the like, can be transmitted from the camera C1 to theremote system 100. An image compression, e.g. a separation of static and dynamic content (measure M2) or a regular data compression (measure M3), could also be carried out by the camera C1 or by a module attached to the camera C1. More preferred is an embodiment where the image compression is carried out by theremote system 100 or by a special hardware and/or software module of the remote system 100 (steps S1.1 and S.1.2,Fig. 7 ). The principle on which measure M2 is based, is schematically illustrated inFig. 10B and 10C. Fig 10B shows the static image content which is here reduced to some very basic shapes and elements, such as road markings, outlines of buildings and landmarks, for instance. In the present example elements and features (such as plants, windows, doors, etc.) which are not considered important are removed.Fig 10C shows the dynamic image content, such as cars, traffic lights, changing traffic signs, pedestrians, etc. - The transmission of the static image content does not require much bandwidth. In a vector based system it would be sufficient to just transmit vector information for lines and edges. In one embodiment of the invention the respective static image content is actually transmitted from the
remote system 100 to thenavigation system 2. In another embodiment of the invention the respective static image content is stored in the navigation system 2 (e.g. using a CD ROM or another storage medium). In this case the static information is not required to be transmitted. - In both embodiments the dynamic content is transmitted to the
navigation system 2. Thesystem 2, or a special module attached thereto, maps the dynamic content onto the static content (step S1.3,Fig. 7 ), no matter whether the static content is locally available or transmitted from theremote system 100.Fig. 10D shows thedisplay 4 with a "reconstructed" image where static and dynamic content has been merged. - In addition or instead of any of the other measures M1, M3 - M6, a distinction is made between static (motionless) image content and dynamic or quasi dynamic image content (i.e. changing content), as described above in connection with
Figures 10A - 10D . - A flow chart is presented in
Fig. 7 where the static and dynamic image content are separately handled and transmitted (steps S1.1 and S1.2,Fig. 7 ). The static and dynamic image content are merged or combined by thenavigation system 2, or by a module attached or connected to the system 2 (step S1.3,Fig. 7 ). - A flow chart is presented in
Fig. 8 where the static image content is not transmitted because it is available at the navigation system 2 (step S2.1,Fig. 8 ). The static image content could be retrieved from a local storage medium, for instance. The dynamic image content is separately handled and transmitted (steps S2.2 and S2.3,Fig. 8 ). The static and dynamic image content are merged or combined by thenavigation system 2, or by a module attached or connected to the system 2 (step S2.4,Fig. 8 ). - In a preferred embodiment, instead of broadcasting all images to all
systems 300 in a general push approach, one could divide the whole city area into smaller cells (herein called cellular or smart push approach). In this case only images of road-side cameras (e.g. the cameras C1 - C10) within a particular cell are transmitted tosystems 300 inside the cell or close to this cell. This helps to reduce the overall bandwidth requirement drastically. - In another preferred embodiment only images of road-side cameras (e.g. the cameras C5 - C10) are transmitted to
systems 300 inside avehicle 1 where a route has been defined or programmed in the navigation system 2 (i.e. if a route finding process of thenavigation system 2 is active) which is passing these cameras C5 - C10. This helps to reduce the overall bandwidth requirement drastically. This means that a driver who is driving from location A to location B using thenavigation system 2 and following the route R1, as shown inFig. 9 , would only be enabled to request and receiveimage information 12 from the road-side camera C2.Image information 12 from the road-side camera C3 is only requestable by a driver whose route passes by the position of the road-side camera C3. - In a further preferred embodiment the cell or micro-cell structure of a cellular mobile phone network is used in order to determine whether a
system 200 currently is in a certain cell and whether there are any cameras (e.g. the cameras C1 - C10) in the same or in a neighboring cell which are providing real-time image information I1 which could be useful to the driver of acertain vehicle 1. This approach could be used together with the cellular or smart broadcast push approach, or it could be used in connection with a pull approach. According to this pull approach, asystem 200 inside or close to a certain cell is enabled to request real-time image information I1 from a certain camera C1 - C10 within or close to the same cell, e.g. by using the systems user interface. - The pull approach is in another embodiment used without making a pre-selection or the like using the current vehicle's position inside a cell. This approach is herein called user specific pull approach. Here the user is enabled by a software module of the
system 300 to request real-time image information I1 provided by a certain camera CX, no matter where he or hisvehicle 1 is located. This real-time image information I1 is directly or indirectly requested from theremote system 100 and sent to the user,vehicle 1 orsystem 300 using a dedicated downlink, e.g. a mobile phone connection. - With current radio broadcast methods in some countries traffic information is transmitted (e.g. using the Traffic Message Channel: TCM) to radio receivers. TMC is a specific application of the FM Radio Data System (RDS) used for broadcasting real-time traffic and weather information. The TCM information is used in order to allow a dynamic route calculation in case of traffic jams and the like. Real-time image information I1 could be transmitted together with radio signals in a broadcast fashion, but this service should be limited to local radio transmitters, because it would not make much sense for a
vehicle 1 in one city to receive images from cameras in other cities. - With the deployment of digital radio, high bandwidth channels become available in particular in urban areas. Digital radio describes radio communications technologies which carry information as a digital signal. Since a digital modulation method is used for the transmission, the
vehicle 1 ornavigation system 2 in this case comprises a digital demodulator in order to be able to receive and process the digital signals. The digital radio service could be used to transmit real-time image information I1 in a broadcast fashion. - It is also possible to use mobile phones and stationary transmitters to transmit the real-time image information I1. For this purpose either a regular phone is programmed to receive the information I1, or a separate communications module (e.g. comprising a SIM card) is implemented inside the
inventive system 300 or is connectable to thesystem 300. Such an approach is preferably being used for realizing the cellular or smart broadcast push approach or the user specific pull approach. - According to the invention, image data messages (real-time image information 12) are received silently and decoded by a car radio, a mobile phone, a PDA, a smart phone or a
navigation system 2, and delivered (e.g. made visible) to the driver in a variety of ways. - The
system 300 in all embodiments includes adisplay 4 or other means or indicators which can be dedicated or shared with the ones already existing in thevehicle 1. Besides visual indicators thenavigation system 2 may also include audible means or other means to inform the driver.
Claims (13)
- Method for providing information assisting a driver of a vehicle (1) which is equipped with a navigation system (2),- Providing real-time image information (I1) to a remote system (100), said real-time image information (I1) having been obtained from at least one road-side camera (C1),- Displaying an object (3) on a display (4) of said navigation system (2) which represents at least one road-side camera (C1),- selecting a corresponding road-side camera (C1) using a user interface of said navigation system (2),- transmitting image information (12) based on said real-time image information (I1) provided by the corresponding road-side camera (C1) which has been selected, from said remote system (100) to said navigation system (2),- providing static image content by using vector information for lines and edges or by using basic shapes and elements,- displaying reconstructed image information (13) based on said image information (12) and said static image content on said display (4) in order to provide visual information of said specific section of a route (R1).
- Method according to claim 1, wherein said visual information is provided if a route finding process of said navigation system (2) is active.
- Method according to claim 1 or 2, wherein said object (3) which is displayed on said display (4) indicates or represents one or more of the following:- a viewing direction (102) of a corresponding road-side camera (C1),- a line of sight of a corresponding road-side camera (C1),- field of view (101) of a corresponding road-side camera (C1),- a range (103) of a corresponding road-side camera (C1).
- Method according to claim 1, 2 or 3, wherein said display (4) is a touch-sensitive display (4) and wherein said object (3), which is displayed on said display (4), is temporarily magnified if the driver touches or pre-selects the object.
- Method according to one of the claims 1 through 4, wherein said image information (12) is displayed in an orientation which matches the actual orientation of a map of said specific section of the route (R1) shown on said display (4).
- Method according to one of the claims 1 through 5, comprising the step- displaying said image information (12) in an overlay mode above an artificial map on said display (4), or,- in a dual screen application displaying said image information (12) in a separate window (5) or frame of said display (4) while displaying a map in another window or frame of said display (4).
- Method according to claim 1, wherein said static image content is provided by said navigation system (2) or by a portable computing system connected to said navigation system (2), and wherein said image information (I2) is provided via said remote system (100).
- Method according to claim 1, wherein said static image content and said real-time image data information are provided via said remote system (100).
- Method according to one of the preceding claims, wherein said image information (12) is obtained by the application of a compression scheme on said real-time image information (I1).
- System (200) for use in a vehicle (1), said system comprising:- a navigation system (2),- a display (4) for displaying information provided by said navigation system (2),- a receiver for receiving image information (12) from a remote system (100) based on real-time image information (11),- a software module causing an object (3) representing a road-side camera (C1) to be displayed on said display (4),- a user interface for interaction of a driver with said system (200), said user interface enabling the driver to select a road-side camera (C1), said system (200) being enabled to receive said image information (12) from said remote system (100), wherein said display (4) displays a reconstructed image Information (I3) where static image content provided by the navigation system (2) and the real-time image information (11) were merged.
- System of claim 10, wherein said image information (12) is requestable by means of a manual interaction with said user interface.
- System of claim 10 or 11, wherein said display (4) is a touch sensitive display (4) and wherein said software module enables a user to request said image information (12) by activation or selection of said object (3).
- A remote system (100) for offering information assisting a driver of a vehicle (1) which is equipped with a navigation system (2), said system (100) comprising- a computing server,- a storage system,- a communication link for receiving image information (I1) from at least one road-side camera,- a communication link for sending image information (12) based on real-time image information (11), such as cars, traffic lights, changing traffic signs, pedestrians, etc., to a navigation system (2),- a software module for processing said image information (12),- wherein said image information (12) representing said real-time image information (11) is transmitted via said communication link to said navigation system (2),- and wherein static image content is provided to said navigation system (2) by using vector information for lines and edges or by using basic shapes and elements.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP10189342.8A EP2447925B1 (en) | 2010-10-29 | 2010-10-29 | Method for providing information assisting a driver of a vehicle and a corresponding system for use in and remote of a vehicle |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP10189342.8A EP2447925B1 (en) | 2010-10-29 | 2010-10-29 | Method for providing information assisting a driver of a vehicle and a corresponding system for use in and remote of a vehicle |
Publications (2)
Publication Number | Publication Date |
---|---|
EP2447925A1 EP2447925A1 (en) | 2012-05-02 |
EP2447925B1 true EP2447925B1 (en) | 2017-05-17 |
Family
ID=43629057
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP10189342.8A Not-in-force EP2447925B1 (en) | 2010-10-29 | 2010-10-29 | Method for providing information assisting a driver of a vehicle and a corresponding system for use in and remote of a vehicle |
Country Status (1)
Country | Link |
---|---|
EP (1) | EP2447925B1 (en) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11351932B1 (en) * | 2021-01-22 | 2022-06-07 | Toyota Motor Engineering & Manufacturing North America, Inc. | Vehicles and methods for generating and displaying composite images |
CN114666382A (en) * | 2022-03-17 | 2022-06-24 | 北京斯年智驾科技有限公司 | Parallel driving system for automatic driving semi-mounted collecting card |
CN115731707B (en) * | 2022-11-14 | 2024-03-19 | 东南大学 | Highway vehicle traffic control method and system |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100386752B1 (en) * | 2000-04-24 | 2003-06-09 | 김석배 | Navigation system of vehicle using live image |
GB0523512D0 (en) * | 2005-11-18 | 2005-12-28 | Applied Generics Ltd | Enhancing traffic and navigation information with visual and audio data |
-
2010
- 2010-10-29 EP EP10189342.8A patent/EP2447925B1/en not_active Not-in-force
Also Published As
Publication number | Publication date |
---|---|
EP2447925A1 (en) | 2012-05-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9406225B2 (en) | Traffic data services without navigation system | |
KR101177386B1 (en) | Method and apparatus for providing transportation status information and using it | |
CN103218910B (en) | System and method for operation of traffic information | |
NL1030393C2 (en) | Map data updating method, for use in navigation system, involves converting location information into same format as map data, and determining whether map data has changed using location information | |
US7880645B2 (en) | Method and apparatus for providing and using public transportation information containing bus stop-connected information | |
US8392099B2 (en) | Method of providing detail information using multimedia based traffic and travel information message and terminal for executing the same | |
JP2009539173A (en) | Method and apparatus for providing traffic information by lane and using the information | |
JP2009541884A (en) | Method and apparatus for transmitting vehicle related information in and from a vehicle | |
CN102881157A (en) | Individualized traffic guidance method and individualized traffic guidance system on basis of mobile terminal display | |
KR20070077020A (en) | Method and apparatus for providing transport information and using it thereof | |
CN106104566A (en) | By system in a vehicle | |
CN102436737A (en) | Road condition sharing system and method based on wireless network and photos | |
JP2021500642A (en) | Methods, computer programs, and systems for transferring image data captured by in-vehicle cameras | |
WO2016138942A1 (en) | A vehicle assistance system | |
EP2447925B1 (en) | Method for providing information assisting a driver of a vehicle and a corresponding system for use in and remote of a vehicle | |
MX2007015218A (en) | Providing traffic information including composite links. | |
WO2015001677A1 (en) | Safety assistance system and safety assistance device | |
KR100873191B1 (en) | Method and apparatus for providing traffic and travel information by synopsis map | |
CN105091895A (en) | Concern prompt system and method | |
JP4581674B2 (en) | Route guidance system and route guidance method | |
JP6712753B2 (en) | Communication device and communication system | |
CN116972873A (en) | Navigation information display method, apparatus, device, storage medium and program product | |
KR101448895B1 (en) | Traffic light control system based on the TPEG information | |
Chu et al. | Traffic and navigation support through an automobile heads up display (a-HUD) | |
CN113034943A (en) | Holographic intersection video display system and method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
17P | Request for examination filed |
Effective date: 20121029 |
|
RAP1 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: HEUSCH, CHRISTIAN |
|
RIN1 | Information on inventor provided before grant (corrected) |
Inventor name: HEUSCH, CHRISTIAN |
|
17Q | First examination report despatched |
Effective date: 20160210 |
|
GRAP | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
|
INTG | Intention to grant announced |
Effective date: 20161212 |
|
GRAS | Grant fee paid |
Free format text: ORIGINAL CODE: EPIDOSNIGR3 |
|
GRAA | (expected) grant |
Free format text: ORIGINAL CODE: 0009210 |
|
AK | Designated contracting states |
Kind code of ref document: B1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
REG | Reference to a national code |
Ref country code: GB Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: EP |
|
REG | Reference to a national code |
Ref country code: IE Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: REF Ref document number: 895104 Country of ref document: AT Kind code of ref document: T Effective date: 20170615 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R096 Ref document number: 602010042378 Country of ref document: DE |
|
REG | Reference to a national code |
Ref country code: NL Ref legal event code: MP Effective date: 20170517 |
|
REG | Reference to a national code |
Ref country code: LT Ref legal event code: MG4D |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: MK05 Ref document number: 895104 Country of ref document: AT Kind code of ref document: T Effective date: 20170517 |
|
REG | Reference to a national code |
Ref country code: FR Ref legal event code: PLFP Year of fee payment: 8 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: AT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170517 Ref country code: LT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170517 Ref country code: ES Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170517 Ref country code: HR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170517 Ref country code: NO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170817 Ref country code: GR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170818 Ref country code: FI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170517 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170517 Ref country code: BG Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170817 Ref country code: RS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170517 Ref country code: LV Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170517 Ref country code: NL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170517 Ref country code: PL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170517 Ref country code: IS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170917 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170517 Ref country code: CZ Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170517 Ref country code: DK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170517 Ref country code: EE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170517 Ref country code: RO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170517 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R097 Ref document number: 602010042378 Country of ref document: DE |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: IT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170517 Ref country code: SM Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170517 |
|
PLBE | No opposition filed within time limit |
Free format text: ORIGINAL CODE: 0009261 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT |
|
26N | No opposition filed |
Effective date: 20180220 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MC Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170517 Ref country code: SI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170517 |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: PL |
|
REG | Reference to a national code |
Ref country code: IE Ref legal event code: MM4A |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: LU Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20171029 Ref country code: LI Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20171031 Ref country code: CH Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20171031 |
|
REG | Reference to a national code |
Ref country code: BE Ref legal event code: MM Effective date: 20171031 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: BE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20171031 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MT Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20171029 |
|
REG | Reference to a national code |
Ref country code: FR Ref legal event code: PLFP Year of fee payment: 9 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: IE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20171029 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: HU Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO Effective date: 20101029 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: CY Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20170517 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170517 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: TR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170517 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: PT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170517 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: AL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170517 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: FR Payment date: 20201022 Year of fee payment: 11 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: GB Payment date: 20211022 Year of fee payment: 12 Ref country code: DE Payment date: 20211020 Year of fee payment: 12 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: FR Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20211031 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R119 Ref document number: 602010042378 Country of ref document: DE |
|
GBPC | Gb: european patent ceased through non-payment of renewal fee |
Effective date: 20221029 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: DE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20230503 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: GB Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20221029 |