AU2010355231B2 - Method and control unit for controlling a display of a proximity warning system - Google Patents
Method and control unit for controlling a display of a proximity warning system Download PDFInfo
- Publication number
- AU2010355231B2 AU2010355231B2 AU2010355231A AU2010355231A AU2010355231B2 AU 2010355231 B2 AU2010355231 B2 AU 2010355231B2 AU 2010355231 A AU2010355231 A AU 2010355231A AU 2010355231 A AU2010355231 A AU 2010355231A AU 2010355231 B2 AU2010355231 B2 AU 2010355231B2
- Authority
- AU
- Australia
- Prior art keywords
- cameras
- display
- positional information
- camera
- images
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/161—Decentralised systems, e.g. inter-vehicle communication
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/181—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Traffic Control Systems (AREA)
- Closed-Circuit Television Systems (AREA)
Abstract
The present idea refers to a method and a control unit for controlling a display (19) of a proximity warning system. Vehicles and other objects (4a, 4b, 4c, 5, 6, 7, 8), for example, in a surface mine (1), are equipped with cameras (12) for providing images of different scenes. A control unit (13) of such object (4a, 4b, 4c, 5, 6, 7, 8) receives a signal representing positional information of such object from a radio based positioning receiver (11). Dependent on the positional information a subset of at least one camera (12) is selected, and a control signal is provided for the display (19) to display images provided by the selected subset of one or more cameras (12).By such method, the most relevant scene in terms of collision avoidance can be displayed to the operator.
Description
Method and control unit for controlling a display of a proximity warning system Field of Invention 5 The present invention relates to a method for controlling a display of a proximity warning system; a computer program element; a control unit for controlling a display of a proximity warning system; a monitoring system; and a movable object. For example, the invention relates to a method and a control unit for controlling a display of a proximity warning system. 10 Background of Invention Surface mines and similar sites or areas are generally operated by means of a large number of vehicles, some of which may be exceedingly large and difficult to maneuver and have 15 very limited visibility for the operator. Collision and/or proximity warning systems are established for conventional automobiles as well as for extra-large vehicles. 20 Proximity warning systems in form of park distance control systems make use of ultrasonic sensors located in the bumpers of a car. More sophisticated sys-tems may rely on different sensors such as three dimensional distance cameras as proposed in WO 2004/021546 A2. There, it is suggested to provide at least a forward, a backward and a sideward looking camera at a passenger car . 25 For extra-large vehicles used in mining, W02004/047047 A2 suggests to use satellite supported radio positioning receivers on board of the vehicles and other objects, such as cranes, for generating proximity warnings in order to reduce the risk of collisions. Another approach based on GNNS receivers is disclosed in the International Application No. 30 PCT/CH2009/000200 incorporated herein by reference.
V0CUn1Cn1J5/()4/40) J4 -2 Other approaches for extra-large vehicles are introduced in "Avoiding accidents with mining vehicles", retrieved and accessed from/on the Internet at http: //www. flir . com/uploadedFiles /Eurasia/MMC/ApplStorie s/AS_0020 EN.pdf on February 2, 2010. Sensors for avoiding collisions may include radar systems, conventional cameras or 5 thermal imaging cameras. In non-conventional types of vehicles such as the vehicles used in mining, each camera may display its image on a display installed in the driving cab. The more cameras there are available the more image information the driver is exposed to such that the driver may be 10 distracted by images not being relevant for collision avoidance. Or, the driver may be overstrained by monitoring the output of all cameras available. It is generally desirable to overcome or ameliorate one or more of the above described difficulties, or to at least provide a useful alternative. 15 Summary of Invention According to the present invention, there is provided a method for controlling a display of a proximity warning system, comprising: 20 receiving a signal representing first positional information of an object from a radio based positioning receiver; dependent on the positional information selecting a subset of at least one camera out of at least two cameras available for providing images of different scenes; and providing a control signal for the display to display images provided by the selected 25 subset of one or more cameras, wherein the subset of one or more cameras is selected subject to the positional information and subject to location information of stationary objects stored in an electronic map. 30 According to the present invention, there is also provided a computer program element comprising computer program code means which, when loaded in a processor unit of a n-1\nQmilIWDIcovnNronDIuJlJUu0|)010/54_|.10C4/41 -2A control unit, configures the control unit to perform the above-described method. According to the present invention, there is also provided a control unit for controlling a display of a proximity warning system, comprising a receiving unit for receiving a signal 5 representing positional information of an object from a radio based positioning receiver, a selection unit for selecting a subset of at least one camera out of at least two cameras available for providing images of different scenes dependent on the positional information and subject to location information of stationary objects stored in an electronic map, and an output for providing a control signal to a display for displaying images provided by the 10 selected subset of one or more cameras . According to the present invention, there is also provided a monitoring system, comprising the above-described control unit, a display, and at least two cameras for providing images of different scenes. 15 According to the present invention, there is also provided a movable object comprising the above-described monitoring system, wherein the cameras are attached to different locations of the object, and wherein the movable object is one of a vehicle, a crane, a dragline, a haul truck, a 20 digger and a shovel. In this respect, it is desired to improve means in a multi camera based proximity warning system for drawing the attention of the operator to the most relevant camera output/s. 25 According to a preferred embodiment of the present invention, a method is provided for controlling a display according to the features of independent claim 1. Accordingly, a signal representing positional information is received from a radio based positioning receiver. A subset of at least one camera out of at least two cameras for 30 providing images of different scenes is selected dependent on the positional information. A -2B control signal is provided for the display to display images provided by the selected subset of one or more cameras. According to another preferred embodiment of the present invention, a control unit is 5 provided for controlling a display according to the features of independent claim 20. Such control unit comprises a receiving unit for receiving a signal representing positional information WO 2011/153646 PCT/CH2010/000152 3 an object from a radio based positioning receiver. A se lection unit is designed for selecting a subset of at least one camera out of at least two cameras available for providing images of different scenes subject to the 5 positional information. At an output of the control unit, a control signal is provided to display images provided by the selected subset of one or more cameras. The basic idea of the present invention is to provide an aid to the operator at which of the camera 10 outputs to look at by means of prioritizing such cam era(s). For this reason, a GNSS receiver is used for de termining the present location of the object the cameras are assigned to and/or the location of an object differ ent to the one the cameras are assigned to. The location 15 of an object, specifically presented as coordinates in a chosen coordinate system, may advantageously be subsumed under the term "positional information" as presently used. In an advantageous scenario, the GNSS re 20 ceiver and the cameras are attached to the same object. An electronic map of preferably stationary objects being critical to traffic on a site may be stored, and the cur rent position of the object as identified by the GNSS re ceiver, may be compared or otherwise put into relation to 25 the position of one or more objects listed in such map. For example, in case of the distance between the object and a stationary object listed in the map being or possi bly becoming critical, e.g. by subtracting the two loca tion data from each other, it is decided to which one of 30 the cameras to draw the operators attention to which preferably is the camera that looks into the direction the critical object is located at. In another advantageous scenario, the GNSS receiver and the cameras are attached to the same object, 3s other objects including movable objects may be equipped with GNSS receivers, too, for determining their respec tive positions and/or trajectories. Such positional in formation is broadcast or individually transmitted by WO 2011/153646 PCT/CH2010/000152 4 these objects to other objects on the site being equipped with a corresponding receiver. By means of such posi tional information shared amongst objects on the site, the direction and distance, and also any approaching ve 5 locity may be determined at the present object with re spect to one or more other objects around. As soon as one or more of these parameters becomes critical in terms of proximity and/or a collision scenario, it is determined again to which of the cameras the operators attention 10 should be drawn to, which preferably is the camera that looks into the direction the critical object is located at. Summarizing, by means of other objects, e.g. operated and located on the same site, being equipped with GNSS receivers, too, and an infrastructure enabling these ob is jects to exchange information about their current loca tion, information about the existence, the distance to, and the direction of such other objects in the vicinity can be generated. In another advantageous scenario, the GNSS 20 receiver and the cameras are attached to different ob jects, the object comprising the cameras not necessarily including a GNSS receiver. However, other objects on a site may be equipped with a GNSS receiver and broadcast or otherwise transmit their positional information to 25 other objects on the site being equipped with a corre sponding receiving unit. In case, the object equipped with the cameras receives such positional information via its receiving unit, the positional information may be evaluated and the direction and/or the distance and/or 30 the approaching velocity of near-by or distant objects may be considered as critical in terms of a collision or a pre-collision scenario. Again, the selection step can be implemented the same way as described above, and the display is controlled such that for the operator of the 3s object the cameras are attached to emphasis is put on the one or more cameras looking into the direction another object is detected.
-5 The general purpose of the selection step is to make the operator focus to the one or more cameras by which a potential danger is currently being filmed under the assumption that there are at least two cameras avail-able filming different scenes, i.e. preferably different scenes around the object the cameras are attached to. Consequently, it is ensured that the 5 image information being most relevant especially in terms of proximity warning including collision avoidance is displayed on the display. The determination which, one/s of the cameras currently monitors the most relevant scene is performed by a selection unit comprised in the control unit. In particular, the location information provided by a GNSS receiver is analyzed in terms of proximity to potentially dangerous objects. 10 By automatically selecting the camera currently monitoring the scene of most interest and by displaying image information delivered from this camera, the personnel in charge for a safe operation of such object being e.g. a vehicle may not be distracted by a multitude of image information but instead may focus on the most relevant image information being 15 displayed. For advantageous embodiments it is referred to the dependent claims. It is noted that embodiments re-ferred to or claimed only in connection with the method are deemed to be disclosed in connection with the apparatus, too, and vice versa. 20 Brief Description of the Drawings Preferred embodiments of the present invention are hereafter described, by way of non limiting example only, with reference to the accompanying drawings, in which: 25 Figure 1 a schematic representation of a mining site, Figure 2 a block diagram of a monitoring system according to an embodiment of the present invention, Figure 3 a top view on a schematic vehicle with cameras mounted according to an 30 embodiment of the present invention, £-I1llHIWROYCIUNIit'OfDIUJLU Uf1)\10/j IdOC-I/U4/20I4 -6 Figure 4 a display, Figure 5 a block diagram of another monitoring system according to an embodiment of the present invention, and Figure 6 another display. 5 Detailed Description of Preferred Embodiments of the Invention In the present application, an "image" is understood as being the output of a camera filming a scene. This can be a camera working in the visible range if light but also a 10 camera working in the infrared range. Such image typically visualizes the scene on a display. When talking about different images it is inherently understood that these images are generated by different cameras, typically simultaneously. In this respect, "image information" may include any information provided by such camera, and, in particular, the image itself. 15 The cameras used provide images of "different scenes". A scene is "different" to another scene whenever the cameras involved do not shoot or scan the same perspective. Cameras may not shoot the same perspective, for example, when they are attached to different locations of an object. In the context of the present application, it is preferred that the at 20 least two cameras are mounted on the same object which may be a movable object such as a vehicle, or a stationary object such as a building. In such scenario, it is preferred that the cameras are ar-ranged such that they scan different sides of the object they are attached to in order to detect other objects in proximity at different or even all sides of the object. the WO 2011/153646 PCT/CH2010/000152 -7 A "section" assigned to a camera is under stood as - e.g. when shooting with a camera horizontally - the horizontal area in front of the camera in which the camera is able to monitor scenes in, and that conse 5 quently can be displayed in an image. Typically, a sec tion of a camera may include a sector. The "control unit" may be embodied in hard ware, in software or both, and may be embodied in a sin gle device, or its functions may be decentralized. Its 10 functional building block "selection unit" may also be embodied in hardware, in software or both. The "display" may have the form of a single display for displaying images from a single source, or may allow displaying images from many different sources, is i.e. cameras, simultaneously. The "display" also encom passes the totality of a multitude of separated displays which are, for example, distributed in the drivers cab of a vehicle. Summarizing, the display includes any display ing means for displaying image information delivered by 20 the cameras. The "control signal to display images" trig gers at least displaying the image selected for display ing. The control signal may evoke additional action sub ject to what is displayed during the normal mode of op 25 eration, i.e. the default mode when there is no object in the vicinity detectable: If, for example, the display regularly shows images of a single camera source only, the control signal may cause to switch from displaying images from the current camera source to displaying im 30 ages from the camera source selected according to the present idea. If the current image source by chance coin cides with the selected image source, there may be no change visible to the monitoring person. In some embodi ments, the control signal causes to highlight the se 35 lected images for drawing the attention to the subject images, e.g. by a flashing frame, or other means. If, for example, the display by default displays images from various sources, the control signal may cause that the WO 2011/153646 PCT/CH2010/000152 8 entire display now displays images only from the selected source. Or, the control signal may cause images from other sources being shut down or completely masked or visually downsized in order to emphasize the selected im 5 ages. The selected image may, as indicated, claim the en tire display screen, or remain in an unchanged image size on the screen. The control signal may include zooming in the selected image. The control signal may additionally cause acoustic warnings to be issued. The control signal 10 may either comprise the selected image itself provided the images are supplied by the cameras to the control unit, or it may cause the subject cameras to directly route the requested image to the display, or it may cause the display to accept only display information from the is camera as selected. All the above holds true also for the selection of multiple images if appropriate. A "warning system" and a corresponding "warn ing" activity may refer to any suitable activity for drawing the attention of the driver or operator to what 20 might be identified as a scene that may become critical in terms of collision or proximity, including selecting an image to be displayed. Such warning system may primar ily include the display which the cameras supply with im ages, but may additionally include acoustic means such as 25 a horn, a diaphone or a speaker, and possibly other vis ual means such as one or more LEDs, a flashlight, etc.. The warning character of the images displayed may be es pecially emphasized by displaying the images intermit tently, or by alternating between the image information 30 and its inverse colors, or by overlaying the image infor mation with visual warning symbols. Any warning in addi tion to the warning provided by the bare display of the images or selected images may be issued in combination with the control signal such that the control signal may 35 activate such additional warnings, too. In other embodi ments, a control signal separate from the control signal for the display may be issued subject to range informa tion derived from the positional information delivered by WO 2011/153646 PCT/CH2010/000152 9 the one or more GNSS receivers. For example, a first con trol signal for the display may be issued based on a first threshold condition for the object still being dis tant with respect to the present object, and a separate s control signal for an acoustic warning element may be is sued based on a second threshold condition for the object being very close with respect to the present object. The term "radio based positioning system" stands for a GNSS or for any other type of positioning 10 system based on radio signals, such as a pseudolite sys tem. The term "GNSS" stands for "Global Navigation Satel lite System" and encompasses all satellite based naviga tion systems, including GPS and Galileo. A "receiver" is a receiver designed for receiving information from satel is lites and for determining its position subject to the signals received. A "movable object" is any object that can change and is expected to change its position and/or ori entation or configuration in space. It may e.g. be a 20 truck or any other vehicle that moves from place to place and changes its orientation with respect to the general north-south direction, e.g. by steering, or it may be an object positioned at a fixed location but able to rotate about its axis or to change its physical configuration, 25 e.g. by extending an arm, in such a manner that the vol ume of safety space attributed to it varies in signifi cant manner. Fig. 1 schematically depicts a site 1, such as a surface mine. Typically, such a site covers a large 30 area, in the case of a surface mine e.g. in the range of square kilometers, with a network of roads 2 and other traffic ways, such as rails 3. A plurality of objects is present in the mine, such as: - Large vehicles, such as haul trucks 4a, 3s cranes 4b or diggers 4c. Vehicles of this type may easily weigh several 100 tons, and they are generally difficult to control, have very large breaking distances, and a WO 2011/153646 PCT/CH2010/000152 10 large number of blind spots that the driver is unable to visually monitor without monitoring cameras. - Medium sized vehicles 5, such as regular trucks. These vehicles are easier to control, but they 5 still have several blind spots and require a skilled driver. - Small vehicles 6. Typically, vehicles of this type weigh 3 tons or less. They comprise passenger vehicles and small lorries. 10 - Trains 7. A further type of object within the mine is comprised of stationary obstacles, such as temporary or permanent buildings 9, open pits, boulders, non-movable excavators, stationary cranes, deposits, etc. 15 The risk of accidents in such an environment is high. In particular, the large sized vehicles can eas ily collide with other vehicles, or obstacles. For this reason, objects according to an em bodiment present in a mine 1 and subject to potential 20 collision may be equipped with at least one GNSS receiver 11, a control unit per object, at least two cameras (not shown in Fig. 1) and a display per object (not shown in Fig. 1). Large objects may provide more than one GNSS re ceiver 11 per object as shown in Fig. 1. The entirety of 25 these elements per object for generating a proximity warning is called a monitoring system. The GNSS receivers 12 interact with satellites 30 for determining the posi tional information of the object they are mounted to. Figure 2 illustrates a block diagram of a 30 monitoring system including a control unit 13 according to an embodiment of the present invention. A receiver 17 of the control unit 13 is connected to cameras 12. An output 16 of the control unit 13 is connected to a dis play 19 and a beeper as warning means. Both connections 3s may be implemented as wireless connections or as wired connections. One or more connections can be implemented via bus connections. Each camera 12 delivers a series of images with respect to the scene monitored by the respec- WO 2011/153646 PCT/CH2010/000152 11 tive camera 12. Preferably, each of the cameras 12 looks into a different direction for monitoring different scenes with respect to the object these cameras are at tached to. 5 The monitoring system further comprises a ra dio based positioning receiver 11, attached to the pre sent object. The receiver 11 provides a signal comprising positional information, i.e. the position of the present object, determined in combination with satellites 30 as 10 shown in Figure 1. Such signal may be received by a re ceiving unit 15 in the control unit 13. The control unit 13 comprises a microproces sor system 14, which controls the operations of the con trol unit 13. A memory 18 comprises programs as well as 15 various parameters, such as unique identifiers of the cameras. Such programs may comprise instructions for evaluating the positional information, and for selecting a subset of cameras currently providing the most signifi cant image information. 20 The radio based positioning receiver 11 may provide positional information of the subject location it is located which represents the subject location of the object it is attached to. Provided that other moving or stationary objects on the site are equipped with such re 25 ceivers 11, too, the positional information related to the various objects may be shared between the control units of these objects, such that by comparing positional information stemming from positioning receivers located on different objects proximity and even approximation can 30 be detected. For further details it is referred to PCT/CH2009/000394 which is incorporated herein by refer ence. Position information of the present object provided by the radio based positioning receiver 11 is 3s transferred to the control unit 13 and evaluated there. Advantageously, such evaluation takes into account posi tional information received from other objects gathered by their own radio based positioning receivers and trans- WO 2011/153646 PCT/CH2010/000152 12 mitted e.g. by a wireless interface not shown in Figure 2. By way of evaluating the positional information from these different sources, a proximity situation may be de tected. If such proximity situation is detected by means 5 of the positional information, a control signal may be issued which activates displaying the image from the cam era looking into a direction where the proximate object is located at. The selected image represent the camera that currently films the proximate object which is of 10 most interest to be monitored by the operator in order to avoid a collision. This is why image information stemming from this camera is emphasized in being presented to the personnel via the display. Figure 2 shows an electronic map 40 stored in is the control unit 13 which holds location information sig nificant of stationary objects located on the site where the monitoring system of Figure 2 is in use. The posi tional information supplied by the GNSS receiver 11 is compared or otherwise put in relation to the location in 20 formation of the stationary objects. In case, sufficient proximity or approximation is detected between thses ob jects, the camera 12 looking into the direction of the stationary object is selected for displaying its image exclusively on the display 19. 25 In another embodiment, the object is equipped with another sensor (not shown) for measuring the dis tance to another object, such as a radio detection and ranging device, a light detection and ranging device, and a sound detection and ranging device. A signal is re 30 ceived from such sensor, and the subset of one or more cameras additionally may be selected based on the dis tance information provided by such sensor. There may be multiple sensors arranged at different sides of a vehi cle. These sensors may operate for detecting near-by ob 3s jects, and in particular objects not tagged with a GNSS receiver, by that providing additional information on the surrounding of the vehicle. Such sensors may individually trigger the selection of the camera(s) through the con- WO 2011/153646 PCT/CH2010/000152 13 trol unit (13) and preferably cover similar sectors as the cameras. Figure 3 illustrates a schematic top view on a vehicle 6 equipped with four cameras 12, one located at 5 each side of the vehicle 6, and a single GNSS receiver 11. Sections monitored by each camera 12 are indicated by sector lines and referred to by 121. This makes each cam era 12 scan a different scene at each side of the vehicle 6. Alternatively, the cameras 12 can be located at the 10 edges of the vehicle 6. Both arrangements are beneficial for covering a large area in the vicinity of the object for proximity including collision detection purposes. Provided that second positional information is received from an object different to the present vehi 15 cle 6, the selection of the camera may be based on the positional information with respect to the present vehi cle 6 and such second positional information. Analyzing the positional information of both of the objects may al low identification of the direction the other object is 20 located at with respect to the vehicle 6, and the dis tance between the vehicle and the other object. In case the other object is located at a position 200 to the left hand side of the vehicle 6, the section 121 of the left hand camera 12 is identified as relevant section 121 when 25 mapping the position of the other object 200 to the sec tions 121 of the cameras 12 of the vehicle. For such map ping, it is beneficial to permanently monitor the orien tation of the vehicle 6 which may alter when moving the vehicle. This may involve e.g. a compass or any other 30 means for determining the orientation of the vehicle with respect to the coordinate system the GNSS makes use of. The identified section 121 makes the camera 12 associated to be the preferred camera for selection. As a result, this camera 12 will exclusively provide images of this 35 proximity situation to the operator provided the distance to the object 200 is not that far that any selection is suppressed. The first and second positional information may be used for determining the distance between the ot- WO 2011/153646 - PCT/CH2010/000152 14 her object and the vehicle. The distance information may be included in the selection step, and the image of the camera corresponding to the identified section may only be selected when the determined distance between the ob 5 jects is below a given threshold. Otherwise, it is as sumed that the other object still is too far away for justifying a warning to the operator. Given that a third object 300 is in proximity to the vehicle 6 and given that the second object 200 10 still is at its position as illustrated in Figure 3, the position of the third object 300 may be determined with respect to the sections 121 of the cameras 12 of the ve hicle 6. Hence, the section 121 to the right hand side of the vehicle 6 is identified as section the object 300 15 maps/falls. The right hand side camera 12 is associated to this section 121. For this example, the object 200 now is assumed to be at a distance from the vehicle 6 which justifies issuing a warning to the operator. Subject to the display/warning strategy both 20 cameras, i.e. the left hand and the right hand camera 12 may be selected for delivering images to the display, e.g. compared to a default display mode where all four cameras 12 show their images on the display. However, following another strategy, only the object closest to 25 the vehicle 6 shall be displayed. By determining the dis tances between the vehicle 6 and the objects 200 and 300, the image of the camera being mounted to the object being closest will be allowed to display the scene it monitors, i.e. the camera 12 to the right hand as the object 300 is 30 closer to the vehicle 6 than the object 200. In the above examples, the radio based posi tioning receiver 11 always is present at the vehicle 6 / object the cameras are attached to. In another embodi ment, no such radio based positioning receiver 11 is at 3s tached to the object holding the cameras. Instead, the selection of cameras only relies on positional informa tion received from other objects. Such positional infor mation may be sufficient for selecting the one or more WO 2011/153646 PCT/CH2010/000152 15 cameras by a mapping step equivalent to the one described in connection with the embodiment above. This holds for other objects providing their position information not in an absolute measure but e.g. in relation to the present 5 object, or to any other known object. Or, preferably, means other than radio based positioning means may be provided for allowing an assessment of the position of the other object with respect to the own position. In case the own position is a priori rather limited to a 10 small area, e.g. when the vehicle may move only within a limited radius, even no such additional means are needed, as the own position may be known in advance, stored in the control unit and be used for putting the position of the other object provided in absolute coordinates into 15 relation with its own position. The display 19 in Figure 4 represents a flat panel display offering displaying images from e.g. 8 cam eras across its screen. Once the control signal is re ceived from the control unit 13, and provided the control 20 signal identifies only one camera 12 for providing image information most relevant to be displayed, the entire screen of Figure 4 may be reserved for showing the sub ject image information. In Figure 4, the screen of the display 19 is devided, and the image information selected 25 is displayed on portion 20 of the display. Portion 21 may be reserved for issuing visual warnings, such a bold "DANGER" symbol or text or other kind of visual warnings. The block diagram of Figure 5 differs from the block diagram of Figure 3 only in the way the control 30 signal affects the control of the display. Instead of the control signal carrying the image information itself, the control signal now acts on AND gates 22 each of which AND gates is connected with one of the cameras 12. By acti vating one of the AND gates by a corresponding control 3s signal, the subject AND gate allows for the associated camera 12 to provide image information to the display 19, while, for example, all the other AND gates are blocked and do not allow for displaying image information from - 16 other cameras 12. There is no need for providing a receiver 17 for the image information in the control unit 13. Figure 6 provides another schematic representation of a display 19, which display 19 is 5 divided into four sub-displays 191 - 194, each sub-display 191 - 194 permanently displaying information from a camera assigned. In this embodiment, the control signal only highlights the sub-display 192 which displays image information from the camera 12 selected to be most critical in terms of a potential collision by a blinking frame or similar. 10 While presently preferred embodiments of the invention are shown and described, it is to be distinctly understood that the invention is not limited thereto but may be otherwise variously embodied and practiced within the scope of the following claims Throughout this specification and the claims which follow, unless the context requires 15 otherwise, the word "comprise", and variations such as "comprises" and "comprising", will be understood to imply the inclusion of a stated integer or step or group of integers or steps but not the exclusion of any other integer or step or group of integers or steps. The reference in this specification to any prior publication (or information derived from it), 20 or to any matter which is known, is not, and should not be taken as an acknowledgment or admission or any form of suggestion that that prior publication (or information derived from it) or known matter forms part of the common general knowledge in the field of endeavour to which this specification relates.
Claims (25)
1. A method for controlling a display of a proximity warning system, comprising: receiving a signal representing first positional information of an object from a radio 5 based positioning receiver; dependent on the positional information selecting a subset of at least one camera out of at least two cameras available for providing images of different scenes; and providing a control signal for the display to display images provided by the selected subset of one or more cameras, 10 wherein the subset of one or more cameras is selected subject to the positional information and subject to location information of stationary objects stored in an electronic map.
2. Method according to claim 1, wherein the cameras available for selection are 15 arranged to provide images of different sections around the object.
3. Method according to claim 2, wherein the cameras and the radio based positioning receiver are attached to the same object. 20
4. Method according to any one of the preceding claims, wherein second positional information is received with respect to a second object, wherein the second positional information originates from a radio based positioning receiver of the second object, and wherein the subset of at least one camera is selected based on the first positional 25 information and the second positional information.
5. Method according to claim 4, wherein a distance between the objects is determined from the positional information and the second positional information, and wherein the subset of at least one camera is selected based on the distance. 30
6. Method according to any one of the preceding claims 4 to 5 in combination with - 18 claim 2, wherein one of the sections is identified for the second positional information to map into, and wherein the camera associated with the identified section is selected in the selection step. 5
7. Method according to claim 6 in combination with claim 5, wherein the camera associated with the identified section is selected provided at least one of the distance between the objects and the distance to the crossing point of their trajectories is below a threshold. 10
8. Method according to claim 2, wherein one of the sections is identified for the stationary object location information to map into, and wherein the camera associated with the identified section is selected in the selection step.
9. Method according to claim 8, wherein a distance between the object and the 15 stationary object is determined from the positional information and the stationary object location information, and wherein the camera associated with the identified section is selected provided the determined distance between the object and the stationary object is below a threshold. 20
10. Method according to claim 1, wherein the object the radio based positioning receiver is assigned to is different to a second object the cameras available for selection are attached to, wherein the cameras available for selection are arranged to provide images of different sections around the second object, 25 wherein one of the sections is identified for the positional information to map into, and wherein the camera associated with the identified section is selected in the selection step. 30
11. Method according to any one of the preceding claims, wherein a signal is received from at least one sensor for measuring the distance to another object by means different to - 19 those of the radio based positioning receiver, wherein the subset of at least one camera is selected based on the positional information and the distance information provided by the sensor, and wherein the sensor includes at least one of a radio detection and ranging device, a 5 light detection and ranging device, and a sound detection and ranging device.
12. Method according to any one of the preceding claims, wherein in a default mode the control signal is designed for allowing images provided by all the cameras available to be displayed, and wherein based on the selection step the control signal is modified for 10 allowing images provided by the one or more selected cameras only to be displayed.
13. Method according to any one of the preceding claims, wherein the control signal is provided for the display to display and highlight the images from the selected subset of one or more cameras. 15
14. Method according to any one of the preceding claims, wherein the control signal is designed for triggering one of an acoustic and a visual warning.
15. Computer program element comprising computer program code means which, 20 when loaded in a processor unit of a control unit, configures the control unit to perform a method as claimed in any one of the preceding claims.
16. A control unit for controlling a display of a proximity warning system, comprising a receiving unit for receiving a signal representing positional information of an object from 25 a radio based positioning receiver, a selection unit for selecting a subset of at least one camera out of at least two cameras available for providing images of different scenes dependent on the positional information and subject to location information of stationary objects stored in an electronic map, and an output for providing a control signal to a display for displaying images provided by the selected subset of one or more cameras. 30
17. A monitoring system, comprising a control unit according to claim 16, a display, H:\tld\Inenvoven\NRPortbl\DCC\TLD\6167184_ I.doc-8/04/2014 - 20 and at least two cameras for providing images of different scenes.
18. A monitoring system according to claim 17, comprising the radio based positioning receiver, 5 wherein the receiving unit is designed for receiving positional information of a second object.
19. A monitoring system according to any one of the preceding claims 17 or 18, comprising a log for logging at least one of the positional information and the selected 10 camera signal.
20. A movable object, comprising a monitoring system according to any one of the preceding claims 17 to 19, wherein the cameras are attached to different locations of the object, and 15 wherein the movable object is one of a vehicle, a crane, a dragline, a haul truck, a digger and a shovel.
21. A method for controlling a display of a proximity warning system, substantially as hereinbefore described with reference to the accompanying drawings. 20
22. Computer program element, substantially as hereinbefore described with reference to the accompanying drawings.
23. A control unit for controlling a display of a proximity warning system, substantially 25 as hereinbefore described with reference to the accompanying drawings.
24. A monitoring system, substantially as hereinbefore described with reference to the accompanying drawings. 30
25. A movable object, substantially as hereinbefore described with reference to the accompanying drawings.
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CH2010/000152 WO2011153646A1 (en) | 2010-06-10 | 2010-06-10 | Method and control unit for controlling a display of a proximity warning system |
Publications (2)
Publication Number | Publication Date |
---|---|
AU2010355231A1 AU2010355231A1 (en) | 2013-01-10 |
AU2010355231B2 true AU2010355231B2 (en) | 2014-11-20 |
Family
ID=43431943
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
AU2010355231A Active AU2010355231B2 (en) | 2010-06-10 | 2010-06-10 | Method and control unit for controlling a display of a proximity warning system |
Country Status (3)
Country | Link |
---|---|
AU (1) | AU2010355231B2 (en) |
CA (1) | CA2802122C (en) |
WO (1) | WO2011153646A1 (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8779934B2 (en) | 2009-06-12 | 2014-07-15 | Safemine Ag | Movable object proximity warning system |
CA2783888C (en) | 2009-12-11 | 2017-02-28 | Safemine Ag | Modular collision warning apparatus and method for operating the same |
US10800329B2 (en) | 2010-04-19 | 2020-10-13 | SMR Patents S.à.r.l. | Rear view mirror simulation |
US10703299B2 (en) | 2010-04-19 | 2020-07-07 | SMR Patents S.à.r.l. | Rear view mirror simulation |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11220726A (en) * | 1998-01-30 | 1999-08-10 | Niles Parts Co Ltd | Vehicle surrounding monitoring device |
US20040217851A1 (en) * | 2003-04-29 | 2004-11-04 | Reinhart James W. | Obstacle detection and alerting system |
WO2006079165A1 (en) * | 2005-01-25 | 2006-08-03 | Alert Systems Pty Ltd | Proximity warning system |
GB2452829A (en) * | 2007-09-12 | 2009-03-18 | Spillard Safety Systems Ltd | Decentralised GPS based anti-collision system for vehicles and pedestrians |
US20090259400A1 (en) * | 2008-04-15 | 2009-10-15 | Caterpillar Inc. | Vehicle collision avoidance system |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE10392601D2 (en) | 2002-08-09 | 2005-02-03 | Conti Temic Microelectronic | Transportation with a 3D range camera and method of operation |
DE10253192A1 (en) | 2002-11-15 | 2004-05-27 | Philips Intellectual Property & Standards Gmbh | Anti-collision system for use with road vehicle has position determining computer with GPS receiver and has radio transmitter ending signals to equipment carried by pedestrians |
-
2010
- 2010-06-10 AU AU2010355231A patent/AU2010355231B2/en active Active
- 2010-06-10 CA CA2802122A patent/CA2802122C/en active Active
- 2010-06-10 WO PCT/CH2010/000152 patent/WO2011153646A1/en active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11220726A (en) * | 1998-01-30 | 1999-08-10 | Niles Parts Co Ltd | Vehicle surrounding monitoring device |
US20040217851A1 (en) * | 2003-04-29 | 2004-11-04 | Reinhart James W. | Obstacle detection and alerting system |
WO2006079165A1 (en) * | 2005-01-25 | 2006-08-03 | Alert Systems Pty Ltd | Proximity warning system |
GB2452829A (en) * | 2007-09-12 | 2009-03-18 | Spillard Safety Systems Ltd | Decentralised GPS based anti-collision system for vehicles and pedestrians |
US20090259400A1 (en) * | 2008-04-15 | 2009-10-15 | Caterpillar Inc. | Vehicle collision avoidance system |
Also Published As
Publication number | Publication date |
---|---|
WO2011153646A1 (en) | 2011-12-15 |
CA2802122A1 (en) | 2011-12-15 |
CA2802122C (en) | 2016-05-31 |
AU2010355231A1 (en) | 2013-01-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3339999B1 (en) | Information processing apparatus, information processing method, and recording medium storing programm | |
US9797247B1 (en) | Command for underground | |
US9457718B2 (en) | Obstacle detection system | |
KR101622028B1 (en) | Apparatus and Method for controlling Vehicle using Vehicle Communication | |
US20180149491A1 (en) | Automobile periphery information display system | |
JP5200568B2 (en) | In-vehicle device, vehicle running support system | |
AU2010351500B2 (en) | Object proximity warning system and method | |
JP4719590B2 (en) | In-vehicle peripheral status presentation device | |
US20150070498A1 (en) | Image Display System | |
JP5841235B2 (en) | Navigation system | |
JP2016213815A (en) | Method of providing vehicle surrounding view image, and device and computer readable recording medium therefor | |
WO2006106685A1 (en) | Surrounding monitor device for construction machine | |
JP7232287B2 (en) | ship navigation system | |
US20180357903A1 (en) | Vehicle control device | |
AU2010355231B2 (en) | Method and control unit for controlling a display of a proximity warning system | |
US20160148421A1 (en) | Integrated Bird's Eye View with Situational Awareness | |
JP2008097279A (en) | Vehicle exterior information display device | |
JP4225189B2 (en) | Vehicle driving support device | |
US11697425B1 (en) | Method and system for assisting drivers in locating objects that may move into their vehicle path | |
CN111645705A (en) | Method for issuing driving route adjustment and server | |
JP2015153208A (en) | Alarm system | |
US20120249342A1 (en) | Machine display system | |
AU2018201213A1 (en) | Command for underground | |
AU2011264358B2 (en) | Method and control unit for controlling a display | |
CN109774705B (en) | Object detector configuration for human override based on automated vehicle control |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FGA | Letters patent sealed or granted (standard patent) |