[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

US20200216152A1 - Collision avoidance assistance system - Google Patents

Collision avoidance assistance system Download PDF

Info

Publication number
US20200216152A1
US20200216152A1 US16/637,587 US201816637587A US2020216152A1 US 20200216152 A1 US20200216152 A1 US 20200216152A1 US 201816637587 A US201816637587 A US 201816637587A US 2020216152 A1 US2020216152 A1 US 2020216152A1
Authority
US
United States
Prior art keywords
ship
obstacle
evaluation
collision avoidance
sensor data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US16/637,587
Inventor
Yoshiro Mizuno
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Publication of US20200216152A1 publication Critical patent/US20200216152A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/937Radar or analogous systems specially adapted for specific applications for anti-collision purposes of marine craft
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B63SHIPS OR OTHER WATERBORNE VESSELS; RELATED EQUIPMENT
    • B63BSHIPS OR OTHER WATERBORNE VESSELS; EQUIPMENT FOR SHIPPING 
    • B63B43/00Improving safety of vessels, e.g. damage control, not otherwise provided for
    • B63B43/18Improving safety of vessels, e.g. damage control, not otherwise provided for preventing collision or grounding; reducing collision damage
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B63SHIPS OR OTHER WATERBORNE VESSELS; RELATED EQUIPMENT
    • B63BSHIPS OR OTHER WATERBORNE VESSELS; EQUIPMENT FOR SHIPPING 
    • B63B49/00Arrangements of nautical instruments or navigational aids
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/862Combination of radar systems with sonar systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/865Combination of radar systems with lidar systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G3/00Traffic control systems for marine craft
    • G08G3/02Anti-collision systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/95Radar or analogous systems specially adapted for specific applications for meteorological use
    • G01S13/953Radar or analogous systems specially adapted for specific applications for meteorological use mounted on aircraft
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Definitions

  • the present disclosure relates to a collision avoidance assistance system for ships to assist collision avoidance particularly using data sensed by sensors (that is referred to as “sensor data” hereinafter).
  • Radar devices are known as a system for collision avoidance assistance for ships.
  • the radar devices assist safe ship navigation by displaying a bird's eye view in which land, a building, another ship, or the like is viewed from high above to supplement visual information.
  • Patent Literature 1 discloses a system for assisting safe ship navigation by utilizing various sensors other than the radar devices.
  • Patent Literature 1 discloses a system to: generate a scenery image viewed from a predetermined position in a predetermined direction based on (i) information outputted by a global positioning system (GPS), a gyroscope device, an attitude sensor device, a radar device, a television (TV) camera device, a night vision device, an echo sounding device, a sonar, a device for logs, and a steering device and (ii) a database regarding nautical chart information, and (iii) a database regarding information on ports and harbors; and display the scenery image on an image displaying means, thereby enabling a generation of and a display of a scenery image viewed from any position in any direction, such as a scenery image viewed from a cockpit of a ship in the direction of forward movement of the ship or a scenery image viewed from a rear portion of a deck of the ship in the backward direction.
  • GPS global positioning system
  • a gyroscope device an attitude sensor device
  • a radar device a
  • Patent Literature 1 discloses that the system (i) makes a collision avoidance calculation for predicting a collision risk, a stranding avoidance calculation for predicting a risk that a ship is stranded, a dangerous sea area calculation for calculating a dangerous sea area, and a recommended course calculation for calculating a recommended ship course, and (ii) displays, on the image displaying means, a recommended ship course calculated in the system, thereby securing the safety of ship route even if information obtained by visual observation by a ship pilot is insufficient.
  • Patent Literature 1 since the system of Patent Literature 1 does not detect, identify or evaluate an obstacle, the ship pilot itself need to identify the obstacle on the basis of a scenery image displayed in image information.
  • Patent Literature 2 discloses a ship motion control system including sensors for collecting sea condition data.
  • analyzer software receives the sea condition data and predicts occurrence of a sea condition event.
  • Calculator software calculates an operation command in preparation for the occurrence of the sea condition event.
  • Interface software transmits the operation command to the ship control system.
  • the sea condition collected by the system of Patent Literature 2 is mainly conditions of waves of sea, and the system of Patent Literature 2 does not detect or identify obstacles such as seaweed.
  • Patent Literature 1 Unexamined Japanese Patent Application Kokai Publication No. H06-301897
  • Patent Literature 2 U.S. Patent Application Publication No. 2014-114509
  • An objective of the present disclosure is to provide a system for assisting avoidance of collision between a ship and an obstacle by collecting various types of sensor data in a ship travel route using sensors, leaning from the collected sensor data, and detecting, identifying and evaluating an obstacle area as a result of the learning.
  • a collision avoidance assistance system for assisting avoidance of collision between a ship and an obstacle using sensors selected from among a video camera, a color meter, a spectrometer, an infrared sensor, a temperature sensor, a radar, a device for LIDAR, a sonar, an air-speed meter, a ship-speed meter, and a global navigation satellite system (GNSS) receiver
  • the system is characterized by including (a) three dimensional viewing field acquisition means for acquiring a three dimensional viewing field in a ship travel route by integrating sensor data obtained via sensors selected from among the video camera, the infrared sensor, and the device for LIDAR, and (b) learning means for (i) collecting the sensor data obtained via the sensors in the three dimensional viewing field in which the obstacle is located, and (ii) outputting an evaluation of the obstacle based on the collected sensor data by using a deep learning method in which a pair of the collected sensor data and the evaluation of the obstacle is used as learning
  • GNSS global navigation satellite system
  • LIDAR means “Light Detection and Ranging”
  • the device for LIDAR is a sensor for detecting light and measuring a distance.
  • the sensors may be fixed to poles included in the ship.
  • the sensors may be attached to a drone for obtaining an observation altitude and an observation distance, and the drones may be connected to the ship with a wired or wireless connection.
  • the evaluation of the obstacle may further include identification information about an object on or in the sea, and the object on or in the sea is another ship different from the ship, a buoy, or a hidden rock.
  • the evaluation may be displayed on at least one selected from among (i) a screen installed on the ship in order to warn a ship pilot, (ii) a head-up display installed on the ship, (iii) a terminal device for the ship pilot, (iv) a head mounted display for the ship pilot, and (v) a visor for the ship pilot.
  • the evaluation may be transmitted to an autopilot system installed in the ship and used for avoiding potential collisions.
  • the evaluation may be transmitted by a device for long distance communication and may be shared with a database on a network or a system of another ship sailing near the ship.
  • the learning data may include (i) a pair of: sensor data collected in another collision avoidance assistance system different from the present system; and an evaluation of the obstacle or (ii) a pair of: sensor data generated as test data; and an evaluation of the obstacle.
  • the learning means may collect sensor data in a plurality of collision avoidance assistance systems connected to one another via communication and may use, as learning data, a pair of the collected sensor data and an evaluation of the obstacle.
  • FIG. 1 is a schematic plane view illustrating a sea area on which a ship equipped with a collision avoidance assistance system according to an embodiment of the present disclosure sails;
  • FIG. 2 is a schematic plane view illustrating the ship equipped with the system according to the embodiment
  • FIG. 3 is a schematic side view illustrating the ship equipped with the system according to the embodiment
  • FIG. 4 is a block diagram illustrating the collision avoidance assistance system included in the ship, sensors connected to the collision avoidance assistance system, and various systems;
  • FIG. 5 is a schematic view illustrating data on a three dimensional viewing field formed by a three dimensional region and an area of an obstacle included in the three dimensional region;
  • FIG. 6 is a schematic view illustrating assistance information displayed on a screen.
  • FIG. 7 is a schematic view illustrating the three dimensional viewing field projected on a head-up display.
  • Operations of functional components of the embodiment are achieved by (i) executing a pre-installed control program such as firmware by a processor of a system circuit and (ii) making the processor cooperate with various devices that are components of the system.
  • the program is stored in a computer-readable recording medium, is read from the recording medium by the processor, and is executed by an operation of a user such as a ship pilot or by receiving a signal from a device included in the system.
  • FIG. 1 is a plane schematic view illustrating a water surface on which a ship equipped with a collision avoidance assistance system according to an embodiment of the present disclosure sails.
  • FIG. 1 illustrates (i) a ship 1001 including a sensor group 1002 and sailing on a waterway represented by an area between two straight lines 1000 , and (ii) a seaweed 1003 , a buoy 1004 , and a hidden rock 1005 that are obstacles on or in the sea.
  • An arrow 1006 denotes a direction of a navigation route of the ship, and an area between two straight lines 1007 represents a plan region of a three dimensional viewing field in which the obstacles are detected using the sensor group 1002 of the ship.
  • a drone 1008 is connected to the ship 1001 via a wireless communication, includes a video camera, approaches, from the air, an obstacle located in a region of the three dimensional viewing field and closely examine the obstacle, and transmits sensor data to the system of the ship.
  • the video camera is used as a sensor installed in the drone in the present embodiment, the video camera may be replaced with a color meter, a spectrometer, an infrared sensor, a thermometer or the like in accordance with a degree of necessity of the examination of the obstacle.
  • the connection between the drone and the ship is not limited to the wireless connection, and the drone may be appropriately connected to the ship in a wired connection method.
  • FIG. 2 is a schematic plane view illustrating the ship equipped with the system according to the embodiment.
  • a ship body 2001 of the ship includes a right pole 2002 , a left pole 2003 and a middle pole 2004 , and a sensor is attached to each of these poles.
  • FIG. 3 is a schematic side view illustrating the ship equipped with the system according to the embodiment. Only the right pole 2002 among the left and right poles symmetrically arranged on the ship body is illustrated in FIG. 3 .
  • the middle pole 2004 is disposed on a roof of a room 3001 of the ship.
  • a sonar 3003 is disposed on a bottom 3002 of the ship.
  • a video camera, a color meter, a spectrometer, and an infrared sensor are mounted on upper portions of the right pole 2002 and the left pole 2003 .
  • a radar, a device for LIDAR, an air-speed meter, a thermometer, a ship-speed meter, and a GNSS receiver are mounted on the middle pole 2004 .
  • thermometer is an infrared thermometer enabling measurement of a temperature of an area of an obstacle without contact.
  • FIG. 4 is a block diagram illustrating the collision avoidance assistance system included in the ship, sensors connected to the collision avoidance assistance system, and various systems.
  • the collision avoidance assistance system 4000 is connected to a sensor group 4001 installed in the ship and a ship handling system 4002 of the ship. Additionally, the collision avoidance assistance system is connected, via a communication network 4003 , to (i) a database 4004 about information on ship handling and (ii) a system 4005 of another ship.
  • Functional components of the collision avoidance assistance system 4000 are a three dimensional viewing field acquirer 4006 , a collector 4007 , a learning unit 4008 , an assisting unit 4009 , and a communicator 4010 .
  • the sensor group 4001 expresses all of sensors attached to the ship 1001 and the drone 1008 , and a data set obtained by the sensor group is transmitted to the collision avoidance assistance system.
  • the three dimensional viewing field generator 4006 forms data on a three dimensional region in the direction of the navigation route of the ship from (i) image data obtained by the video camera and the infrared sensor disposed on the right and left poles and included in the sensor group 4001 and (ii) data on an object in the sea obtained by the sonar, and generates a three dimensional viewing field including an area of the waterway and an area of the obstacle other than the area of the waterway.
  • FIG. 5 is a schematic view illustrating data on a three dimensional viewing field formed by a three dimensional region and an area of an obstacle included in the three dimensional region.
  • the three dimensional region is a region including two straight lines 1007 and extending with depth from four points 5002 , 5003 , 5004 , and 5005 on the XY-plane in the Z-direction denoting the direction in which the ship sails.
  • the collector 4007 collects a pair of (i) a data set regarding the three dimensional viewing field in which an obstacle such as a drift wood, the seaweed, the other ship, the buoy, the hidden rock, or a shoal is located and (ii) a user's operation of the ship handling system 4002 in response to the three dimensional viewing field or an inputted user's evaluation of the three dimensional viewing field.
  • a data set of the present disclosure may include (i) data on a type of and a position of a structural object located on the sea as indicated by nautical chart data and/or (ii) data on a type of and a position of the floating object or the like indicated on a sea warning information.
  • FIG. 6 is a schematic view illustrating assistance information displayed on a screen.
  • a message 6001 indicating existence of an obstacle and a recommended ship handling is displayed on the screen 6000 in a radar screen of the ship handling system.
  • FIG. 7 is a schematic view illustrating the three dimensional viewing field generated on the head-up display and the assistance information displayed on the head-up display.
  • the head-up display 7004 A is disposed on the windshield 7002 of the cockpit 7001 such that the head-up display 7004 A is positioned in a viewing field of the ship pilot 7003 .
  • the numeral reference 7004 B denotes an enlarged image of the head-up display.
  • a wave 7005 , a floating object 7006 and a buoy 7007 that are detected and identified are displayed in the three dimensional viewing field.
  • the floating object 7006 expressed by the dotted line is colored and displayed on the display, and the floating object 7006 is displayed as a dangerous object that should be avoided.
  • the numeral references 7008 , 7009 and 7010 denote indicator labels indicating names of identified obstacles, and the indicator labels indicate the wave, the floating object and the buoy.
  • the numerical number 7011 denotes the bow of the ship and indicates a current direction of travel of the ship.
  • the reference numeral 7012 denotes ship handling recommended by the system in the current situation.
  • the method in which the assistance information is displayed on the screen disposed on the ship and the method in which the assistance information is displayed on the head-up display are presented in this example.
  • methods of displaying the assistance information that are applicable to the present disclosure are not limited to such methods, and a method in which the assistance information is displayed on a terminal device of the ship pilot, a head mounted display of the ship pilot, or a visor of the ship pilot is applicable to the present disclosure, as necessary.
  • the system of the present embodiment also learns from (i) a pair of the data set collected during the operation for the above-described assistance and the output on which the assistance information is based, and (ii) the ship handling operation actually selected by the ship handling system.
  • the communicator 4010 may receive a pair of (i) a data set collected by the system 4005 of the other ship and (ii) an output produced in the system 4005 , and the system may use, as learning data for the learning unit 4008 , the received pair of the data set and the output from the other ship.
  • the system may use, as learning data, (i) a data set of test data formed by simulating the three dimensional viewing field in which an obstacle is located and (ii) an output expressing an evaluation of a situation indicated by that data set.
  • a system for assisting collision avoidance in a situation in which an obstacle is located on a ship travel route can be provided.
  • the present disclosure is applicable to a ship industry, a ship-handling industry, and an industry such as an agency for providing ships with information.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Ocean & Marine Engineering (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Mechanical Engineering (AREA)
  • Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Electromagnetism (AREA)
  • Traffic Control Systems (AREA)
  • Geophysics And Detection Of Objects (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The objective of the present disclosure is to provide a collision avoidance assistance system for assisting avoidance of collision between a ship and an obstacle using a result obtained by (i) detecting the obstacle on a ship travel route using sensors, (ii) identifying the detected obstacle, (iii) collecting sensor data, and (iv) analyzing the collected sensor data. By using the system including, as a component, a sensor group, a three dimensional viewing field acquirer, an obstacle identifier, a database regarding obstacles, an analyzer, a deep learning unit, a learning unit, and communicator, the obstacle on the ship travel route is detected and identified, and by using a result obtained by detecting, identifying, collecting, and analyzing sensor data, avoidance of collision between the ship and the obstacle is assisted.

Description

    TECHNICAL FIELD
  • The present disclosure relates to a collision avoidance assistance system for ships to assist collision avoidance particularly using data sensed by sensors (that is referred to as “sensor data” hereinafter).
  • BACKGROUND ART
  • Radar devices are known as a system for collision avoidance assistance for ships. The radar devices assist safe ship navigation by displaying a bird's eye view in which land, a building, another ship, or the like is viewed from high above to supplement visual information.
  • However, obstacles such as seaweed, driftwood, and floating objects cannot be detected, identified and evaluated using only information obtained by the radar devices detecting the obstacles, and thus the information obtained by the radar devices is insufficient for safe ship navigation.
  • CITATION LIST Patent Literature
  • Patent Literature 1 discloses a system for assisting safe ship navigation by utilizing various sensors other than the radar devices.
  • Patent Literature 1 discloses a system to: generate a scenery image viewed from a predetermined position in a predetermined direction based on (i) information outputted by a global positioning system (GPS), a gyroscope device, an attitude sensor device, a radar device, a television (TV) camera device, a night vision device, an echo sounding device, a sonar, a device for logs, and a steering device and (ii) a database regarding nautical chart information, and (iii) a database regarding information on ports and harbors; and display the scenery image on an image displaying means, thereby enabling a generation of and a display of a scenery image viewed from any position in any direction, such as a scenery image viewed from a cockpit of a ship in the direction of forward movement of the ship or a scenery image viewed from a rear portion of a deck of the ship in the backward direction.
  • Additionally, Patent Literature 1 discloses that the system (i) makes a collision avoidance calculation for predicting a collision risk, a stranding avoidance calculation for predicting a risk that a ship is stranded, a dangerous sea area calculation for calculating a dangerous sea area, and a recommended course calculation for calculating a recommended ship course, and (ii) displays, on the image displaying means, a recommended ship course calculated in the system, thereby securing the safety of ship route even if information obtained by visual observation by a ship pilot is insufficient.
  • However, since the system of Patent Literature 1 does not detect, identify or evaluate an obstacle, the ship pilot itself need to identify the obstacle on the basis of a scenery image displayed in image information.
  • Patent Literature 2 discloses a ship motion control system including sensors for collecting sea condition data. In the system of Patent Literature 2, analyzer software receives the sea condition data and predicts occurrence of a sea condition event. Calculator software calculates an operation command in preparation for the occurrence of the sea condition event. Interface software transmits the operation command to the ship control system.
  • However, the sea condition collected by the system of Patent Literature 2 is mainly conditions of waves of sea, and the system of Patent Literature 2 does not detect or identify obstacles such as seaweed.
  • Patent Literature 1: Unexamined Japanese Patent Application Kokai Publication No. H06-301897
  • Patent Literature 2: U.S. Patent Application Publication No. 2014-114509 SUMMARY OF INVENTION Technical Problem
  • An objective of the present disclosure is to provide a system for assisting avoidance of collision between a ship and an obstacle by collecting various types of sensor data in a ship travel route using sensors, leaning from the collected sensor data, and detecting, identifying and evaluating an obstacle area as a result of the learning.
  • Solution to Problem
  • In a first aspect of the present disclosure, a collision avoidance assistance system for assisting avoidance of collision between a ship and an obstacle using sensors selected from among a video camera, a color meter, a spectrometer, an infrared sensor, a temperature sensor, a radar, a device for LIDAR, a sonar, an air-speed meter, a ship-speed meter, and a global navigation satellite system (GNSS) receiver is provided, and the system is characterized by including (a) three dimensional viewing field acquisition means for acquiring a three dimensional viewing field in a ship travel route by integrating sensor data obtained via sensors selected from among the video camera, the infrared sensor, and the device for LIDAR, and (b) learning means for (i) collecting the sensor data obtained via the sensors in the three dimensional viewing field in which the obstacle is located, and (ii) outputting an evaluation of the obstacle based on the collected sensor data by using a deep learning method in which a pair of the collected sensor data and the evaluation of the obstacle is used as learning data, wherein the evaluation of the obstacle includes identification information about a floating object on the sea, and the system provides, as information for assisting avoidance of collision with the obstacle, an evaluation of a current obstacle that is outputted by the learning means based on the sensor data on a current three dimensional viewing field obtained via the sensors.
  • The term, “LIDAR”, means “Light Detection and Ranging”, and the device for LIDAR is a sensor for detecting light and measuring a distance.
  • The sensors may be fixed to poles included in the ship.
  • Alternatively, the sensors may be attached to a drone for obtaining an observation altitude and an observation distance, and the drones may be connected to the ship with a wired or wireless connection.
  • The evaluation of the obstacle may further include identification information about an object on or in the sea, and the object on or in the sea is another ship different from the ship, a buoy, or a hidden rock.
  • The evaluation may be displayed on at least one selected from among (i) a screen installed on the ship in order to warn a ship pilot, (ii) a head-up display installed on the ship, (iii) a terminal device for the ship pilot, (iv) a head mounted display for the ship pilot, and (v) a visor for the ship pilot.
  • The evaluation may be transmitted to an autopilot system installed in the ship and used for avoiding potential collisions.
  • The evaluation may be transmitted by a device for long distance communication and may be shared with a database on a network or a system of another ship sailing near the ship.
  • The learning data may include (i) a pair of: sensor data collected in another collision avoidance assistance system different from the present system; and an evaluation of the obstacle or (ii) a pair of: sensor data generated as test data; and an evaluation of the obstacle.
  • The learning means may collect sensor data in a plurality of collision avoidance assistance systems connected to one another via communication and may use, as learning data, a pair of the collected sensor data and an evaluation of the obstacle.
  • BRIEF DESCRIPTION OF DRAWINGS
  • FIG. 1 is a schematic plane view illustrating a sea area on which a ship equipped with a collision avoidance assistance system according to an embodiment of the present disclosure sails;
  • FIG. 2 is a schematic plane view illustrating the ship equipped with the system according to the embodiment;
  • FIG. 3 is a schematic side view illustrating the ship equipped with the system according to the embodiment;
  • FIG. 4 is a block diagram illustrating the collision avoidance assistance system included in the ship, sensors connected to the collision avoidance assistance system, and various systems;
  • FIG. 5 is a schematic view illustrating data on a three dimensional viewing field formed by a three dimensional region and an area of an obstacle included in the three dimensional region;
  • FIG. 6 is a schematic view illustrating assistance information displayed on a screen; and
  • FIG. 7 is a schematic view illustrating the three dimensional viewing field projected on a head-up display.
  • DESCRIPTION OF EMBODIMENTS
  • A concrete example of the present disclosure is described below in an embodiment with reference to the drawings.
  • Operations of functional components of the embodiment are achieved by (i) executing a pre-installed control program such as firmware by a processor of a system circuit and (ii) making the processor cooperate with various devices that are components of the system. Also, the program is stored in a computer-readable recording medium, is read from the recording medium by the processor, and is executed by an operation of a user such as a ship pilot or by receiving a signal from a device included in the system.
  • Embodiment 1
  • Overall View Including a Network
  • FIG. 1 is a plane schematic view illustrating a water surface on which a ship equipped with a collision avoidance assistance system according to an embodiment of the present disclosure sails. FIG. 1 illustrates (i) a ship 1001 including a sensor group 1002 and sailing on a waterway represented by an area between two straight lines 1000, and (ii) a seaweed 1003, a buoy 1004, and a hidden rock 1005 that are obstacles on or in the sea. An arrow 1006 denotes a direction of a navigation route of the ship, and an area between two straight lines 1007 represents a plan region of a three dimensional viewing field in which the obstacles are detected using the sensor group 1002 of the ship. A drone 1008 is connected to the ship 1001 via a wireless communication, includes a video camera, approaches, from the air, an obstacle located in a region of the three dimensional viewing field and closely examine the obstacle, and transmits sensor data to the system of the ship. Although the video camera is used as a sensor installed in the drone in the present embodiment, the video camera may be replaced with a color meter, a spectrometer, an infrared sensor, a thermometer or the like in accordance with a degree of necessity of the examination of the obstacle. Also, in the present disclosure, the connection between the drone and the ship is not limited to the wireless connection, and the drone may be appropriately connected to the ship in a wired connection method.
  • FIG. 2 is a schematic plane view illustrating the ship equipped with the system according to the embodiment. A ship body 2001 of the ship includes a right pole 2002, a left pole 2003 and a middle pole 2004, and a sensor is attached to each of these poles.
  • FIG. 3 is a schematic side view illustrating the ship equipped with the system according to the embodiment. Only the right pole 2002 among the left and right poles symmetrically arranged on the ship body is illustrated in FIG. 3. The middle pole 2004 is disposed on a roof of a room 3001 of the ship. A sonar 3003 is disposed on a bottom 3002 of the ship.
  • A video camera, a color meter, a spectrometer, and an infrared sensor are mounted on upper portions of the right pole 2002 and the left pole 2003. A radar, a device for LIDAR, an air-speed meter, a thermometer, a ship-speed meter, and a GNSS receiver are mounted on the middle pole 2004.
  • In the present embodiment, the thermometer is an infrared thermometer enabling measurement of a temperature of an area of an obstacle without contact.
  • FIG. 4 is a block diagram illustrating the collision avoidance assistance system included in the ship, sensors connected to the collision avoidance assistance system, and various systems.
  • The collision avoidance assistance system 4000 is connected to a sensor group 4001 installed in the ship and a ship handling system 4002 of the ship. Additionally, the collision avoidance assistance system is connected, via a communication network 4003, to (i) a database 4004 about information on ship handling and (ii) a system 4005 of another ship.
  • Functional components of the collision avoidance assistance system 4000 are a three dimensional viewing field acquirer 4006, a collector 4007, a learning unit 4008, an assisting unit 4009, and a communicator 4010.
  • The sensor group 4001 expresses all of sensors attached to the ship 1001 and the drone 1008, and a data set obtained by the sensor group is transmitted to the collision avoidance assistance system.
  • The three dimensional viewing field generator 4006 forms data on a three dimensional region in the direction of the navigation route of the ship from (i) image data obtained by the video camera and the infrared sensor disposed on the right and left poles and included in the sensor group 4001 and (ii) data on an object in the sea obtained by the sonar, and generates a three dimensional viewing field including an area of the waterway and an area of the obstacle other than the area of the waterway. FIG. 5 is a schematic view illustrating data on a three dimensional viewing field formed by a three dimensional region and an area of an obstacle included in the three dimensional region. In the present embodiment, a three dimensional region 5001 including areas of the seaweed 1003, the buoy 1004, and the hidden rock 1005 that are obstacles is formed in a three dimensional coordination space in which a point of the water's surface onto which a bow of the ship 5000 is projected is defined as the origin (X, Y, Z)=(0, 0, 0).
  • In this case, the three dimensional region is a region including two straight lines 1007 and extending with depth from four points 5002, 5003, 5004, and 5005 on the XY-plane in the Z-direction denoting the direction in which the ship sails.
  • The collector 4007 collects a pair of (i) a data set regarding the three dimensional viewing field in which an obstacle such as a drift wood, the seaweed, the other ship, the buoy, the hidden rock, or a shoal is located and (ii) a user's operation of the ship handling system 4002 in response to the three dimensional viewing field or an inputted user's evaluation of the three dimensional viewing field.
  • In addition to data detected by the sensor group such as a shape, a color, a temperature, and speed, a data set of the present disclosure may include (i) data on a type of and a position of a structural object located on the sea as indicated by nautical chart data and/or (ii) data on a type of and a position of the floating object or the like indicated on a sea warning information.
  • During assistance of collision avoidance, the data set regarding a current three dimensional viewing field is inputted into the learning unit 4008 for the purpose of generating assistance information, and the learning unit 4008 outputs an evaluation of an obstacle located in the current three dimensional viewing field. The evaluation is outputted to the assisting unit 4009 and is transmitted, as assistance information displayed on the screen disposed in the ship, to the ship handling system. Although the current evaluation outputted by the learning unit is transmitted as assistance information to be displayed in the present embodiment, assistance information of the present disclosure is not limited to the current evaluation outputted by the learning unit, and the evaluation outputted by the learning unit may be transmitted to an autopilot system incorporated into the ship handling system with the evaluation unchanged. FIG. 6 is a schematic view illustrating assistance information displayed on a screen. A message 6001 indicating existence of an obstacle and a recommended ship handling is displayed on the screen 6000 in a radar screen of the ship handling system.
  • In the present disclosure, a method in which a three dimensional viewing field is generated on the head-up display disposed on a windshield of the cockpit of the ship and the assistance information is displayed on the head-up display may be used as a manner of displaying the assistance information on a screen. FIG. 7 is a schematic view illustrating the three dimensional viewing field generated on the head-up display and the assistance information displayed on the head-up display. In the bow 7000 of the ship, the head-up display 7004A is disposed on the windshield 7002 of the cockpit 7001 such that the head-up display 7004A is positioned in a viewing field of the ship pilot 7003. The numeral reference 7004B denotes an enlarged image of the head-up display. In FIG. 7, a wave 7005, a floating object 7006 and a buoy 7007 that are detected and identified are displayed in the three dimensional viewing field. The floating object 7006 expressed by the dotted line is colored and displayed on the display, and the floating object 7006 is displayed as a dangerous object that should be avoided. The numeral references 7008, 7009 and 7010 denote indicator labels indicating names of identified obstacles, and the indicator labels indicate the wave, the floating object and the buoy. The numerical number 7011 denotes the bow of the ship and indicates a current direction of travel of the ship. The reference numeral 7012 denotes ship handling recommended by the system in the current situation. As a method of displaying assistance information, the method in which the assistance information is displayed on the screen disposed on the ship and the method in which the assistance information is displayed on the head-up display are presented in this example. However, methods of displaying the assistance information that are applicable to the present disclosure are not limited to such methods, and a method in which the assistance information is displayed on a terminal device of the ship pilot, a head mounted display of the ship pilot, or a visor of the ship pilot is applicable to the present disclosure, as necessary.
  • The system of the present embodiment also learns from (i) a pair of the data set collected during the operation for the above-described assistance and the output on which the assistance information is based, and (ii) the ship handling operation actually selected by the ship handling system. However, in order to efficiently improve the functions of the learning unit 4008, the communicator 4010 may receive a pair of (i) a data set collected by the system 4005 of the other ship and (ii) an output produced in the system 4005, and the system may use, as learning data for the learning unit 4008, the received pair of the data set and the output from the other ship. Additionally, the system may use, as learning data, (i) a data set of test data formed by simulating the three dimensional viewing field in which an obstacle is located and (ii) an output expressing an evaluation of a situation indicated by that data set.
  • By using the above-described system, a system for assisting collision avoidance in a situation in which an obstacle is located on a ship travel route can be provided.
  • This application claims the benefit of Japanese Patent Application No. 2017-155291, the entire disclosure of which is incorporated by reference herein.
  • INDUSTRIAL APPLICABILITY
  • The present disclosure is applicable to a ship industry, a ship-handling industry, and an industry such as an agency for providing ships with information.
  • REFERENCE SIGNS LIST
      • 1001 Ship
      • 1002 Sensor group
      • 1003 Seaweed
      • 1004 Buoy
      • 1005 Hidden rock
      • 1006 Direction of navigation route
      • 1007 Three dimensional viewing field
      • 1008 Drone

Claims (9)

1: A collision avoidance assistance system for assisting avoidance of collision between a ship and an obstacle using sensors selected from among a video camera, a color meter, a spectrometer, an infrared sensor, a temperature sensor, a radar, a device for LIDAR, a sonar, an air-speed meter, a ship-speed meter, and a global navigation satellite system (GNSS) receiver, the system comprising:
(a) three dimensional viewing field acquisition means for acquiring a three dimensional viewing field in a ship travel route by integrating sensor data obtained via sensors selected from among the video camera, the infrared sensor, and the device for LIDAR; and
(b) learning means (i) for collecting the sensor data obtained via the sensors in the three dimensional viewing field in which the obstacle is located and (ii) for outputting an evaluation of the obstacle based on the collected sensor data by using a deep learning method in which a pair of the collected sensor data and the evaluation of the obstacle is used as learning data, wherein
the evaluation of the obstacle includes identification information about a floating object on a sea, and
the system provides, as information for assisting avoidance of collision with the obstacle, an evaluation of a current obstacle that is outputted by the learning means based on the sensor data on a current three dimensional viewing field obtained via the sensors.
2: The collision avoidance assistance system according to claim 1, wherein the sensors are fixed to poles included in the ship.
3: The collision avoidance assistance system according to claim 1, wherein
the evaluation of the obstacle further includes identification information about an object on or in the sea, and
the object on or in the sea is another ship different from the ship, a buoy, or a hidden rock.
4: The collision avoidance assistance system according to claim 1, wherein the sensors are attached to a drone for obtaining an observation altitude and an observation distance, and the drone is connected to the ship with a wired or wireless connection.
5: The collision avoidance assistance system according to claim 1, wherein
the evaluation is displayed on at least one selected from among (i) a screen installed on the ship in order to warn a ship pilot, (ii) a head-up display installed on the ship, (iii) a terminal device for the ship pilot, (iv) a head mounted display for the ship pilot, and (v) a visor for the ship pilot.
6: The collision avoidance assistance system according to claim 1, wherein the evaluation is transmitted to an autopilot system installed in the ship and used for avoiding potential collisions.
7: The collision avoidance assistance system according to claim 1, wherein the evaluation is transmitted by a device for long distance communication and is shared with a database on a network or a system of another ship sailing near the ship.
8: The collision avoidance assistance system according to claim 1, wherein the learning data includes
a pair of (i) sensor data collected in another collision avoidance assistance system different from the system and (ii) an evaluation of the obstacle, or
a pair of (i) sensor data generated as test data and (ii) an evaluation of the obstacle.
9: The collision avoidance assistance system according to claim 1, wherein the learning means (i) collects sensor data in a plurality of collision avoidance assistance systems connected to one another via communication and (ii) uses, as learning data, a pair of the collected sensor data and an evaluation of the obstacle.
US16/637,587 2017-08-10 2018-07-04 Collision avoidance assistance system Abandoned US20200216152A1 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2017155291A JP6293960B1 (en) 2017-08-10 2017-08-10 Collision avoidance support system
JP2017-155291 2017-08-10
PCT/JP2018/025288 WO2019031115A1 (en) 2017-08-10 2018-07-04 Collision avoidance assistance system

Publications (1)

Publication Number Publication Date
US20200216152A1 true US20200216152A1 (en) 2020-07-09

Family

ID=61628701

Family Applications (1)

Application Number Title Priority Date Filing Date
US16/637,587 Abandoned US20200216152A1 (en) 2017-08-10 2018-07-04 Collision avoidance assistance system

Country Status (4)

Country Link
US (1) US20200216152A1 (en)
EP (1) EP3667642A4 (en)
JP (1) JP6293960B1 (en)
WO (1) WO2019031115A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
IT202000028616A1 (en) * 2020-11-26 2022-05-26 Accurami S R L ASSISTANCE SYSTEM FOR VESSELS DOCKING IN THE PORT
US11495028B2 (en) * 2018-09-28 2022-11-08 Intel Corporation Obstacle analyzer, vehicle control system, and methods thereof
US20220404387A1 (en) * 2021-06-21 2022-12-22 Honda Motor Co., Ltd. Object detection device
TWI835431B (en) * 2022-11-28 2024-03-11 財團法人金屬工業研究發展中心 Ship docking system and ship docking method

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019197606A1 (en) * 2018-04-13 2019-10-17 Hamburger Hafen & Logistik Ag Monitoring unit
CN108922247B (en) * 2018-07-25 2020-11-06 重庆大学 Ship-navigation mark collision risk degree estimation method based on AIS
CN110927798B (en) * 2018-09-20 2021-12-31 中国石油化工股份有限公司 Logging curve prediction method and system based on deep learning
CN109523833A (en) * 2018-11-05 2019-03-26 中设设计集团股份有限公司 A kind of evidence-obtaining system and evidence collecting method of inland navigation craft and small bridge collision
KR102073675B1 (en) * 2019-07-26 2020-02-05 주식회사 우리아이씨티 Searching method for pipes and valves are located dangerous facility using multidimensional measure information based on GNSS, Total station, LiDAR, Drone
CN110764080B (en) * 2019-10-30 2023-08-15 武汉理工大学 Method for detecting navigation-following ship formation target in ship lock based on millimeter wave radar
CN111145595B (en) * 2020-02-20 2021-06-25 智慧航海(青岛)科技有限公司 Method for confirming key avoidance of autonomous driving ship based on projection pursuit method
CN113009590B (en) * 2021-02-01 2022-04-08 西南科技大学 Three-dimensional foreign matter detection system and method in vehicle bottom security inspection system
CN113112871B (en) * 2021-04-14 2022-06-24 上海海事大学 Ship-bridge collision risk calculation method considering ship dimension
CN114394206B (en) * 2022-01-07 2024-01-23 苏州天炯信息科技有限公司 Intelligent anti-collision alarm device for ship
KR102633461B1 (en) * 2023-07-05 2024-02-02 한성진 Drone Device for Ship Navigation Guidance and Driving Method Thereof
CN116935699B (en) * 2023-09-15 2023-12-12 天津港(集团)有限公司 Intelligent seaport channel integrated monitoring system and method

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3045625B2 (en) 1993-04-16 2000-05-29 川崎重工業株式会社 Ship navigation support device
JP2004354170A (en) * 2003-05-28 2004-12-16 Asahi Kasei Corp Three-dimensional information display device and array type ultrasonic sensor module
FR2864249B1 (en) * 2003-12-19 2006-02-03 Thales Sa OBSTACLE REMOVAL SYSTEM FOR FAST MULTI-SHIP VESSELS
NO332432B1 (en) * 2008-08-12 2012-09-17 Kongsberg Seatex As System for detection and imaging of objects in the trajectory of marine vessels
EP2911935B1 (en) 2012-10-24 2020-10-07 Naiad Maritime Group, Inc. Predictive sea state mapping for ship motion control
JP2016035707A (en) * 2014-08-04 2016-03-17 日立建機株式会社 Conveyance vehicle for mine
US10931934B2 (en) * 2014-09-02 2021-02-23 FLIR Belgium BVBA Watercraft thermal monitoring systems and methods
WO2016163559A1 (en) * 2015-04-09 2016-10-13 ヤマハ発動機株式会社 Small vessel and small vessel trailer system
JP2017155291A (en) 2016-03-02 2017-09-07 株式会社コイワイ Manufacturing method of high strength aluminum alloy laminate molded body
EP4209856A1 (en) * 2017-06-16 2023-07-12 Flir Belgium BVBA Autonomous and assisted docking systems and methods

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11495028B2 (en) * 2018-09-28 2022-11-08 Intel Corporation Obstacle analyzer, vehicle control system, and methods thereof
IT202000028616A1 (en) * 2020-11-26 2022-05-26 Accurami S R L ASSISTANCE SYSTEM FOR VESSELS DOCKING IN THE PORT
US20220404387A1 (en) * 2021-06-21 2022-12-22 Honda Motor Co., Ltd. Object detection device
US11782069B2 (en) * 2021-06-21 2023-10-10 Honda Motor Co., Ltd. Object detection device
TWI835431B (en) * 2022-11-28 2024-03-11 財團法人金屬工業研究發展中心 Ship docking system and ship docking method

Also Published As

Publication number Publication date
EP3667642A1 (en) 2020-06-17
EP3667642A4 (en) 2021-05-12
JP6293960B1 (en) 2018-03-14
WO2019031115A1 (en) 2019-02-14
JP2019036010A (en) 2019-03-07

Similar Documents

Publication Publication Date Title
US20200216152A1 (en) Collision avoidance assistance system
KR102240839B1 (en) Autonomous navigation method using image segmentation
CN108281043B (en) Ship collision risk early warning system and early warning method
KR101289349B1 (en) Marine navigation system
KR102661171B1 (en) System for predicting degree of collision risk and guiding safe voyage route through fusing navigation sensor inside ship and image information
US11514668B2 (en) Method and device for situation awareness
US7646313B2 (en) Method and device for assisting in the piloting of an aircraft
KR102466804B1 (en) Autonomous navigation method using image segmentation
US11892298B2 (en) Navigational danger identification and feedback systems and methods
KR101288953B1 (en) Black box system for leisure vessel
KR102265980B1 (en) Device and method for monitoring ship and port
EP4089660A1 (en) Method and device for monitoring port and ship in consideration of sea level
JP5102886B2 (en) Image display system, image display method, and program
JP2021152917A (en) Information processing device, control method, program and storage medium
CN111290410A (en) Millimeter wave radar-based automatic ship berthing and departing system and method
US12118777B2 (en) Method and device for situation awareness
CN115723919B (en) Auxiliary navigation method and device for ship yaw
Douguet et al. Multimodal perception for obstacle detection for flying boats-Unmanned Surface Vehicle (USV)
US11022441B2 (en) Marine electronic device for generating a route based on water depth
KR20180076936A (en) Optimum Ocean Route guide system and method using low cost estimation device
KR20140137233A (en) System and Method for Shipping lookout using the 3D spatial
EP3905223A1 (en) Aircraft display systems and methods for identifying target traffic
EP2946225A1 (en) Method to control a space including a plurality of mobile or not mobile stations
EP4173942A1 (en) Navigation assistance device using augmented reality and 3d images
CN114253278A (en) Ship harbor-entering berthing auxiliary system and method based on multiple antennas

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION