WO2022113243A1 - Visibility information generation device, visibility information generation method, and visibility information generation program - Google Patents
Visibility information generation device, visibility information generation method, and visibility information generation program Download PDFInfo
- Publication number
- WO2022113243A1 WO2022113243A1 PCT/JP2020/044080 JP2020044080W WO2022113243A1 WO 2022113243 A1 WO2022113243 A1 WO 2022113243A1 JP 2020044080 W JP2020044080 W JP 2020044080W WO 2022113243 A1 WO2022113243 A1 WO 2022113243A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- line
- sight
- determination
- unit
- dimensional
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 42
- 238000012545 processing Methods 0.000 claims abstract description 63
- 230000002093 peripheral effect Effects 0.000 claims abstract description 17
- 238000009826 distribution Methods 0.000 claims description 4
- 230000002452 interceptive effect Effects 0.000 description 54
- 238000004891 communication Methods 0.000 description 26
- 238000010586 diagram Methods 0.000 description 25
- 238000003860 storage Methods 0.000 description 19
- 230000015654 memory Effects 0.000 description 13
- 239000000463 material Substances 0.000 description 4
- 238000005259 measurement Methods 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 4
- 239000000126 substance Substances 0.000 description 4
- 101100096895 Mus musculus Sult2a2 gene Proteins 0.000 description 3
- 239000003086 colorant Substances 0.000 description 3
- 238000012790 confirmation Methods 0.000 description 3
- 238000004040 coloring Methods 0.000 description 2
- 238000005520 cutting process Methods 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 230000001174 ascending effect Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 230000001678 irradiating effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B61—RAILWAYS
- B61L—GUIDING RAILWAY TRAFFIC; ENSURING THE SAFETY OF RAILWAY TRAFFIC
- B61L23/00—Control, warning or like safety means along the route or between vehicles or trains
- B61L23/04—Control, warning or like safety means along the route or between vehicles or trains for monitoring the mechanical state of the route
Definitions
- the present disclosure relates to a line-of-sight information generator, a line-of-sight information generation method, and a line-of-sight information generation program that generate line-of-sight information, which is information indicating the line-of-sight status of the target equipment from a moving body moving on a moving path.
- a special signal light emitter that indicates that a situation that interferes with train operation has occurred is known as a light emitting signal.
- the special signal light emitter is provided with an emergency stop button, and the special signal light emitter outputs a light emission signal when the emergency stop button is pressed.
- the train crew confirms the light emission signal of the special signal light emitter, it is necessary to stop the train. Therefore, the special signal light emission is performed when the train is located at a point in front of the special signal light emitter by a certain distance or more. It is necessary for the train crew to be able to confirm the light emission signal of the aircraft.
- Patent Document 1 proposes a special signal detection device that detects light emission in a special signal light emitter.
- a special signal detection device divides a continuous image captured by an imaging unit into image regions, extracts amplitude data with respect to a luminance frequency from a partial image composed of a plurality of divided frames, and uses a special signal based on the extracted result. Detects the presence or absence of a light emission signal from the light emitter.
- the technique described in Patent Document 1 is a technique of detecting a light emission signal of a special signal light emitter and notifying the train crew of the technique, which is a moving body moving on a moving path and is in front of a certain distance or more of the target equipment. It is not a technique for providing information showing the outlook status of the target equipment from the above point.
- the technique described in Patent Document 1 if the visibility of the target equipment from a point in front of the target equipment by a certain distance or more is poor, it is possible that the target equipment cannot be detected from a point in front of the target equipment by a certain distance or more. There is sex.
- the present disclosure has been made in view of the above, and an object of the present disclosure is to obtain a line-of-sight information generator capable of accurately presenting the line-of-sight status of the target equipment from a moving body moving on a moving path.
- the outlook information generation device of the present disclosure includes a data acquisition unit, a outlook information generation unit, and an output processing unit.
- the data acquisition unit acquires three-dimensional point cloud data representing an object in the peripheral region of the moving path on which the moving body moves as a three-dimensional point cloud.
- the outlook information generation unit generates outlook information, which is information indicating the outlook status of the target equipment from a viewpoint position away from the target equipment in the peripheral area, based on the three-dimensional point cloud data acquired by the data acquisition unit. ..
- the output processing unit outputs the line-of-sight information generated by the line-of-sight information generation unit.
- the figure which shows an example of the 1st 2D image of the determination target area which the 3D point included in the determination target area is color-coded according to the position by the target area image generation part which concerns on Embodiment 1.
- the figure which shows an example of the 2nd two-dimensional image of the determination target area which the 3D point included in the determination target area is color-coded according to the position by the target area image generation part which concerns on Embodiment 1.
- the figure which shows an example of the outlook information which concerns on Embodiment 1. A flowchart showing an example of processing by the processing unit of the line-of-sight information generation device according to the first embodiment.
- the outlook information generation device, the outlook information generation method, and the outlook information generation program according to the embodiment will be described in detail below based on the drawings.
- FIG. 1 is a diagram showing an example of a configuration of an information providing system including a line-of-sight information generator according to the first embodiment.
- the information providing system 100 according to the first embodiment is arranged on the line-of-sight information generation device 1 and the moving body 4 moving on the moving path 3, and measures the peripheral area of the moving path 3 in three dimensions. It is provided with a point cloud measuring device 2.
- the moving path 3 is a railroad track
- the moving body 4 is a train traveling on the railroad track.
- the moving path 3 may be a moving path other than a railroad track such as a road
- the moving body 4 may be a moving body other than a train such as an automobile traveling on a road.
- the three-dimensional point cloud measuring device 2 measures the peripheral area of the moving path 3 while the moving body 4 is moving on the moving path 3, and measures the three-dimensional shape of the object existing in the peripheral area of the moving path 3. Generates 3D point cloud data represented by a 3D point cloud.
- the three-dimensional point cloud measuring device 2 includes, for example, a laser scanner or an optical cutoff sensor.
- the laser scanner irradiates an object with a laser beam, measures the time until the laser beam irradiating the object returns, and converts the measured time into a distance to measure the three-dimensional shape of the object.
- the optical cutting sensor measures the three-dimensional shape of an object by the optical cutting method.
- the 3D point cloud data generated by the 3D point cloud measuring device 2 is acquired by the line-of-sight information generation device 1.
- the line-of-sight information generation device 1 can acquire the three-dimensional point cloud data generated by the three-dimensional point cloud measuring device 2 from the three-dimensional point cloud measuring device 2 by wireless communication or wired communication. Further, the line-of-sight information generation device 1 generates 3D point cloud data generated by the 3D point cloud measurement device 2 from a recording medium on which the 3D point cloud data generated by the 3D point cloud measurement device 2 is recorded. You can also get it.
- the line-of-sight information generation device 1 shows the line-of-sight status of the target equipment from a viewpoint position away from the target equipment in the peripheral area of the moving path 3 based on the three-dimensional point cloud data acquired from the three-dimensional point cloud measuring device 2.
- the target equipment is, for example, a special signal light emitter, but may be equipment along a railway line other than the special signal light emitter, such as a traffic light or a railroad crossing.
- the line-of-sight information generated by the line-of-sight information generation device 1 is, for example, a preset area in the field of view from the viewpoint position among a plurality of three-dimensional points constituting the three-dimensional point cloud represented by the three-dimensional point cloud data. Contains information about 3D points in.
- the line-of-sight information generation device 1 uses the three-dimensional point group data obtained by the three-dimensional measurement. Therefore, as compared with the case of using the image captured by the image pickup apparatus, it is possible to accurately present the line-of-sight status of the target equipment from the moving body 4.
- FIG. 2 is a diagram showing an example of the configuration of the line-of-sight information generation device according to the first embodiment.
- the line-of-sight information generation device 1 includes a communication unit 10, a storage unit 11, and a processing unit 12.
- the communication unit 10 is communicably connected to a network (not shown), and transmits / receives information to / from an external device such as a three-dimensional point cloud measuring device 2 via the network (not shown).
- the network (not shown) is, for example, a WAN (Wide Area Network) such as the Internet or a LAN (Local Area Network).
- the storage unit 11 stores the three-dimensional point cloud data 20 transmitted from the three-dimensional point cloud measuring device 2 and received by the communication unit 10, and the reference line data 21 which is data related to the reference line along the movement path 3. ..
- the reference line data 21 may be generated by the analysis of the three-dimensional point cloud data 20 by the processing unit 12 and stored in the storage unit 11, and may be transmitted from an external device and received by the communication unit 10 by the processing unit 12. It may be stored in the storage unit 11.
- the processing unit 12 can detect the movement path 3 from the three-dimensional point cloud shown in the three-dimensional point cloud data 20, and use a line along the detected movement path 3 as a reference line.
- FIG. 3 is a diagram showing an example of three-dimensional point cloud data according to the first embodiment.
- the three-dimensional point cloud data 20 includes data of a plurality of three-dimensional points.
- the three-dimensional point data includes data indicating the coordinates of the three-dimensional point in the three-dimensional Cartesian coordinate system.
- the three-dimensional point cloud data 20 shown in FIG. 3 includes data having coordinates “X1, Y1, Z1”, data having coordinates “X2, Y2, Z2”, data having coordinates “X3, Y3, Z3”, and the like, respectively. It is included as data of dimension points.
- FIG. 4 is a diagram showing an example of reference line data according to the first embodiment.
- the reference line data 21 includes data of a plurality of reference points.
- the reference point is a three-dimensional point on the reference line.
- the reference point data includes data indicating the coordinates of the reference point, the roll value, and the order of arrangement.
- the coordinates of the reference point are the coordinates of the reference point in the three-dimensional Cartesian coordinate system.
- the roll value is data indicating the inclination of the moving path 3. Such a roll value can be said to be data indicating the inclination of the moving body 4 when the moving body 4 is on the reference point in the moving path 3.
- the arrangement order is a value indicating the movement direction of the moving body 4, and the direction from the reference point having a small arrangement order to the reference point having a large arrangement order is the movement direction of the moving body 4.
- the reference line data 21 shown in FIG. 4 includes data having coordinates “BX1, BY1, BZ1”, roll value “R1”, and arrangement order “1”, coordinates “BX2, BY2, BZ2”, and roll value “R2”. And the data of the arrangement order "2”, the coordinates "BX3, BY3, BZ3", the roll value "R3", the data of the arrangement order "3", and the like are included as the data of the reference points, respectively.
- the processing unit 12 shown in FIG. 2 includes a data acquisition unit 30, a line-of-sight information generation unit 31, and an output processing unit 32.
- the data acquisition unit 30 acquires the 3D point cloud data 20 transmitted from the 3D point cloud measuring device 2 and received by the communication unit 10 from the communication unit 10, and stores the acquired 3D point cloud data 20 in the storage unit 11.
- the data acquisition unit 30 acquires the three-dimensional point cloud data 20 and the reference line data 21 from the storage unit 11, and obtains the acquired three-dimensional point cloud data 20 and the reference line data 21 as the line-of-sight information generation unit. Notify 31.
- the line-of-sight information generation unit 31 is information indicating the line-of-sight status of the target equipment from a viewpoint position away from the target equipment in the peripheral area of the movement path 3 based on the three-dimensional point cloud data 20 acquired by the data acquisition unit 30. Generate outlook information that is.
- the line-of-sight information generation unit 31 includes a viewpoint position determination unit 40, a determination area determination unit 41, and a generation processing unit 42.
- the viewpoint position determination unit 40 determines the viewpoint position based on the distance from the target equipment and the inclination of the movement path 3. For example, the viewpoint position determination unit 40 determines the viewpoint position based on the distance from the target equipment along the reference line indicated by the reference line data 21.
- the viewpoint position is, for example, the standard eye position of the driver who drives the moving body 4.
- FIG. 5 is a diagram showing an example of a plurality of reference points on the reference line shown by the reference line data according to the first embodiment. As shown in FIG. 5, a plurality of reference points on the reference line Lr shown by the reference line data 21 are arranged in the center between a pair of rails which are the movement paths 3 of the moving body 4.
- FIG. 6 is a diagram showing an example of a process of determining a position in front of a designated distance from the target equipment position by the viewpoint position determining unit according to the first embodiment.
- TP is the position of the target equipment, and will be referred to as the target equipment position TP below.
- the target equipment position TP is represented by the coordinates "TX, TY, TZ" in the three-dimensional Cartesian coordinate system.
- the viewpoint position determining unit 40 is separated from the position BT by a designated distance Dt, which will be described later, in the first calculation process for calculating the position BT on the reference line Lr corresponding to the target equipment position TP and the distance on the reference line Lr.
- the viewpoint position determining unit 40 selects the reference point closest to the target equipment position TP and the reference point next to the target equipment position TP as selection reference points. Then, the viewpoint position determining unit 40 calculates the intersection of the plane that is orthogonal to the straight line connecting the two selection reference points and including the target equipment position TP and the straight line connecting the two selection reference points as the position BT. do.
- the reference point closest to the target equipment position TP is the reference point P23
- the reference point next to the target equipment position TP is the reference point P24.
- the viewpoint position determination unit 40 determines the intersection of a plane that is orthogonal to the straight line connecting the reference point P23 and the reference point P24 and includes the target equipment position TP and a straight line connecting the reference point P23 and the reference point P24. Calculated as position BT.
- the viewpoint position determining unit 40 calculates a point on the reference line Lr in which the length of the straight line passing through the reference point becomes the designated distance Dt in the order from the position BT in ascending order, and determines the position of the calculated point as the position BE. do. As a result, the position in front of the position BT by the designated distance Dt is determined as the position BE.
- the viewpoint position determining unit 40 calculates the distance d1 between the position BT and the reference point P23.
- the viewpoint position determining unit 40 calculates the distance d2 between the reference point P23 and the reference point P22, and adds the calculated distance d2 and the distance d1 to calculate the integrated distance D.
- the viewpoint position determining unit 40 calculates the distance d3 between the reference point P22 and the reference point P21, integrates the calculated distance d3 into the integrated distance D, and calculates a new integrated distance D.
- the viewpoint position determining unit 40 repeats the same process.
- the viewpoint position determining unit 40 determines two reference points whose integrated distance D becomes the designated distance Dt or more when integrated, and designates the integrated distance D to the reference point having the larger arrangement order among the determined two reference points. The difference ⁇ d from the distance Dt is calculated. The viewpoint position determination unit 40 determines a position on a straight line connecting the determined two reference points and having a difference ⁇ d in distance from the reference point having the larger arrangement order as the position BE.
- the viewpoint position determining unit 40 is a position on a straight line connecting the two reference points P5 and P6. Therefore, the position where the distance from the reference point P6 having the larger arrangement order is the difference ⁇ d is determined as the position BE.
- the designated distance Dt is stored in advance in the storage unit 11, it may be received by the communication unit 10, acquired by the data acquisition unit 30 from the communication unit 10, and stored in the storage unit 11.
- the viewpoint position determining unit 40 selects the reference point closest to the position BE and the reference point next to the position BE. Next, the viewpoint position determining unit 40 determines the roll value of the position BE from the roll values of the two selected reference points.
- the roll value of the reference point is the data included in the above-mentioned reference line data 21.
- the viewpoint position determination unit 40 determines, for example, the roll value of the reference point having the largest arrangement order among the two reference points as the roll value of the position BE. Further, the viewpoint position determining unit 40 can also determine the roll value of the reference point having the smaller arrangement order among the two reference points as the roll value of the position BE. Further, the viewpoint position determining unit 40 can also determine the average value of the roll values of the two reference points as the roll value of the position BE.
- the viewpoint position determination unit 40 calculates a plane that is orthogonal to the straight line connecting the two selected reference points and includes the position BE, and the roll value of the position BE including the position BE on the calculated plane.
- a straight line having an inclination is calculated as the first straight line.
- the viewpoint position determining unit 40 calculates the first position on the first straight line, which is separated from the position BE by the distance L, and is orthogonal to the first straight line and on the second straight line including the first position. In, the second position separated by the distance H is calculated.
- the viewpoint position determination unit 40 determines the calculated second position as the viewpoint position EP.
- the viewpoint position EP is represented by the coordinates "EX, EY, EZ" in the three-dimensional Cartesian coordinate system.
- FIG. 7 is a diagram showing an example of a viewpoint position determined by the viewpoint position determining unit according to the first embodiment when the inclination of the first straight line is small.
- FIG. 8 is a diagram showing an example of a viewpoint position determined by the viewpoint position determining unit according to the first embodiment when the inclination of the first straight line is large.
- the viewpoint position determining unit 40 is the first position PB1 which is separated from the position BE by the distance L on the first straight line SL1 including the position BE and having the slope of the roll value of the position BE. Is calculated, and the second position PB2 which is orthogonal to the first straight line SL1 and is separated by the distance H on the second straight line SL2 including the first position PB1 is calculated.
- the second straight line SL2 is a straight line on a plane orthogonal to the first straight line SL1 and including the first position PB1.
- the viewpoint position determination unit 40 determines the calculated second position PB2 as the viewpoint position EP. Although the distance L and the distance H are stored in advance in the storage unit 11, they may be received by the communication unit 10, acquired from the communication unit 10 by the data acquisition unit 30, and stored in the storage unit 11.
- the reference line data 21 is not limited to the example shown in FIG. 4, and may be data that does not include the roll value of the reference point.
- the reference line data 21 is the data of the pair of reference lines Lr corresponding to the pair of rails, and includes the coordinates of the plurality of reference points on each reference line Lr and the data of the arrangement order.
- the viewpoint position determining unit 40 determines the position BE from a plurality of reference points on one of the reference lines Lr of the pair of reference lines Lr, and selects the reference point closest to the position BE and the reference point next to the position BE. Select as.
- the viewpoint position determining unit 40 is a straight line intersecting a straight line orthogonal to the straight line connecting the two selected reference points among a plurality of straight lines connecting two reference points of different combinations adjacent to each other on the other reference line Lr. The intersection position of is determined as the position OBP.
- the viewpoint position determining unit 40 determines the straight line connecting the position BE and the position OBP as the first straight line SL1 described above.
- FIG. 9 is a diagram showing another example of the process of determining the position before the designated distance from the target equipment position by the viewpoint position determining unit according to the first embodiment.
- FIG. 10 is a diagram showing another example of the viewpoint position determining method by the viewpoint position determining unit according to the first embodiment.
- the viewpoint position determining unit 40 selects the reference point P5 closest to the position BE and the reference point P6 closest to the position BE.
- the viewpoint position determining unit 40 determines the position where the straight line orthogonal to the straight line connecting the two reference points P5 and P6 intersects the straight line connecting the two adjacent reference points P5'and P6' on the other reference line Lr. Determined as position OBP.
- the viewpoint position determining unit 40 determines the straight line connecting the position BE and the position OBP as the first straight line SL1 described above.
- the viewpoint position determining unit 40 calculates a first position PB1 on the first straight line SL1 that is separated from the position BE by a distance L, and is orthogonal to the first straight line SL1 and has a second position.
- the second position PB2 separated by the distance H on the second straight line SL2 including the position PB1 of 1 is calculated.
- the viewpoint position determination unit 40 determines the calculated second position PB2 as the viewpoint position EP.
- the determination area determination unit 41 determines a preset area around the straight line connecting the target equipment position TP and the viewpoint position EP as the determination target area.
- FIG. 11 is a diagram showing an example of a determination target region determined by the determination region determination unit according to the first embodiment.
- the determination area determination unit 41 determines a preset area around the straight line SL3 connecting the target equipment position TP and the viewpoint position EP as the determination target area AR.
- the determination target region AR shown in FIG. 11 is a columnar region having a radius r having a straight line SL3 connecting the target equipment position TP and the viewpoint position EP as the center line.
- the determination target area AR determined by the determination area determination unit 41 is not limited to the example shown in FIG.
- the determination area determination unit 41 may determine the polygonal columnar region as the determination target region AR instead of the columnar region, or may determine the elliptical columnar region as the determination target region AR.
- FIG. 12 is a diagram showing another example of a method for determining a determination target area by the determination area determination unit according to the first embodiment.
- the determination target region AR determined by the determination region determination unit 41 is a square columnar region having a straight line SL3 connecting the target equipment position TP and the viewpoint position EP as the center line, and is the first. It is a region surrounded by two surfaces AS1 and AS2 parallel to the straight line SL1 and two surfaces AS3 and AS4 parallel to the second straight line SL2.
- FIG. 13 is a diagram showing an example of the relationship between the three-dimensional point cloud shown in the three-dimensional point cloud data according to the first embodiment and the determination target area
- FIG. 14 is a diagram showing an example of the relationship between the three-dimensional point cloud and the determination target area
- FIG. 14 is a diagram of the moving body according to the first embodiment. It is a figure which shows an example of the determination target area seen from the traveling direction.
- the example shown in FIG. 13 shows a state in which the three-dimensional point cloud and the determination target area AR are viewed from above.
- the example shown in FIG. 14 shows a state in which the determination target area AR is viewed from the vicinity of the viewpoint position EP, and in FIG. 14, the determination target area AR is an area surrounded by a broken line.
- the line-of-sight information generation device 1 a preset area around the straight line connecting the target equipment position TP and the viewpoint position EP is determined as the determination target area AR.
- the generation processing unit 42 generates outlook information based on the three-dimensional point cloud data 20 acquired by the data acquisition unit 30 and the determination target area AR determined by the determination area determination unit 41.
- the line-of-sight information is included in the determination target area AR when, for example, there is a three-dimensional point included in the determination target area AR among a plurality of three-dimensional points constituting the three-dimensional point group indicated by the three-dimensional point group data 20. Includes information according to 3D points.
- the generation processing unit 42 includes a target area image generation unit 50, a line-of-sight determination unit 51, and an interfering object image generation unit 52.
- the target area image generation unit 50 is a two-dimensional image of the determination target area AR obtained by projecting the three-dimensional points included in the determination target area AR among the three-dimensional point groups shown in the three-dimensional point group data 20 onto the reference plane. Generate information including the above as outlook information.
- FIG. 15 is a diagram showing an example of an interfering substance existing in the determination target region according to the first embodiment.
- the interference substance I1 and the interference substance I2 are partially included in the determination target region AR.
- the interfering material I1 is an overhead wire pillar, and the interfering material I2 is a tree. Therefore, in the example shown in FIG. 15, the determination target region AR includes a part of the three-dimensional point cloud representing the interfering object I1 and a part of the three-dimensional point cloud representing the interfering material I2.
- the target area image generation unit 50 arranges the three-dimensional points included in the determination target area AR in the straight line SL3 shown in FIG. 11 in the size of PL1 [mm 2 ] among the three-dimensional point groups shown in the three-dimensional point group data 20.
- a two-dimensional image of the determination target area AR is generated by projecting onto a vertical surface. Further, the target area image generation unit 50 projects the three-dimensional points included in the determination target area AR among the three-dimensional point groups shown in the three-dimensional point group data 20 onto a horizontal plane with a size of PL1 [mm 2 ].
- a two-dimensional image of the determination target area AR is generated.
- PL1 is stored in advance in the storage unit 11, it may be received by the communication unit 10, acquired from the communication unit 10 by the data acquisition unit 30, and stored in the storage unit 11.
- FIG. 16 is a diagram showing an example of a first two-dimensional image of a determination target area generated by the target area image generation unit according to the first embodiment.
- the first two-dimensional image 60 shown in FIG. 16 is a two-dimensional image of the determination target region AR obtained by projecting the three-dimensional points included in the determination target region AR onto a plane perpendicular to the straight line SL3 shown in FIG. ..
- the first two-dimensional image 60 includes a three-dimensional point included in the determination target area AR and projected as a two-dimensional point on the reference plane, a viewpoint position EP, and a circular frame having a radius r1 centered on the viewpoint position EP.
- a line, a circular frame line having a radius r2 centered on the viewpoint position EP, and a circular frame line having a radius r centered on the viewpoint position EP are included.
- the circular frame line having a radius r1 indicates the outer edge of the determination target area AR.
- the radius r1 is shorter than the radius r and the radius r2, and the radius r2 is shorter than the radius r.
- the radius r1 is, for example, 1/3 of the radius r, and the radius r2 is, for example, 2/3 of the radius r.
- the first two-dimensional image 60 interferes with a position far away from the position of the target equipment when the crew member of the moving body 4 looks at the target equipment in a state where the moving body 4 is located in front of the designated distance Dt. It is an image showing whether there is an object.
- the three-dimensional points included in the determination target region AR are arranged as two-dimensional points in a circle having a radius r centered on the viewpoint position EP.
- the two-dimensional image 60 of 1 can accurately represent the line-of-sight status of the target equipment from the viewpoint position EP.
- the first two-dimensional image 60 described above is an example when the determination target area AR is a columnar region, but when the determination target region AR is a square columnar region, the target region image generation unit is also used.
- the first two-dimensional image 60 is generated by 50.
- FIG. 17 is a diagram showing another example of the first two-dimensional image of the determination target region generated by the target region image generation unit according to the first embodiment.
- the first two-dimensional image 60 shown in FIG. 17 includes a three-dimensional point included in the determination target region AR, which is a square columnar region, and projected as a two-dimensional point on the reference plane, a viewpoint position EP, and a viewpoint position EP. Includes square borders of different sizes centered around.
- the central region of the determination target region AR can be set near the line of sight of both eyes of the driver, it is possible to accurately represent the line-of-sight situation in a situation closer to reality. can.
- FIG. 18 is a diagram showing an example of a second two-dimensional image of the determination target area generated by the target area image generation unit according to the first embodiment.
- the second two-dimensional image 61 shown in FIG. 18 is a two-dimensional image obtained by projecting a three-dimensional point included in the determination target region AR from the three-dimensional point group shown in the three-dimensional point group data 20 onto a horizontal plane. be.
- the second two-dimensional image 61 includes a three-dimensional point included in the determination target area AR and projected as a two-dimensional point on the reference plane, a rectangular frame, a viewpoint position EP, a target equipment position TP, and a viewpoint position.
- the designated distance Dt which is the distance between the EP and the target equipment position TP, is included.
- the second two-dimensional image 61 is a two-dimensional image of the determination target region AR viewed from above, and the second two-dimensional image 61 can indicate how far the interference is from the viewpoint position EP. can.
- the target area image generation unit 50 displays a two-dimensional image obtained by projecting a three-dimensional point included in the determination target area AR on a reference plane by color-coding the three-dimensional points according to the distance from the straight line SL3 as the first two-dimensional image. It can also be generated as 60 and a second 2D image 61.
- FIG. 19 is a diagram showing an example of a first two-dimensional image of a determination target area in which three-dimensional points included in the determination target area are color-coded according to positions by the target area image generation unit according to the first embodiment. ..
- the first two-dimensional image 60 shown in FIG. 19 includes a three-dimensional point included in the determination target area AR and projected as a two-dimensional point on the reference plane. ,
- the viewpoint position EP, a circle with a radius r1, a circle with a radius r2, and a circle with a radius r are included.
- the three-dimensional points included in the determination target area AR are color-coded according to the distance from the straight line SL3 including the viewpoint position EP.
- the difference in color is represented by the difference in the filled state in the circle of the three-dimensional points.
- FIG. 19 the example shown in FIG.
- a three-dimensional point in a region inside a circle having a radius r1 a three-dimensional point outside a circle having a radius r1 and a region inside a circle having a radius r2, and a radius outside a circle having a radius r2.
- the three-dimensional points in the region within the circle of r are represented by different colors.
- FIG. 20 is a diagram showing an example of a second two-dimensional image of the determination target area in which the three-dimensional points included in the determination target area are color-coded according to the positions by the target area image generation unit according to the first embodiment. .. Similar to the second two-dimensional image 61 shown in FIG. 18, the second two-dimensional image 61 shown in FIG. 20 includes a three-dimensional point included in the determination target area AR and projected as a two-dimensional point on the reference plane. , A rectangular frame, a viewpoint position EP, a target equipment position TP, and a designated distance Dt are included.
- a three-dimensional point included in the determination target area AR and projected as a two-dimensional point on the reference plane is a viewpoint. It is color-coded according to the distance from the straight line SL3 including the position EP.
- the target region image generation unit 50 similarly includes a three-dimensional point included in the determination target region AR and projected as a two-dimensional point on the reference plane.
- the first two-dimensional image 60 and the second two-dimensional image 61 of the determination target area AR are color-coded according to the distance from the straight line SL3.
- the line-of-sight determination unit 51 determines the degree of line-of-sight based on the distribution of three-dimensional points included in the determination target area AR.
- the degree of visibility indicates how much the target equipment can be seen from the viewpoint position EP.
- Information indicating the degree of line-of-sight determined by the line-of-sight determination unit 51 is added to the line-of-sight information by the target area image generation unit 50.
- the line-of-sight determination unit 51 projects, for example, a three-dimensional point included in the determination target area AR of the three-dimensional point group shown in the three-dimensional point group data 20 onto a horizontal plane with a size of PL2 [mm 2 ] to make a determination target. Generate a two-dimensional image of the region AR.
- the PL2 is stored in advance in the storage unit 11, but may be received by the communication unit 10, acquired from the communication unit 10 by the data acquisition unit 30, and stored in the storage unit 11.
- the line-of-sight determination unit 51 calculates the total value of the areas of each three-dimensional point included in the two-dimensional image of the determination target area AR as the total area, and calculates the area ratio Sr, which is the ratio of the total area to the area of the determination target area AR. calculate.
- the outlook determination unit 51 determines the outlook situation according to the size of the area ratio Sr. For example, the line-of-sight determination unit 51 determines that the line-of-sight is in a good state when 0 ⁇ Sr ⁇ Sth1, and determines that the line-of-sight is in a state where there is some interference but can be seen when Sth1 ⁇ Sr ⁇ Sth2. , Sth2 ⁇ Sr, it is determined that the line-of-sight is poor.
- Sth1 and Sth2 are threshold values.
- the line-of-sight determination unit 51 can also weight and determine the line-of-sight status according to the distance from the straight line SL3 connecting the target equipment position TP and the viewpoint position EP.
- the region at the distance from the straight line SL3 to the radius r1 is defined as the first region
- the region excluding the first region from the region at the distance from the straight line SL3 to the radius r2 is defined as the second region
- the region from the straight line SL3 is used.
- the region excluding the first region and the second region from the region at a distance to the radius r is defined as the third region.
- the first region will be referred to as a first region AR1
- the second region will be referred to as a second region AR2
- the third region will be referred to as a third region AR3.
- the line-of-sight determination unit 51 calculates the first area value by multiplying the total area of the three-dimensional points included in the first area AR1 of the two-dimensional images of the determination target area AR by the coefficient k1 and calculates the determination target area AR.
- the second area value is calculated by multiplying the total area of the three-dimensional points included in the second area AR2 of the two-dimensional image of the above by the coefficient k2.
- the line-of-sight determination unit 51 calculates the third area value by multiplying the total area of the three-dimensional points included in the third area AR3 of the two-dimensional image of the determination target area AR by the coefficient k3.
- the outlook determination unit 51 can express the value obtained by summing the first area value, the second area value, and the third area value as the above-mentioned area ratio Sr.
- the coefficients k1, k2, and k3 have a relationship of k1> k2> k3 and are stored in advance in the storage unit 11, but are received by the communication unit 10 and acquired from the communication unit 10 by the data acquisition unit 30 and stored in the storage unit 11. It may be stored in 11.
- the line-of-sight determination unit 51 projects, for example, a three-dimensional point included in the first region AR1 onto a horizontal plane with a size of PL2 ⁇ k4 [mm 2 ], and projects the three-dimensional point included in the second region AR2. It is projected onto the horizontal plane with a size of PL2 ⁇ k5 [mm 2 ], and the three-dimensional points included in the third region AR3 are projected onto the horizontal plane with a size of PL2 ⁇ k6 [mm 2 ].
- k4, k5, and k6 are coefficients, and there is a relationship of k4>k5> k6.
- the outlook determination unit 51 calculates the total value of the areas of each three-dimensional point included in the determination target area AR as the total area, and calculates the area ratio Sr, which is the ratio of the total area to the area of the determination target area AR. Then, the line-of-sight determination unit 51 determines the line-of-sight status according to the size of the area ratio Sr by the same method as the above-mentioned determination method.
- the line-of-sight determination unit 51 determines that an interfering object closer to the line of sight of the crew member who is looking at the target equipment is an element that makes the line-of-sight condition worse by weighting as the distance from the straight line SL3 increases. Therefore, it is possible to make a more accurate determination.
- the line-of-sight determination unit 51 divides the determination target area AR projected on the horizontal plane into three regions and determines the line-of-sight status, but the determination target region AR projected on the horizontal plane is divided into two regions or four or more regions. It is also possible to judge the outlook situation by dividing it into the areas of.
- the interfering object image generation unit 52 generates an image of the three-dimensional point group as an interfering object image in a state where the three-dimensional points included in the determination target area AR of the three-dimensional point group shown in the three-dimensional point group data 20 are colored. do.
- the interfering object image generated by the interfering object image generation unit 52 is added to the line-of-sight information by the target area image generation unit 50.
- the method of coloring the three-dimensional points by the interfering object image generation unit 52 is the same as the method of coloring the three-dimensional points by the target area image generation unit 50, and the interfering object image generation unit 52 is included in the determination target area AR.
- the points are color-coded according to the distance from the straight line SL3.
- the interfering object image generation unit 52 has a three-dimensional point in a region inside a circle with a radius r1, a three-dimensional point outside a circle with a radius r1 and a region inside a circle with a radius r2, and a circle outside the radius r2.
- FIG. 21 is a diagram showing an example of an interfering object image generated by the interfering object image generation unit according to the first embodiment
- FIG. 22 is a diagram showing an interference generated by the interfering object image generation unit according to the first embodiment. It is a figure which shows the other example of an object image.
- the interfering object image generation unit 52 generates images of a group of three-dimensional points including the three-dimensional points of the interfering object from a position close to the interfering object as interfering object images 70 and 71. ..
- the interfering object image 70 shown in FIG. 21 includes an image of a three-dimensional point group representing the interfering object I1, and is a determination target area among a plurality of three-dimensional points constituting the three-dimensional point group representing the interfering object I1.
- the three-dimensional points included in the AR are colored.
- the interfering object image 70 can indicate in what state the interfering object I1 interferes with the determination target region AR.
- the interfering object image 71 shown in FIG. 22 includes an image of a three-dimensional point group representing the interfering object I2, and is determined among a plurality of three-dimensional points constituting the three-dimensional point group representing the interfering object I2.
- the three-dimensional points included in the target area AR are colored.
- the interfering object image 71 can indicate in what state the interfering object I2 interferes with the determination target region AR.
- the three-dimensional points included in the determination target area AR are displayed in large size, and the three-dimensional points are colored.
- the difference in color is represented by the difference in the filled state in the circle of the three-dimensional points.
- the output processing unit 32 shown in FIG. 2 outputs the line-of-sight information generated by the line-of-sight information generation unit 31.
- the output processing unit 32 outputs the line-of-sight information by, for example, transmitting the line-of-sight information generated by the line-of-sight information generation unit 31 to an external device via a network (not shown) to the communication unit 10. Further, the output processing unit 32 can also output the line-of-sight information by displaying the line-of-sight information generated by the line-of-sight information generation unit 31 on the display device 5, for example.
- the line-of-sight information generated by the line-of-sight information generation unit 31 includes the first two-dimensional image 60 and the second two-dimensional image 61 generated by the target area image generation unit 50, and the line-of-sight status determined by the line-of-sight determination unit 51. And at least one of the interfering object images 70, 71 generated by the interfering object image generation unit 52.
- the output processing unit 32 generates a first two-dimensional image 60 and a second two-dimensional image 61 generated by the target area image generation unit 50, a line-of-sight situation determined by the line-of-sight determination unit 51, and an interfering object image generation. It is possible to output line-of-sight information including the interfering object images 70 and 71 generated by the unit 52.
- FIG. 23 is a diagram showing an example of outlook information according to the first embodiment.
- the line-of-sight information 80 shown in FIG. 23 includes a first two-dimensional image 60, a second two-dimensional image 61, an interfering object image 70, an interfering object image 71, a confirmation point 82, and a determination result 83 of the line-of-sight state. And include.
- the confirmation point 82 is a designated distance Dt, and indicates the distance along the movement path 3 between the viewpoint position EP and the target equipment position TP.
- the outlook situation determination result 83 is information indicating the degree of outlook determined by the outlook determination unit 51.
- position information indicating the position of the three-dimensional point included in the determination target area AR among the three-dimensional points of the interfering object I1 is arranged at the position associated with the interfering object image 70.
- position information indicating the position of the three-dimensional point included in the determination target region AR among the three-dimensional points of the interfering object I2 is arranged at the position associated with the interfering object image 71. ..
- Such location information is information detected by the line-of-sight information generation unit 31.
- the line-of-sight information 80 includes the first two-dimensional image 60, the second two-dimensional image 61, the interfering object image 70, the interfering object image 71, and the line-of-sight status determination result 83.
- the worker who confirms the outlook information 80 can accurately grasp the outlook situation.
- FIG. 24 is a flowchart showing an example of processing by the processing unit of the line-of-sight information generation device according to the first embodiment.
- the processing unit 12 of the line-of-sight information generation device 1 acquires the three-dimensional point cloud data 20 from the storage unit 11 or the communication unit 10 (step S10).
- the processing unit 12 performs a viewpoint position determining process for determining the viewpoint position EP (step S11).
- the process of step S11 is the process of steps S20 to S22 shown in FIG. 25, which will be described in detail later.
- the processing unit 12 determines the determination target area based on the viewpoint position EP determined in step S11 (step S12). Then, the processing unit 12 performs a line-of-sight information generation process for generating the line-of-sight information 80 based on the three-dimensional point cloud data 20 acquired in step S10 and the determination target area determined in step S12 (step S13).
- the process of step S13 is the process of steps S30 to S34 shown in FIG. 26, which will be described in detail later.
- the processing unit 12 outputs the line-of-sight information generated in step S13 (step S14), and ends the processing shown in FIG. 24.
- FIG. 25 is a flowchart showing an example of the viewpoint position determination process by the processing unit of the line-of-sight information generation device according to the first embodiment.
- the processing unit 12 calculates the position BT on the reference line Lr corresponding to the target equipment position TP (step S20).
- the processing unit 12 calculates the position BE on the reference line Lr separated from the position BT by the designated distance Dt on the reference line Lr (step S21), and calculates the viewpoint position EP based on the calculated position BE. Then (step S22), the process shown in FIG. 25 is terminated.
- FIG. 26 is a flowchart showing an example of the line-of-sight information generation process by the processing unit of the line-of-sight information generation device according to the first embodiment.
- the processing unit 12 projects a three-dimensional point included in the determination target area AR onto a reference plane (step S30). Then, the processing unit 12 color-codes each three-dimensional point projected on the reference plane according to the distance from the straight line SL3 to generate a two-dimensional image of the determination target area AR (step S31).
- the processing unit 12 determines the degree of line-of-sight based on the distribution of the three-dimensional points included in the determination target area AR (step S32). Further, the processing unit 12 generates images of the three-dimensional point cloud as interfering object images 70 and 71 in a state where the three-dimensional points included in the determination target area AR are colored (step S33). Then, the processing unit 12 outputs the line-of-sight information 80 (step S34), and ends the processing of FIG. 26.
- FIG. 27 is a diagram showing an example of the hardware configuration of the line-of-sight information generation device according to the first embodiment.
- the line-of-sight information generation device 1 includes a computer including a processor 101, a memory 102, and a communication device 103.
- the processor 101, the memory 102, and the communication device 103 can send and receive information to and from each other by, for example, the bus 104.
- the storage unit 11 is realized by the memory 102.
- the communication unit 10 is realized by the communication device 103.
- the processor 101 executes functions such as a data acquisition unit 30, a line-of-sight information generation unit 31, and an output processing unit 32 by reading and executing a program stored in the memory 102.
- the processor 101 is, for example, an example of a processing circuit, and includes one or more of a CPU (Central Processing Unit), a DSP (Digital Signal Processor), and a system LSI (Large Scale Integration).
- the memory 102 includes one or more of RAM (Random Access Memory), ROM (Read Only Memory), flash memory, EPROM (Erasable Programmable Read Only Memory), and EEPROM (registered trademark) (Electrically Erasable Programmable Read Only Memory). include. Further, the memory 102 includes a recording medium in which a computer-readable program is recorded. Such recording media include one or more of non-volatile or volatile semiconductor memories, magnetic disks, flexible memories, optical discs, compact disks, and DVDs (Digital Versatile Discs).
- the line-of-sight information generation device 1 may include integrated circuits such as an ASIC (Application Specific Integrated Circuit) and an FPGA (Field Programmable Gate Array).
- the line-of-sight information generation device 1 includes a data acquisition unit 30, a line-of-sight information generation unit 31, and an output processing unit 32.
- the data acquisition unit 30 acquires three-dimensional point cloud data 20 that represents an object in the peripheral region of the movement path 3 to which the moving body 4 moves as a three-dimensional point cloud.
- the outlook information generation unit 31 is information indicating the outlook status of the target equipment from the viewpoint position EP away from the target equipment in the peripheral region based on the three-dimensional point cloud data 20 acquired by the data acquisition unit 30.
- Generate information 80 The output processing unit 32 outputs the line-of-sight information 80 generated by the line-of-sight information generation unit 31.
- the line-of-sight information generation device 1 can accurately present the line-of-sight status of the target equipment from the moving body 4 moving on the moving path 3.
- the data acquisition unit 30 acquires the reference line data 21 which is the data of the reference line Lr along the movement path 3.
- the line-of-sight information generation unit 31 includes a viewpoint position determination unit 40 that determines the viewpoint position EP based on the distance from the target equipment along the reference line Lr indicated by the reference line data 21.
- the line-of-sight information generator 1 can automatically set, for example, a position separated from the target equipment by a specified distance Dt as the viewpoint position EP, and the line-of-sight status of the target equipment from a point a certain distance before the target equipment. Can be presented accurately.
- the reference line data 21 includes data indicating the inclination of the moving path 3 or data for calculating the inclination of the moving path 3.
- the viewpoint position determination unit 40 determines the viewpoint position EP based on the distance from the target equipment and the inclination of the movement path 3. As a result, the line-of-sight information generation device 1 can accurately present the line-of-sight status of the target equipment from the viewpoint of the crew of the moving body 4, for example.
- the line-of-sight information generation unit 31 includes a determination area determination unit 41 and a generation processing unit 42.
- the determination area determination unit 41 determines a preset area around the straight line SL3 connecting the target equipment position TP and the viewpoint position EP as the determination target area AR.
- the generation processing unit 42 generates the line-of-sight information 80 based on the three-dimensional point cloud data 20 and the determination target area AR. Thereby, the line-of-sight information generation device 1 can accurately present, for example, the line-of-sight status of the target equipment in a preset area around the straight line SL3 connecting the target equipment position TP and the viewpoint position EP.
- the generation processing unit 42 includes a target area image generation unit 50.
- the target area image generation unit 50 is obtained by projecting a three-dimensional point included in the determination target area AR out of a plurality of three-dimensional points constituting the three-dimensional point group represented by the three-dimensional point group data 20 onto a reference plane. Generates information including a two-dimensional image of the determination target area AR. As a result, the line-of-sight information generation device 1 can present the line-of-sight status of the target equipment as a two-dimensional image.
- the reference plane includes a plane perpendicular to the straight line SL3.
- the line-of-sight information generation device 1 can present the line-of-sight status of the target equipment from the viewpoint position EP as a two-dimensional image.
- the reference plane includes a horizontal plane.
- the line-of-sight information generation device 1 can present a two-dimensional image capable of grasping the distance from the three-dimensional viewpoint position EP in which the determination target area AR exists.
- the target area image generation unit 50 generates a two-dimensional image in which the three-dimensional points included in the determination target area AR are color-coded according to the distance from the straight line SL3 as the two-dimensional image of the determination target area AR.
- the line-of-sight information generation device 1 can present a two-dimensional image capable of grasping the line-of-sight status of the target equipment from the viewpoint position EP by the colors of the three-dimensional points included in the two-dimensional image.
- the generation processing unit 42 includes a prospect determination unit 51.
- the line-of-sight determination unit 51 determines the degree of line-of-sight based on the distribution of three-dimensional points included in the determination target area AR.
- the outlook information 80 further includes information indicating the degree of outlook determined by the outlook determination unit 51. As a result, the line-of-sight information generation device 1 can present information indicating the degree of line-of-sight.
- the generation processing unit 42 includes an interfering object image generation unit 52.
- the interfering object image generation unit 52 generates an image of the three-dimensional point group as the interfering object images 70 and 71 in a state where the three-dimensional points included in the determination target area AR are colored.
- the line-of-sight information 80 further includes the interfering object images 70 and 71 generated by the interfering material image generation unit 52.
- the line-of-sight information generation device 1 can present the interfering object images 70 and 71, which are images of the three-dimensional point cloud, in a state where the three-dimensional points included in the determination target area AR are colored.
- the determination area determination unit 41 determines a columnar area centered on the straight line SL3 as the determination target area AR. As a result, the line-of-sight information generation device 1 can easily and easily set the determination target area AR.
- the determination area determination unit 41 determines a square columnar area centered on the straight line SL3 as the determination target area AR.
- the line-of-sight information generation device 1 can set the central area of the determination target area AR near the eyes of both eyes of the driver, so that the line-of-sight situation can be accurately represented in a situation closer to reality.
- the configuration shown in the above embodiment is an example, and can be combined with another known technique, or a part of the configuration may be omitted or changed without departing from the gist. It is possible.
- 1 line-of-sight information generation device 2 3D point group measurement device, 3 movement path, 4 moving body, 5 display device, 10 communication unit, 11 storage unit, 12 processing unit, 20 3D point group data, 21 reference line data, 30 data acquisition unit, 31 outlook information generation unit, 32 output processing unit, 40 viewpoint position determination unit, 41 judgment area determination unit, 42 generation processing unit, 50 target area image generation unit, 51 outlook determination unit, 52 interfering object image generation Department, 60 1st 2D image, 61 2nd 2D image, 70,71 Interfering object image, 80 line-of-sight information, 82 confirmation points, 83 line-of-sight status judgment results, 100 information provision system, 101 processor, 102 Memory, 103 communication device, 104 bus, AR judgment target area, AR1 first area, AR2 second area, AR3 third area, AS1, AS2, AS3, AS4 surface, BT position, D integrated distance, Dt designation Distance, EP viewpoint position, H distance, I1, I2 interfering object, L distance, Lr reference line, P5, P5', P
Landscapes
- Engineering & Computer Science (AREA)
- Mechanical Engineering (AREA)
- Image Analysis (AREA)
- Processing Or Creating Images (AREA)
Abstract
This visibility information generation device (1) comprises a data acquisition unit (30), a visibility information generation unit (31), and an output processing unit (32). The data acquisition unit (30) acquires three-dimensional point cloud data representing, as a three-dimensional point cloud, an object in the peripheral region of a travel path on which a mobile body travels. On the basis of the three-dimensional point cloud data acquired by the data acquisition unit (30), the visibility information generation unit (31) generates visibility information which is information indicating the visibility state of target equipment in the peripheral region from a viewpoint position distanced from the target equipment. The output processing unit (32) outputs the visibility information generated by the visibility information generation unit (31).
Description
本開示は、移動路を移動する移動体からの対象設備の見通し状況を示す情報である見通し情報を生成する見通し情報生成装置、見通し情報生成方法、および見通し情報生成プログラムに関する。
The present disclosure relates to a line-of-sight information generator, a line-of-sight information generation method, and a line-of-sight information generation program that generate line-of-sight information, which is information indicating the line-of-sight status of the target equipment from a moving body moving on a moving path.
従来、鉄道沿線に設置される鉄道設備である鉄道沿線設備として、列車の運行に支障を与える事態が発生したことを発光信号で現示する特殊信号発光機が知られている。特殊信号発光機には、非常停止ボタンが設けられており、かかる非常停止ボタンが押された場合などに、特殊信号発光機が発光信号を出力する。列車の乗務員は、特殊信号発光機の発光信号を確認すると列車の停止操作をする必要があるため、特殊信号発光機の一定距離以上手前の地点に列車が位置している状態で、特殊信号発光機の発光信号を列車の乗務員が確認できることが必要になる。
Conventionally, as a railway line facility that is a railway facility installed along a railway line, a special signal light emitter that indicates that a situation that interferes with train operation has occurred is known as a light emitting signal. The special signal light emitter is provided with an emergency stop button, and the special signal light emitter outputs a light emission signal when the emergency stop button is pressed. When the train crew confirms the light emission signal of the special signal light emitter, it is necessary to stop the train. Therefore, the special signal light emission is performed when the train is located at a point in front of the special signal light emitter by a certain distance or more. It is necessary for the train crew to be able to confirm the light emission signal of the aircraft.
鉄道沿線には、一般に、建設物、看板、架線柱などの鉄道沿線設備および樹木などが存在するが、樹木が成長したり、鉄道沿線設備が追加で設置されたりすると、特殊信号発光機の見通し状況が悪くなる場合がある。そのため、特殊信号発光機の一定距離以上手前の地点から特殊信号発光機の見通し状況の確認を事前に行う必要があるが、現状の見通し状況の確認は、定期点検などにおいて目視での視認性確認を複数人で行っており、多くの労力が必要となっている。
Generally, there are constructions, signs, overhead wire pillars, and other railway line equipment and trees along the railway line, but if trees grow or additional railway line equipment is installed, the prospect of a special signal light emitter. The situation may get worse. Therefore, it is necessary to confirm the line-of-sight status of the special signal light-emitting device in advance from a point in front of the special signal light-emitting device by a certain distance or more. This is done by multiple people, and a lot of labor is required.
そこで、特許文献1では、特殊信号発光機における発光を検出する特殊信号検出装置が提案されている。かかる特殊信号検出装置は、撮像部によって撮像された連続画像を画像領域ごとに分割し、分割した複数のフレームからなる部分画像から輝度の周波数に対する振幅データを抽出し、抽出した結果に基づき特殊信号発光機の発光信号の有無を検出する。
Therefore, Patent Document 1 proposes a special signal detection device that detects light emission in a special signal light emitter. Such a special signal detection device divides a continuous image captured by an imaging unit into image regions, extracts amplitude data with respect to a luminance frequency from a partial image composed of a plurality of divided frames, and uses a special signal based on the extracted result. Detects the presence or absence of a light emission signal from the light emitter.
しかしながら、上記特許文献1に記載の技術は、特殊信号発光機の発光信号を検出して列車の乗務員に報知する技術であり、移動路を移動する移動体であって対象設備の一定距離以上手前の地点からの対象設備の見通し状況を示す情報を提供するための技術ではない。上記特許文献1に記載の技術では、対象設備の一定距離以上手前の地点からの対象設備の見通しが悪い場合には、対象設備の一定距離以上手前の地点から対象設備を検出することができない可能性がある。
However, the technique described in Patent Document 1 is a technique of detecting a light emission signal of a special signal light emitter and notifying the train crew of the technique, which is a moving body moving on a moving path and is in front of a certain distance or more of the target equipment. It is not a technique for providing information showing the outlook status of the target equipment from the above point. With the technique described in Patent Document 1, if the visibility of the target equipment from a point in front of the target equipment by a certain distance or more is poor, it is possible that the target equipment cannot be detected from a point in front of the target equipment by a certain distance or more. There is sex.
本開示は、上記に鑑みてなされたものであって、移動路を移動する移動体からの対象設備の見通し状況を精度よく提示することができる見通し情報生成装置を得ることを目的とする。
The present disclosure has been made in view of the above, and an object of the present disclosure is to obtain a line-of-sight information generator capable of accurately presenting the line-of-sight status of the target equipment from a moving body moving on a moving path.
上述した課題を解決し、目的を達成するために、本開示の見通し情報生成装置は、データ取得部と、見通し情報生成部と、出力処理部とを備える。データ取得部は、移動体が移動する移動路の周辺領域の物体を3次元点群で表す3次元点群データを取得する。見通し情報生成部は、データ取得部によって取得された3次元点群データに基づいて、周辺領域にある対象設備から離れた視点位置からの対象設備の見通し状況を示す情報である見通し情報を生成する。出力処理部は、見通し情報生成部によって生成された見通し情報を出力する。
In order to solve the above-mentioned problems and achieve the object, the outlook information generation device of the present disclosure includes a data acquisition unit, a outlook information generation unit, and an output processing unit. The data acquisition unit acquires three-dimensional point cloud data representing an object in the peripheral region of the moving path on which the moving body moves as a three-dimensional point cloud. The outlook information generation unit generates outlook information, which is information indicating the outlook status of the target equipment from a viewpoint position away from the target equipment in the peripheral area, based on the three-dimensional point cloud data acquired by the data acquisition unit. .. The output processing unit outputs the line-of-sight information generated by the line-of-sight information generation unit.
本開示によれば、移動路を移動する移動体からの対象設備の見通し状況を精度よく提示することができる、という効果を奏する。
According to the present disclosure, it is possible to accurately present the line-of-sight status of the target equipment from the moving body moving on the moving path.
以下に、実施の形態にかかる見通し情報生成装置、見通し情報生成方法、および見通し情報生成プログラムを図面に基づいて詳細に説明する。
The outlook information generation device, the outlook information generation method, and the outlook information generation program according to the embodiment will be described in detail below based on the drawings.
実施の形態1.
図1は、実施の形態1にかかる見通し情報生成装置を含む情報提供システムの構成の一例を示す図である。図1に示すように、実施の形態1にかかる情報提供システム100は、見通し情報生成装置1と、移動路3を移動する移動体4に配置され、移動路3の周辺領域を計測する3次元点群計測装置2とを備える。Embodiment 1.
FIG. 1 is a diagram showing an example of a configuration of an information providing system including a line-of-sight information generator according to the first embodiment. As shown in FIG. 1, theinformation providing system 100 according to the first embodiment is arranged on the line-of-sight information generation device 1 and the moving body 4 moving on the moving path 3, and measures the peripheral area of the moving path 3 in three dimensions. It is provided with a point cloud measuring device 2.
図1は、実施の形態1にかかる見通し情報生成装置を含む情報提供システムの構成の一例を示す図である。図1に示すように、実施の形態1にかかる情報提供システム100は、見通し情報生成装置1と、移動路3を移動する移動体4に配置され、移動路3の周辺領域を計測する3次元点群計測装置2とを備える。
FIG. 1 is a diagram showing an example of a configuration of an information providing system including a line-of-sight information generator according to the first embodiment. As shown in FIG. 1, the
図1に示す例では、移動路3は、鉄道軌道であり、移動体4が鉄道軌道を走行する列車である。なお、移動路3は、道路などのように鉄道軌道以外の移動路であってもよく、移動体4は、道路を走行する自動車のように列車以外の移動体であってもよい。以下においては、3次元点群計測装置2が列車に配置される例を説明する。
In the example shown in FIG. 1, the moving path 3 is a railroad track, and the moving body 4 is a train traveling on the railroad track. The moving path 3 may be a moving path other than a railroad track such as a road, and the moving body 4 may be a moving body other than a train such as an automobile traveling on a road. In the following, an example in which the three-dimensional point cloud measuring device 2 is arranged on the train will be described.
3次元点群計測装置2は、移動体4が移動路3を移動している状態で、移動路3の周辺領域を計測し、移動路3の周辺領域に存在する物体の3次元形状を3次元点群で表す3次元点群データを生成する。3次元点群計測装置2は、例えば、レーザスキャナまたは光切断センサを含む。レーザスキャナは、物体にレーザ光線を照射し、物体に照射したレーザ光線が返ってくるまでの時間を計測し、計測した時間を距離に換算することで物体の3次元形状を計測する。光切断センサは、光切断法によって物体の3次元形状を計測する。
The three-dimensional point cloud measuring device 2 measures the peripheral area of the moving path 3 while the moving body 4 is moving on the moving path 3, and measures the three-dimensional shape of the object existing in the peripheral area of the moving path 3. Generates 3D point cloud data represented by a 3D point cloud. The three-dimensional point cloud measuring device 2 includes, for example, a laser scanner or an optical cutoff sensor. The laser scanner irradiates an object with a laser beam, measures the time until the laser beam irradiating the object returns, and converts the measured time into a distance to measure the three-dimensional shape of the object. The optical cutting sensor measures the three-dimensional shape of an object by the optical cutting method.
3次元点群計測装置2によって生成された3次元点群データは、見通し情報生成装置1によって取得される。見通し情報生成装置1は、3次元点群計測装置2によって生成された3次元点群データを3次元点群計測装置2から無線通信または有線通信によって取得することができる。また、見通し情報生成装置1は、3次元点群計測装置2によって生成された3次元点群データが記録された記録媒体から、3次元点群計測装置2によって生成された3次元点群データを取得することもできる。
The 3D point cloud data generated by the 3D point cloud measuring device 2 is acquired by the line-of-sight information generation device 1. The line-of-sight information generation device 1 can acquire the three-dimensional point cloud data generated by the three-dimensional point cloud measuring device 2 from the three-dimensional point cloud measuring device 2 by wireless communication or wired communication. Further, the line-of-sight information generation device 1 generates 3D point cloud data generated by the 3D point cloud measurement device 2 from a recording medium on which the 3D point cloud data generated by the 3D point cloud measurement device 2 is recorded. You can also get it.
見通し情報生成装置1は、3次元点群計測装置2から取得した3次元点群データに基づいて、移動路3の周辺領域にある対象設備から離れた視点位置からの対象設備の見通し状況を示す情報である見通し情報を生成し、生成した見通し情報を出力する。対象設備は、例えば、特殊信号発光機であるが、例えば、信号機または踏切などのように、特殊信号発光機以外の鉄道沿線設備であってもよい。
The line-of-sight information generation device 1 shows the line-of-sight status of the target equipment from a viewpoint position away from the target equipment in the peripheral area of the moving path 3 based on the three-dimensional point cloud data acquired from the three-dimensional point cloud measuring device 2. Generates outlook information, which is information, and outputs the generated outlook information. The target equipment is, for example, a special signal light emitter, but may be equipment along a railway line other than the special signal light emitter, such as a traffic light or a railroad crossing.
見通し情報生成装置1によって生成される見通し情報は、例えば、3次元点群データで示される3次元点群を構成する複数の3次元点のうち、視点位置からの視界内の予め設定された領域にある3次元点に関する情報を含む。撮像装置によって撮像された画像では、光または天候に応じた画像の不鮮明さによって見通しの判断が難しい場合があるが、見通し情報生成装置1は、3次元計測によって得られる3次元点群データを用いることから、撮像装置によって撮像された画像を用いる場合に比べて、移動体4からの対象設備の見通し状況を精度よく提示することができる。
The line-of-sight information generated by the line-of-sight information generation device 1 is, for example, a preset area in the field of view from the viewpoint position among a plurality of three-dimensional points constituting the three-dimensional point cloud represented by the three-dimensional point cloud data. Contains information about 3D points in. In the image captured by the image pickup device, it may be difficult to judge the line-of-sight due to the blur of the image depending on the light or the weather, but the line-of-sight information generation device 1 uses the three-dimensional point group data obtained by the three-dimensional measurement. Therefore, as compared with the case of using the image captured by the image pickup apparatus, it is possible to accurately present the line-of-sight status of the target equipment from the moving body 4.
以下、見通し情報生成装置1についてさらに具体的に説明する。図2は、実施の形態1にかかる見通し情報生成装置の構成の一例を示す図である。図2に示すように、見通し情報生成装置1は、通信部10と、記憶部11と、処理部12とを備える。
Hereinafter, the outlook information generation device 1 will be described more specifically. FIG. 2 is a diagram showing an example of the configuration of the line-of-sight information generation device according to the first embodiment. As shown in FIG. 2, the line-of-sight information generation device 1 includes a communication unit 10, a storage unit 11, and a processing unit 12.
通信部10は、不図示のネットワークに通信可能に接続され、不図示のネットワークを介して3次元点群計測装置2などの外部装置との間で情報の送受信を行う。不図示のネットワークは、例えば、インターネットなどのWAN(Wide Area Network)またはLAN(Local Area Network)である。
The communication unit 10 is communicably connected to a network (not shown), and transmits / receives information to / from an external device such as a three-dimensional point cloud measuring device 2 via the network (not shown). The network (not shown) is, for example, a WAN (Wide Area Network) such as the Internet or a LAN (Local Area Network).
記憶部11は、3次元点群計測装置2から送信され通信部10で受信される3次元点群データ20と、移動路3に沿った基準線に関するデータである基準線データ21とを記憶する。なお、基準線データ21は、処理部12による3次元点群データ20の解析によって生成されて記憶部11に記憶されてもよく、外部装置から送信され通信部10で受信されて処理部12によって記憶部11に記憶されてもよい。処理部12は、例えば、3次元点群データ20で示される3次元点群から移動路3を検出し、検出した移動路3に沿った線を基準線とすることができる。
The storage unit 11 stores the three-dimensional point cloud data 20 transmitted from the three-dimensional point cloud measuring device 2 and received by the communication unit 10, and the reference line data 21 which is data related to the reference line along the movement path 3. .. The reference line data 21 may be generated by the analysis of the three-dimensional point cloud data 20 by the processing unit 12 and stored in the storage unit 11, and may be transmitted from an external device and received by the communication unit 10 by the processing unit 12. It may be stored in the storage unit 11. For example, the processing unit 12 can detect the movement path 3 from the three-dimensional point cloud shown in the three-dimensional point cloud data 20, and use a line along the detected movement path 3 as a reference line.
図3は、実施の形態1にかかる3次元点群データの一例を示す図である。図3に示すように、3次元点群データ20は、複数の3次元点のデータを含む。3次元点のデータには、3次元直交座標系における3次元点の座標を示すデータが含まれる。図3に示す3次元点群データ20には、座標「X1,Y1,Z1」のデータ、座標「X2,Y2,Z2」のデータ、および座標「X3,Y3,Z3」のデータなどが各々3次元点のデータとして含まれる。
FIG. 3 is a diagram showing an example of three-dimensional point cloud data according to the first embodiment. As shown in FIG. 3, the three-dimensional point cloud data 20 includes data of a plurality of three-dimensional points. The three-dimensional point data includes data indicating the coordinates of the three-dimensional point in the three-dimensional Cartesian coordinate system. The three-dimensional point cloud data 20 shown in FIG. 3 includes data having coordinates “X1, Y1, Z1”, data having coordinates “X2, Y2, Z2”, data having coordinates “X3, Y3, Z3”, and the like, respectively. It is included as data of dimension points.
図4は、実施の形態1にかかる基準線データの一例を示す図である。図4に示すように、基準線データ21は、複数の基準点のデータを含む。基準点は基準線上の3次元点である。基準点のデータには、基準点の座標、ロール値、および並び順を示すデータが含まれる。
FIG. 4 is a diagram showing an example of reference line data according to the first embodiment. As shown in FIG. 4, the reference line data 21 includes data of a plurality of reference points. The reference point is a three-dimensional point on the reference line. The reference point data includes data indicating the coordinates of the reference point, the roll value, and the order of arrangement.
基準点の座標は、3次元直交座標系における基準点の座標である。ロール値は、移動路3の傾きを示すデータである。かかるロール値は、移動路3のうち移動体4が基準点上にある場合の移動体4の傾きを示すデータとも言える。並び順は、移動体4の移動方向を示す値であり、並び順が小さな基準点から並び順が大きな基準点へ向かう方向が移動体4の移動方向である。
The coordinates of the reference point are the coordinates of the reference point in the three-dimensional Cartesian coordinate system. The roll value is data indicating the inclination of the moving path 3. Such a roll value can be said to be data indicating the inclination of the moving body 4 when the moving body 4 is on the reference point in the moving path 3. The arrangement order is a value indicating the movement direction of the moving body 4, and the direction from the reference point having a small arrangement order to the reference point having a large arrangement order is the movement direction of the moving body 4.
図4に示す基準線データ21には、座標「BX1,BY1,BZ1」、ロール値「R1」、および並び順「1」のデータ、座標「BX2,BY2,BZ2」、ロール値「R2」、および並び順「2」のデータ、および座標「BX3,BY3,BZ3」、ロール値「R3」、および並び順「3」のデータなどが各々基準点のデータとして含まれる。
The reference line data 21 shown in FIG. 4 includes data having coordinates “BX1, BY1, BZ1”, roll value “R1”, and arrangement order “1”, coordinates “BX2, BY2, BZ2”, and roll value “R2”. And the data of the arrangement order "2", the coordinates "BX3, BY3, BZ3", the roll value "R3", the data of the arrangement order "3", and the like are included as the data of the reference points, respectively.
図2に示す処理部12は、データ取得部30と、見通し情報生成部31と、出力処理部32とを備える。データ取得部30は、3次元点群計測装置2から送信され通信部10で受信された3次元点群データ20を通信部10から取得し、取得した3次元点群データ20を記憶部11に記憶させる。データ取得部30は、見通し情報生成タイミングになると、3次元点群データ20および基準線データ21を記憶部11から取得し、取得した3次元点群データ20および基準線データ21を見通し情報生成部31に通知する。
The processing unit 12 shown in FIG. 2 includes a data acquisition unit 30, a line-of-sight information generation unit 31, and an output processing unit 32. The data acquisition unit 30 acquires the 3D point cloud data 20 transmitted from the 3D point cloud measuring device 2 and received by the communication unit 10 from the communication unit 10, and stores the acquired 3D point cloud data 20 in the storage unit 11. Remember. When the line-of-sight information generation timing comes, the data acquisition unit 30 acquires the three-dimensional point cloud data 20 and the reference line data 21 from the storage unit 11, and obtains the acquired three-dimensional point cloud data 20 and the reference line data 21 as the line-of-sight information generation unit. Notify 31.
見通し情報生成部31は、データ取得部30によって取得された3次元点群データ20に基づいて、移動路3の周辺領域にある対象設備から離れた視点位置からの対象設備の見通し状況を示す情報である見通し情報を生成する。かかる見通し情報生成部31は、視点位置決定部40と、判定領域決定部41と、生成処理部42とを備える。
The line-of-sight information generation unit 31 is information indicating the line-of-sight status of the target equipment from a viewpoint position away from the target equipment in the peripheral area of the movement path 3 based on the three-dimensional point cloud data 20 acquired by the data acquisition unit 30. Generate outlook information that is. The line-of-sight information generation unit 31 includes a viewpoint position determination unit 40, a determination area determination unit 41, and a generation processing unit 42.
視点位置決定部40は、対象設備からの距離と移動路3の傾きとに基づいて、視点位置を決定する。例えば、視点位置決定部40は、基準線データ21で示される基準線に沿った対象設備からの距離に基づいて、視点位置を決定する。視点位置は、例えば、移動体4を運転する運転手の標準的な目の位置である。
The viewpoint position determination unit 40 determines the viewpoint position based on the distance from the target equipment and the inclination of the movement path 3. For example, the viewpoint position determination unit 40 determines the viewpoint position based on the distance from the target equipment along the reference line indicated by the reference line data 21. The viewpoint position is, for example, the standard eye position of the driver who drives the moving body 4.
図5は、実施の形態1にかかる基準線データで示される基準線上の複数の基準点の一例を示す図である。図5に示すように、基準線データ21で示される基準線Lr上の複数の基準点は、移動体4の移動路3である一対のレール間の中央に配列される。
FIG. 5 is a diagram showing an example of a plurality of reference points on the reference line shown by the reference line data according to the first embodiment. As shown in FIG. 5, a plurality of reference points on the reference line Lr shown by the reference line data 21 are arranged in the center between a pair of rails which are the movement paths 3 of the moving body 4.
図6は、実施の形態1にかかる視点位置決定部による対象設備位置から指定距離手前の位置を決定する処理の一例を示す図である。図6では、「TP」は、対象設備の位置であり、以下において、対象設備位置TPと記載する。対象設備位置TPは、3次元直交座標系における座標「TX,TY,TZ」で表される。
FIG. 6 is a diagram showing an example of a process of determining a position in front of a designated distance from the target equipment position by the viewpoint position determining unit according to the first embodiment. In FIG. 6, “TP” is the position of the target equipment, and will be referred to as the target equipment position TP below. The target equipment position TP is represented by the coordinates "TX, TY, TZ" in the three-dimensional Cartesian coordinate system.
視点位置決定部40は、対象設備位置TPに対応する基準線Lr上の位置BTを算出する第1の算出処理と、基準線Lr上の距離で、位置BTから後述する指定距離Dtだけ離れた基準線Lr上の位置BEを算出する第2の算出処理と、後述する視点位置EPを算出する第3の算出処理を行う。
The viewpoint position determining unit 40 is separated from the position BT by a designated distance Dt, which will be described later, in the first calculation process for calculating the position BT on the reference line Lr corresponding to the target equipment position TP and the distance on the reference line Lr. A second calculation process for calculating the position BE on the reference line Lr and a third calculation process for calculating the viewpoint position EP, which will be described later, are performed.
まず、第1の算出処理について説明する。視点位置決定部40は、対象設備位置TPに最も近い基準点と次に近い基準点とを選択基準点として選択する。そして、視点位置決定部40は、2つの選択基準点を結ぶ直線と直交する平面であって対象設備位置TPを含む平面と、2つの選択基準点を結ぶ直線との交点を、位置BTとして算出する。
First, the first calculation process will be described. The viewpoint position determining unit 40 selects the reference point closest to the target equipment position TP and the reference point next to the target equipment position TP as selection reference points. Then, the viewpoint position determining unit 40 calculates the intersection of the plane that is orthogonal to the straight line connecting the two selection reference points and including the target equipment position TP and the straight line connecting the two selection reference points as the position BT. do.
図6に示す例では、対象設備位置TPから最も近い基準点は、基準点P23であり、対象設備位置TPから次に近い基準点は、基準点P24である。視点位置決定部40は、基準点P23と基準点P24とを結ぶ直線と直交する平面であって対象設備位置TPを含む平面と、基準点P23と基準点P24とを結ぶ直線との交点を、位置BTとして算出する。
In the example shown in FIG. 6, the reference point closest to the target equipment position TP is the reference point P23, and the reference point next to the target equipment position TP is the reference point P24. The viewpoint position determination unit 40 determines the intersection of a plane that is orthogonal to the straight line connecting the reference point P23 and the reference point P24 and includes the target equipment position TP and a straight line connecting the reference point P23 and the reference point P24. Calculated as position BT.
次に、第2の算出処理について説明する。視点位置決定部40は、位置BTから並び順が小さくなる順に基準点を通る直線の長さが指定距離Dtになる基準線Lr上の点を算出し、算出した点の位置を位置BEとして決定する。これにより、位置BTから指定距離Dtだけ手前の位置が位置BEとして決定される。
Next, the second calculation process will be described. The viewpoint position determining unit 40 calculates a point on the reference line Lr in which the length of the straight line passing through the reference point becomes the designated distance Dt in the order from the position BT in ascending order, and determines the position of the calculated point as the position BE. do. As a result, the position in front of the position BT by the designated distance Dt is determined as the position BE.
図6に示す例では、視点位置決定部40は、位置BTと基準点P23との距離d1を算出する。視点位置決定部40は、基準点P23と基準点P22との距離d2を算出し、算出した距離d2と距離d1とを加算して積算距離Dを算出する。視点位置決定部40は、基準点P22と基準点P21との距離d3を算出し、算出した距離d3を積算距離Dに積算して新たな積算距離Dを算出する。視点位置決定部40は、同様の処理を繰り返し行う。
In the example shown in FIG. 6, the viewpoint position determining unit 40 calculates the distance d1 between the position BT and the reference point P23. The viewpoint position determining unit 40 calculates the distance d2 between the reference point P23 and the reference point P22, and adds the calculated distance d2 and the distance d1 to calculate the integrated distance D. The viewpoint position determining unit 40 calculates the distance d3 between the reference point P22 and the reference point P21, integrates the calculated distance d3 into the integrated distance D, and calculates a new integrated distance D. The viewpoint position determining unit 40 repeats the same process.
視点位置決定部40は、積算すると積算距離Dが指定距離Dt以上になる2つの基準点を判定し、判定した2つの基準点のうち並び順が大きい方の基準点までの積算距離Dと指定距離Dtとの差Δdを算出する。視点位置決定部40は、判定した2つの基準点間を結ぶ直線上の位置であって、並び順が大きい方の基準点からの距離が差Δdである位置を位置BEとして決定する。
The viewpoint position determining unit 40 determines two reference points whose integrated distance D becomes the designated distance Dt or more when integrated, and designates the integrated distance D to the reference point having the larger arrangement order among the determined two reference points. The difference Δd from the distance Dt is calculated. The viewpoint position determination unit 40 determines a position on a straight line connecting the determined two reference points and having a difference Δd in distance from the reference point having the larger arrangement order as the position BE.
図6に示す例では、距離d1から距離d19まで積算すると積算距離Dが指定距離Dt以上になるため、視点位置決定部40は、2つの基準点P5,P6間を結ぶ直線上の位置であって、並び順が大きい方の基準点P6からの距離が差Δdである位置を位置BEとして決定する。なお、指定距離Dtは、記憶部11に予め記憶されるが、通信部10で受信され通信部10からデータ取得部30によって取得されて記憶部11に記憶されてもよい。
In the example shown in FIG. 6, when the integrated distance D is integrated from the distance d1 to the distance d19, the integrated distance D becomes the designated distance Dt or more. Therefore, the viewpoint position determining unit 40 is a position on a straight line connecting the two reference points P5 and P6. Therefore, the position where the distance from the reference point P6 having the larger arrangement order is the difference Δd is determined as the position BE. Although the designated distance Dt is stored in advance in the storage unit 11, it may be received by the communication unit 10, acquired by the data acquisition unit 30 from the communication unit 10, and stored in the storage unit 11.
次に、第3の算出処理について説明する。視点位置決定部40は、位置BEに最も近い基準点と次に近い基準点とを選択する。次に、視点位置決定部40は、選択した2つの基準点のロール値から位置BEのロール値を決定する。基準点のロール値は、上述した基準線データ21に含まれるデータである。
Next, the third calculation process will be described. The viewpoint position determining unit 40 selects the reference point closest to the position BE and the reference point next to the position BE. Next, the viewpoint position determining unit 40 determines the roll value of the position BE from the roll values of the two selected reference points. The roll value of the reference point is the data included in the above-mentioned reference line data 21.
視点位置決定部40は、例えば、2つの基準点のうち並び順が大きい基準点のロール値を位置BEのロール値として決定する。また、視点位置決定部40は、2つの基準点のうち並び順が小さい基準点のロール値を位置BEのロール値として決定することもできる。また、視点位置決定部40は、2つの基準点のロール値の平均値を位置BEのロール値として決定することもできる。
The viewpoint position determination unit 40 determines, for example, the roll value of the reference point having the largest arrangement order among the two reference points as the roll value of the position BE. Further, the viewpoint position determining unit 40 can also determine the roll value of the reference point having the smaller arrangement order among the two reference points as the roll value of the position BE. Further, the viewpoint position determining unit 40 can also determine the average value of the roll values of the two reference points as the roll value of the position BE.
次に、視点位置決定部40は、選択した2つの基準点を結ぶ直線に直交する平面であって位置BEを含む平面を算出し、算出した平面上において位置BEを含み位置BEのロール値の傾きを持つ直線を第1の直線として算出する。そして、視点位置決定部40は、第1の直線上において位置BEから距離Lだけ離れた第1の位置を算出し、第1の直線と直交し且つ第1の位置を含む第2の直線上において距離Hだけ離れた第2の位置を算出する。視点位置決定部40は、算出した第2の位置を視点位置EPとして決定する。視点位置EPは、3次元直交座標系における座標「EX,EY,EZ」で表される。
Next, the viewpoint position determination unit 40 calculates a plane that is orthogonal to the straight line connecting the two selected reference points and includes the position BE, and the roll value of the position BE including the position BE on the calculated plane. A straight line having an inclination is calculated as the first straight line. Then, the viewpoint position determining unit 40 calculates the first position on the first straight line, which is separated from the position BE by the distance L, and is orthogonal to the first straight line and on the second straight line including the first position. In, the second position separated by the distance H is calculated. The viewpoint position determination unit 40 determines the calculated second position as the viewpoint position EP. The viewpoint position EP is represented by the coordinates "EX, EY, EZ" in the three-dimensional Cartesian coordinate system.
図7は、実施の形態1にかかる視点位置決定部によって第1の直線の傾きが小さい場合に決定される視点位置の一例を示す図である。図8は、実施の形態1にかかる視点位置決定部によって第1の直線の傾きが大きい場合に決定される視点位置の一例を示す図である。
FIG. 7 is a diagram showing an example of a viewpoint position determined by the viewpoint position determining unit according to the first embodiment when the inclination of the first straight line is small. FIG. 8 is a diagram showing an example of a viewpoint position determined by the viewpoint position determining unit according to the first embodiment when the inclination of the first straight line is large.
図7および図8に示す例では、視点位置決定部40は、位置BEを含み位置BEのロール値の傾きを持つ第1の直線SL1上において位置BEから距離Lだけ離れた第1の位置PB1を算出し、第1の直線SL1と直交し且つ第1の位置PB1を含む第2の直線SL2上において距離Hだけ離れた第2の位置PB2を算出する。第2の直線SL2は、第1の直線SL1と直交し且つ第1の位置PB1を含む平面上の直線である。
In the example shown in FIGS. 7 and 8, the viewpoint position determining unit 40 is the first position PB1 which is separated from the position BE by the distance L on the first straight line SL1 including the position BE and having the slope of the roll value of the position BE. Is calculated, and the second position PB2 which is orthogonal to the first straight line SL1 and is separated by the distance H on the second straight line SL2 including the first position PB1 is calculated. The second straight line SL2 is a straight line on a plane orthogonal to the first straight line SL1 and including the first position PB1.
視点位置決定部40は、算出した第2の位置PB2を視点位置EPとして決定する。なお、距離Lおよび距離Hは、記憶部11に予め記憶されるが、通信部10で受信され通信部10からデータ取得部30によって取得されて記憶部11に記憶されてもよい。
The viewpoint position determination unit 40 determines the calculated second position PB2 as the viewpoint position EP. Although the distance L and the distance H are stored in advance in the storage unit 11, they may be received by the communication unit 10, acquired from the communication unit 10 by the data acquisition unit 30, and stored in the storage unit 11.
基準線データ21は、図4に示す例に限定されず、基準点のロール値を含まないデータであってもよい。移動体4が列車である場合、移動体4は、一対のレールを移動路3として一対のレール上を走行する。この場合、基準線データ21は、一対のレールに対応する一対の基準線Lrのデータであり、各基準線Lr上の複数の基準点の座標および並び順のデータを含む。
The reference line data 21 is not limited to the example shown in FIG. 4, and may be data that does not include the roll value of the reference point. When the moving body 4 is a train, the moving body 4 travels on the pair of rails with the pair of rails as the moving paths 3. In this case, the reference line data 21 is the data of the pair of reference lines Lr corresponding to the pair of rails, and includes the coordinates of the plurality of reference points on each reference line Lr and the data of the arrangement order.
視点位置決定部40は、一対の基準線Lrのうち一方の基準線Lr上の複数の基準点から位置BEを決定し、位置BEに最も近い基準点と次に近い基準点とを選択基準点として選択する。視点位置決定部40は、他方の基準線Lr上の隣接する異なる組み合わせの2つの基準点間を結ぶ複数の直線のうち、選択した2つの選択基準点を結ぶ直線に直交する直線と交差する直線の交差位置を位置OBPとして判定する。視点位置決定部40は、位置BEと位置OBPとを結ぶ直線を上述した第1の直線SL1として決定する。
The viewpoint position determining unit 40 determines the position BE from a plurality of reference points on one of the reference lines Lr of the pair of reference lines Lr, and selects the reference point closest to the position BE and the reference point next to the position BE. Select as. The viewpoint position determining unit 40 is a straight line intersecting a straight line orthogonal to the straight line connecting the two selected reference points among a plurality of straight lines connecting two reference points of different combinations adjacent to each other on the other reference line Lr. The intersection position of is determined as the position OBP. The viewpoint position determining unit 40 determines the straight line connecting the position BE and the position OBP as the first straight line SL1 described above.
図9は、実施の形態1にかかる視点位置決定部による対象設備位置から指定距離手前の位置を決定する処理の他の例を示す図である。図10は、実施の形態1にかかる視点位置決定部による視点位置決定方法の他の例を示す図である。
FIG. 9 is a diagram showing another example of the process of determining the position before the designated distance from the target equipment position by the viewpoint position determining unit according to the first embodiment. FIG. 10 is a diagram showing another example of the viewpoint position determining method by the viewpoint position determining unit according to the first embodiment.
図9に示す例では、視点位置決定部40は、位置BEに最も近い基準点P5と次に近い基準点P6とを選択する。視点位置決定部40は、2つの基準点P5,P6を結ぶ直線に直交する直線が、他方の基準線Lr上の隣接する2つの基準点P5’,P6’ 間を結ぶ直線と交差する位置を位置OBPとしてを判定する。視点位置決定部40は、位置BEと位置OBPとを結ぶ直線を上述した第1の直線SL1として決定する。
In the example shown in FIG. 9, the viewpoint position determining unit 40 selects the reference point P5 closest to the position BE and the reference point P6 closest to the position BE. The viewpoint position determining unit 40 determines the position where the straight line orthogonal to the straight line connecting the two reference points P5 and P6 intersects the straight line connecting the two adjacent reference points P5'and P6' on the other reference line Lr. Determined as position OBP. The viewpoint position determining unit 40 determines the straight line connecting the position BE and the position OBP as the first straight line SL1 described above.
そして、図10に示すように、視点位置決定部40は、第1の直線SL1上において位置BEから距離Lだけ離れた第1の位置PB1を算出し、第1の直線SL1と直交し且つ第1の位置PB1を含む第2の直線SL2上において距離Hだけ離れた第2の位置PB2を算出する。視点位置決定部40は、算出した第2の位置PB2を視点位置EPとして決定する。
Then, as shown in FIG. 10, the viewpoint position determining unit 40 calculates a first position PB1 on the first straight line SL1 that is separated from the position BE by a distance L, and is orthogonal to the first straight line SL1 and has a second position. The second position PB2 separated by the distance H on the second straight line SL2 including the position PB1 of 1 is calculated. The viewpoint position determination unit 40 determines the calculated second position PB2 as the viewpoint position EP.
次に、図2に示す判定領域決定部41について説明する。判定領域決定部41は、対象設備位置TPと視点位置EPとを結ぶ直線の回りの予め設定された領域を判定対象領域に決定する。図11は、実施の形態1にかかる判定領域決定部によって決定される判定対象領域の一例を示す図である。
Next, the determination area determination unit 41 shown in FIG. 2 will be described. The determination area determination unit 41 determines a preset area around the straight line connecting the target equipment position TP and the viewpoint position EP as the determination target area. FIG. 11 is a diagram showing an example of a determination target region determined by the determination region determination unit according to the first embodiment.
図11に示すように、判定領域決定部41は、対象設備位置TPと視点位置EPとを結ぶ直線SL3の回りの予め設定された領域を判定対象領域ARに決定する。図11に示す判定対象領域ARは、対象設備位置TPと視点位置EPとを結ぶ直線SL3を中心線とする半径rの円柱状の領域である。
As shown in FIG. 11, the determination area determination unit 41 determines a preset area around the straight line SL3 connecting the target equipment position TP and the viewpoint position EP as the determination target area AR. The determination target region AR shown in FIG. 11 is a columnar region having a radius r having a straight line SL3 connecting the target equipment position TP and the viewpoint position EP as the center line.
判定領域決定部41によって決定される判定対象領域ARは、図11に示す例に限定されない。例えば、判定領域決定部41は、円柱状の領域に代えて、多角柱状の領域を判定対象領域ARとして決定することもでき、楕円柱状の領域を判定対象領域ARとして決定することもできる。
The determination target area AR determined by the determination area determination unit 41 is not limited to the example shown in FIG. For example, the determination area determination unit 41 may determine the polygonal columnar region as the determination target region AR instead of the columnar region, or may determine the elliptical columnar region as the determination target region AR.
図12は、実施の形態1にかかる判定領域決定部による判定対象領域の決定方法の他の例を示す図である。図12に示す例では、判定領域決定部41によって決定される判定対象領域ARは、対象設備位置TPと視点位置EPとを結ぶ直線SL3を中心線とする四角柱状の領域であり、第1の直線SL1と平行な2つの面AS1,AS2と、第2の直線SL2と平行な2つの面AS3,AS4とで囲まれる領域である。
FIG. 12 is a diagram showing another example of a method for determining a determination target area by the determination area determination unit according to the first embodiment. In the example shown in FIG. 12, the determination target region AR determined by the determination region determination unit 41 is a square columnar region having a straight line SL3 connecting the target equipment position TP and the viewpoint position EP as the center line, and is the first. It is a region surrounded by two surfaces AS1 and AS2 parallel to the straight line SL1 and two surfaces AS3 and AS4 parallel to the second straight line SL2.
図13は、実施の形態1にかかる3次元点群データで示される3次元点群と判定対象領域との関係の一例を示す図であり、図14は、実施の形態1にかかる移動体の進行方向から見た判定対象領域の一例を示す図である。
FIG. 13 is a diagram showing an example of the relationship between the three-dimensional point cloud shown in the three-dimensional point cloud data according to the first embodiment and the determination target area, and FIG. 14 is a diagram showing an example of the relationship between the three-dimensional point cloud and the determination target area, and FIG. 14 is a diagram of the moving body according to the first embodiment. It is a figure which shows an example of the determination target area seen from the traveling direction.
図13に示す例では、3次元点群と判定対象領域ARとを上方から見た様子を示している。また、図14に示す例では、判定対象領域ARを視点位置EP付近から見た様子を示しており、図14では、判定対象領域ARは、破線で囲まれる領域である。このように、見通し情報生成装置1では、判定対象領域ARとして、対象設備位置TPと視点位置EPとを結ぶ直線の回りの予め設定された領域が決定される。
The example shown in FIG. 13 shows a state in which the three-dimensional point cloud and the determination target area AR are viewed from above. Further, the example shown in FIG. 14 shows a state in which the determination target area AR is viewed from the vicinity of the viewpoint position EP, and in FIG. 14, the determination target area AR is an area surrounded by a broken line. In this way, in the line-of-sight information generation device 1, a preset area around the straight line connecting the target equipment position TP and the viewpoint position EP is determined as the determination target area AR.
次に、図2に示す生成処理部42について説明する。生成処理部42は、データ取得部30によって取得された3次元点群データ20と判定領域決定部41によって決定された判定対象領域ARとに基づいて、見通し情報を生成する。見通し情報は、例えば、3次元点群データ20によって示される3次元点群を構成する複数の3次元点のうち判定対象領域ARに含まれる3次元点がある場合、判定対象領域ARに含まれる3次元点に応じた情報を含む。
Next, the generation processing unit 42 shown in FIG. 2 will be described. The generation processing unit 42 generates outlook information based on the three-dimensional point cloud data 20 acquired by the data acquisition unit 30 and the determination target area AR determined by the determination area determination unit 41. The line-of-sight information is included in the determination target area AR when, for example, there is a three-dimensional point included in the determination target area AR among a plurality of three-dimensional points constituting the three-dimensional point group indicated by the three-dimensional point group data 20. Includes information according to 3D points.
生成処理部42は、図2に示すように、対象領域画像生成部50と、見通し判定部51と、干渉物画像生成部52とを備える。対象領域画像生成部50は、3次元点群データ20で示される3次元点群のうち判定対象領域ARに含まれる3次元点を基準面に投影して得られる判定対象領域ARの2次元画像を含む情報を見通し情報として生成する。
As shown in FIG. 2, the generation processing unit 42 includes a target area image generation unit 50, a line-of-sight determination unit 51, and an interfering object image generation unit 52. The target area image generation unit 50 is a two-dimensional image of the determination target area AR obtained by projecting the three-dimensional points included in the determination target area AR among the three-dimensional point groups shown in the three-dimensional point group data 20 onto the reference plane. Generate information including the above as outlook information.
図15は、実施の形態1にかかる判定対象領域に存在する干渉物の一例を示す図である。図15に示す例では、判定対象領域AR内に干渉物I1と干渉物I2とが各々一部含まれる。干渉物I1は、架線柱であり、干渉物I2は、樹木である。したがって、図15に示す例では、判定対象領域AR内には、干渉物I1を表す3次元点群の一部と、干渉物I2を表す3次元点群の一部とが含まれる。
FIG. 15 is a diagram showing an example of an interfering substance existing in the determination target region according to the first embodiment. In the example shown in FIG. 15, the interference substance I1 and the interference substance I2 are partially included in the determination target region AR. The interfering material I1 is an overhead wire pillar, and the interfering material I2 is a tree. Therefore, in the example shown in FIG. 15, the determination target region AR includes a part of the three-dimensional point cloud representing the interfering object I1 and a part of the three-dimensional point cloud representing the interfering material I2.
対象領域画像生成部50は、3次元点群データ20で示される3次元点群のうち判定対象領域ARに含まれる3次元点をPL1[mm2]の大きさで図11に示す直線SL3に垂直な面に投影して判定対象領域ARの2次元画像を生成する。また、対象領域画像生成部50は、3次元点群データ20で示される3次元点群のうち判定対象領域ARに含まれる3次元点をPL1[mm2]の大きさで水平面に投影して判定対象領域ARの2次元画像を生成する。PL1は、記憶部11に予め記憶されるが、通信部10で受信され通信部10からデータ取得部30によって取得されて記憶部11に記憶されてもよい。
The target area image generation unit 50 arranges the three-dimensional points included in the determination target area AR in the straight line SL3 shown in FIG. 11 in the size of PL1 [mm 2 ] among the three-dimensional point groups shown in the three-dimensional point group data 20. A two-dimensional image of the determination target area AR is generated by projecting onto a vertical surface. Further, the target area image generation unit 50 projects the three-dimensional points included in the determination target area AR among the three-dimensional point groups shown in the three-dimensional point group data 20 onto a horizontal plane with a size of PL1 [mm 2 ]. A two-dimensional image of the determination target area AR is generated. Although PL1 is stored in advance in the storage unit 11, it may be received by the communication unit 10, acquired from the communication unit 10 by the data acquisition unit 30, and stored in the storage unit 11.
図16は、実施の形態1にかかる対象領域画像生成部によって生成される判定対象領域の第1の2次元画像の一例を示す図である。図16に示す第1の2次元画像60は、判定対象領域ARに含まれる3次元点を図11に示す直線SL3に垂直な面に投影して得られる判定対象領域ARの2次元画像である。
FIG. 16 is a diagram showing an example of a first two-dimensional image of a determination target area generated by the target area image generation unit according to the first embodiment. The first two-dimensional image 60 shown in FIG. 16 is a two-dimensional image of the determination target region AR obtained by projecting the three-dimensional points included in the determination target region AR onto a plane perpendicular to the straight line SL3 shown in FIG. ..
第1の2次元画像60には、判定対象領域ARに含まれ基準面に2次元点として投影された3次元点と、視点位置EPと、視点位置EPを中心とする半径r1の円形の枠線と、視点位置EPを中心とする半径r2の円形の枠線と、視点位置EPを中心とする半径rの円形の枠線とが含まれる。半径r1の円形の枠線は、判定対象領域ARの外縁を示す。半径r1は、半径rおよび半径r2よりも短く、半径r2は、半径rよりも短い。半径r1は、例えば、半径rの1/3の長さであり、半径r2は、例えば、半径rの2/3の長さである。
The first two-dimensional image 60 includes a three-dimensional point included in the determination target area AR and projected as a two-dimensional point on the reference plane, a viewpoint position EP, and a circular frame having a radius r1 centered on the viewpoint position EP. A line, a circular frame line having a radius r2 centered on the viewpoint position EP, and a circular frame line having a radius r centered on the viewpoint position EP are included. The circular frame line having a radius r1 indicates the outer edge of the determination target area AR. The radius r1 is shorter than the radius r and the radius r2, and the radius r2 is shorter than the radius r. The radius r1 is, for example, 1/3 of the radius r, and the radius r2 is, for example, 2/3 of the radius r.
第1の2次元画像60は、移動体4が指定距離Dtだけ手前に位置する状態で移動体4の乗務員が対象設備を見た場合に、対象設備の位置を中心としてどのくらい離れた位置に干渉物があるかを示す画像である。図16に示すように、第1の2次元画像60では、視点位置EPを中心とする半径rの円内に判定対象領域ARに含まれる3次元点が2次元点として配置されるため、第1の2次元画像60によって視点位置EPからの対象設備の見通し状況を精度よく表すことができる。
The first two-dimensional image 60 interferes with a position far away from the position of the target equipment when the crew member of the moving body 4 looks at the target equipment in a state where the moving body 4 is located in front of the designated distance Dt. It is an image showing whether there is an object. As shown in FIG. 16, in the first two-dimensional image 60, the three-dimensional points included in the determination target region AR are arranged as two-dimensional points in a circle having a radius r centered on the viewpoint position EP. The two-dimensional image 60 of 1 can accurately represent the line-of-sight status of the target equipment from the viewpoint position EP.
上述した第1の2次元画像60は、判定対象領域ARが円柱状の領域である場合の例であるが、判定対象領域ARが四角柱状の領域である場合も、同様に対象領域画像生成部50によって第1の2次元画像60が生成される。図17は、実施の形態1にかかる対象領域画像生成部によって生成される判定対象領域の第1の2次元画像の他の例を示す図である。
The first two-dimensional image 60 described above is an example when the determination target area AR is a columnar region, but when the determination target region AR is a square columnar region, the target region image generation unit is also used. The first two-dimensional image 60 is generated by 50. FIG. 17 is a diagram showing another example of the first two-dimensional image of the determination target region generated by the target region image generation unit according to the first embodiment.
図17に示す第1の2次元画像60には、四角柱状の領域である判定対象領域ARに含まれ基準面に2次元点として投影された3次元点と、視点位置EPと、視点位置EPを中心とする異なる大きさの四角形の枠線とが含まれる。図17に示す第1の2次元画像60では、運転手の両目の目線付近に判定対象領域ARの中央領域を設定することができるため、より現実に近い状況で見通し状況を精度よく表すことができる。
The first two-dimensional image 60 shown in FIG. 17 includes a three-dimensional point included in the determination target region AR, which is a square columnar region, and projected as a two-dimensional point on the reference plane, a viewpoint position EP, and a viewpoint position EP. Includes square borders of different sizes centered around. In the first two-dimensional image 60 shown in FIG. 17, since the central region of the determination target region AR can be set near the line of sight of both eyes of the driver, it is possible to accurately represent the line-of-sight situation in a situation closer to reality. can.
図18は、実施の形態1にかかる対象領域画像生成部によって生成される判定対象領域の第2の2次元画像の一例を示す図である。図18に示す第2の2次元画像61は、3次元点群データ20で示される3次元点群のうち判定対象領域ARに含まれる3次元点を水平面に投影して得られる2次元画像である。
FIG. 18 is a diagram showing an example of a second two-dimensional image of the determination target area generated by the target area image generation unit according to the first embodiment. The second two-dimensional image 61 shown in FIG. 18 is a two-dimensional image obtained by projecting a three-dimensional point included in the determination target region AR from the three-dimensional point group shown in the three-dimensional point group data 20 onto a horizontal plane. be.
第2の2次元画像61には、判定対象領域ARに含まれ基準面に2次元点として投影された3次元点と、長方形の枠と、視点位置EPと、対象設備位置TPと、視点位置EPと対象設備位置TPとの間の距離である指定距離Dtとが含まれる。第2の2次元画像61は、判定対象領域ARを上方から見た2次元画像であり、かかる第2の2次元画像61によって視点位置EPからどのくらいの距離に干渉物があるかを表すことができる。
The second two-dimensional image 61 includes a three-dimensional point included in the determination target area AR and projected as a two-dimensional point on the reference plane, a rectangular frame, a viewpoint position EP, a target equipment position TP, and a viewpoint position. The designated distance Dt, which is the distance between the EP and the target equipment position TP, is included. The second two-dimensional image 61 is a two-dimensional image of the determination target region AR viewed from above, and the second two-dimensional image 61 can indicate how far the interference is from the viewpoint position EP. can.
また、対象領域画像生成部50は、判定対象領域ARに含まれる3次元点を直線SL3からの距離に応じて色分けして基準面に投影して得られる2次元画像を第1の2次元画像60および第2の2次元画像61として生成することもできる。図19は、実施の形態1にかかる対象領域画像生成部によって判定対象領域に含まれる3次元点が位置に応じて色分けされた判定対象領域の第1の2次元画像の一例を示す図である。
Further, the target area image generation unit 50 displays a two-dimensional image obtained by projecting a three-dimensional point included in the determination target area AR on a reference plane by color-coding the three-dimensional points according to the distance from the straight line SL3 as the first two-dimensional image. It can also be generated as 60 and a second 2D image 61. FIG. 19 is a diagram showing an example of a first two-dimensional image of a determination target area in which three-dimensional points included in the determination target area are color-coded according to positions by the target area image generation unit according to the first embodiment. ..
図19に示す第1の2次元画像60には、図16に示す第1の2次元画像60と同様に、判定対象領域ARに含まれ基準面に2次元点として投影された3次元点と、視点位置EPと、半径r1の円と、半径r2の円と、半径rの円とが含まれる。また、図19に示す第1の2次元画像60では、判定対象領域ARに含まれる3次元点が視点位置EPを含む直線SL3からの距離に応じて色分けされている。図19では、3次元点の円内の塗り潰し状態の違いで色の違いが表されている。図19に示す例で、半径r1の円内の領域の3次元点と、半径r1の円外であって半径r2の円内の領域の3次元点と、半径r2の円外であって半径rの円内の領域の3次元点とがそれぞれ異なる色で表されている。
Similar to the first two-dimensional image 60 shown in FIG. 16, the first two-dimensional image 60 shown in FIG. 19 includes a three-dimensional point included in the determination target area AR and projected as a two-dimensional point on the reference plane. , The viewpoint position EP, a circle with a radius r1, a circle with a radius r2, and a circle with a radius r are included. Further, in the first two-dimensional image 60 shown in FIG. 19, the three-dimensional points included in the determination target area AR are color-coded according to the distance from the straight line SL3 including the viewpoint position EP. In FIG. 19, the difference in color is represented by the difference in the filled state in the circle of the three-dimensional points. In the example shown in FIG. 19, a three-dimensional point in a region inside a circle having a radius r1, a three-dimensional point outside a circle having a radius r1 and a region inside a circle having a radius r2, and a radius outside a circle having a radius r2. The three-dimensional points in the region within the circle of r are represented by different colors.
図20は、実施の形態1にかかる対象領域画像生成部によって判定対象領域に含まれる3次元点が位置に応じて色分けされた判定対象領域の第2の2次元画像の一例を示す図である。図20に示す第2の2次元画像61には、図18に示す第2の2次元画像61と同様に、判定対象領域ARに含まれ基準面に2次元点として投影された3次元点と、長方形の枠と、視点位置EPと、対象設備位置TPと、指定距離Dtとが含まれる。
FIG. 20 is a diagram showing an example of a second two-dimensional image of the determination target area in which the three-dimensional points included in the determination target area are color-coded according to the positions by the target area image generation unit according to the first embodiment. .. Similar to the second two-dimensional image 61 shown in FIG. 18, the second two-dimensional image 61 shown in FIG. 20 includes a three-dimensional point included in the determination target area AR and projected as a two-dimensional point on the reference plane. , A rectangular frame, a viewpoint position EP, a target equipment position TP, and a designated distance Dt are included.
図20に示す第2の2次元画像61では、図18に示す第2の2次元画像61と同様に、判定対象領域ARに含まれ基準面に2次元点として投影された3次元点が視点位置EPを含む直線SL3からの距離に応じて色分けされている。
In the second two-dimensional image 61 shown in FIG. 20, similarly to the second two-dimensional image 61 shown in FIG. 18, a three-dimensional point included in the determination target area AR and projected as a two-dimensional point on the reference plane is a viewpoint. It is color-coded according to the distance from the straight line SL3 including the position EP.
なお、図示しないが、判定対象領域ARが多角柱状の領域である場合も、同様に対象領域画像生成部50によって、判定対象領域ARに含まれ基準面に2次元点として投影された3次元点が直線SL3からの距離に応じて色分けされた判定対象領域ARの第1の2次元画像60および第2の2次元画像61が生成される。
Although not shown, even when the determination target region AR is a polygonal columnar region, the target region image generation unit 50 similarly includes a three-dimensional point included in the determination target region AR and projected as a two-dimensional point on the reference plane. The first two-dimensional image 60 and the second two-dimensional image 61 of the determination target area AR are color-coded according to the distance from the straight line SL3.
次に、生成処理部42の見通し判定部51について説明する。見通し判定部51は、判定対象領域ARに含まれる3次元点の分布に基づいて、見通しの程度を判定する。見通しの程度は、視点位置EPからの対象設備がどの程度見通せるかを示す。見通し判定部51によって判定された見通しの程度を示す情報は、対象領域画像生成部50によって見通し情報に追加される。
Next, the outlook determination unit 51 of the generation processing unit 42 will be described. The line-of-sight determination unit 51 determines the degree of line-of-sight based on the distribution of three-dimensional points included in the determination target area AR. The degree of visibility indicates how much the target equipment can be seen from the viewpoint position EP. Information indicating the degree of line-of-sight determined by the line-of-sight determination unit 51 is added to the line-of-sight information by the target area image generation unit 50.
見通し判定部51は、例えば、3次元点群データ20で示される3次元点群のうち判定対象領域ARに含まれる3次元点をPL2[mm2]の大きさで水平面に投影して判定対象領域ARの2次元画像を生成する。PL2は、記憶部11に予め記憶されるが、通信部10で受信され通信部10からデータ取得部30によって取得されて記憶部11に記憶されてもよい。
The line-of-sight determination unit 51 projects, for example, a three-dimensional point included in the determination target area AR of the three-dimensional point group shown in the three-dimensional point group data 20 onto a horizontal plane with a size of PL2 [mm 2 ] to make a determination target. Generate a two-dimensional image of the region AR. The PL2 is stored in advance in the storage unit 11, but may be received by the communication unit 10, acquired from the communication unit 10 by the data acquisition unit 30, and stored in the storage unit 11.
見通し判定部51は、判定対象領域ARの2次元画像に含まれる各3次元点の面積の合計値を合計面積として算出し、判定対象領域ARの面積に対する合計面積の比である面積比Srを算出する。見通し判定部51は、面積比Srの大きさに応じて、見通し状況を判定する。例えば、見通し判定部51は、0≦Sr<Sth1である場合、見通しが良好な状態であると判定し、Sth1≦Sr<Sth2である場合、一部干渉があるが見通せる状態であると判定し、Sth2≦Srである場合、見通しが不良である状態であると判定する。Sth1,Sth2は閾値である。
The line-of-sight determination unit 51 calculates the total value of the areas of each three-dimensional point included in the two-dimensional image of the determination target area AR as the total area, and calculates the area ratio Sr, which is the ratio of the total area to the area of the determination target area AR. calculate. The outlook determination unit 51 determines the outlook situation according to the size of the area ratio Sr. For example, the line-of-sight determination unit 51 determines that the line-of-sight is in a good state when 0 ≦ Sr <Sth1, and determines that the line-of-sight is in a state where there is some interference but can be seen when Sth1 ≦ Sr <Sth2. , Sth2 ≦ Sr, it is determined that the line-of-sight is poor. Sth1 and Sth2 are threshold values.
また、見通し判定部51は、対象設備位置TPと視点位置EPとを結ぶ直線SL3からの距離に応じて、重みづけして見通し状況を判定することもできる。例えば、直線SL3から半径r1までの距離にある領域を第1の領域とし、直線SL3から半径r2までの距離にある領域から第1の領域を除いた領域を第2の領域とし、直線SL3から半径rまでの距離にある領域から第1の領域および第2の領域を除いた領域を第3の領域とする。以下、説明の便宜上、第1の領域を第1の領域AR1と記載し、第2の領域を第2の領域AR2と記載し、第3の領域を第3の領域AR3と記載する。
Further, the line-of-sight determination unit 51 can also weight and determine the line-of-sight status according to the distance from the straight line SL3 connecting the target equipment position TP and the viewpoint position EP. For example, the region at the distance from the straight line SL3 to the radius r1 is defined as the first region, the region excluding the first region from the region at the distance from the straight line SL3 to the radius r2 is defined as the second region, and the region from the straight line SL3 is used. The region excluding the first region and the second region from the region at a distance to the radius r is defined as the third region. Hereinafter, for convenience of explanation, the first region will be referred to as a first region AR1, the second region will be referred to as a second region AR2, and the third region will be referred to as a third region AR3.
見通し判定部51は、判定対象領域ARの2次元画像のうち第1の領域AR1に含まれる3次元点の総面積に係数k1を乗算して第1の面積値を算出し、判定対象領域ARの2次元画像のうち第2の領域AR2に含まれる3次元点の総面積に係数k2を乗算して第2の面積値を算出する。また、見通し判定部51は、判定対象領域ARの2次元画像のうち第3の領域AR3に含まれる3次元点の総面積に係数k3を乗算して第3面積値を算出する。
The line-of-sight determination unit 51 calculates the first area value by multiplying the total area of the three-dimensional points included in the first area AR1 of the two-dimensional images of the determination target area AR by the coefficient k1 and calculates the determination target area AR. The second area value is calculated by multiplying the total area of the three-dimensional points included in the second area AR2 of the two-dimensional image of the above by the coefficient k2. Further, the line-of-sight determination unit 51 calculates the third area value by multiplying the total area of the three-dimensional points included in the third area AR3 of the two-dimensional image of the determination target area AR by the coefficient k3.
見通し判定部51は、第1の面積値と第2の面積値と第3面積値とを合計して得られた値を上述した面積比Srと表すことができる。なお、係数k1,k2,k3は、k1>k2>k3の関係にあり、記憶部11に予め記憶されるが、通信部10で受信され通信部10からデータ取得部30によって取得されて記憶部11に記憶されてもよい。
The outlook determination unit 51 can express the value obtained by summing the first area value, the second area value, and the third area value as the above-mentioned area ratio Sr. The coefficients k1, k2, and k3 have a relationship of k1> k2> k3 and are stored in advance in the storage unit 11, but are received by the communication unit 10 and acquired from the communication unit 10 by the data acquisition unit 30 and stored in the storage unit 11. It may be stored in 11.
また、見通し判定部51は、例えば、第1の領域AR1に含まれる3次元点をPL2×k4[mm2]の大きさで水平面に投影し、第2の領域AR2に含まれる3次元点をPL2×k5[mm2]の大きさで水平面に投影し、第3の領域AR3に含まれる3次元点をPL2×k6[mm2]の大きさで水平面に投影する。k4,k5,k6は、係数であり、k4>k5>k6の関係にある。
Further, the line-of-sight determination unit 51 projects, for example, a three-dimensional point included in the first region AR1 onto a horizontal plane with a size of PL2 × k4 [mm 2 ], and projects the three-dimensional point included in the second region AR2. It is projected onto the horizontal plane with a size of PL2 × k5 [mm 2 ], and the three-dimensional points included in the third region AR3 are projected onto the horizontal plane with a size of PL2 × k6 [mm 2 ]. k4, k5, and k6 are coefficients, and there is a relationship of k4>k5> k6.
見通し判定部51は、判定対象領域ARに含まれる各3次元点の面積の合計値を合計面積として算出し、判定対象領域ARの面積に対する合計面積の比である面積比Srを算出する。そして、見通し判定部51は、上述した判定方法と同様の方法で、面積比Srの大きさに応じて、見通し状況を判定する。
The outlook determination unit 51 calculates the total value of the areas of each three-dimensional point included in the determination target area AR as the total area, and calculates the area ratio Sr, which is the ratio of the total area to the area of the determination target area AR. Then, the line-of-sight determination unit 51 determines the line-of-sight status according to the size of the area ratio Sr by the same method as the above-mentioned determination method.
このように、見通し判定部51は、直線SL3からの距離に近いほど大きな重み付けを行うことで、対象設備を見ている状態の乗務員の目線に近い干渉物を見通し状況がより悪くなる要素として判定することができるため、より精度よい判定を行うことができる。
In this way, the line-of-sight determination unit 51 determines that an interfering object closer to the line of sight of the crew member who is looking at the target equipment is an element that makes the line-of-sight condition worse by weighting as the distance from the straight line SL3 increases. Therefore, it is possible to make a more accurate determination.
上述した例では、見通し判定部51は、水平面に投影した判定対象領域ARを3つの領域に分けて、見通し状況を判定するが、水平面に投影した判定対象領域ARを2つの領域または4つ以上の領域に分けて、見通し状況を判定することもできる。
In the above-mentioned example, the line-of-sight determination unit 51 divides the determination target area AR projected on the horizontal plane into three regions and determines the line-of-sight status, but the determination target region AR projected on the horizontal plane is divided into two regions or four or more regions. It is also possible to judge the outlook situation by dividing it into the areas of.
次に、生成処理部42の干渉物画像生成部52について説明する。干渉物画像生成部52は、3次元点群データ20で示される3次元点群のうち判定対象領域ARに含まれる3次元点を着色した状態で3次元点群の画像を干渉物画像として生成する。干渉物画像生成部52によって生成された干渉物画像は、対象領域画像生成部50によって見通し情報に追加される。
Next, the interfering object image generation unit 52 of the generation processing unit 42 will be described. The interfering object image generation unit 52 generates an image of the three-dimensional point group as an interfering object image in a state where the three-dimensional points included in the determination target area AR of the three-dimensional point group shown in the three-dimensional point group data 20 are colored. do. The interfering object image generated by the interfering object image generation unit 52 is added to the line-of-sight information by the target area image generation unit 50.
干渉物画像生成部52による3次元点の着色方法は、対象領域画像生成部50による3次元点の着色方法と同様であり、干渉物画像生成部52は、判定対象領域ARに含まれる3次元点を直線SL3からの距離に応じて色分けを行う。
The method of coloring the three-dimensional points by the interfering object image generation unit 52 is the same as the method of coloring the three-dimensional points by the target area image generation unit 50, and the interfering object image generation unit 52 is included in the determination target area AR. The points are color-coded according to the distance from the straight line SL3.
例えば、干渉物画像生成部52は、半径r1の円内の領域の3次元点と、半径r1の円外であって半径r2の円内の領域の3次元点と、半径r2の円外であって半径rの円内の領域の3次元点とにそれぞれ異なる色を着ける。
For example, the interfering object image generation unit 52 has a three-dimensional point in a region inside a circle with a radius r1, a three-dimensional point outside a circle with a radius r1 and a region inside a circle with a radius r2, and a circle outside the radius r2. There are different colors for the three-dimensional points in the area within the circle with radius r.
図21は、実施の形態1にかかる干渉物画像生成部によって生成される干渉物画像の一例を示す図であり、図22は、実施の形態1にかかる干渉物画像生成部によって生成される干渉物画像の他の例を示す図である。図21および図22に示すように、干渉物画像生成部52は、干渉物に近い位置から干渉物の3次元点を含む3次元点群を見た画像を干渉物画像70,71として生成する。
FIG. 21 is a diagram showing an example of an interfering object image generated by the interfering object image generation unit according to the first embodiment, and FIG. 22 is a diagram showing an interference generated by the interfering object image generation unit according to the first embodiment. It is a figure which shows the other example of an object image. As shown in FIGS. 21 and 22, the interfering object image generation unit 52 generates images of a group of three-dimensional points including the three-dimensional points of the interfering object from a position close to the interfering object as interfering object images 70 and 71. ..
図21に示す干渉物画像70には、干渉物I1を表す3次元点群の画像が含まれており、干渉物I1を表す3次元点群を構成する複数の3次元点のうち判定対象領域ARに含まれる3次元点が着色されている。かかる干渉物画像70によって、干渉物I1がどのような状態で判定対象領域ARに干渉しているかを表すことができる。
The interfering object image 70 shown in FIG. 21 includes an image of a three-dimensional point group representing the interfering object I1, and is a determination target area among a plurality of three-dimensional points constituting the three-dimensional point group representing the interfering object I1. The three-dimensional points included in the AR are colored. The interfering object image 70 can indicate in what state the interfering object I1 interferes with the determination target region AR.
また、図22に示す干渉物画像71には、干渉物I2を表す3次元点群の画像が含まれており、干渉物I2を表す3次元点群を構成する複数の3次元点のうち判定対象領域ARに含まれる3次元点が着色されている。かかる干渉物画像71によって、干渉物I2がどのような状態で判定対象領域ARに干渉しているかを表すことができる。
Further, the interfering object image 71 shown in FIG. 22 includes an image of a three-dimensional point group representing the interfering object I2, and is determined among a plurality of three-dimensional points constituting the three-dimensional point group representing the interfering object I2. The three-dimensional points included in the target area AR are colored. The interfering object image 71 can indicate in what state the interfering object I2 interferes with the determination target region AR.
図21および図22に示す例では、説明の便宜上、判定対象領域ARに含まれる3次元点を大きく表示しており、かかる3次元点が着色されている。図21および図22では、3次元点の円内の塗り潰し状態の違いで色の違いを表している。
In the examples shown in FIGS. 21 and 22, for convenience of explanation, the three-dimensional points included in the determination target area AR are displayed in large size, and the three-dimensional points are colored. In FIGS. 21 and 22, the difference in color is represented by the difference in the filled state in the circle of the three-dimensional points.
図2に示す出力処理部32は、見通し情報生成部31によって生成された見通し情報を出力する。出力処理部32は、例えば、見通し情報生成部31によって生成された見通し情報を不図示のネットワークを介して外部装置へ通信部10に送信させることで、見通し情報を出力する。また、出力処理部32は、例えば、見通し情報生成部31によって生成された見通し情報を表示装置5に表示させることで、見通し情報を出力することもできる。
The output processing unit 32 shown in FIG. 2 outputs the line-of-sight information generated by the line-of-sight information generation unit 31. The output processing unit 32 outputs the line-of-sight information by, for example, transmitting the line-of-sight information generated by the line-of-sight information generation unit 31 to an external device via a network (not shown) to the communication unit 10. Further, the output processing unit 32 can also output the line-of-sight information by displaying the line-of-sight information generated by the line-of-sight information generation unit 31 on the display device 5, for example.
見通し情報生成部31によって生成される見通し情報は、対象領域画像生成部50によって生成される第1の2次元画像60および第2の2次元画像61、見通し判定部51によって判定される見通し状況、および干渉物画像生成部52によって生成される干渉物画像70,71の少なくとも1つを含む。
The line-of-sight information generated by the line-of-sight information generation unit 31 includes the first two-dimensional image 60 and the second two-dimensional image 61 generated by the target area image generation unit 50, and the line-of-sight status determined by the line-of-sight determination unit 51. And at least one of the interfering object images 70, 71 generated by the interfering object image generation unit 52.
例えば、出力処理部32は、対象領域画像生成部50によって生成される第1の2次元画像60および第2の2次元画像61、見通し判定部51によって判定される見通し状況、および干渉物画像生成部52によって生成される干渉物画像70,71を含む見通し情報を出力することができる。
For example, the output processing unit 32 generates a first two-dimensional image 60 and a second two-dimensional image 61 generated by the target area image generation unit 50, a line-of-sight situation determined by the line-of-sight determination unit 51, and an interfering object image generation. It is possible to output line-of-sight information including the interfering object images 70 and 71 generated by the unit 52.
図23は、実施の形態1にかかる見通し情報の一例を示す図である。図23に示す見通し情報80は、第1の2次元画像60と、第2の2次元画像61と、干渉物画像70と、干渉物画像71と、確認ポイント82と、見通し状況の判定結果83とを含む。確認ポイント82は、指定距離Dtであり、視点位置EPと対象設備位置TPとの移動路3に沿った距離を示す。見通し状況の判定結果83は、見通し判定部51によって判定された見通しの程度を示す情報である。
FIG. 23 is a diagram showing an example of outlook information according to the first embodiment. The line-of-sight information 80 shown in FIG. 23 includes a first two-dimensional image 60, a second two-dimensional image 61, an interfering object image 70, an interfering object image 71, a confirmation point 82, and a determination result 83 of the line-of-sight state. And include. The confirmation point 82 is a designated distance Dt, and indicates the distance along the movement path 3 between the viewpoint position EP and the target equipment position TP. The outlook situation determination result 83 is information indicating the degree of outlook determined by the outlook determination unit 51.
見通し情報80には、干渉物画像70に対応付けられた位置に、干渉物I1の3次元点のうち判定対象領域ARに含まれる3次元点の位置を示す位置情報が配置される。同様に、見通し情報80には、干渉物画像71に対応付けられた位置に、干渉物I2の3次元点のうち判定対象領域ARに含まれる3次元点の位置を示す位置情報が配置される。かかる位置情報は、見通し情報生成部31によって検出される情報である。
In the line-of-sight information 80, position information indicating the position of the three-dimensional point included in the determination target area AR among the three-dimensional points of the interfering object I1 is arranged at the position associated with the interfering object image 70. Similarly, in the line-of-sight information 80, position information indicating the position of the three-dimensional point included in the determination target region AR among the three-dimensional points of the interfering object I2 is arranged at the position associated with the interfering object image 71. .. Such location information is information detected by the line-of-sight information generation unit 31.
このように、見通し情報80は、第1の2次元画像60と、第2の2次元画像61と、干渉物画像70と、干渉物画像71と、見通し状況の判定結果83とを含むことから、見通し情報80を確認する作業者は、見通し状況を精度よく把握することができる。
As described above, the line-of-sight information 80 includes the first two-dimensional image 60, the second two-dimensional image 61, the interfering object image 70, the interfering object image 71, and the line-of-sight status determination result 83. The worker who confirms the outlook information 80 can accurately grasp the outlook situation.
つづいて、フローチャートを用いて見通し情報生成装置1の処理部12による処理を説明する。図24は、実施の形態1にかかる見通し情報生成装置の処理部による処理の一例を示すフローチャートである。
Next, the processing by the processing unit 12 of the line-of-sight information generation device 1 will be described using a flowchart. FIG. 24 is a flowchart showing an example of processing by the processing unit of the line-of-sight information generation device according to the first embodiment.
図24に示すように、見通し情報生成装置1の処理部12は、記憶部11または通信部10から3次元点群データ20を取得する(ステップS10)。次に、処理部12は、視点位置EPを決定する視点位置決定処理を行う(ステップS11)。かかるステップS11の処理は、図25に示すステップS20~S22の処理であり、後で詳述する。
As shown in FIG. 24, the processing unit 12 of the line-of-sight information generation device 1 acquires the three-dimensional point cloud data 20 from the storage unit 11 or the communication unit 10 (step S10). Next, the processing unit 12 performs a viewpoint position determining process for determining the viewpoint position EP (step S11). The process of step S11 is the process of steps S20 to S22 shown in FIG. 25, which will be described in detail later.
次に、処理部12は、ステップS11で決定された視点位置EPに基づいて、判定対象領域を決定する(ステップS12)。そして、処理部12は、ステップS10で取得した3次元点群データ20とステップS12で決定した判定対象領域とに基づいて、見通し情報80を生成する見通し情報生成処理を行う(ステップS13)。ステップS13の処理は、図26に示すステップS30~S34の処理であり、後で詳述する。次に、処理部12は、ステップS13で生成した見通し情報を出力して(ステップS14)、図24に示す処理を終了する。
Next, the processing unit 12 determines the determination target area based on the viewpoint position EP determined in step S11 (step S12). Then, the processing unit 12 performs a line-of-sight information generation process for generating the line-of-sight information 80 based on the three-dimensional point cloud data 20 acquired in step S10 and the determination target area determined in step S12 (step S13). The process of step S13 is the process of steps S30 to S34 shown in FIG. 26, which will be described in detail later. Next, the processing unit 12 outputs the line-of-sight information generated in step S13 (step S14), and ends the processing shown in FIG. 24.
図25は、実施の形態1にかかる見通し情報生成装置の処理部による視点位置決定処理の一例を示すフローチャートである。図25に示すように、処理部12は、対象設備位置TPに対応する基準線Lr上の位置BTを算出する(ステップS20)。
FIG. 25 is a flowchart showing an example of the viewpoint position determination process by the processing unit of the line-of-sight information generation device according to the first embodiment. As shown in FIG. 25, the processing unit 12 calculates the position BT on the reference line Lr corresponding to the target equipment position TP (step S20).
次に、処理部12は、基準線Lr上で位置BTから指定距離Dtだけ離れた基準線Lr上の位置BEを算出し(ステップS21)、算出した位置BEに基づいて、視点位置EPを算出して(ステップS22)、図25に示す処理を終了する。
Next, the processing unit 12 calculates the position BE on the reference line Lr separated from the position BT by the designated distance Dt on the reference line Lr (step S21), and calculates the viewpoint position EP based on the calculated position BE. Then (step S22), the process shown in FIG. 25 is terminated.
図26は、実施の形態1にかかる見通し情報生成装置の処理部による見通し情報生成処理の一例を示すフローチャートである。図26に示すように、処理部12は、判定対象領域ARに含まれる3次元点を基準面に投影する(ステップS30)。そして、処理部12は、基準面上に投影された各3次元点を直線SL3からの距離に応じて色分けして、判定対象領域ARの2次元画像を生成する(ステップS31)。
FIG. 26 is a flowchart showing an example of the line-of-sight information generation process by the processing unit of the line-of-sight information generation device according to the first embodiment. As shown in FIG. 26, the processing unit 12 projects a three-dimensional point included in the determination target area AR onto a reference plane (step S30). Then, the processing unit 12 color-codes each three-dimensional point projected on the reference plane according to the distance from the straight line SL3 to generate a two-dimensional image of the determination target area AR (step S31).
次に、処理部12は、判定対象領域ARに含まれる3次元点の分布に基づいて、見通しの程度を判定する(ステップS32)。また、処理部12は、判定対象領域ARに含まれる3次元点を着色した状態で3次元点群の画像を干渉物画像70,71として生成する(ステップS33)。そして、処理部12は、見通し情報80を出力して(ステップS34)、図26の処理を終了する。
Next, the processing unit 12 determines the degree of line-of-sight based on the distribution of the three-dimensional points included in the determination target area AR (step S32). Further, the processing unit 12 generates images of the three-dimensional point cloud as interfering object images 70 and 71 in a state where the three-dimensional points included in the determination target area AR are colored (step S33). Then, the processing unit 12 outputs the line-of-sight information 80 (step S34), and ends the processing of FIG. 26.
図27は、実施の形態1にかかる見通し情報生成装置のハードウェア構成の一例を示す図である。図27に示すように、見通し情報生成装置1は、プロセッサ101と、メモリ102と、通信装置103とを備えるコンピュータを含む。
FIG. 27 is a diagram showing an example of the hardware configuration of the line-of-sight information generation device according to the first embodiment. As shown in FIG. 27, the line-of-sight information generation device 1 includes a computer including a processor 101, a memory 102, and a communication device 103.
プロセッサ101、メモリ102、および通信装置103は、例えば、バス104によって互いに情報の送受信が可能である。記憶部11は、メモリ102によって実現される。通信部10は、通信装置103で実現される。プロセッサ101は、メモリ102に記憶されたプログラムを読み出して実行することによって、データ取得部30、見通し情報生成部31、および出力処理部32などの機能を実行する。プロセッサ101は、例えば、処理回路の一例であり、CPU(Central Processing Unit)、DSP(Digital Signal Processor)、およびシステムLSI(Large Scale Integration)のうち一つ以上を含む。
The processor 101, the memory 102, and the communication device 103 can send and receive information to and from each other by, for example, the bus 104. The storage unit 11 is realized by the memory 102. The communication unit 10 is realized by the communication device 103. The processor 101 executes functions such as a data acquisition unit 30, a line-of-sight information generation unit 31, and an output processing unit 32 by reading and executing a program stored in the memory 102. The processor 101 is, for example, an example of a processing circuit, and includes one or more of a CPU (Central Processing Unit), a DSP (Digital Signal Processor), and a system LSI (Large Scale Integration).
メモリ102は、RAM(Random Access Memory)、ROM(Read Only Memory)、フラッシュメモリ、EPROM(Erasable Programmable Read Only Memory)、およびEEPROM(登録商標)(Electrically Erasable Programmable Read Only Memory)のうち一つ以上を含む。また、メモリ102は、コンピュータが読み取り可能なプログラムが記録された記録媒体を含む。かかる記録媒体は、不揮発性または揮発性の半導体メモリ、磁気ディスク、フレキシブルメモリ、光ディスク、コンパクトディスク、およびDVD(Digital Versatile Disc)のうち一つ以上を含む。なお、見通し情報生成装置1は、ASIC(Application Specific Integrated Circuit)およびFPGA(Field Programmable Gate Array)などの集積回路を含んでいてもよい。
The memory 102 includes one or more of RAM (Random Access Memory), ROM (Read Only Memory), flash memory, EPROM (Erasable Programmable Read Only Memory), and EEPROM (registered trademark) (Electrically Erasable Programmable Read Only Memory). include. Further, the memory 102 includes a recording medium in which a computer-readable program is recorded. Such recording media include one or more of non-volatile or volatile semiconductor memories, magnetic disks, flexible memories, optical discs, compact disks, and DVDs (Digital Versatile Discs). The line-of-sight information generation device 1 may include integrated circuits such as an ASIC (Application Specific Integrated Circuit) and an FPGA (Field Programmable Gate Array).
以上のように、実施の形態1にかかる見通し情報生成装置1は、データ取得部30と、見通し情報生成部31と、出力処理部32とを備える。データ取得部30は、移動体4が移動する移動路3の周辺領域の物体を3次元点群で表す3次元点群データ20を取得する。見通し情報生成部31は、データ取得部30によって取得された3次元点群データ20に基づいて、周辺領域にある対象設備から離れた視点位置EPからの対象設備の見通し状況を示す情報である見通し情報80を生成する。出力処理部32は、見通し情報生成部31によって生成された見通し情報80を出力する。これにより、見通し情報生成装置1は、移動路3を移動する移動体4からの対象設備の見通し状況を精度よく提示することができる。
As described above, the line-of-sight information generation device 1 according to the first embodiment includes a data acquisition unit 30, a line-of-sight information generation unit 31, and an output processing unit 32. The data acquisition unit 30 acquires three-dimensional point cloud data 20 that represents an object in the peripheral region of the movement path 3 to which the moving body 4 moves as a three-dimensional point cloud. The outlook information generation unit 31 is information indicating the outlook status of the target equipment from the viewpoint position EP away from the target equipment in the peripheral region based on the three-dimensional point cloud data 20 acquired by the data acquisition unit 30. Generate information 80. The output processing unit 32 outputs the line-of-sight information 80 generated by the line-of-sight information generation unit 31. As a result, the line-of-sight information generation device 1 can accurately present the line-of-sight status of the target equipment from the moving body 4 moving on the moving path 3.
また、データ取得部30は、移動路3に沿った基準線Lrのデータである基準線データ21を取得する。見通し情報生成部31は、基準線データ21で示される基準線Lrに沿った対象設備からの距離に基づいて、視点位置EPを決定する視点位置決定部40を備える。これにより、見通し情報生成装置1は、例えば、対象設備から指定距離Dtだけ離れた位置を視点位置EPとして自動的に設定することができ、対象設備から一定距離手前の地点から対象設備の見通し状況を精度よく提示することができる。
Further, the data acquisition unit 30 acquires the reference line data 21 which is the data of the reference line Lr along the movement path 3. The line-of-sight information generation unit 31 includes a viewpoint position determination unit 40 that determines the viewpoint position EP based on the distance from the target equipment along the reference line Lr indicated by the reference line data 21. As a result, the line-of-sight information generator 1 can automatically set, for example, a position separated from the target equipment by a specified distance Dt as the viewpoint position EP, and the line-of-sight status of the target equipment from a point a certain distance before the target equipment. Can be presented accurately.
また、基準線データ21は、移動路3の傾きを示すデータまたは移動路3の傾きを算出するためのデータを含む。視点位置決定部40は、対象設備からの距離と移動路3の傾きとに基づいて、視点位置EPを決定する。これにより、見通し情報生成装置1は、例えば、移動体4の乗務員からの視点で、対象設備の見通し状況を精度よく提示することができる。
Further, the reference line data 21 includes data indicating the inclination of the moving path 3 or data for calculating the inclination of the moving path 3. The viewpoint position determination unit 40 determines the viewpoint position EP based on the distance from the target equipment and the inclination of the movement path 3. As a result, the line-of-sight information generation device 1 can accurately present the line-of-sight status of the target equipment from the viewpoint of the crew of the moving body 4, for example.
また、見通し情報生成部31は、判定領域決定部41と、生成処理部42とを備える。判定領域決定部41は、対象設備位置TPと視点位置EPとを結ぶ直線SL3の回りの予め設定された領域を判定対象領域ARに決定する。生成処理部42は、3次元点群データ20と判定対象領域ARとに基づいて、見通し情報80を生成する。これにより、見通し情報生成装置1は、例えば、対象設備位置TPと視点位置EPとを結ぶ直線SL3の回りの予め設定された領域での対象設備の見通し状況を精度よく提示することができる。
Further, the line-of-sight information generation unit 31 includes a determination area determination unit 41 and a generation processing unit 42. The determination area determination unit 41 determines a preset area around the straight line SL3 connecting the target equipment position TP and the viewpoint position EP as the determination target area AR. The generation processing unit 42 generates the line-of-sight information 80 based on the three-dimensional point cloud data 20 and the determination target area AR. Thereby, the line-of-sight information generation device 1 can accurately present, for example, the line-of-sight status of the target equipment in a preset area around the straight line SL3 connecting the target equipment position TP and the viewpoint position EP.
また、生成処理部42は、対象領域画像生成部50を備える。対象領域画像生成部50は、3次元点群データ20で示される3次元点群を構成する複数の3次元点のうち判定対象領域ARに含まれる3次元点を基準面に投影して得られる判定対象領域ARの2次元画像を含む情報を生成する。これにより、見通し情報生成装置1は、対象設備の見通し状況を2次元画像で提示することができる。
Further, the generation processing unit 42 includes a target area image generation unit 50. The target area image generation unit 50 is obtained by projecting a three-dimensional point included in the determination target area AR out of a plurality of three-dimensional points constituting the three-dimensional point group represented by the three-dimensional point group data 20 onto a reference plane. Generates information including a two-dimensional image of the determination target area AR. As a result, the line-of-sight information generation device 1 can present the line-of-sight status of the target equipment as a two-dimensional image.
また、基準面は、直線SL3に垂直な面を含む。これにより、見通し情報生成装置1は、視点位置EPからの対象設備の見通し状況を2次元画像で提示することができる。
Further, the reference plane includes a plane perpendicular to the straight line SL3. As a result, the line-of-sight information generation device 1 can present the line-of-sight status of the target equipment from the viewpoint position EP as a two-dimensional image.
また、基準面は、水平面を含む。これにより、見通し情報生成装置1は、判定対象領域ARの存在する3次元の視点位置EPからの距離を把握可能な2次元画像を提示することができる。
Also, the reference plane includes a horizontal plane. As a result, the line-of-sight information generation device 1 can present a two-dimensional image capable of grasping the distance from the three-dimensional viewpoint position EP in which the determination target area AR exists.
また、対象領域画像生成部50は、判定対象領域ARに含まれる3次元点を直線SL3からの距離に応じて色分けした2次元画像を判定対象領域ARの2次元画像として生成する。これにより、見通し情報生成装置1は、2次元画像に含まれる3次元点の色によって視点位置EPからの対象設備の見通し状況を把握可能な2次元画像を提示することができる。
Further, the target area image generation unit 50 generates a two-dimensional image in which the three-dimensional points included in the determination target area AR are color-coded according to the distance from the straight line SL3 as the two-dimensional image of the determination target area AR. As a result, the line-of-sight information generation device 1 can present a two-dimensional image capable of grasping the line-of-sight status of the target equipment from the viewpoint position EP by the colors of the three-dimensional points included in the two-dimensional image.
また、生成処理部42は、見通し判定部51を備える。見通し判定部51は、判定対象領域ARに含まれる3次元点の分布に基づいて、見通しの程度を判定する。見通し情報80は、見通し判定部51によって判定された見通しの程度を示す情報をさらに含む。これにより、見通し情報生成装置1は、見通しの程度を示す情報を提示することができる。
Further, the generation processing unit 42 includes a prospect determination unit 51. The line-of-sight determination unit 51 determines the degree of line-of-sight based on the distribution of three-dimensional points included in the determination target area AR. The outlook information 80 further includes information indicating the degree of outlook determined by the outlook determination unit 51. As a result, the line-of-sight information generation device 1 can present information indicating the degree of line-of-sight.
また、生成処理部42は、干渉物画像生成部52を備える。干渉物画像生成部52は、判定対象領域ARに含まれる3次元点を着色した状態で3次元点群の画像を干渉物画像70,71として生成する。見通し情報80は、干渉物画像生成部52によって生成された干渉物画像70,71をさらに含む。これにより、見通し情報生成装置1は、判定対象領域ARに含まれる3次元点を着色した状態で3次元点群の画像である干渉物画像70,71を提示することができる。
Further, the generation processing unit 42 includes an interfering object image generation unit 52. The interfering object image generation unit 52 generates an image of the three-dimensional point group as the interfering object images 70 and 71 in a state where the three-dimensional points included in the determination target area AR are colored. The line-of-sight information 80 further includes the interfering object images 70 and 71 generated by the interfering material image generation unit 52. As a result, the line-of-sight information generation device 1 can present the interfering object images 70 and 71, which are images of the three-dimensional point cloud, in a state where the three-dimensional points included in the determination target area AR are colored.
また、判定領域決定部41は、直線SL3を中心線とする円柱状の領域を判定対象領域ARとして決定する。これにより、見通し情報生成装置1は、判定対象領域ARを容易且つ簡易に設定することができる。
Further, the determination area determination unit 41 determines a columnar area centered on the straight line SL3 as the determination target area AR. As a result, the line-of-sight information generation device 1 can easily and easily set the determination target area AR.
また、判定領域決定部41は、直線SL3を中心線とする四角柱状の領域を判定対象領域ARとして決定する。これにより、見通し情報生成装置1は、運転手の両目の目線付近に判定対象領域ARの中央領域を設定することができるため、より現実に近い状況で見通し状況を精度よく表すことができる。
Further, the determination area determination unit 41 determines a square columnar area centered on the straight line SL3 as the determination target area AR. As a result, the line-of-sight information generation device 1 can set the central area of the determination target area AR near the eyes of both eyes of the driver, so that the line-of-sight situation can be accurately represented in a situation closer to reality.
以上の実施の形態に示した構成は、一例を示すものであり、別の公知の技術と組み合わせることも可能であるし、要旨を逸脱しない範囲で、構成の一部を省略、変更することも可能である。
The configuration shown in the above embodiment is an example, and can be combined with another known technique, or a part of the configuration may be omitted or changed without departing from the gist. It is possible.
1 見通し情報生成装置、2 3次元点群計測装置、3 移動路、4 移動体、5 表示装置、10 通信部、11 記憶部、12 処理部、20 3次元点群データ、21 基準線データ、30 データ取得部、31 見通し情報生成部、32 出力処理部、40 視点位置決定部、41 判定領域決定部、42 生成処理部、50 対象領域画像生成部、51 見通し判定部、52 干渉物画像生成部、60 第1の2次元画像、61 第2の2次元画像、70,71 干渉物画像、80 見通し情報、82 確認ポイント、83 見通しの状況の判定結果、100 情報提供システム、101 プロセッサ、102 メモリ、103 通信装置、104 バス、AR 判定対象領域、AR1 第1の領域、AR2 第2の領域、AR3 第3の領域、AS1,AS2,AS3,AS4 面、BT 位置、D 積算距離、Dt 指定距離、EP 視点位置、H 距離、I1,I2 干渉物、L 距離、Lr 基準線、P5,P5’,P6,P6’,P21,P22,P23,P24 基準点、PB1 第1の位置、PB2 第2の位置、SL1 第1の直線、SL2 第2の直線、Sr 面積比、TP 対象設備位置、d1,d2,d3,d19 距離、k1,k2,k3,k4,k5,k6 係数、r,r1,r2 半径。
1 line-of-sight information generation device, 2 3D point group measurement device, 3 movement path, 4 moving body, 5 display device, 10 communication unit, 11 storage unit, 12 processing unit, 20 3D point group data, 21 reference line data, 30 data acquisition unit, 31 outlook information generation unit, 32 output processing unit, 40 viewpoint position determination unit, 41 judgment area determination unit, 42 generation processing unit, 50 target area image generation unit, 51 outlook determination unit, 52 interfering object image generation Department, 60 1st 2D image, 61 2nd 2D image, 70,71 Interfering object image, 80 line-of-sight information, 82 confirmation points, 83 line-of-sight status judgment results, 100 information provision system, 101 processor, 102 Memory, 103 communication device, 104 bus, AR judgment target area, AR1 first area, AR2 second area, AR3 third area, AS1, AS2, AS3, AS4 surface, BT position, D integrated distance, Dt designation Distance, EP viewpoint position, H distance, I1, I2 interfering object, L distance, Lr reference line, P5, P5', P6, P6', P21, P22, P23, P24 reference point, PB1 first position, PB2 first 2 position, SL1 1st straight line, SL2 2nd straight line, Sr area ratio, TP target equipment position, d1, d2, d3, d19 distance, k1, k2, k3, k4, k5, k6 coefficient, r, r1 , R2 radius.
Claims (14)
- 移動体が移動する移動路の周辺領域の物体を3次元点群で表す3次元点群データを取得するデータ取得部と、
前記データ取得部によって取得された前記3次元点群データに基づいて、前記周辺領域にある対象設備から離れた視点位置からの前記対象設備の見通し状況を示す情報である見通し情報を生成する見通し情報生成部と、
前記見通し情報生成部によって生成された前記見通し情報を出力する出力処理部と、を備える
ことを特徴とする見通し情報生成装置。 A data acquisition unit that acquires 3D point cloud data that represents an object in the peripheral area of the movement path where the moving object moves as a 3D point cloud.
Based on the three-dimensional point cloud data acquired by the data acquisition unit, outlook information for generating outlook information, which is information indicating the outlook status of the target equipment from a viewpoint position away from the target equipment in the peripheral area. The generator and
A line-of-sight information generation device including an output processing unit that outputs the line-of-sight information generated by the line-of-sight information generation unit. - 前記データ取得部は、
前記移動路に沿った基準線のデータである基準線データを取得し、
前記見通し情報生成部は、
前記基準線データで示される前記基準線に沿った前記対象設備からの距離に基づいて、前記視点位置を決定する視点位置決定部を備える
ことを特徴とする請求項1に記載の見通し情報生成装置。 The data acquisition unit
The reference line data, which is the reference line data along the movement path, is acquired, and the reference line data is acquired.
The outlook information generation unit
The line-of-sight information generation device according to claim 1, further comprising a viewpoint position determining unit that determines the viewpoint position based on the distance from the target equipment along the reference line indicated by the reference line data. .. - 前記基準線データは、
前記移動路の傾きを示すデータまたは前記移動路の傾きを算出するためのデータを含み、
前記視点位置決定部は、
前記対象設備からの距離と前記移動路の傾きとに基づいて、前記視点位置を決定する
ことを特徴とする請求項2に記載の見通し情報生成装置。 The reference line data is
Includes data indicating the slope of the travel path or data for calculating the slope of the travel path.
The viewpoint position determining unit is
The line-of-sight information generation device according to claim 2, wherein the viewpoint position is determined based on the distance from the target equipment and the inclination of the moving path. - 前記見通し情報生成部は、
前記対象設備の位置と前記視点位置とを結ぶ直線の回りの予め設定された領域を判定対象領域に決定する判定領域決定部と、
前記3次元点群データと前記判定対象領域とに基づいて、前記見通し情報を生成する生成処理部と、を備える
ことを特徴とする請求項1から3のいずれか1つに記載の見通し情報生成装置。 The outlook information generation unit
A determination area determination unit that determines a preset area around a straight line connecting the position of the target equipment and the viewpoint position as the determination target area.
The line-of-sight information generation according to any one of claims 1 to 3, further comprising a generation processing unit that generates the line-of-sight information based on the three-dimensional point cloud data and the determination target area. Device. - 前記生成処理部は、
前記3次元点群データで示される前記3次元点群を構成する複数の3次元点のうち前記判定対象領域に含まれる3次元点を基準面に投影して得られる前記判定対象領域の2次元画像を含む情報を生成する対象領域画像生成部を備える
ことを特徴とする請求項4に記載の見通し情報生成装置。 The generation processing unit
The two dimensions of the determination target area obtained by projecting the three-dimensional points included in the determination target area among the plurality of three-dimensional points constituting the three-dimensional point group indicated by the three-dimensional point group data onto the reference plane. The line-of-sight information generation device according to claim 4, further comprising a target area image generation unit for generating information including an image. - 前記基準面は、
前記直線に垂直な面を含む
ことを特徴とする請求項5に記載の見通し情報生成装置。 The reference plane is
The line-of-sight information generation device according to claim 5, further comprising a plane perpendicular to the straight line. - 前記基準面は、
水平面を含む
ことを特徴とする請求項5または6に記載の見通し情報生成装置。 The reference plane is
The line-of-sight information generator according to claim 5 or 6, which comprises a horizontal plane. - 前記生成処理部は、
前記判定対象領域に含まれる3次元点を前記直線からの距離に応じて色分けした2次元画像を前記判定対象領域の2次元画像として生成する
ことを特徴とする請求項4から7のいずれか1つに記載の見通し情報生成装置。 The generation processing unit
One of claims 4 to 7, wherein a two-dimensional image in which three-dimensional points included in the determination target area are color-coded according to the distance from the straight line is generated as a two-dimensional image of the determination target area. The line-of-sight information generator described in 1. - 前記生成処理部は、
前記判定対象領域に含まれる3次元点の分布に基づいて、前記視点位置からの前記対象設備の見通しの程度を判定する見通し判定部を備え、
前記見通し情報は、
前記見通し判定部によって判定された前記見通しの程度を示す情報をさらに含む
ことを特徴とする請求項4から8のいずれか1つに記載の見通し情報生成装置。 The generation processing unit
A line-of-sight determination unit for determining the degree of line-of-sight of the target equipment from the viewpoint position based on the distribution of three-dimensional points included in the determination target area is provided.
The outlook information is
The line-of-sight information generation device according to any one of claims 4 to 8, further comprising information indicating the degree of the line-of-sight determined by the line-of-sight determination unit. - 前記生成処理部は、
前記判定対象領域に含まれる3次元点を着色した状態で前記3次元点群の画像を干渉物画像として生成する干渉物画像生成部を備え、
前記見通し情報は、
前記干渉物画像生成部によって生成された前記干渉物画像をさらに含む
ことを特徴とする請求項4から9のいずれか1つに記載の見通し情報生成装置。 The generation processing unit
It is provided with an interferometer image generation unit that generates an image of the three-dimensional point group as an interferometer image in a state where the three-dimensional points included in the determination target region are colored.
The outlook information is
The line-of-sight information generation device according to any one of claims 4 to 9, further comprising the interference image generated by the interference image generation unit. - 前記判定領域決定部は、
前記直線を中心線とする円柱状の領域を前記判定対象領域として決定する
ことを特徴とする請求項4から10のいずれか1つに記載の見通し情報生成装置。 The determination area determination unit
The line-of-sight information generation device according to any one of claims 4 to 10, wherein a columnar region having the straight line as a center line is determined as the determination target region. - 前記判定領域決定部は、
前記直線を中心線とする四角柱状の領域を前記判定対象領域として決定する
ことを特徴とする請求項4から11のいずれか1つに記載の見通し情報生成装置。 The determination area determination unit
The line-of-sight information generation device according to any one of claims 4 to 11, wherein a square columnar region having the straight line as a center line is determined as the determination target region. - コンピュータが実行する見通し情報生成方法であって、
移動体が移動する移動路の周辺領域の物体を3次元点群で表す3次元点群データを取得する第1のステップと、
前記第1のステップによって取得された前記3次元点群データに基づいて、前記周辺領域にある対象設備から離れた視点位置からの前記対象設備の見通し状況を示す情報である見通し情報を生成する第2のステップと、
前記第2のステップによって生成された前記見通し情報を出力する第3のステップと、を含む
ことを特徴とする見通し情報生成方法。 It ’s a computer-executed method of generating outlook information.
The first step of acquiring 3D point cloud data representing an object in the peripheral area of the moving path in which the moving body moves is represented by a 3D point cloud.
Based on the three-dimensional point cloud data acquired in the first step, the first line-of-sight information, which is information indicating the line-of-sight status of the target equipment from a viewpoint position away from the target equipment in the peripheral area, is generated. 2 steps and
A line-of-sight information generation method comprising the third step of outputting the line-of-sight information generated by the second step. - 移動体が移動する移動路の周辺領域の物体を3次元点群で表す3次元点群データを取得する第1のステップと、
前記第1のステップによって取得された前記3次元点群データに基づいて、前記周辺領域にある対象設備から離れた視点位置からの前記対象設備の見通し状況を示す情報である見通し情報を生成する第2のステップと、
前記第2のステップによって生成された前記見通し情報を出力する第3のステップと、をコンピュータに実行させる
ことを特徴とする見通し情報生成プログラム。
The first step of acquiring 3D point cloud data representing an object in the peripheral area of the moving path in which the moving body moves is represented by a 3D point cloud.
Based on the three-dimensional point cloud data acquired in the first step, the first line-of-sight information, which is information indicating the line-of-sight status of the target equipment from a viewpoint position away from the target equipment in the peripheral area, is generated. 2 steps and
A line-of-sight information generation program comprising causing a computer to execute a third step of outputting the line-of-sight information generated by the second step.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2020/044080 WO2022113243A1 (en) | 2020-11-26 | 2020-11-26 | Visibility information generation device, visibility information generation method, and visibility information generation program |
JP2022564913A JP7209913B2 (en) | 2020-11-26 | 2020-11-26 | Outlook Information Generating Device, Outlook Information Generating Method, and Outlook Information Generating Program |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2020/044080 WO2022113243A1 (en) | 2020-11-26 | 2020-11-26 | Visibility information generation device, visibility information generation method, and visibility information generation program |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022113243A1 true WO2022113243A1 (en) | 2022-06-02 |
Family
ID=81755378
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2020/044080 WO2022113243A1 (en) | 2020-11-26 | 2020-11-26 | Visibility information generation device, visibility information generation method, and visibility information generation program |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP7209913B2 (en) |
WO (1) | WO2022113243A1 (en) |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010202017A (en) * | 2009-03-03 | 2010-09-16 | Mitsubishi Electric Corp | Data analysis device and method, and program |
US20180297621A1 (en) * | 2017-04-14 | 2018-10-18 | Bayer Cropscience Lp | Vegetation detection and alert system for a railway vehicle |
-
2020
- 2020-11-26 WO PCT/JP2020/044080 patent/WO2022113243A1/en active Application Filing
- 2020-11-26 JP JP2022564913A patent/JP7209913B2/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010202017A (en) * | 2009-03-03 | 2010-09-16 | Mitsubishi Electric Corp | Data analysis device and method, and program |
US20180297621A1 (en) * | 2017-04-14 | 2018-10-18 | Bayer Cropscience Lp | Vegetation detection and alert system for a railway vehicle |
Also Published As
Publication number | Publication date |
---|---|
JP7209913B2 (en) | 2023-01-20 |
JPWO2022113243A1 (en) | 2022-06-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10408606B1 (en) | Quality inspection system and method of operation | |
US10501059B2 (en) | Stereo camera device | |
JP5936792B2 (en) | Building limit measurement drawing creation device, building limit measurement drawing data creation device, and building limit measurement drawing creation method | |
JP2018077254A (en) | Point group data utilization system | |
CN109782015A (en) | Laser velocimeter method, control device and laser velocimeter | |
CN104567758B (en) | Stereo imaging system and its method | |
JP6268126B2 (en) | Architectural limit point cloud determination system using laser point cloud, architectural limit point cloud determination method using laser point cloud, and building limit internal point cloud determination program using laser point cloud | |
AU2008241689A1 (en) | Method of and apparatus for producing road information | |
JP2016120892A (en) | Three-dimensional object detector, three-dimensional object detection method, and three-dimensional object detection program | |
US20210223363A1 (en) | Object detection on a path of travel and obstacle detection on railway tracks using free space information | |
CN106370884A (en) | Vehicle speed measurement method based on binocular camera computer vision technology | |
JP6465421B1 (en) | Structural deformation detector | |
JP2014163707A (en) | Road deformation detection device, road deformation detection method and program | |
CA2801885A1 (en) | Method for producing a digital photo wherein at least some of the pixels comprise position information, and such a digital photo | |
CN110956151A (en) | Rail transit foreign matter intrusion detection method and system based on structured light | |
RU2562368C1 (en) | Three-dimensional (3d) mapping method | |
JP6883995B2 (en) | Congestion level notification system, congestion level detection method, and program | |
WO2022113243A1 (en) | Visibility information generation device, visibility information generation method, and visibility information generation program | |
JP6929207B2 (en) | Road image processing device, road image processing method, road image processing program, and recording medium | |
JP5947666B2 (en) | Traveling road feature image generation method, traveling road feature image generation program, and traveling road feature image generation apparatus | |
CN106123864A (en) | Image distance-finding method based on image-forming principle and Data Regression Model | |
JP5952759B2 (en) | Overhead wire position measuring apparatus and method | |
RU2625091C1 (en) | Method of road surface cross cut smoothness (wheel tracking) determining | |
JPH11142124A (en) | Method and equipment for measuring sectional shape of rail | |
US20240257376A1 (en) | Method and system for detection a line above ground from a helicopter |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20963508 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2022564913 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20963508 Country of ref document: EP Kind code of ref document: A1 |