WO2024063078A1 - Method and system for generating 3-dimensional map data - Google Patents
Method and system for generating 3-dimensional map data Download PDFInfo
- Publication number
- WO2024063078A1 WO2024063078A1 PCT/JP2023/034040 JP2023034040W WO2024063078A1 WO 2024063078 A1 WO2024063078 A1 WO 2024063078A1 JP 2023034040 W JP2023034040 W JP 2023034040W WO 2024063078 A1 WO2024063078 A1 WO 2024063078A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- dimensional
- positioning information
- range sensor
- measurement vehicle
- point cloud
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 50
- 238000005259 measurement Methods 0.000 claims abstract description 64
- 238000006243 chemical reaction Methods 0.000 claims abstract description 26
- 230000008569 process Effects 0.000 claims abstract description 22
- 230000005291 magnetic effect Effects 0.000 claims description 73
- 239000003550 marker Substances 0.000 claims description 66
- 230000008859 change Effects 0.000 claims description 37
- 238000012545 processing Methods 0.000 claims description 32
- 238000001514 detection method Methods 0.000 claims description 28
- 238000006073 displacement reaction Methods 0.000 claims description 26
- 238000013507 mapping Methods 0.000 claims description 22
- 230000009466 transformation Effects 0.000 claims description 10
- 230000004044 response Effects 0.000 claims description 6
- 238000009434 installation Methods 0.000 claims 1
- 238000010586 diagram Methods 0.000 description 18
- 239000008186 active pharmaceutical agent Substances 0.000 description 17
- 230000001133 acceleration Effects 0.000 description 10
- 238000004364 calculation method Methods 0.000 description 7
- 230000000694 effects Effects 0.000 description 7
- 238000005516 engineering process Methods 0.000 description 5
- 238000013459 approach Methods 0.000 description 4
- 230000006870 function Effects 0.000 description 3
- 230000005389 magnetism Effects 0.000 description 3
- 238000004088 simulation Methods 0.000 description 3
- UQSXHKLRYXJYBZ-UHFFFAOYSA-N Iron oxide Chemical compound [Fe]=O UQSXHKLRYXJYBZ-UHFFFAOYSA-N 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 229920000139 polyethylene terephthalate Polymers 0.000 description 2
- 239000005020 polyethylene terephthalate Substances 0.000 description 2
- 229910000859 α-Fe Inorganic materials 0.000 description 2
- -1 Polyethylene terephthalate Polymers 0.000 description 1
- 238000009825 accumulation Methods 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 239000006249 magnetic particle Substances 0.000 description 1
- 239000006247 magnetic powder Substances 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 239000002861 polymer material Substances 0.000 description 1
- 230000000630 rising effect Effects 0.000 description 1
- 238000004148 unit process Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C7/00—Tracing profiles
- G01C7/02—Tracing profiles of land surfaces
- G01C7/04—Tracing profiles of land surfaces involving a vehicle which moves along the profile to be traced
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/40—Control within particular dimensions
- G05D1/43—Control of position or course in two dimensions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
- G06T17/05—Geographic models
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B29/00—Maps; Plans; Charts; Diagrams, e.g. route diagram
Definitions
- the present invention relates to the generation of three-dimensional map data representing the three-dimensional structure of a vehicle driving environment.
- driving support technologies include technologies that support driving by having the vehicle side take charge of a portion of vehicle control, such as brake control in an automatic braking function and steering control in a lane keeping function.
- advanced driving support technology that realizes automatic driving by executing almost all vehicle controls such as steering control and speed control on the vehicle side, reducing the operational burden on the driver to near zero (see the following patent document) (See 1).
- three-dimensional point cloud data that includes information on the distance and direction to each point of features such as curbs, guardrails, and signs is required.
- MMS mobile mapping system
- Vehicles equipped with a mobile mapping system can acquire three-dimensional point cloud data while driving.
- Distance images tend to have a large amount of information because they include distance information for each position within a two-dimensional area, and when generating three-dimensional map data based on these distance images, the amount of calculation processing is enormous. Therefore, there is a high possibility that three-dimensional map data cannot be efficiently generated.
- the present invention has been made in view of the above-mentioned conventional problems, and aims to provide a method or system for efficiently generating three-dimensional map data.
- One aspect of the present invention is a method for generating three-dimensional map data representing a three-dimensional structure of a vehicle's driving environment using a measurement vehicle that is movable on a moving plane that is a floor or ground, the method comprising the steps of: a first two-dimensional range sensor attached to the measurement vehicle so as to acquire three-dimensional point cloud data by measuring distances to one or more target points on a first plane that is perpendicular or oblique to a central axis of the measurement vehicle in a longitudinal direction; a positioning information acquisition circuit for acquiring positioning information capable of identifying the position and direction of the measurement vehicle; acquiring the three-dimensional point cloud data by the first two-dimensional range sensor while acquiring the positioning information by the positioning information acquisition circuit; converting the three-dimensional point cloud data obtained by the first two-dimensional range sensor into three-dimensional point cloud data in which a reference position and orientation are specified by performing a coordinate transformation process based on the positioning information acquired by the positioning information acquisition circuit;
- the present invention relates to a method for generating
- One aspect of the present invention is a system that generates three-dimensional map data representing a three-dimensional structure of a vehicle driving environment using a measurement vehicle that can move on a moving plane such as a floor or ground surface, the system comprising:
- the measurement vehicle is configured such that three-dimensional point cloud data can be obtained by measuring the distance to one or more target points on a first plane perpendicular or oblique to the central axis in the longitudinal direction of the measurement vehicle.
- a first two-dimensional range sensor attached to the a positioning information acquisition circuit that acquires positioning information that can identify the position and direction of the measurement vehicle;
- a coordinate conversion process based on the positioning information acquired by the positioning information acquisition circuit is applied to the three-dimensional point cloud data obtained by the first two-dimensional range sensor, and a three-dimensional point cloud whose reference position and orientation are specified.
- a 3D map data generation system includes a mapping circuit that generates a 3D map by mapping 2D point cloud data after coordinate conversion onto a 3D space.
- the three-dimensional map generation method and generation system of the present invention is a method for generating a three-dimensional map based on three-dimensional point cloud data acquired by a first two-dimensional range sensor attached to a measuring vehicle; It is a system.
- the first two-dimensional range sensor in the present invention is attached to the measuring vehicle so as to be able to acquire three-dimensional point cloud data on a first plane that is perpendicular or oblique to the longitudinal central axis of the measuring vehicle. ing. Acquiring three-dimensional point cloud data by the first two-dimensional range sensor while the measurement vehicle is moving corresponds to scanning the driving environment with the first two-dimensional range sensor.
- coordinate transformation is performed on the three-dimensional point group data obtained by the first two-dimensional range sensor based on the positioning information acquired by the positioning information acquisition circuit. Then, a three-dimensional map is generated by mapping the three-dimensional point group data after coordinate conversion onto a three-dimensional space.
- a three-dimensional range sensor is a sensor that acquires three-dimensional information of a three-dimensional space in which the depth of each point in a two-dimensional area is taken into consideration as a distance image.
- a two-dimensional range sensor is a sensor that acquires three-dimensional information on a plane in which the depth of each point on a one-dimensional line is considered.
- 3D point cloud data acquired by a 2D range sensor has a much smaller amount of information. Therefore, by using the three-dimensional point cloud data acquired by the two-dimensional range sensor, it is possible to suppress the amount of calculation processing required to generate the three-dimensional map. According to the present invention, a three-dimensional map can be efficiently generated.
- FIG. 2 is a perspective view of a truck that is an example of a measurement vehicle in Example 1.
- FIG. 1 is a system diagram showing the configuration of a three-dimensional map generation system in Example 1.
- FIG. 3 is a diagram illustrating the relationship between the central axis CL of the cart and the plane DS on which the two-dimensional range sensor detects an object in Example 1.
- FIG. 3 is a flow diagram showing the flow of three-dimensional map generation processing in Example 1.
- FIG. FIG. 3 is an explanatory diagram showing how the cart acquires three-dimensional point group data in Example 1.
- FIG. FIG. 3 is a perspective view of another truck in Example 1.
- FIG. 7 is an explanatory diagram of a magnetic marker in Example 2.
- FIG. 3 is an explanatory diagram of a truck in Example 2.
- FIG. 1 is a system diagram showing the configuration of a three-dimensional map generation system in Example 1.
- FIG. 3 is a diagram illustrating the relationship between the central axis CL of the cart and the plane DS on which the
- FIG. 3 is a system diagram showing the configuration of a three-dimensional map generation system in Example 2.
- FIG. 3 is a flow diagram showing the flow of three-dimensional map generation processing in Example 2.
- FIG. 7 is a diagram illustrating the relationship between the central axis CL of the cart and the plane DS on which the first and second two-dimensional range sensors detect objects in Example 3;
- FIG. 3 is a system diagram showing the configuration of a three-dimensional map generation system in Example 3.
- FIG. 7 is a flowchart showing the flow of three-dimensional map generation processing in Example 3.
- FIG. 7 is an explanatory diagram of a calibration post in Example 4.
- 7 is a flowchart showing the flow of calibration processing in Example 4.
- FIG. 7 is an explanatory diagram showing how the trolley observes the calibration post in Example 4.
- FIG. 7 is an explanatory diagram showing a truck in Example 5.
- FIG. 7 is an explanatory diagram of the truck seen from the rear side in the fifth embodiment.
- Example 1 This example relates to a method and system for generating three-dimensional map data using the two-dimensional range sensor 11. The contents will be explained with reference to FIGS. 1 to 6.
- a generation system 1 and a generation method in which a trolley 10 (FIG. 1), which is an example of a measurement vehicle, moves on a floor surface (moving plane) and generates three-dimensional map data of a driving environment will be described.
- the driving environment in this example is a driving environment inside a facility such as a factory.
- the generated three-dimensional map data is used by an automated transport vehicle (not shown) to move within the facility.
- the generation system 1 (FIG. 2) of this example includes a trolley 10, a two-dimensional range sensor 11 that acquires three-dimensional point cloud data, an IMU (Inertial Measurement Unit) 22 that enables inertial navigation, a coordinate conversion circuit 131, and a mapping It is configured to include a circuit 133 and a storage section 135 that stores three-dimensional map data.
- a two-dimensional range sensor 11, an IMU 22, a coordinate conversion circuit 131, a mapping circuit 133, and a storage unit 135 for storing three-dimensional map data are provided in a trolley 10, which is an example of a measurement vehicle.
- the trolley 10 is a four-wheeled vehicle configured to automatically travel along a pre-programmed predetermined route.
- the size of the trolley 10 is approximately 2 m in total length, 1 m in width, and 1 m in height.
- the trolley 10 is equipped with a steering unit for steering wheels, a drive motor for rotationally driving the drive wheels, a control unit 13, and the like to enable automatic travel.
- the control unit 13 is a unit that controls the traveling of the cart 10 so that it can move along a predetermined route.
- the control unit 13 reads route data from a storage device that constitutes the memory section 135, and controls the cart 10.
- the control unit 13 mounted on the cart 10 realizes the functions of the coordinate conversion circuit 131 and mapping circuit 133 described above.
- the coordinate conversion circuit 131, the mapping circuit 133, and the storage unit 135 are necessary components of the generation system 1, it is not essential that they be incorporated into the trolley 10.
- the information acquired by the two-dimensional range sensor 11 and IMU 22 may be wirelessly transmitted to a server device or the like.
- the server device or the like can execute processing for generating a three-dimensional map.
- the two-dimensional range sensor 11 detects an object located on a plane DS (see FIG. 3) obliquely intersecting the central axis CL of the truck 10 in the front-rear direction. This sensor acquires three-dimensional point cloud data by measuring the distance to.
- the plane DS on which the two-dimensional range sensor 11 acquires the three-dimensional point group data may be a plane orthogonal to the central axis CL.
- the IMU 22 ( Figure 2) is an inertial navigation unit that estimates the displacement and orientation change of the trolley 10 by inertial navigation, and is an example of a positioning information acquisition circuit.
- the IMU 22 is equipped with a magnetic sensor 221, an acceleration sensor 222, and a gyro sensor 223.
- the displacement estimated by the IMU 22 is the positional fluctuation amount of the trolley 10.
- the orientation change amount is the orientation change amount that indicates the direction of the trolley 10.
- the two-dimensional range sensor 11 is fixed to the trolley 10.
- the displacement and orientation change amount estimated by the IMU 22 can be treated as the displacement or orientation change amount of the two-dimensional range sensor 11.
- the control unit 13 estimates the latest vehicle position or vehicle orientation based on the displacement amount and orientation change amount estimated by the IMU 22. The control unit 13 then controls the traveling of the trolley 10 so that it moves along a predetermined route. Route data representing a predetermined route is stored in a storage device forming the storage section 135 described above so that the control unit 13 can read it. In the configuration of this example, initial values whose absolute values are known are set for the vehicle position and vehicle direction. The control unit 13 estimates the latest vehicle position and vehicle orientation by adding the displacement amount and orientation change amount estimated by the IMU 22 to the initial position and initial orientation.
- the coordinate conversion circuit 131 is a circuit that performs coordinate conversion processing on the three-dimensional point group data acquired by the two-dimensional range sensor 11.
- the three-dimensional point cloud data acquired by the two-dimensional range sensor 11 is point cloud data in a local coordinate system based on the direction of the two-dimensional range sensor 11.
- the coordinate conversion circuit 131 converts three-dimensional point group data belonging to the local coordinate system to three-dimensional point group data belonging to the global coordinate system. This conversion is realized by coordinate conversion processing based on the vehicle position and vehicle orientation estimated by inertial navigation.
- the mapping circuit 133 is a circuit that generates three-dimensional map data by mapping three-dimensional point cloud data onto a three-dimensional space.
- the three-dimensional point group data mapped by the mapping circuit 133 is three-dimensional point group data belonging to the global coordinate system after coordinate transformation.
- the mapping circuit 133 executes a process of additionally writing three-dimensional point cloud data to the three-dimensional map data stored in the storage unit 135 every time new three-dimensional point cloud data is acquired.
- the flowchart in the figure shows the flow of three-dimensional map generation processing. Note that a description of the process for automatically driving the trolley 10 will be omitted.
- the IMU 22 estimates the amount of displacement and the amount of change in direction (S101), and the two-dimensional range sensor 11 acquires three-dimensional point cloud data (S102). As shown in FIG. 5, the cart 10 acquires three-dimensional point group data while changing its position as it moves.
- the IMU 22 estimates the amount of displacement or the amount of change in orientation based on the initial position whose absolute position is known and the orientation whose absolute orientation is known.
- the control unit 13 estimates the latest vehicle position and vehicle orientation based on the displacement amount and orientation change amount estimated by the IMU 22 (S103).
- the coordinate conversion circuit 131 (control unit 13) performs coordinate conversion processing on three-dimensional point cloud data belonging to a local coordinate system based on the two-dimensional range sensor 11, and converts it into three-dimensional point cloud data belonging to a global coordinate system. (S104). Specifically, the coordinate conversion circuit 131 executes the above coordinate conversion process based on the vehicle position and vehicle orientation estimated by the control unit 13.
- the mapping circuit 133 maps the three-dimensional point group data of the global coordinate system after the coordinate transformation onto a three-dimensional space (S105), thereby generating a three-dimensional map.
- This three-dimensional map is not complete and covers every part of the driving environment.
- the three-dimensional map includes three-dimensional information of an area of the driving environment that corresponds to the route traveled by the trolley 10. As shown in FIG. 5, the degree of completion of the three-dimensional map improves as the distance traveled by the cart 10 increases.
- an additional two-dimensional range sensor (an example of a third two-dimensional range sensor) 113 may be provided on the trolley 10.
- the amount of information in the three-dimensional map data can be increased.
- the timing and aspects in which three-dimensional information can be acquired differ depending on the situation.
- a two-dimensional range sensor 113 whose measurement surface is a forward-rising plane DS (an example of a third plane), as shown in Figure 6.
- the two-dimensional range sensor 113 can acquire three-dimensional information on a surface that cannot be acquired by the two-dimensional range sensor 11 whose measurement surface is a forward-sloping plane DS.
- the three-dimensional point cloud data acquired by the two two-dimensional range sensors 11 and 113 in Figure 6 can be handled in the same way, and it is advisable to map the three-dimensional point cloud data after coordinate transformation for each.
- a two-dimensional range sensor is provided whose measurement surface is a plane parallel to the central axis CL of the cart 10 (see FIG. 3) and along the vertical direction (vertical plane). That's good too.
- the three-dimensional point group data of this two-dimensional range sensor it is possible to easily detect the angular change in the pitch direction of the trolley 10. For example, by focusing on the displacement of one target point in two three-dimensional point cloud data acquired at different times, the angular change in the pitch direction of the trolley 10 can be estimated with high accuracy.
- a route inside a facility such as a factory is exemplified as the driving environment.
- the driving environment may be a road on which ordinary vehicles travel or a road exclusively for buses.
- the measurement vehicle When the driving environment is a road, the measurement vehicle generates three-dimensional map data while moving on the road surface (moving plane).
- Example 2 This example is an example in which the generation system of Example 1 is used as a reference, and the accuracy of estimating the vehicle position and vehicle orientation is improved by using the magnetic marker 100 disposed on the floor. The contents will be explained with reference to FIGS. 7 to 10.
- the magnetic marker 100 is a permanent magnetic sheet that can be attached to the floor of the track, as shown in FIG.
- the magnetic marker 100 has a circular sheet shape with a diameter of 100 mm and a thickness of about 2 mm.
- the permanent magnet sheet is a ferrite rubber magnet sheet made of a polymer material in which iron oxide magnetic powder is dispersed and formed into a sheet shape.
- An RFID (Radio Frequency IDentification) tag 15 which is a wireless tag that outputs information wirelessly, is attached to the surface 100S of the magnetic marker 100.
- the RFID tag 15 is a sheet-shaped electronic component in which an IC chip is mounted on the surface of a tag sheet cut out from, for example, a PET (Polyethylene terephthalate) film.
- the RFID tag 15 which constitutes the information providing unit, operates using power supplied wirelessly from the trolley (measurement vehicle) 10, and transmits a tag ID (identification information), which is an example of unique information of the magnetic marker 100.
- the memory unit 135 ( Figure 9) stores the position information of the magnetic marker 100 linked to the tag ID. If the trolley 10 side can obtain the tag ID of the magnetic marker 100, it is possible to read the position information of the corresponding magnetic marker 100.
- the ferrite rubber magnet forming the magnetic marker 100 has magnetic particles dispersed in a polymeric material, and therefore has an electrical characteristic of low conductivity. Therefore, the magnetic marker 100 can suppress the generation of eddy currents and the like in response to wireless power supply to the RFID tag 15.
- the RFID tag 15 attached to the magnetic marker 100 can efficiently receive wirelessly transmitted power.
- the trolley 10 includes a sensor array 21 including a magnetic sensor Cn and a tag reader 24 in addition to the configuration of the first embodiment.
- the sensor array 21 includes 15 magnetic sensors Cn (n is an integer from 1 to 15) arranged in a straight line, and a detection processing circuit 212 incorporating a CPU (not shown). . Note that in the sensor array 21, 15 magnetic sensors Cn are arranged at equal intervals of 5 cm.
- the magnetic sensor Cn is a sensor that detects magnetism using the known MI effect (Magnet Impedance Effect).
- the MI effect is a magnetic effect in which the impedance of a magnetically sensitive material such as an amorphous wire changes sensitively in response to an external magnetic field.
- the magnetic sensor Cn is capable of detecting magnetism acting in the axial direction of the linearly incorporated amorphous wire.
- Each magnetic sensor Cn is incorporated into the sensor array 21 so that the magnetic detection direction is the same.
- the sensor array 21 is attached to the trolley 10 so that each magnetic sensor Cn can detect magnetism in the vertical direction. In the configuration of this example, this sensor array 21 is attached to the truck 10 along the vehicle width direction.
- the detection processing circuit 212 of the sensor array 21 is an arithmetic circuit that executes marker detection processing for detecting the magnetic marker 100 and the like.
- the detection processing circuit 212 is configured using a CPU (central processing unit) that executes various calculations, memory elements such as ROM (read only memory) and RAM (random access memory), and the like.
- the detection processing circuit 212 acquires sensor signals output from each magnetic sensor Cn at a cycle of, for example, 3 kHz, and executes marker detection processing.
- the amount of lateral deviation of the cart 10 with respect to the magnetic marker 100 is measured.
- the detection processing circuit 212 measures the amount of lateral deviation, for example, by identifying the position in the vehicle width direction of the peak value of the distribution of magnetic measurement values of the magnetic sensors Cn arranged in the vehicle width direction.
- the detection processing circuit 212 detects the magnetic marker 100, it inputs the detection result including the amount of lateral shift to the control unit 13 along with the fact that the magnetic marker 100 has been detected.
- the tag reader 24 (FIGS. 8 and 9) is a communication unit that wirelessly communicates with the RFID tag 15 (FIG. 7) attached to the magnetic marker 100.
- the tag reader 24 wirelessly transmits the power necessary for the operation of the RFID tag 15 and receives the tag ID transmitted by the RFID tag 15.
- the tag reader 24 may also be configured to start operating in response to the detection of the magnetic marker 100.
- step S111 determines whether the magnetic marker 100 is detected.
- the processes in steps S112 to S115 are executed.
- the determination in step S111 is NO, that is, if the magnetic marker 100 is not detected, the processes in steps S112 to S115 are not executed and are bypassed.
- the processing contents in this case are exactly the same as the processing according to the flowchart of FIG. 4 of the first embodiment.
- Step S111 of determining whether a magnetic marker has been detected is executed after three-dimensional point group data is acquired (S102) and the vehicle position and vehicle orientation are estimated by inertial navigation (S103).
- the presence or absence of detection of the magnetic marker 100 is determined using the detection result of the marker detection process input from the detection processing circuit 212. Note that, as described above, the detection result when the magnetic marker 100 is detected includes the fact that the magnetic marker 100 has been detected, the amount of lateral deviation of the trolley 10 with respect to the detected magnetic marker 100, and the like.
- the tag ID is read from the RFID tag 15 held by the magnetic marker 100 (S112).
- the control unit 13 uses the read tag ID to refer to the stored data in the storage unit 135 and acquires the position information of the magnetic marker 100 to which the tag ID is attached (S113). Then, based on the placement position of the magnetic marker 100 represented by the acquired position information, a position shifted by the amount of lateral shift included in the detection result of the marker detection process is specified by calculation (S114, positioning information acquisition circuit). Control unit 13 replaces the vehicle position estimated in step S103 with the vehicle position specified in step S114 (S115).
- the vehicle position is estimated with high accuracy using the magnetic markers 100 arranged on the road.
- accumulation of errors is inevitable in principle.
- the error accumulated by inertial navigation can be reset.
- inertial navigation after the vehicle position has been estimated according to the detection of the magnetic marker 100 it is preferable to estimate the amount of displacement based on the vehicle position.
- the acceleration sensor, gyro sensor, etc. included in the IMU are prone to offset errors due to temperature changes, changes over time, and the like.
- the offset error is essentially a so-called zero point shift in which the sensor output does not become zero in a situation where the sensor output is zero.
- a zero point shift of an acceleration sensor or a gyro sensor causes an estimation error in inertial navigation.
- the magnetic marker 100 it is also possible to specify the positional deviation between the vehicle position determined by inertial navigation and the vehicle position based on the magnetic marker 100. Estimated errors due to inertial navigation can be identified based on the positional deviation of the vehicle position due to inertial navigation, thereby making it possible to calibrate sensors such as acceleration sensors that acquire inertial information.
- the distance between the main magnetic marker and the sub magnetic marker is preferably set to a short distance that is less than the entire length of the cart 10. With such a short distance, changes in the orientation of the truck 10 can be ignored, and the orientation of the truck 10 can be estimated with high accuracy (positioning information acquisition circuit).
- the absolute heading vehicle heading
- three-dimensional point cloud data may be acquired in response to detection of the magnetic marker 100.
- the three-dimensional point cloud data may be subjected to coordinate conversion in response to the vehicle position and vehicle orientation estimated using the magnetic marker 100, and may be mapped into a three-dimensional space.
- a three-dimensional map including discrete three-dimensional data for each position of the magnetic marker 100 may be generated.
- an automated guided vehicle that uses the three-dimensional map may refer to the three-dimensional map every time it detects a magnetic marker 100.
- Other configurations and effects are the same as those of the first embodiment.
- Example 3 This example is based on the generation system of Example 1, and employs a two-dimensional range sensor 112 (an example of a second two-dimensional range sensor) instead of the IMU in order to estimate the vehicle position and vehicle direction. This is an example. The contents will be explained with reference to FIGS. 11 to 13.
- the trolley 10 (FIG. 11) that constitutes the generation system 1 of this example is equipped with a two-dimensional range sensor 11 that acquires three-dimensional point cloud data that is the source data of three-dimensional map data, as well as a vehicle position and vehicle orientation. It is equipped with a two-dimensional range sensor 112 that acquires three-dimensional point cloud data for estimation.
- the plane DS (an example of a second plane) on which the two-dimensional range sensor 112 measures distance is a horizontal plane parallel to the center axis CL of the cart 10 in the front-rear direction.
- the control unit 13 (FIG. 12) is configured to estimate the vehicle position and vehicle orientation using time-series three-dimensional point group data acquired by the two-dimensional range sensor 112.
- the control unit 13 estimates the amount of displacement and the amount of change in direction of the truck 10 by processing this three-dimensional point group data. For example, if the distance to any one object and the change in direction over time are known, the amount of displacement and the amount of change in direction of the trolley 10 can be estimated.
- the control unit 13 calculates the latest vehicle position by adding the estimated displacement amount or azimuth change amount to the reference vehicle position whose absolute position is known and the reference vehicle direction whose absolute direction is known. , estimate vehicle heading.
- the flow of the operation by the generation system 1 of this example will be explained with reference to the flow diagram of FIG. 13. While the trolley 10, which is an example of a measurement vehicle, is moving, three-dimensional point cloud data (A) by the two-dimensional range sensor 11 and three-dimensional point cloud data (B) by the two-dimensional range sensor 112 are acquired (S121 ).
- the control unit 13 estimates the amount of displacement and the amount of change in direction of the bogie 10 by processing the three-dimensional point group data (B) (S122), and further estimates the vehicle position and vehicle direction (S123, positioning information acquisition circuit).
- the control unit 13 performs coordinate transformation processing on the three-dimensional point group data (A) acquired by the two-dimensional range sensor 11 based on the estimated vehicle position and vehicle orientation (S124). Then, the control unit 13 generates three-dimensional map data by mapping the three-dimensional point group data after the coordinate transformation (S125).
- 3D point cloud data obtained by performing coordinate transformation processing on the 3D point cloud data from the 2D range sensor 112 is mapped. That's good too.
- the two-dimensional range sensors 11 and 112 it is also possible to provide a two-dimensional range sensor whose measurement surface is a plane parallel to the central axis CL of the trolley 10 and along the vertical direction. In this case, the angular change in the pitch direction of the truck 10 can be detected using the three-dimensional point group data of the two-dimensional range sensor.
- the vehicle position and vehicle direction are estimated by inertial navigation based on the positioning information acquired by the IMU, while the estimation error of inertial navigation is identified by the positioning information based on the 3D point cloud data from the 2D range sensor 112. It's also good to do. If the estimation error of inertial navigation can be identified, then it is possible to make corrections to improve the estimation accuracy of inertial navigation. Note that the other configurations and effects are the same as in Example 1.
- Example 4 This example is based on the generation system of Example 1, and adds a mechanism that enables calibration of the sensor included in the IMU. The contents will be explained with reference to FIGS. 14 to 16.
- the generation system 1 of this example uses a calibration post (an example of a rod-shaped member) 101 installed in the driving environment to calibrate a sensor (IMU) that measures inertial information, such as an acceleration sensor or a gyro sensor.
- IMU sensor
- the calibration post 101 (FIG. 14) is a cylinder with a diameter of 10 cm that is erected on the floor along the vertical direction (an example of a known orientation).
- the height of the upper end of the calibration post 101 is 1 m.
- the calibration post 101 is installed, for example, at a side of the route along which the trolley 10 travels, or at the back of a corner, that is, at a location where it hits when viewed from the approach direction.
- the calibration post 101 is installed at a location where the flatness of the road surface is sufficiently high. Therefore, the azimuth change in the pitch direction when the trolley 10 observes the calibration post 101 can be ignored.
- the calibration post 101 is provided at a location where there are no similar structures around it. Therefore, the calibration post 101 can be easily identified on the trolley 10 side. Note that it is also possible to provide a characteristic shape, such as providing a groove of a predetermined width extending in the vertical direction, on the surface of the side facing the cart 10 when it approaches. By measuring the cross-sectional shape of the above-mentioned surface of the calibration post 101, the calibration post 101 can be identified more easily.
- a calibration flag indicating a calibration period is set and managed by the control unit.
- the calibration flag is set to 1 and the calibration period begins.
- the trolley 10 acquires three-dimensional point group data while moving (S201).
- the control unit processes the acquired three-dimensional point group data to determine whether the three-dimensional data of the calibration post 101 is included, that is, the calibration post 101. is observed (S203). If the calibration post 101 has not been observed, the calibration flag is zero, so acquisition of three-dimensional point group data is repeatedly executed in accordance with the flow of S201 ⁇ S202: YES ⁇ S203: NO.
- step S204 the control unit executes the process of step S204.
- the first process is a process of extracting the 3D data of the observation point P1 of the calibration post 101 (see FIG. 16, an example of one target point) from the 3D point cloud data acquired in step S201 above. be.
- the three-dimensional data of the observation point P2 of the calibration post 101 is calculated when the trolley 10 travels and is displaced by a predetermined amount dx.
- the third process is a process of setting the calibration flag to 1. When the calibration flag is set to 1, the calibration period begins.
- the control unit repeatedly acquires the three-dimensional point group data until the estimated displacement amount by the IMU reaches the predetermined amount dx (S205: NO ⁇ S201). At this time, since the calibration period is in progress and the flag value of the calibration flag is not zero (S202: NO), the processes of steps S203 and S204 described above are bypassed.
- the amount of displacement estimated by the IMU includes an estimation error caused by measurement errors of the acceleration sensor and the gyro sensor. Therefore, the actual displacement amount dxs when it is determined in step S205 that the estimated displacement amount has reached the predetermined amount dx is highly likely to be different from the predetermined amount dx.
- the control unit calculates the observation point of the calibration post 101 from the actually measured three-dimensional point group data by the two-dimensional range sensor 11. P2s (see FIG. 16; an example of another target point) is extracted, and three-dimensional data of observation point P2s is specified (S206). Note that the estimated displacement amount of the truck 10 includes an estimation error due to inertial navigation. Therefore, as described above, when it is determined that the estimated displacement amount of the truck 10 has reached the predetermined amount dx, the actual displacement amount is dxs.
- the above three-dimensional data at the observation point P2s is not the data at the time when the displacement amount of the trolley 10 reaches dx, but the data at the time when the displacement amount reaches dxs.
- the control unit compares the three-dimensional data of the observation point P2 of the calibration post 101 based on the simulation calculation with the three-dimensional data of the observation point P2s of the calibration post 101 based on the actual measurement, and identifies an error (estimated error) (S207 ).
- the three-dimensional data obtained by the simulation calculation is the three-dimensional data of the observation point P2 obtained in step S204 above.
- the three-dimensional data of the actual observation point P2s is the three-dimensional data of the observation point P2s specified in step S206 above.
- the control unit uses the error identified in step S207 to calibrate the acceleration sensor and gyro sensor (S208).
- Calibration is, for example, so-called bias adjustment that adjusts the zero point of each sensor.
- Calibration of the acceleration sensor can be performed mainly based on the vertical positional error of the observation point P2 and the observation point P2s.
- Calibration of the gyro sensor can be performed mainly based on the horizontal positional error of observation point P2 and observation point P2s. After such calibration is performed, the calibration flag is reset to zero (S208).
- the above explanation is based on the assumption that the floor surface is horizontal and the orientation of the truck 10 (vehicle direction) does not change in the pitching direction.
- the pitching direction may change depending on the acceleration/deceleration of the truck 10, the air pressure of the wheels, etc. Such a situation in which the pitching direction changes is not suitable for sensor calibration.
- two two-dimensional range sensors mount on the trolley 10 that can simultaneously observe the calibration post 101. Attach a marking such as a horizontal bar or reflective tape to a position on the calibration post that is simultaneously observed by the two two-dimensional range sensors when there is no change in the pitching direction of the cart 10. That's good too.
- two two-dimensional range sensors may be attached so that the upper ends of the calibration posts can be observed simultaneously. When a change occurs in the pitching direction of the truck 10, the markings and the upper end cannot be observed simultaneously by the two two-dimensional range sensors. In such a case, there is a high possibility that high calibration accuracy cannot be ensured, so it is preferable not to perform sensor calibration.
- two markings that can be detected by a two-dimensional range sensor may be attached to the calibration post 101 at a predetermined interval. Since the distance between the two markings is known on the calibration post 101, the relative positional relationship in the three-dimensional space is known.
- the other marking (the second It is preferable to predict the amount of change (predicted amount of change) in the position and orientation of the trolley 10 when the object point (an example of the target point) is observed. Further, when the trolley 10 actually moves after one marking is observed and the other marking is actually observed, the position of the trolley 10 relative to the reference position and orientation is determined based on the positioning information from the IMU. It is better to estimate the amount of change in direction (estimated amount of change).
- the calibration post 101 is illustrated as extending in the vertical direction, but the direction in which the calibration post 101 extends does not have to be the vertical direction, and may be any direction as long as it is in a known direction.
- the above two markings do not need to be at two locations on the calibration post 101.
- the two markings may be provided on separate structures. It is only necessary that the relative positional relationship of each marking is known in three-dimensional space and that the two-dimensional range sensor can detect each marking. Note that the other configurations and effects are the same as in Example 1.
- Example 5 This example is based on the configuration of Example 1, and is configured such that the roll of the truck 10 around the center axis CL can be detected. The contents will be explained with reference to FIGS. 17 and 18.
- FIG. 17 is a side view of the truck 10
- FIG. 18 is an explanatory view looking forward from the rear side of the truck 10.
- the trolley 10 of this example includes a two-dimensional range sensor 11 disposed at the upper front side of the trolley 10, and a two-dimensional range sensor 114 (an example of a third two-dimensional range sensor) disposed at the lower front side. ).
- the measurement surface of the two-dimensional range sensor 11 is a plane DS that is parallel to the vehicle width direction and slopes downward toward the front.
- the measurement surface of the two-dimensional range sensor 114 is a plane DS (an example of a third plane) parallel to the vehicle width direction and rising forward.
- the front downward plane DS and the front upward plane DS intersect in front of the truck 10, and appear as an intersection in the figure.
- the intersection of these two planes DS forms an intersection line Xc along the vehicle width direction. Since the two planes DS are both planes parallel to the vehicle width direction, the intersection line Xc between the two planes DS is also parallel to the vehicle width direction.
- the intersection line Xc (FIG. 18) is a line that passes through two or more common target points among the target points observed by the two-dimensional range sensor 11 and the target points observed by the two-dimensional range sensor 114. It can be identified as As shown in FIG. 18, if the roll, which is the inclination of the cart 10 in the rotational direction around the central axis CL, is zero, the intersection line Xc is parallel to the floor surface 1S. The roll angle of the truck 10 can be estimated by examining the slope of the intersection line Xc with respect to the floor surface 1S. Note that the other configurations and effects are the same as in Example 1.
- 3D map data generation system 10 Dolly (measurement vehicle) 11 2D range sensor (first 2D range sensor) 112 Two-dimensional range sensor (second two-dimensional range sensor) 113, 114 Two-dimensional range sensor (third two-dimensional range sensor) 100 Magnetic marker 101 Calibration post (rod-shaped member) 13 Control unit (positioning information acquisition circuit) 131 Coordinate conversion circuit 133 Mapping circuit 135 Storage unit 15 RFID tag 21 Sensor array 212 Detection processing circuit (marker detection circuit) 22 IMU (positioning information acquisition circuit) 221 Magnetic sensor 222 Acceleration sensor 223 Gyro sensor 24 Tag reader CL Central axis Cn (n is an integer from 1 to 15) Magnetic sensor DS Plane
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Remote Sensing (AREA)
- Theoretical Computer Science (AREA)
- Geometry (AREA)
- Software Systems (AREA)
- Radar, Positioning & Navigation (AREA)
- Multimedia (AREA)
- Computer Graphics (AREA)
- Mathematical Physics (AREA)
- Business, Economics & Management (AREA)
- Educational Administration (AREA)
- Educational Technology (AREA)
- Aviation & Aerospace Engineering (AREA)
- Automation & Control Theory (AREA)
- Navigation (AREA)
Abstract
Provided is a method for generating 3-dimensional map data representing the 3-dimensional structure of a drive environment of a vehicle, wherein: a first 2-dimensional measurement area sensor (11) that acquires 3-dimensional point cloud data by measuring distance to a target point on a plane diagonally intersecting a central axis CL in the front-rear direction of a dolly (10), and an IMU that acquires positioning information enabling identification of the position and orientation of the dolly (10) are utilized to perform a coordinate conversion process based on the positioning information acquired by the IMU, on the 3-dimensional point cloud data obtained by the 2-dimensional measurement area sensor (11), to thereby convert the data into 3-dimensional point cloud data in which criterion position and orientation are identified; and the 3-dimensional point cloud data after the coordinate conversion is mapped onto a 3-dimensional space to generate 3-dimensional map data.
Description
本発明は、車両の走行環境の3次元構造を表す3次元地図データの生成に関する。
The present invention relates to the generation of three-dimensional map data representing the three-dimensional structure of a vehicle driving environment.
近年、車両の運転負担を低減するための運転支援技術が各種提案されている。運転支援技術としては、例えば、自動ブレーキ機能におけるブレーキ制御や車線維持機能における操舵制御など車両制御の一部を、車両側が担うことで運転を支援する技術がある。さらに、操舵制御や調速制御などの車両制御のほぼ全てを車両側が実行し、ドライバー側の操作負担をゼロに近づけて自動運転を実現する高度な運転支援技術の提案もある(下記の特許文献1参照。)。
In recent years, various driving support technologies have been proposed to reduce the burden of driving a vehicle. Examples of driving support technologies include technologies that support driving by having the vehicle side take charge of a portion of vehicle control, such as brake control in an automatic braking function and steering control in a lane keeping function. Furthermore, there are proposals for advanced driving support technology that realizes automatic driving by executing almost all vehicle controls such as steering control and speed control on the vehicle side, reducing the operational burden on the driver to near zero (see the following patent document) (See 1).
また、工場や物流倉庫等では、自動搬送車が広く活用されている。自動搬送車を自動走行させるためのシステムとしては、周囲環境の構造や障害物を検知しながら自動走行する自動搬送車のシステムの提案がある(下記の特許文献2参照。)。
Additionally, automatic guided vehicles are widely used in factories, distribution warehouses, etc. As a system for automatically traveling an automatic guided vehicle, there has been proposed a system for an automatic guided vehicle that automatically travels while detecting structures and obstacles in the surrounding environment (see Patent Document 2 below).
車両の運転に関する高度な運転支援技術や、磁気テープなど床面のマーキングに依らない自動搬送車の自動走行など、高度な車両制御を実現するためには、車両側で走行環境を精度高く把握できていることが必要である。例えば車両の自動運転を実現するためには、周囲環境の3次元構造を表す3次元情報を含む高精度な3次元地図が必要になる。
In order to realize advanced vehicle control, such as advanced driving support technology for vehicle operation and autonomous driving of automated guided vehicles that do not rely on floor markings such as magnetic tape, it is necessary for the vehicle to be able to accurately grasp the driving environment. It is necessary that the For example, in order to realize autonomous vehicle driving, a highly accurate three-dimensional map containing three-dimensional information representing the three-dimensional structure of the surrounding environment is required.
例えば道路環境を表す3次元地図データの生成には、縁石やガードレールや標識などの地物の各点までの距離や方位の情報を含む3次元点群データが必要となる。例えば、ステレオカメラやレーザスキャナなどが取得する距離画像を元にして3次元点群データを取得するモービルマッピングシステム(Mobile Mapping System:MMS)などが提案されている(例えば特許文献3参照。)。モービルマッピングシステムを搭載する車両であれば、走行中に3次元点群データを取得できる。
For example, to generate three-dimensional map data representing the road environment, three-dimensional point cloud data that includes information on the distance and direction to each point of features such as curbs, guardrails, and signs is required. For example, a mobile mapping system (MMS) has been proposed that acquires three-dimensional point cloud data based on distance images acquired by a stereo camera, laser scanner, etc. (see, for example, Patent Document 3). Vehicles equipped with a mobile mapping system can acquire three-dimensional point cloud data while driving.
距離画像は、2次元エリア内の各位置の距離情報を含むため情報量が多くなる傾向にあり、この距離画像を元にして3次元地図データを生成する場合には、演算処理量が膨大となって3次元地図データを効率良く生成できない可能性が高い。
Distance images tend to have a large amount of information because they include distance information for each position within a two-dimensional area, and when generating three-dimensional map data based on these distance images, the amount of calculation processing is enormous. Therefore, there is a high possibility that three-dimensional map data cannot be efficiently generated.
本発明は、前記従来の問題点に鑑みてなされたものであり、3次元地図データを効率良く生成するための方法あるいはシステムを提供しようとするものである。
The present invention has been made in view of the above-mentioned conventional problems, and aims to provide a method or system for efficiently generating three-dimensional map data.
本発明の一態様は、床面あるいは地面である移動平面上を移動可能な計測車両を用いて車両の走行環境の3次元構造を表す3次元地図データを生成する方法であって、
前記計測車両の前後方向の中心軸に対して直交あるいは斜交する第1の平面上の一または複数の対象点までの距離を計測することにより3次元点群データを取得できるよう、前記計測車両に取り付けられた第1の2次元測域センサと、
前記計測車両の位置及び方位を特定可能な測位情報を取得する測位情報取得回路と、を利用し、
前記測位情報取得回路による前記測位情報を取得しながら、前記第1の2次元測域センサによる前記3次元点群データを取得し、
前記測位情報取得回路により取得された測位情報に基づく座標変換処理を、前記第1の2次元測域センサによる3次元点群データに施すことにより、基準となる位置及び方位が特定された3次元点群データに変換し、
座標変換後の3次元点群データを3次元空間にマッピングすることにより3次元地図データを生成する3次元地図データの生成方法にある。 One aspect of the present invention is a method for generating three-dimensional map data representing a three-dimensional structure of a vehicle's driving environment using a measurement vehicle that is movable on a moving plane that is a floor or ground, the method comprising the steps of:
a first two-dimensional range sensor attached to the measurement vehicle so as to acquire three-dimensional point cloud data by measuring distances to one or more target points on a first plane that is perpendicular or oblique to a central axis of the measurement vehicle in a longitudinal direction;
a positioning information acquisition circuit for acquiring positioning information capable of identifying the position and direction of the measurement vehicle;
acquiring the three-dimensional point cloud data by the first two-dimensional range sensor while acquiring the positioning information by the positioning information acquisition circuit;
converting the three-dimensional point cloud data obtained by the first two-dimensional range sensor into three-dimensional point cloud data in which a reference position and orientation are specified by performing a coordinate transformation process based on the positioning information acquired by the positioning information acquisition circuit;
The present invention relates to a method for generating three-dimensional map data by mapping three-dimensional point cloud data after coordinate transformation into a three-dimensional space.
前記計測車両の前後方向の中心軸に対して直交あるいは斜交する第1の平面上の一または複数の対象点までの距離を計測することにより3次元点群データを取得できるよう、前記計測車両に取り付けられた第1の2次元測域センサと、
前記計測車両の位置及び方位を特定可能な測位情報を取得する測位情報取得回路と、を利用し、
前記測位情報取得回路による前記測位情報を取得しながら、前記第1の2次元測域センサによる前記3次元点群データを取得し、
前記測位情報取得回路により取得された測位情報に基づく座標変換処理を、前記第1の2次元測域センサによる3次元点群データに施すことにより、基準となる位置及び方位が特定された3次元点群データに変換し、
座標変換後の3次元点群データを3次元空間にマッピングすることにより3次元地図データを生成する3次元地図データの生成方法にある。 One aspect of the present invention is a method for generating three-dimensional map data representing a three-dimensional structure of a vehicle's driving environment using a measurement vehicle that is movable on a moving plane that is a floor or ground, the method comprising the steps of:
a first two-dimensional range sensor attached to the measurement vehicle so as to acquire three-dimensional point cloud data by measuring distances to one or more target points on a first plane that is perpendicular or oblique to a central axis of the measurement vehicle in a longitudinal direction;
a positioning information acquisition circuit for acquiring positioning information capable of identifying the position and direction of the measurement vehicle;
acquiring the three-dimensional point cloud data by the first two-dimensional range sensor while acquiring the positioning information by the positioning information acquisition circuit;
converting the three-dimensional point cloud data obtained by the first two-dimensional range sensor into three-dimensional point cloud data in which a reference position and orientation are specified by performing a coordinate transformation process based on the positioning information acquired by the positioning information acquisition circuit;
The present invention relates to a method for generating three-dimensional map data by mapping three-dimensional point cloud data after coordinate transformation into a three-dimensional space.
本発明の一態様は、床面あるいは地面である移動平面上を移動可能な計測車両を用いて車両の走行環境の3次元構造を表す3次元地図データを生成するシステムであって、
前記計測車両の前後方向の中心軸に対して直交あるいは斜交する第1の平面上の一または複数の対象点までの距離を計測することにより3次元点群データを取得できるよう、前記計測車両に取り付けられた第1の2次元測域センサと、
前記計測車両の位置及び方位を特定可能な測位情報を取得する測位情報取得回路と、
前記測位情報取得回路により取得された測位情報に基づく座標変換処理を、前記第1の2次元測域センサによる3次元点群データに施し、基準となる位置及び方位が特定された3次元点群データに変換する座標変換回路と、
座標変換後の2次元点群データを3次元空間にマッピングすることにより3次元地図を生成するマッピング回路と、を含む3次元地図データの生成システムにある。 One aspect of the present invention is a system that generates three-dimensional map data representing a three-dimensional structure of a vehicle driving environment using a measurement vehicle that can move on a moving plane such as a floor or ground surface, the system comprising:
The measurement vehicle is configured such that three-dimensional point cloud data can be obtained by measuring the distance to one or more target points on a first plane perpendicular or oblique to the central axis in the longitudinal direction of the measurement vehicle. a first two-dimensional range sensor attached to the
a positioning information acquisition circuit that acquires positioning information that can identify the position and direction of the measurement vehicle;
A coordinate conversion process based on the positioning information acquired by the positioning information acquisition circuit is applied to the three-dimensional point cloud data obtained by the first two-dimensional range sensor, and a three-dimensional point cloud whose reference position and orientation are specified. A coordinate conversion circuit that converts into data,
A 3D map data generation system includes a mapping circuit that generates a 3D map by mapping 2D point cloud data after coordinate conversion onto a 3D space.
前記計測車両の前後方向の中心軸に対して直交あるいは斜交する第1の平面上の一または複数の対象点までの距離を計測することにより3次元点群データを取得できるよう、前記計測車両に取り付けられた第1の2次元測域センサと、
前記計測車両の位置及び方位を特定可能な測位情報を取得する測位情報取得回路と、
前記測位情報取得回路により取得された測位情報に基づく座標変換処理を、前記第1の2次元測域センサによる3次元点群データに施し、基準となる位置及び方位が特定された3次元点群データに変換する座標変換回路と、
座標変換後の2次元点群データを3次元空間にマッピングすることにより3次元地図を生成するマッピング回路と、を含む3次元地図データの生成システムにある。 One aspect of the present invention is a system that generates three-dimensional map data representing a three-dimensional structure of a vehicle driving environment using a measurement vehicle that can move on a moving plane such as a floor or ground surface, the system comprising:
The measurement vehicle is configured such that three-dimensional point cloud data can be obtained by measuring the distance to one or more target points on a first plane perpendicular or oblique to the central axis in the longitudinal direction of the measurement vehicle. a first two-dimensional range sensor attached to the
a positioning information acquisition circuit that acquires positioning information that can identify the position and direction of the measurement vehicle;
A coordinate conversion process based on the positioning information acquired by the positioning information acquisition circuit is applied to the three-dimensional point cloud data obtained by the first two-dimensional range sensor, and a three-dimensional point cloud whose reference position and orientation are specified. A coordinate conversion circuit that converts into data,
A 3D map data generation system includes a mapping circuit that generates a 3D map by mapping 2D point cloud data after coordinate conversion onto a 3D space.
本発明の3次元地図の生成方法及び生成システムは、計測車両に取り付けられた第1の2次元測域センサが取得する3次元点群データを元にして3次元地図を生成するための方法あるいはシステムである。
The three-dimensional map generation method and generation system of the present invention is a method for generating a three-dimensional map based on three-dimensional point cloud data acquired by a first two-dimensional range sensor attached to a measuring vehicle; It is a system.
本発明における第1の2次元測域センサは、計測車両の前後方向の中心軸に対して直交あるいは斜行する第1の平面上の3次元点群データを取得できるように計測車両に取り付けられている。計測車両の移動中に、第1の2次元測域センサが3次元点群データを取得することは、第1の2次元測域センサにより走行環境を走査することに相当している。
The first two-dimensional range sensor in the present invention is attached to the measuring vehicle so as to be able to acquire three-dimensional point cloud data on a first plane that is perpendicular or oblique to the longitudinal central axis of the measuring vehicle. ing. Acquiring three-dimensional point cloud data by the first two-dimensional range sensor while the measurement vehicle is moving corresponds to scanning the driving environment with the first two-dimensional range sensor.
本発明では、測位情報取得回路により取得された測位情報に基づき、第1の2次元測域センサによる3次元点群データに座標変換が施される。そして、座標変換後の3次元点群データを3次元空間にマッピングすることで3次元地図が生成される。
In the present invention, coordinate transformation is performed on the three-dimensional point group data obtained by the first two-dimensional range sensor based on the positioning information acquired by the positioning information acquisition circuit. Then, a three-dimensional map is generated by mapping the three-dimensional point group data after coordinate conversion onto a three-dimensional space.
ちなみに、3次元測域センサは、2次元領域の各点の奥行きが考慮された3次元空間の3次元情報を距離画像として取得するセンサである。これに対して、2次元測域センサは、1次元的なライン上の各点の奥行きが考慮された平面の3次元情報を取得するセンサである。3次元測域センサが取得する3次元点群データに比べて、2次元測域センサが取得する3次元点群データは格段に情報量が少ない。したがって、2次元測域センサが取得する3次元点群データを元にすれば、3次元地図を生成するために必要な演算処理量を抑制できる。本発明によれば、効率良く3次元地図を生成できる。
Incidentally, a three-dimensional range sensor is a sensor that acquires three-dimensional information of a three-dimensional space in which the depth of each point in a two-dimensional area is taken into consideration as a distance image. On the other hand, a two-dimensional range sensor is a sensor that acquires three-dimensional information on a plane in which the depth of each point on a one-dimensional line is considered. Compared to 3D point cloud data acquired by a 3D range sensor, 3D point cloud data acquired by a 2D range sensor has a much smaller amount of information. Therefore, by using the three-dimensional point cloud data acquired by the two-dimensional range sensor, it is possible to suppress the amount of calculation processing required to generate the three-dimensional map. According to the present invention, a three-dimensional map can be efficiently generated.
本発明の実施の形態につき、以下の実施例を用いて具体的に説明する。
(実施例1)
本例は、2次元測域センサ11を利用して3次元地図データを生成する方法及びシステムに関する例である。この内容について、図1~図6を参照して説明する。 Embodiments of the present invention will be specifically described using the following examples.
(Example 1)
This example relates to a method and system for generating three-dimensional map data using the two-dimensional range sensor 11. The contents will be explained with reference to FIGS. 1 to 6.
(実施例1)
本例は、2次元測域センサ11を利用して3次元地図データを生成する方法及びシステムに関する例である。この内容について、図1~図6を参照して説明する。 Embodiments of the present invention will be specifically described using the following examples.
(Example 1)
This example relates to a method and system for generating three-dimensional map data using the two-
本例では、計測車両の一例である台車10(図1)が床面(移動平面)上を移動して走行環境の3次元地図データを生成する生成システム1および生成方法について説明する。本例の走行環境は、工場などの施設内の走行環境である。生成された3次元地図データは、図示を省略する自動搬送車両が施設内を移動するために利用される。
In this example, a generation system 1 and a generation method in which a trolley 10 (FIG. 1), which is an example of a measurement vehicle, moves on a floor surface (moving plane) and generates three-dimensional map data of a driving environment will be described. The driving environment in this example is a driving environment inside a facility such as a factory. The generated three-dimensional map data is used by an automated transport vehicle (not shown) to move within the facility.
本例の生成システム1(図2)は、台車10、3次元点群データを取得する2次元測域センサ11、慣性航法を可能にするIMU(Inertial Measurement Unit)22、座標変換回路131、マッピング回路133、3次元地図データを記憶する記憶部135、を含めて構成されている。本例の構成では、2次元測域センサ11、IMU22、座標変換回路131、マッピング回路133、3次元地図データを記憶する記憶部135が、計測車両の一例をなす台車10に設けられている。
The generation system 1 (FIG. 2) of this example includes a trolley 10, a two-dimensional range sensor 11 that acquires three-dimensional point cloud data, an IMU (Inertial Measurement Unit) 22 that enables inertial navigation, a coordinate conversion circuit 131, and a mapping It is configured to include a circuit 133 and a storage section 135 that stores three-dimensional map data. In the configuration of this example, a two-dimensional range sensor 11, an IMU 22, a coordinate conversion circuit 131, a mapping circuit 133, and a storage unit 135 for storing three-dimensional map data are provided in a trolley 10, which is an example of a measurement vehicle.
台車10は、予めプログラムされた所定のルートを自動で走行できるように構成された4輪の車両である。台車10の大きさは、全長2m、幅1m、高さ1mほどである。台車10は、自動走行を可能にするための構成として、操舵輪の操舵ユニット、駆動輪を回転駆動する駆動モータ、制御ユニット13、等を備えている。
The trolley 10 is a four-wheeled vehicle configured to automatically travel along a pre-programmed predetermined route. The size of the trolley 10 is approximately 2 m in total length, 1 m in width, and 1 m in height. The trolley 10 is equipped with a steering unit for steering wheels, a drive motor for rotationally driving the drive wheels, a control unit 13, and the like to enable automatic travel.
制御ユニット13は、所定のルートに沿って移動できるように台車10の走行を制御するユニットである。制御ユニット13は、記憶部135を構成する記憶装置からルートデータを読み出し、台車10を制御する。なお、本例の構成では、台車10に搭載された制御ユニット13が、上記の座標変換回路131およびマッピング回路133としての機能を実現する。
The control unit 13 is a unit that controls the traveling of the cart 10 so that it can move along a predetermined route. The control unit 13 reads route data from a storage device that constitutes the memory section 135, and controls the cart 10. In the configuration of this example, the control unit 13 mounted on the cart 10 realizes the functions of the coordinate conversion circuit 131 and mapping circuit 133 described above.
なお、座標変換回路131、マッピング回路133、記憶部135は、生成システム1に必要な構成である一方、台車10に組み込まれていることは必須ではない。例えば、2次元測域センサ11、IMU22が取得した情報を、サーバ装置等に無線送信することも良い。この場合には、サーバ装置等が3次元地図を生成するための処理を実行できる。
Note that while the coordinate conversion circuit 131, the mapping circuit 133, and the storage unit 135 are necessary components of the generation system 1, it is not essential that they be incorporated into the trolley 10. For example, the information acquired by the two-dimensional range sensor 11 and IMU 22 may be wirelessly transmitted to a server device or the like. In this case, the server device or the like can execute processing for generating a three-dimensional map.
2次元測域センサ11(第1の2次元測域センサの一例。)は、台車10の前後方向の中心軸CLに対して斜交する平面DS(図3参照。)上に所在する対象物までの距離を計測することで3次元点群データを取得するセンサである。なお、2次元測域センサ11が3次元点群データを取得する平面DSは、上記の中心軸CLに対して直交する平面であっても良い。
The two-dimensional range sensor 11 (an example of a first two-dimensional range sensor) detects an object located on a plane DS (see FIG. 3) obliquely intersecting the central axis CL of the truck 10 in the front-rear direction. This sensor acquires three-dimensional point cloud data by measuring the distance to. Note that the plane DS on which the two-dimensional range sensor 11 acquires the three-dimensional point group data may be a plane orthogonal to the central axis CL.
IMU22(図2)は、慣性航法により台車10の変位量及び方位変化量を推定する慣性航法ユニットであって、測位情報取得回路の一例をなしている。IMU22は、磁気センサ221、加速度センサ222、ジャイロセンサ223を備えている。IMU22が推定する変位量は、台車10の位置的な変動量である。同様の方位変化量は、台車10の向きを表す方位の変化量である。2次元測域センサ11は、台車10に固定されている。IMU22が推定する変位量及び方位変化量は、2次元測域センサ11の変位量あるいは方位変化量として取り扱い可能である。
The IMU 22 (Figure 2) is an inertial navigation unit that estimates the displacement and orientation change of the trolley 10 by inertial navigation, and is an example of a positioning information acquisition circuit. The IMU 22 is equipped with a magnetic sensor 221, an acceleration sensor 222, and a gyro sensor 223. The displacement estimated by the IMU 22 is the positional fluctuation amount of the trolley 10. Similarly, the orientation change amount is the orientation change amount that indicates the direction of the trolley 10. The two-dimensional range sensor 11 is fixed to the trolley 10. The displacement and orientation change amount estimated by the IMU 22 can be treated as the displacement or orientation change amount of the two-dimensional range sensor 11.
本例の構成では、IMU22が推定する変位量及び方位変化量に基づいて制御ユニット13が、最新の車両位置あるいは車両方位を推定する。そして制御ユニット13は、所定のルートに沿って移動するように台車10の走行を制御する。所定のルートを表すルートデータは、制御ユニット13が読出可能なように、上記の記憶部135をなす記憶装置により記憶されている。なお、本例の構成では、車両位置及び車両方位について、絶対値が既知の初期値が設定されている。制御ユニット13は、初期位置、初期方位に対して、IMU22が推定した変位量、方位変化量を足し込むことで、最新の車両位置および車両方位を推定する。
In the configuration of this example, the control unit 13 estimates the latest vehicle position or vehicle orientation based on the displacement amount and orientation change amount estimated by the IMU 22. The control unit 13 then controls the traveling of the trolley 10 so that it moves along a predetermined route. Route data representing a predetermined route is stored in a storage device forming the storage section 135 described above so that the control unit 13 can read it. In the configuration of this example, initial values whose absolute values are known are set for the vehicle position and vehicle direction. The control unit 13 estimates the latest vehicle position and vehicle orientation by adding the displacement amount and orientation change amount estimated by the IMU 22 to the initial position and initial orientation.
座標変換回路131は、2次元測域センサ11が取得した3次元点群データに座標変換処理を施す回路である。2次元測域センサ11が取得する3次元点群データは、2次元測域センサ11の方位を基準としたローカル座標系の点群データである。座標変換回路131は、ローカル座標系に属する3次元点群データを、グローバル座標系に属する3次元点群データに変換する。この変換は、慣性航法により推定された車両位置、車両方位に基づく座標変換処理により実現される。
The coordinate conversion circuit 131 is a circuit that performs coordinate conversion processing on the three-dimensional point group data acquired by the two-dimensional range sensor 11. The three-dimensional point cloud data acquired by the two-dimensional range sensor 11 is point cloud data in a local coordinate system based on the direction of the two-dimensional range sensor 11. The coordinate conversion circuit 131 converts three-dimensional point group data belonging to the local coordinate system to three-dimensional point group data belonging to the global coordinate system. This conversion is realized by coordinate conversion processing based on the vehicle position and vehicle orientation estimated by inertial navigation.
マッピング回路133は、3次元点群データを3次元空間にマッピングすることで、3次元地図データを生成する回路である。マッピング回路133がマッピングする3次元点群データは、座標変換後のグローバル座標系に属する3次元点群データである。マッピング回路133は、記憶部135に記憶された3次元地図データに3次元点群データを追加的に書き込む処理を、新たな3次元点群データが取得される毎に随時、実行する。
The mapping circuit 133 is a circuit that generates three-dimensional map data by mapping three-dimensional point cloud data onto a three-dimensional space. The three-dimensional point group data mapped by the mapping circuit 133 is three-dimensional point group data belonging to the global coordinate system after coordinate transformation. The mapping circuit 133 executes a process of additionally writing three-dimensional point cloud data to the three-dimensional map data stored in the storage unit 135 every time new three-dimensional point cloud data is acquired.
次に、本例の生成システム1が実行する処理の流れを、図4のフロー図を参照して説明する。同図のフロー図は、3次元地図生成処理の流れを示している。なお、台車10を自動走行させるための処理の説明は省略する。
Next, the flow of processing executed by the generation system 1 of this example will be explained with reference to the flow diagram of FIG. 4. The flowchart in the figure shows the flow of three-dimensional map generation processing. Note that a description of the process for automatically driving the trolley 10 will be omitted.
台車10の移動中、IMU22によって変位量、方位変化量が推定されると共に(S101)、2次元測域センサ11によって3次元点群データが取得される(S102)。図5のように台車10は移動に応じて位置を変更しながら、3次元点群データを取得する。IMU22は、絶対位置が既知の初期位置および絶対方位が既知の方位を基準として変位量あるいは方位変化量を推定する。制御ユニット13は、IMU22が推定した変位量、方位変化量に基づいて、最新の車両位置、車両方位を推定する(S103)。
While the trolley 10 is moving, the IMU 22 estimates the amount of displacement and the amount of change in direction (S101), and the two-dimensional range sensor 11 acquires three-dimensional point cloud data (S102). As shown in FIG. 5, the cart 10 acquires three-dimensional point group data while changing its position as it moves. The IMU 22 estimates the amount of displacement or the amount of change in orientation based on the initial position whose absolute position is known and the orientation whose absolute orientation is known. The control unit 13 estimates the latest vehicle position and vehicle orientation based on the displacement amount and orientation change amount estimated by the IMU 22 (S103).
座標変換回路131(制御ユニット13)は、2次元測域センサ11を基準としたローカル座標系に属する3次元点群データに座標変換処理を施してグローバル座標系に属する3次元点群データに変換する(S104)。具体的には、座標変換回路131は、制御ユニット13により推定された車両位置、車両方位に基づいて上記の座標変換処理を実行する。
The coordinate conversion circuit 131 (control unit 13) performs coordinate conversion processing on three-dimensional point cloud data belonging to a local coordinate system based on the two-dimensional range sensor 11, and converts it into three-dimensional point cloud data belonging to a global coordinate system. (S104). Specifically, the coordinate conversion circuit 131 executes the above coordinate conversion process based on the vehicle position and vehicle orientation estimated by the control unit 13.
マッピング回路133(制御ユニット13)は、座標変換後のグローバル座標系の3次元点群データを3次元空間にマッピングし(S105)、これにより3次元地図を生成する。この3次元地図は、走行環境の各所を網羅するような完全なものではない。3次元地図は、走行環境のうち、台車10が移動してきた経路に対応する領域の3次元情報を含むものである。3次元地図は、図5のように台車10の移動に応じて、その移動距離が増えるにつれて完成度が向上する。
The mapping circuit 133 (control unit 13) maps the three-dimensional point group data of the global coordinate system after the coordinate transformation onto a three-dimensional space (S105), thereby generating a three-dimensional map. This three-dimensional map is not complete and covers every part of the driving environment. The three-dimensional map includes three-dimensional information of an area of the driving environment that corresponds to the route traveled by the trolley 10. As shown in FIG. 5, the degree of completion of the three-dimensional map improves as the distance traveled by the cart 10 increases.
なお、図6のように、2次元測域センサ11の他に、追加の2次元測域センサ(第3の2次元測域センサの一例。)113を台車10に設けることも良い。この場合には、3次元地図データの情報量を増やすことができる。例えば図6のように、前方下がりの平面DSが計測面である2次元測域センサ11の場合、2次元測域センサ11の取付位置よりも低い位置の対象物か、高い位置の対象物か、によって、3次元情報を取得できるタイミング及び側面が異なってくる。2次元測域センサ11の取付位置よりも低い位置の対象物の場合、台車10が近づいていくとき、台車10が対面する表側の面の3次元情報を取得できる。一方、2次元測域センサ11の取付高さよりも高い位置の対象物の場合、台車10が通り過ぎたあと、裏側の面の3次元情報を取得できるのみである。このように前方下がりの平面DSが計測面である2次元測域センサ11のみの場合、2次元測域センサ11よりも低い対象物については、台車10が近づいていくときに対面する表側の面に対して裏側にある面の3次元情報を取得することが難しい。同様に、2次元測域センサ11よりも高い位置の対象物については、裏側の面の3次元情報を取得できる一方、上記の表側の面の3次元情報を取得することが難しい。
Note that, as shown in FIG. 6, in addition to the two-dimensional range sensor 11, an additional two-dimensional range sensor (an example of a third two-dimensional range sensor) 113 may be provided on the trolley 10. In this case, the amount of information in the three-dimensional map data can be increased. For example, as shown in FIG. 6, in the case of a two-dimensional range sensor 11 whose measurement surface is a plane DS that slopes downward, is the object located at a lower position or higher than the mounting position of the two-dimensional range sensor 11? , the timing and aspects in which three-dimensional information can be acquired differ depending on the situation. In the case of an object located at a lower position than the mounting position of the two-dimensional range sensor 11, when the trolley 10 approaches, three-dimensional information of the front surface facing the trolley 10 can be acquired. On the other hand, in the case of an object located at a higher position than the mounting height of the two-dimensional range sensor 11, three-dimensional information on the back surface can only be acquired after the trolley 10 has passed. In this way, in the case of only the two-dimensional range sensor 11 whose measurement surface is the forward-sloping plane DS, for objects lower than the two-dimensional range sensor 11, the front surface facing the trolley 10 as it approaches It is difficult to obtain three-dimensional information about the surface on the back side. Similarly, for objects located higher than the two-dimensional range sensor 11, while it is possible to obtain three-dimensional information on the back surface, it is difficult to obtain three-dimensional information on the front surface.
そこで、図6のように、前方上がりの平面DS(第3の平面の一例。)が計測面である2次元測域センサ113を組み合わせると良い。2次元測域センサ113によれば、前方下がりの平面DSが計測面である2次元測域センサ11によっては3次元情報を取得できない側の面の3次元情報を取得できる。図6の2基の2次元測域センサ11、113が取得する3次元点群データは、同様に取り扱うことができ、それぞれ座標変換後の3次元点群データをマッピングすると良い。
As such, it is advisable to combine a two-dimensional range sensor 113 whose measurement surface is a forward-rising plane DS (an example of a third plane), as shown in Figure 6. The two-dimensional range sensor 113 can acquire three-dimensional information on a surface that cannot be acquired by the two-dimensional range sensor 11 whose measurement surface is a forward-sloping plane DS. The three-dimensional point cloud data acquired by the two two- dimensional range sensors 11 and 113 in Figure 6 can be handled in the same way, and it is advisable to map the three-dimensional point cloud data after coordinate transformation for each.
なお、2次元測域センサ11に加えて、台車10の中心軸CL(図3参照。)に平行をなすと共に鉛直方向に沿う平面(垂直面)が計測面である2次元測域センサを設けることも良い。この2次元測域センサの3次元点群データを利用すれば、台車10のピッチ方向の角度変化を容易に検出できる。例えば、取得時点が異なる2つの3次元点群データにおいて、一の対象点の変位に着目することで台車10のピッチ方向の角度変化を精度高く推定できる。
In addition to the two-dimensional range sensor 11, a two-dimensional range sensor is provided whose measurement surface is a plane parallel to the central axis CL of the cart 10 (see FIG. 3) and along the vertical direction (vertical plane). That's good too. By using the three-dimensional point group data of this two-dimensional range sensor, it is possible to easily detect the angular change in the pitch direction of the trolley 10. For example, by focusing on the displacement of one target point in two three-dimensional point cloud data acquired at different times, the angular change in the pitch direction of the trolley 10 can be estimated with high accuracy.
本例では、走行環境として工場などの施設内の経路を例示している。走行環境は、一般の車両が走行する道路やバス専用道路であっても良い。走行環境が道路の場合、計測車両は路面(移動平面)を移動しながら3次元地図データを生成する。
In this example, a route inside a facility such as a factory is exemplified as the driving environment. The driving environment may be a road on which ordinary vehicles travel or a road exclusively for buses. When the driving environment is a road, the measurement vehicle generates three-dimensional map data while moving on the road surface (moving plane).
(実施例2)
本例は、実施例1の生成システムを基準として、床面に配設された磁気マーカ100を利用して車両位置及び車両方位の推定精度を高めた例である。この内容について、図7~図10を参照して説明する。 (Example 2)
This example is an example in which the generation system of Example 1 is used as a reference, and the accuracy of estimating the vehicle position and vehicle orientation is improved by using themagnetic marker 100 disposed on the floor. The contents will be explained with reference to FIGS. 7 to 10.
本例は、実施例1の生成システムを基準として、床面に配設された磁気マーカ100を利用して車両位置及び車両方位の推定精度を高めた例である。この内容について、図7~図10を参照して説明する。 (Example 2)
This example is an example in which the generation system of Example 1 is used as a reference, and the accuracy of estimating the vehicle position and vehicle orientation is improved by using the
磁気マーカ100は、図7のごとく、走路の床面に貼付け可能な永久磁石シートである。磁気マーカ100は、直径100mm、厚さ約2mmの円形シート状をなしている。永久磁石シートは、酸化鉄の磁粉を分散させた高分子材料を、シート状に成形したフェライトラバーマグネットシートである。
The magnetic marker 100 is a permanent magnetic sheet that can be attached to the floor of the track, as shown in FIG. The magnetic marker 100 has a circular sheet shape with a diameter of 100 mm and a thickness of about 2 mm. The permanent magnet sheet is a ferrite rubber magnet sheet made of a polymer material in which iron oxide magnetic powder is dispersed and formed into a sheet shape.
磁気マーカ100の表面100Sには、無線により情報を出力する無線タグであるRFID(Radio Frequency IDentification)タグ15が貼付されている。RFIDタグ15は、例えばPET(Polyethylene terephthalate)フィルムから切り出したタグシートの表面にICチップが実装されたシート状の電子部品である。
An RFID (Radio Frequency IDentification) tag 15, which is a wireless tag that outputs information wirelessly, is attached to the surface 100S of the magnetic marker 100. The RFID tag 15 is a sheet-shaped electronic component in which an IC chip is mounted on the surface of a tag sheet cut out from, for example, a PET (Polyethylene terephthalate) film.
情報提供部をなすRFIDタグ15は、台車(計測車両)10側から無線で供給された電力により動作し、磁気マーカ100の固有情報の一例であるタグID(識別情報)を送信する。台車10側では、記憶部135(図9)により、タグIDをひも付けて磁気マーカ100の位置情報が記憶されている。台車10側では、磁気マーカ100のタグIDを取得できれば、対応する磁気マーカ100の位置情報の読出が可能である。
The RFID tag 15, which constitutes the information providing unit, operates using power supplied wirelessly from the trolley (measurement vehicle) 10, and transmits a tag ID (identification information), which is an example of unique information of the magnetic marker 100. On the trolley 10 side, the memory unit 135 (Figure 9) stores the position information of the magnetic marker 100 linked to the tag ID. If the trolley 10 side can obtain the tag ID of the magnetic marker 100, it is possible to read the position information of the corresponding magnetic marker 100.
なお、磁気マーカ100をなすフェライトラバーマグネットは、磁粉が高分子材料中に分散されたものであるため、導電性が低いという電気的な特性を有する。それ故、磁気マーカ100では、RFIDタグ15への無線給電に応じた渦電流等の発生を抑制できる。磁気マーカ100に付設されたRFIDタグ15は、無線伝送された電力を効率良く受電できる。
Note that the ferrite rubber magnet forming the magnetic marker 100 has magnetic particles dispersed in a polymeric material, and therefore has an electrical characteristic of low conductivity. Therefore, the magnetic marker 100 can suppress the generation of eddy currents and the like in response to wireless power supply to the RFID tag 15. The RFID tag 15 attached to the magnetic marker 100 can efficiently receive wirelessly transmitted power.
台車10は、図8及び図9のごとく、実施例1の構成に加えて、磁気センサCnを含むセンサアレイ21、タグリーダ24を備えている。センサアレイ21は、図9のごとく、一直線上に配列された15個の磁気センサCn(nは1~15の整数)と、図示しないCPU等を内蔵した検出処理回路212と、を備えている。なお、センサアレイ21では、15個の磁気センサCnが5cmの等間隔で配置されている。
As shown in FIGS. 8 and 9, the trolley 10 includes a sensor array 21 including a magnetic sensor Cn and a tag reader 24 in addition to the configuration of the first embodiment. As shown in FIG. 9, the sensor array 21 includes 15 magnetic sensors Cn (n is an integer from 1 to 15) arranged in a straight line, and a detection processing circuit 212 incorporating a CPU (not shown). . Note that in the sensor array 21, 15 magnetic sensors Cn are arranged at equal intervals of 5 cm.
磁気センサCnは、公知のMI効果(Magnet Impedance Effect)を利用して磁気を検出するセンサである。MI効果は、アモルファスワイヤなどの感磁体のインピーダンスが外部磁界に応じて敏感に変化するという磁気的な効果である。磁気センサCnは、直線的に組み込まれるアモルファスワイヤの軸方向に作用する磁気の検出が可能である。各磁気センサCnは、磁気の検出方向が同じになるようにセンサアレイ21に組み込まれている。センサアレイ21は、各磁気センサCnが鉛直方向の磁気を検出できるように台車10に取り付けられている。本例の構成では、このセンサアレイ21が、車幅方向に沿うように台車10に取り付けられる。
The magnetic sensor Cn is a sensor that detects magnetism using the known MI effect (Magnet Impedance Effect). The MI effect is a magnetic effect in which the impedance of a magnetically sensitive material such as an amorphous wire changes sensitively in response to an external magnetic field. The magnetic sensor Cn is capable of detecting magnetism acting in the axial direction of the linearly incorporated amorphous wire. Each magnetic sensor Cn is incorporated into the sensor array 21 so that the magnetic detection direction is the same. The sensor array 21 is attached to the trolley 10 so that each magnetic sensor Cn can detect magnetism in the vertical direction. In the configuration of this example, this sensor array 21 is attached to the truck 10 along the vehicle width direction.
センサアレイ21の検出処理回路212は、磁気マーカ100を検出するためのマーカ検出処理などを実行する演算回路である。この検出処理回路212は、各種の演算を実行するCPU(central processing unit)、ROM(read only memory)やRAM(random access memory)などのメモリ素子、等を利用して構成されている。
The detection processing circuit 212 of the sensor array 21 is an arithmetic circuit that executes marker detection processing for detecting the magnetic marker 100 and the like. The detection processing circuit 212 is configured using a CPU (central processing unit) that executes various calculations, memory elements such as ROM (read only memory) and RAM (random access memory), and the like.
検出処理回路212は、各磁気センサCnが出力するセンサ信号を、例えば3kHz周期で取得してマーカ検出処理を実行する。マーカ検出処理では、磁気マーカ100の検出に加えて、磁気マーカ100に対する台車10の横ずれ量の計測が行われる。検出処理回路212は、例えば、車幅方向に配列された磁気センサCnの磁気計測値の分布のうちのピーク値の車幅方向の位置を特定することで横ずれ量を計測する。検出処理回路212は、磁気マーカ100を検出した際、検出した旨と共に横ずれ量を含む検出結果を制御ユニット13に入力する。
The detection processing circuit 212 acquires sensor signals output from each magnetic sensor Cn at a cycle of, for example, 3 kHz, and executes marker detection processing. In the marker detection process, in addition to detecting the magnetic marker 100, the amount of lateral deviation of the cart 10 with respect to the magnetic marker 100 is measured. The detection processing circuit 212 measures the amount of lateral deviation, for example, by identifying the position in the vehicle width direction of the peak value of the distribution of magnetic measurement values of the magnetic sensors Cn arranged in the vehicle width direction. When the detection processing circuit 212 detects the magnetic marker 100, it inputs the detection result including the amount of lateral shift to the control unit 13 along with the fact that the magnetic marker 100 has been detected.
タグリーダ24(図8、図9)は、磁気マーカ100に付設されたRFIDタグ15(図7)と無線で通信する通信ユニットである。タグリーダ24は、RFIDタグ15の動作に必要な電力を無線で送電し、RFIDタグ15が送信するタグIDを受信する。磁気マーカ100の検出に応じてタグリーダ24が動作を開始するように構成することも良い。
The tag reader 24 (FIGS. 8 and 9) is a communication unit that wirelessly communicates with the RFID tag 15 (FIG. 7) attached to the magnetic marker 100. The tag reader 24 wirelessly transmits the power necessary for the operation of the RFID tag 15 and receives the tag ID transmitted by the RFID tag 15. The tag reader 24 may also be configured to start operating in response to the detection of the magnetic marker 100.
次に、本例の生成システム1の動作について、図10のフロー図を参照して説明する。同図の処理フローは、実施例1で参照した図4の処理フローに基づき、ステップS111~S115が追加されたフローである。ステップS111の判断においてYESのとき、すなわち磁気マーカ100が検出されたとき、ステップS112~S115の処理が実行される。一方、ステップS111の判断においてNOのとき、すなわち磁気マーカ100が未検出の場合には、ステップS112~S115の処理は実行されずに迂回される。この場合の処理内容は、実施例1の図4のフロー図による処理と全く同じになる。
Next, the operation of the generation system 1 of this example will be explained with reference to the flow diagram of FIG. The processing flow in the figure is based on the processing flow in FIG. 4 referred to in the first embodiment, with steps S111 to S115 added. When the determination in step S111 is YES, that is, when the magnetic marker 100 is detected, the processes in steps S112 to S115 are executed. On the other hand, if the determination in step S111 is NO, that is, if the magnetic marker 100 is not detected, the processes in steps S112 to S115 are not executed and are bypassed. The processing contents in this case are exactly the same as the processing according to the flowchart of FIG. 4 of the first embodiment.
磁気マーカが検出されたか否かを判断するステップS111は、3次元点群データが取得されると共に(S102)、慣性航法により車両位置及び車両方位が推定された後(S103)、実行される。ステップS111では、検出処理回路212から入力されたマーカ検出処理の検出結果を利用して磁気マーカ100の検出の有無が判断される。なお、磁気マーカ100が検出された場合の検出結果には、上記の通り、検出の旨や、検出された磁気マーカ100に対する台車10の横ずれ量などが含まれる。
Step S111 of determining whether a magnetic marker has been detected is executed after three-dimensional point group data is acquired (S102) and the vehicle position and vehicle orientation are estimated by inertial navigation (S103). In step S111, the presence or absence of detection of the magnetic marker 100 is determined using the detection result of the marker detection process input from the detection processing circuit 212. Note that, as described above, the detection result when the magnetic marker 100 is detected includes the fact that the magnetic marker 100 has been detected, the amount of lateral deviation of the trolley 10 with respect to the detected magnetic marker 100, and the like.
磁気マーカ100が検出された場合(S111:YES)、その磁気マーカ100に保持されたRFIDタグ15からタグIDが読み取られる(S112)。制御ユニット13は、読み取ったタグIDを利用して記憶部135の記憶データを参照し、そのタグIDがひも付けられた磁気マーカ100の位置情報を取得する(S113)。そして、取得した位置情報が表す磁気マーカ100の配設位置を基準として、マーカ検出処理の検出結果に含まれる横ずれ量の分だけずらした位置を演算により特定する(S114、測位情報取得回路)。制御ユニット13は、ステップS103で推定された車両位置を、ステップS114で特定された車両位置に置き換える(S115)。
If the magnetic marker 100 is detected (S111: YES), the tag ID is read from the RFID tag 15 held by the magnetic marker 100 (S112). The control unit 13 uses the read tag ID to refer to the stored data in the storage unit 135 and acquires the position information of the magnetic marker 100 to which the tag ID is attached (S113). Then, based on the placement position of the magnetic marker 100 represented by the acquired position information, a position shifted by the amount of lateral shift included in the detection result of the marker detection process is specified by calculation (S114, positioning information acquisition circuit). Control unit 13 replaces the vehicle position estimated in step S103 with the vehicle position specified in step S114 (S115).
このように本例の生成システム1では、走路に配設された磁気マーカ100を利用して車両位置が精度高く推定される。慣性航法による車両位置の推定では、原理的に誤差の累積が不可避である。慣性航法によって車両位置等の推定を実行する一方、磁気マーカ100が検出されたとき、その磁気マーカ100を基準として車両位置を推定すれば、慣性航法によって累積された誤差をリセットできる。磁気マーカ100の検出に応じて車両位置を推定できた後の慣性航法では、その車両位置を基準とした変位量を推定すると良い。
As described above, in the generation system 1 of this example, the vehicle position is estimated with high accuracy using the magnetic markers 100 arranged on the road. In estimating a vehicle position using inertial navigation, accumulation of errors is inevitable in principle. While estimating the vehicle position etc. by inertial navigation, when the magnetic marker 100 is detected, if the vehicle position is estimated using the magnetic marker 100 as a reference, the error accumulated by inertial navigation can be reset. In inertial navigation after the vehicle position has been estimated according to the detection of the magnetic marker 100, it is preferable to estimate the amount of displacement based on the vehicle position.
なお、IMUが備える加速度センサやジャイロセンサ等は、温度変化や経時変化等に起因してオフセット誤差が生じやすい。オフセット誤差とは、本来、センサ出力がゼロとなる状況においてセンサ出力がゼロにならない、いわゆるゼロ点ずれである。加速度センサやジャイロセンサのゼロ点ずれは、慣性航法における推定誤差の原因となる。磁気マーカ100が検出されたとき、慣性航法による車両位置と、磁気マーカ100を基準とした車両位置と、の位置ずれを特定することも良い。慣性航法による車両位置の位置ずれに基づいて慣性航法による推定誤差を特定でき、これにより加速度センサ等、慣性情報を取得するセンサのキャリブレーションが可能になる。
It should be noted that the acceleration sensor, gyro sensor, etc. included in the IMU are prone to offset errors due to temperature changes, changes over time, and the like. The offset error is essentially a so-called zero point shift in which the sensor output does not become zero in a situation where the sensor output is zero. A zero point shift of an acceleration sensor or a gyro sensor causes an estimation error in inertial navigation. When the magnetic marker 100 is detected, it is also possible to specify the positional deviation between the vehicle position determined by inertial navigation and the vehicle position based on the magnetic marker 100. Estimated errors due to inertial navigation can be identified based on the positional deviation of the vehicle position due to inertial navigation, thereby making it possible to calibrate sensors such as acceleration sensors that acquire inertial information.
磁気マーカ100(メインの磁気マーカ)の近くに、車両方位を推定するためのサブの磁気マーカを配置することも良い。メインの磁気マーカとサブの磁気マーカとを結ぶ絶対方位の情報を、位置情報と共にタグIDにひも付けることも良い。メインの磁気マーカとサブの磁気マーカとの距離は、台車10の全長に満たないような短い距離を設定すると良い。このような短い距離であれば、台車10の方位変化を無視することができ、台車10の方位を精度高く推定できる(測位情報取得回路)。磁気マーカ100を利用して絶対方位(車両方位)を推定できたとき、慣性航法により推定された車両方位を、新たに推定された絶対方位に置き換えると良い。
It is also good to arrange a sub-magnetic marker for estimating the vehicle heading near the magnetic marker 100 (main magnetic marker). It is also good to link information on the absolute orientation connecting the main magnetic marker and the sub magnetic marker to the tag ID together with the position information. The distance between the main magnetic marker and the sub magnetic marker is preferably set to a short distance that is less than the entire length of the cart 10. With such a short distance, changes in the orientation of the truck 10 can be ignored, and the orientation of the truck 10 can be estimated with high accuracy (positioning information acquisition circuit). When the absolute heading (vehicle heading) can be estimated using the magnetic marker 100, it is preferable to replace the vehicle heading estimated by inertial navigation with the newly estimated absolute heading.
全ての磁気マーカ100に対してサブの磁気マーカを付設することも良い。この場合には、(メインの)磁気マーカ100を検出する毎に、車両位置及び車両方位の推定が可能になるので、IMU22を含まないシステム構成が可能である。IMU22と組み合せる場合であれば、一部の磁気マーカ100に対して、サブの磁気マーカを付設するだけでも良い。
It is also good to attach a sub magnetic marker to all the magnetic markers 100. In this case, the vehicle position and vehicle orientation can be estimated every time the (main) magnetic marker 100 is detected, so a system configuration that does not include the IMU 22 is possible. If it is combined with the IMU 22, it is sufficient to simply attach sub magnetic markers to some of the magnetic markers 100.
上記のIMU22を含まないシステム構成の場合、磁気マーカ100の検出に応じて3次元点群データを取得することも良い。磁気マーカ100を利用して推定された車両位置及び車両方位に応じて3次元点群データを座標変換し、3次元空間にマッピングすると良い。この場合には、磁気マーカ100の位置毎の離散的な3次元データを含む3次元地図を生成できる。例えば、3次元地図を利用する側の自動搬送車は、磁気マーカ100を検出する毎に3次元地図を参照すれば良い。
なお、その他の構成及び作用効果については、実施例1と同様である。 In the case of a system configuration that does not include theIMU 22, three-dimensional point cloud data may be acquired in response to detection of the magnetic marker 100. The three-dimensional point cloud data may be subjected to coordinate conversion in response to the vehicle position and vehicle orientation estimated using the magnetic marker 100, and may be mapped into a three-dimensional space. In this case, a three-dimensional map including discrete three-dimensional data for each position of the magnetic marker 100 may be generated. For example, an automated guided vehicle that uses the three-dimensional map may refer to the three-dimensional map every time it detects a magnetic marker 100.
Other configurations and effects are the same as those of the first embodiment.
なお、その他の構成及び作用効果については、実施例1と同様である。 In the case of a system configuration that does not include the
Other configurations and effects are the same as those of the first embodiment.
(実施例3)
本例は、実施例1の生成システムに基づき、車両位置、車両方位を推定するために、IMUに代えて2次元測域センサ112(第2の2次元測域センサの一例。)を採用した例である。この内容について、図11~図13を参照して説明する。 (Example 3)
This example is based on the generation system of Example 1, and employs a two-dimensional range sensor 112 (an example of a second two-dimensional range sensor) instead of the IMU in order to estimate the vehicle position and vehicle direction. This is an example. The contents will be explained with reference to FIGS. 11 to 13.
本例は、実施例1の生成システムに基づき、車両位置、車両方位を推定するために、IMUに代えて2次元測域センサ112(第2の2次元測域センサの一例。)を採用した例である。この内容について、図11~図13を参照して説明する。 (Example 3)
This example is based on the generation system of Example 1, and employs a two-dimensional range sensor 112 (an example of a second two-dimensional range sensor) instead of the IMU in order to estimate the vehicle position and vehicle direction. This is an example. The contents will be explained with reference to FIGS. 11 to 13.
本例の生成システム1を構成する台車10(図11)は、3次元地図データの元データとなる3次元点群データを取得する2次元測域センサ11に加えて、車両位置、車両方位を推定するための3次元点群データを取得する2次元測域センサ112を備えている。2次元測域センサ112が距離を計測する平面DS(第2の平面の一例。)は、台車10の前後方向の中心軸CLに平行をなす水平面である。
The trolley 10 (FIG. 11) that constitutes the generation system 1 of this example is equipped with a two-dimensional range sensor 11 that acquires three-dimensional point cloud data that is the source data of three-dimensional map data, as well as a vehicle position and vehicle orientation. It is equipped with a two-dimensional range sensor 112 that acquires three-dimensional point cloud data for estimation. The plane DS (an example of a second plane) on which the two-dimensional range sensor 112 measures distance is a horizontal plane parallel to the center axis CL of the cart 10 in the front-rear direction.
制御ユニット13(図12)は、2次元測域センサ112が取得する時系列の3次元点群データを利用して車両位置、車両方位を推定するように構成されている。制御ユニット13は、この3次元点群データを処理することで台車10の変位量、方位変化量を推定する。例えば、いずれか一の対象物までの距離や方位の時間的変化がわかれば、台車10の変位量、方位変化量を推定できる。制御ユニット13は、絶対位置が既知である基準の車両位置、及び絶対方位が既知である基準となる車両方位に対し、推定された変位量あるいは方位変化量を足し込むことで、最新の車両位置、車両方位を推定する。
The control unit 13 (FIG. 12) is configured to estimate the vehicle position and vehicle orientation using time-series three-dimensional point group data acquired by the two-dimensional range sensor 112. The control unit 13 estimates the amount of displacement and the amount of change in direction of the truck 10 by processing this three-dimensional point group data. For example, if the distance to any one object and the change in direction over time are known, the amount of displacement and the amount of change in direction of the trolley 10 can be estimated. The control unit 13 calculates the latest vehicle position by adding the estimated displacement amount or azimuth change amount to the reference vehicle position whose absolute position is known and the reference vehicle direction whose absolute direction is known. , estimate vehicle heading.
本例の生成システム1による動作の流れを、図13のフロー図を参照しながら説明する。計測車両の一例をなす台車10の移動中に、2次元測域センサ11による3次元点群データ(A)および2次元測域センサ112による3次元点群データ(B)が取得される(S121)。制御ユニット13は、3次元点群データ(B)に処理を施すことで台車10の変位量、方位変化量を推定し(S122)、さらに、車両位置、車両方位を推定する(S123、測位情報取得回路)。
The flow of the operation by the generation system 1 of this example will be explained with reference to the flow diagram of FIG. 13. While the trolley 10, which is an example of a measurement vehicle, is moving, three-dimensional point cloud data (A) by the two-dimensional range sensor 11 and three-dimensional point cloud data (B) by the two-dimensional range sensor 112 are acquired (S121 ). The control unit 13 estimates the amount of displacement and the amount of change in direction of the bogie 10 by processing the three-dimensional point group data (B) (S122), and further estimates the vehicle position and vehicle direction (S123, positioning information acquisition circuit).
制御ユニット13は、推定した車両位置及び車両方位に基づき、2次元測域センサ11によって取得された3次元点群データ(A)に座標変換処理を施す(S124)。そして、制御ユニット13は、座標変換後の3次元点群データをマッピングすることにより3次元地図データを生成する(S125)。
The control unit 13 performs coordinate transformation processing on the three-dimensional point group data (A) acquired by the two-dimensional range sensor 11 based on the estimated vehicle position and vehicle orientation (S124). Then, the control unit 13 generates three-dimensional map data by mapping the three-dimensional point group data after the coordinate transformation (S125).
なお、2次元測域センサ11に係る3次元点群データをマッピングするのに加えて、2次元測域センサ112による3次元点群データに座標変換処理を施した3次元点群データをマッピングすることも良い。
In addition to mapping the 3D point cloud data from the 2D range sensor 11, 3D point cloud data obtained by performing coordinate transformation processing on the 3D point cloud data from the 2D range sensor 112 is mapped. That's good too.
なお、2次元測域センサ11、112に加えて、台車10の中心軸CLに平行をなすと共に鉛直方向に沿う平面が計測面である2次元測域センサを設けることも良い。この場合には、2次元測域センサの3次元点群データを利用して台車10のピッチ方向の角度変化を検出できる。
In addition to the two- dimensional range sensors 11 and 112, it is also possible to provide a two-dimensional range sensor whose measurement surface is a plane parallel to the central axis CL of the trolley 10 and along the vertical direction. In this case, the angular change in the pitch direction of the truck 10 can be detected using the three-dimensional point group data of the two-dimensional range sensor.
なお、本例のシステムにIMUを組み合わせることも良い。この場合、IMUが取得する測位情報に基づいて慣性航法による車両位置及び車両方位を推定する一方、2次元測域センサ112による3次元点群データに基づく測位情報により、慣性航法の推定誤差を特定することも良い。慣性航法の推定誤差を特定できれば、その後、慣性航法による推定精度を高める補正が可能である。
なお、その他の構成及び作用効果については実施例1と同様である。 Note that it is also possible to combine the system of this example with an IMU. In this case, the vehicle position and vehicle direction are estimated by inertial navigation based on the positioning information acquired by the IMU, while the estimation error of inertial navigation is identified by the positioning information based on the 3D point cloud data from the2D range sensor 112. It's also good to do. If the estimation error of inertial navigation can be identified, then it is possible to make corrections to improve the estimation accuracy of inertial navigation.
Note that the other configurations and effects are the same as in Example 1.
なお、その他の構成及び作用効果については実施例1と同様である。 Note that it is also possible to combine the system of this example with an IMU. In this case, the vehicle position and vehicle direction are estimated by inertial navigation based on the positioning information acquired by the IMU, while the estimation error of inertial navigation is identified by the positioning information based on the 3D point cloud data from the
Note that the other configurations and effects are the same as in Example 1.
(実施例4)
本例は、実施例1の生成システムに基づき、IMUが備えるセンサのキャリブレーションを可能とする仕組みを追加した例である。この内容について、図14~図16を参照して説明する。 (Example 4)
This example is based on the generation system of Example 1, and adds a mechanism that enables calibration of the sensor included in the IMU. The contents will be explained with reference to FIGS. 14 to 16.
本例は、実施例1の生成システムに基づき、IMUが備えるセンサのキャリブレーションを可能とする仕組みを追加した例である。この内容について、図14~図16を参照して説明する。 (Example 4)
This example is based on the generation system of Example 1, and adds a mechanism that enables calibration of the sensor included in the IMU. The contents will be explained with reference to FIGS. 14 to 16.
本例の生成システム1は、走行環境中に設置されたキャリブレーションポスト(棒状部材の一例)101を利用して加速度センサやジャイロセンサなど、慣性情報を計測するセンサ(IMU)のキャリブレーションを実行可能としたシステムである。
The generation system 1 of this example uses a calibration post (an example of a rod-shaped member) 101 installed in the driving environment to calibrate a sensor (IMU) that measures inertial information, such as an acceleration sensor or a gyro sensor. This is a system that makes it possible.
キャリブレーションポスト101(図14)は、鉛直方向(既知の方位の一例。)に沿うように床面に立設された直径10cmの円柱である。キャリブレーションポスト101の上端の高さは1mである。キャリブレーションポスト101は、例えば、台車10が走行するルートの脇の箇所、あるいは曲がり角の奥、すなわち進入方向から見込んで突き当る箇所に設置されている。キャリブレーションポスト101は、路面の平坦度が十分に高い箇所に設置されている。それ故、台車10がキャリブレーションポスト101を観測する際のピッチ方向の方位変化を無視できる。
The calibration post 101 (FIG. 14) is a cylinder with a diameter of 10 cm that is erected on the floor along the vertical direction (an example of a known orientation). The height of the upper end of the calibration post 101 is 1 m. The calibration post 101 is installed, for example, at a side of the route along which the trolley 10 travels, or at the back of a corner, that is, at a location where it hits when viewed from the approach direction. The calibration post 101 is installed at a location where the flatness of the road surface is sufficiently high. Therefore, the azimuth change in the pitch direction when the trolley 10 observes the calibration post 101 can be ignored.
キャリブレーションポスト101は、周囲に類似する構造物が存在していない箇所に設けられる。それ故、台車10側では、キャリブレーションポスト101を容易に識別可能である。なお、台車10が接近する際に対面する側の表面に、縦方向に延在する所定幅の溝を設ける等、特徴的な形状を設けることも良い。キャリブレーションポスト101の上記の表面をなす断面形状を計測すれば、キャリブレーションポスト101を一層容易に識別できるようになる。
The calibration post 101 is provided at a location where there are no similar structures around it. Therefore, the calibration post 101 can be easily identified on the trolley 10 side. Note that it is also possible to provide a characteristic shape, such as providing a groove of a predetermined width extending in the vertical direction, on the surface of the side facing the cart 10 when it approaches. By measuring the cross-sectional shape of the above-mentioned surface of the calibration post 101, the calibration post 101 can be identified more easily.
続いて、本例の生成システム1の動作を、図15のフロー図および図16を利用して説明する。なお、本例の生成システム1では、キャリブレーション期間を表すキャリブレーションフラグが設定され、制御ユニットにより管理されている。キャリブレーションポスト101が最初に観測されると、キャリブレーションフラグに1がセットされてキャリブレーション期間に移行する。
Next, the operation of the generation system 1 of this example will be explained using the flow diagram of FIG. 15 and FIG. 16. Note that in the generation system 1 of this example, a calibration flag indicating a calibration period is set and managed by the control unit. When the calibration post 101 is observed for the first time, the calibration flag is set to 1 and the calibration period begins.
台車10は、移動中に3次元点群データを取得する(S201)。制御ユニットは、キャリブレーションフラグがゼロの場合(S202:YES)、取得した3次元点群データを処理することで、キャリブレーションポスト101の3次元データが含まれるか否か、すなわちキャリブレーションポスト101が観測されたか否かを判断する(S203)。キャリブレーションポスト101が未観測の場合はキャリブレーションフラグがゼロなので、S201→S202:YES→S203:NOのフローに沿って3次元点群データの取得が繰り返し実行されることになる。
The trolley 10 acquires three-dimensional point group data while moving (S201). When the calibration flag is zero (S202: YES), the control unit processes the acquired three-dimensional point group data to determine whether the three-dimensional data of the calibration post 101 is included, that is, the calibration post 101. is observed (S203). If the calibration post 101 has not been observed, the calibration flag is zero, so acquisition of three-dimensional point group data is repeatedly executed in accordance with the flow of S201 → S202: YES → S203: NO.
キャリブレーションフラグがゼロのときにキャリブレーションポスト101が観測された場合(S202:YES→S203:YES)、制御ユニットは、ステップS204の処理を実行する。このステップS204では、次の3つの処理が実行される。第1の処理は、上記のステップS201で取得された3次元点群データからキャリブレーションポスト101の観測点P1(図16参照。一の対象点の一例。)の3次元データを抽出する処理である。第2の処理は、所定量dxの分だけ台車10が走行して変位した場合に予測されるキャリブレーションポスト101の観測点P2(図16参照。予測対象点の一例。)の3次元データを、シミュレーション計算により算出する処理である。第3の処理は、キャリブレーションフラグに1を設定する処理である。キャリブレーションフラグに1が設定されるとキャリブレーション期間に移行する。
If the calibration post 101 is observed when the calibration flag is zero (S202: YES → S203: YES), the control unit executes the process of step S204. In this step S204, the following three processes are executed. The first process is a process of extracting the 3D data of the observation point P1 of the calibration post 101 (see FIG. 16, an example of one target point) from the 3D point cloud data acquired in step S201 above. be. In the second process, the three-dimensional data of the observation point P2 of the calibration post 101 (see FIG. 16, an example of a prediction target point) is calculated when the trolley 10 travels and is displaced by a predetermined amount dx. , is a process calculated by simulation calculation. The third process is a process of setting the calibration flag to 1. When the calibration flag is set to 1, the calibration period begins.
制御ユニットは、IMUによる推定変位量が所定量dxに達するまで、3次元点群データの取得を繰り返し実行する(S205:NO→S201)。このとき、キャリブレーション期間にあり、キャリブレーションフラグのフラグ値がゼロではないため(S202:NO)、上記のステップS203、S204の処理が迂回される。ここで、IMUが推定する変位量には、加速度センサやジャイロセンサの計測誤差に起因する推定誤差が含まれる。そのため、ステップS205において推定変位量が所定量dxに達したと判断されたときの実際の変位量dxsは、所定量dxとは異なる可能性が高い。
The control unit repeatedly acquires the three-dimensional point group data until the estimated displacement amount by the IMU reaches the predetermined amount dx (S205: NO → S201). At this time, since the calibration period is in progress and the flag value of the calibration flag is not zero (S202: NO), the processes of steps S203 and S204 described above are bypassed. Here, the amount of displacement estimated by the IMU includes an estimation error caused by measurement errors of the acceleration sensor and the gyro sensor. Therefore, the actual displacement amount dxs when it is determined in step S205 that the estimated displacement amount has reached the predetermined amount dx is highly likely to be different from the predetermined amount dx.
台車10の推定変位量が所定量dxに到達したと判断されたとき(S205:YES)、制御ユニットは、2次元測域センサ11による実測の3次元点群データからキャリブレーションポスト101の観測点P2s(図16参照。他の対象点の一例。)を抽出し、観測点P2sの3次元データを特定する(S206)。なお、台車10の推定変位量には慣性航法による推定誤差が含まれる。そのため、上記のごとく、台車10の推定変位量が所定量dxに到達したと判断されたときの実際の変位量はdxsとなる。上記の観測点P2sの3次元データは、台車10の変位量がdxに達した時点のものではなく、変位量がdxsに到達したときのものである。
When it is determined that the estimated displacement amount of the trolley 10 has reached the predetermined amount dx (S205: YES), the control unit calculates the observation point of the calibration post 101 from the actually measured three-dimensional point group data by the two-dimensional range sensor 11. P2s (see FIG. 16; an example of another target point) is extracted, and three-dimensional data of observation point P2s is specified (S206). Note that the estimated displacement amount of the truck 10 includes an estimation error due to inertial navigation. Therefore, as described above, when it is determined that the estimated displacement amount of the truck 10 has reached the predetermined amount dx, the actual displacement amount is dxs. The above three-dimensional data at the observation point P2s is not the data at the time when the displacement amount of the trolley 10 reaches dx, but the data at the time when the displacement amount reaches dxs.
制御ユニットは、シミュレーション計算によるキャリブレーションポスト101の観測点P2の3次元データと、実測によるキャリブレーションポスト101の観測点P2sの3次元データと、を比較し誤差(推定誤差)を特定する(S207)。シミュレーション計算による3次元データは、上記のステップS204により求めた観測点P2の3次元データである。実測の観測点P2sの3次元データは、上記のステップS206で特定された観測点P2sの3次元データである。
The control unit compares the three-dimensional data of the observation point P2 of the calibration post 101 based on the simulation calculation with the three-dimensional data of the observation point P2s of the calibration post 101 based on the actual measurement, and identifies an error (estimated error) (S207 ). The three-dimensional data obtained by the simulation calculation is the three-dimensional data of the observation point P2 obtained in step S204 above. The three-dimensional data of the actual observation point P2s is the three-dimensional data of the observation point P2s specified in step S206 above.
制御ユニットは、ステップS207で特定した誤差を利用して、加速度センサおよびジャイロセンサのキャリブレーションを実行する(S208)。キャリブレーションは、例えば、各センサのゼロ点を調整する、いわゆるバイアス調整である。加速度センサのキャリブレーションは、主として、観測点P2及び観測点P2sの鉛直方向の位置的な誤差に基づいて実行できる。ジャイロセンサのキャリブレーションは、主として、観測点P2及び観測点P2sの水平方向の位置的な誤差に基づいて実行できる。このようなキャリブレーションが実行された後、キャリブレーションフラグがゼロにリセットされる(S208)。
The control unit uses the error identified in step S207 to calibrate the acceleration sensor and gyro sensor (S208). Calibration is, for example, so-called bias adjustment that adjusts the zero point of each sensor. Calibration of the acceleration sensor can be performed mainly based on the vertical positional error of the observation point P2 and the observation point P2s. Calibration of the gyro sensor can be performed mainly based on the horizontal positional error of observation point P2 and observation point P2s. After such calibration is performed, the calibration flag is reset to zero (S208).
以上の説明は、床面が水平であり、台車10の方位(車両方位)がピッチング方向に変化しないことを前提としている。しかしながら、台車10の加減速や輪の空気圧等によってピッチング方向の変化が生じることがある。このようなピッチング方向の変化がある状況は、センサのキャリブレーションには適していない。
The above explanation is based on the assumption that the floor surface is horizontal and the orientation of the truck 10 (vehicle direction) does not change in the pitching direction. However, the pitching direction may change depending on the acceleration/deceleration of the truck 10, the air pressure of the wheels, etc. Such a situation in which the pitching direction changes is not suitable for sensor calibration.
そこで、キャリブレーションポスト101を同時に観測可能な2基の2次元測域センサを台車10に搭載することも良い。台車10の方位にピッチング方向の変化が生じていない場合に、上記の2基の2次元測域センサによって同時に観測されるキャリブレーションポスト上の位置に、水平な棒や反射テープなどのマーキングを取り付けることも良い。あるいは、キャリブレーションポストの上端が同時に観測されるよう、2基の2次元測域センサを取り付けることも良い。台車10にピッチング方向の変化が生じた場合には、2基の2次元測域センサにより上記のマーキングや上端が同時に観測されなくなる。このような場合には、キャリブレーションの精度を高く確保できない可能性が高いことから、センサのキャリブレーションを実行しないようにすると良い。
Therefore, it is also good to mount two two-dimensional range sensors on the trolley 10 that can simultaneously observe the calibration post 101. Attach a marking such as a horizontal bar or reflective tape to a position on the calibration post that is simultaneously observed by the two two-dimensional range sensors when there is no change in the pitching direction of the cart 10. That's good too. Alternatively, two two-dimensional range sensors may be attached so that the upper ends of the calibration posts can be observed simultaneously. When a change occurs in the pitching direction of the truck 10, the markings and the upper end cannot be observed simultaneously by the two two-dimensional range sensors. In such a case, there is a high possibility that high calibration accuracy cannot be ensured, so it is preferable not to perform sensor calibration.
例えばキャリブレーションポスト101に所定の間隔を空けて、2次元測域センサが検出可能な2つのマーキングを付すことも良い。2つのマーキングは、キャリブレーションポスト101において間隔が既知であるので、3次元空間における相対的な位置関係が既知となる。
For example, two markings that can be detected by a two-dimensional range sensor may be attached to the calibration post 101 at a predetermined interval. Since the distance between the two markings is known on the calibration post 101, the relative positional relationship in the three-dimensional space is known.
2つのマーキングのうちの一方のマーキング(第1の対象点の一例)が2次元測域センサ11によって観測されたとき、そのときの台車10の位置及び方位を基準として、他方のマーキング(第2の対象点の一例)が観測されるときの台車10の位置及び方位の変化量(予測変化量)を予測すると良い。また、一方のマーキングが観測された後、台車10が実際に移動して他方のマーキングが実際に観測されたとき、IMUによる測位情報に基づき、前記基準とした位置及び方位に対する台車10の位置及び方位の変化量(推定変化量)を推定すると良い。
When one of the two markings (an example of the first target point) is observed by the two-dimensional range sensor 11, the other marking (the second It is preferable to predict the amount of change (predicted amount of change) in the position and orientation of the trolley 10 when the object point (an example of the target point) is observed. Further, when the trolley 10 actually moves after one marking is observed and the other marking is actually observed, the position of the trolley 10 relative to the reference position and orientation is determined based on the positioning information from the IMU. It is better to estimate the amount of change in direction (estimated amount of change).
上記の予測変化量と上記の推定変化量とを比較すれば、IMUが取得した測位情報の誤差を特定できる。測位情報の誤差を特定できれば、IMUが取得した測位情報を補正可能である。ここで、第1の3次元データに対応する3次元位置と、第2の3次元データに対応する3次元位置と、を比較するに当たっては、共通の座標系において2種類の3次元位置を特定する必要がある。
By comparing the above predicted change amount and the above estimated change amount, it is possible to identify the error in the positioning information acquired by the IMU. If errors in positioning information can be identified, it is possible to correct the positioning information acquired by the IMU. Here, when comparing the three-dimensional position corresponding to the first three-dimensional data and the three-dimensional position corresponding to the second three-dimensional data, two types of three-dimensional positions are specified in a common coordinate system. There is a need to.
本例では、鉛直方向に沿うキャリブレーションポスト101を例示したが、キャリブレーションポスト101が延在する方向は鉛直方向でなくても良く、既知の方位であれば良い。上記の2つのマーキングは、キャリブレーションポスト101上の2カ所である必要はない。2つのマーキングは、別々の構造物にそれぞれ設けられたマーキングであっても良い。各マーキングの相対的な位置関係が3次元空間において既知であって、かつ、2次元測域センサが各マーキングを検出可能であれば良い。
なお、その他の構成及び作用効果については、実施例1と同様である。 In this example, thecalibration post 101 is illustrated as extending in the vertical direction, but the direction in which the calibration post 101 extends does not have to be the vertical direction, and may be any direction as long as it is in a known direction. The above two markings do not need to be at two locations on the calibration post 101. The two markings may be provided on separate structures. It is only necessary that the relative positional relationship of each marking is known in three-dimensional space and that the two-dimensional range sensor can detect each marking.
Note that the other configurations and effects are the same as in Example 1.
なお、その他の構成及び作用効果については、実施例1と同様である。 In this example, the
Note that the other configurations and effects are the same as in Example 1.
(実施例5)
本例は、実施例1の構成に基づき、中心軸CL回りの台車10のロールを検出可能に構成された例である。この内容について、図17及び図18を参照して説明する。図17は、台車10の側面図であり、図18は、台車10の後ろ側から前方を見込む説明図である。 (Example 5)
This example is based on the configuration of Example 1, and is configured such that the roll of thetruck 10 around the center axis CL can be detected. The contents will be explained with reference to FIGS. 17 and 18. FIG. 17 is a side view of the truck 10, and FIG. 18 is an explanatory view looking forward from the rear side of the truck 10.
本例は、実施例1の構成に基づき、中心軸CL回りの台車10のロールを検出可能に構成された例である。この内容について、図17及び図18を参照して説明する。図17は、台車10の側面図であり、図18は、台車10の後ろ側から前方を見込む説明図である。 (Example 5)
This example is based on the configuration of Example 1, and is configured such that the roll of the
本例の台車10は、台車10の前側の上部に配置された2次元測域センサ11に加えて、前側下部に配置された2次元測域センサ114(第3の2次元測域センサの一例。)を備えている。2次元測域センサ11の計測面は、実施例1にて説明した通り、車幅方向に対して平行をなす前方下がりの平面DSである。一方、2次元測域センサ114の計測面は、車幅方向に平行をなす前方上がりの平面DS(第3の平面の一例。)である。
The trolley 10 of this example includes a two-dimensional range sensor 11 disposed at the upper front side of the trolley 10, and a two-dimensional range sensor 114 (an example of a third two-dimensional range sensor) disposed at the lower front side. ). As described in the first embodiment, the measurement surface of the two-dimensional range sensor 11 is a plane DS that is parallel to the vehicle width direction and slopes downward toward the front. On the other hand, the measurement surface of the two-dimensional range sensor 114 is a plane DS (an example of a third plane) parallel to the vehicle width direction and rising forward.
前方下がりの平面DSと前方上がりの平面DSとは、図17のごとく、台車10の前方で交わり、同図中の交点として現れる。図18のごとく、これら2つの平面DSの交わる箇所は、車幅方向に沿う交線Xcを形成している。2つの平面DSは、いずれも、車幅方向に平行をなす平面であるため、2つの平面DSの間の交線Xcもまた車幅方向に平行をなしている。
As shown in FIG. 17, the front downward plane DS and the front upward plane DS intersect in front of the truck 10, and appear as an intersection in the figure. As shown in FIG. 18, the intersection of these two planes DS forms an intersection line Xc along the vehicle width direction. Since the two planes DS are both planes parallel to the vehicle width direction, the intersection line Xc between the two planes DS is also parallel to the vehicle width direction.
交線Xc(図18)は、2次元測域センサ11により観測された対象点、及び2次元測域センサ114により観測された対象点、のうち、2以上の共通する対象点を通過する線として特定できる。図18のごとく、台車10に中心軸CL周りの回転方向の傾きであるロールがゼロであれば、交線Xcが床面1Sと平行をなす。床面1Sに対する交線Xcの傾きを調べれば、台車10のロール角を推定できる。
なお、その他の構成及び作用効果については実施例1と同様である。 The intersection line Xc (FIG. 18) is a line that passes through two or more common target points among the target points observed by the two-dimensional range sensor 11 and the target points observed by the two-dimensional range sensor 114. It can be identified as As shown in FIG. 18, if the roll, which is the inclination of the cart 10 in the rotational direction around the central axis CL, is zero, the intersection line Xc is parallel to the floor surface 1S. The roll angle of the truck 10 can be estimated by examining the slope of the intersection line Xc with respect to the floor surface 1S.
Note that the other configurations and effects are the same as in Example 1.
なお、その他の構成及び作用効果については実施例1と同様である。 The intersection line Xc (FIG. 18) is a line that passes through two or more common target points among the target points observed by the two-
Note that the other configurations and effects are the same as in Example 1.
以上、実施例のごとく本発明の具体例を詳細に説明したが、これらの具体例は、特許請求の範囲に包含される技術の一例を開示しているにすぎない。言うまでもなく、具体例の構成や数値等によって、特許請求の範囲が限定的に解釈されるべきではない。特許請求の範囲は、公知技術や当業者の知識等を利用して前記具体例を多様に変形、変更あるいは適宜組み合わせた技術を包含している。
Although specific examples of the present invention have been described above in detail as in the embodiments, these specific examples merely disclose examples of techniques included in the scope of the claims. Needless to say, the scope of the claims should not be interpreted to be limited by the configurations, numerical values, etc. of the specific examples. The scope of the claims includes techniques in which the specific examples described above are variously modified, changed, or appropriately combined using known techniques and the knowledge of those skilled in the art.
1 3次元地図データの生成システム(生成システム)
10 台車(計測車両)
11 2次元測域センサ(第1の2次元測域センサ)
112 2次元測域センサ(第2の2次元測域センサ)
113、114 2次元測域センサ(第3の2次元測域センサ)
100 磁気マーカ
101 キャリブレーションポスト(棒状部材)
13 制御ユニット(測位情報取得回路)
131 座標変換回路
133 マッピング回路
135 記憶部
15 RFIDタグ
21 センサアレイ
212 検出処理回路(マーカ検出回路)
22 IMU(測位情報取得回路)
221 磁気センサ
222 加速度センサ
223 ジャイロセンサ
24 タグリーダ
CL 中心軸
Cn(nは1~15の整数) 磁気センサ
DS 平面 1 3D map data generation system (generation system)
10 Dolly (measurement vehicle)
11 2D range sensor (first 2D range sensor)
112 Two-dimensional range sensor (second two-dimensional range sensor)
113, 114 Two-dimensional range sensor (third two-dimensional range sensor)
100Magnetic marker 101 Calibration post (rod-shaped member)
13 Control unit (positioning information acquisition circuit)
131 Coordinateconversion circuit 133 Mapping circuit 135 Storage unit 15 RFID tag 21 Sensor array 212 Detection processing circuit (marker detection circuit)
22 IMU (positioning information acquisition circuit)
221Magnetic sensor 222 Acceleration sensor 223 Gyro sensor 24 Tag reader CL Central axis Cn (n is an integer from 1 to 15) Magnetic sensor DS Plane
10 台車(計測車両)
11 2次元測域センサ(第1の2次元測域センサ)
112 2次元測域センサ(第2の2次元測域センサ)
113、114 2次元測域センサ(第3の2次元測域センサ)
100 磁気マーカ
101 キャリブレーションポスト(棒状部材)
13 制御ユニット(測位情報取得回路)
131 座標変換回路
133 マッピング回路
135 記憶部
15 RFIDタグ
21 センサアレイ
212 検出処理回路(マーカ検出回路)
22 IMU(測位情報取得回路)
221 磁気センサ
222 加速度センサ
223 ジャイロセンサ
24 タグリーダ
CL 中心軸
Cn(nは1~15の整数) 磁気センサ
DS 平面 1 3D map data generation system (generation system)
10 Dolly (measurement vehicle)
11 2D range sensor (first 2D range sensor)
112 Two-dimensional range sensor (second two-dimensional range sensor)
113, 114 Two-dimensional range sensor (third two-dimensional range sensor)
100
13 Control unit (positioning information acquisition circuit)
131 Coordinate
22 IMU (positioning information acquisition circuit)
221
Claims (15)
- 床面あるいは地面である移動平面上を移動する計測車両を用いて車両の走行環境の3次元構造を表す3次元地図データを生成する方法であって、
前記計測車両の前後方向の中心軸に対して直交あるいは斜交する第1の平面上の一または複数の対象点までの距離を計測することにより3次元点群データを取得できるよう、前記計測車両に取り付けられた第1の2次元測域センサと、
前記計測車両の位置及び方位を特定可能な測位情報を取得する測位情報取得回路と、を利用し、
前記測位情報取得回路により取得された測位情報に基づく座標変換処理を、前記第1の2次元測域センサによる3次元点群データに施すことにより、基準となる位置及び方位が特定された3次元点群データに変換し、
座標変換後の3次元点群データを3次元空間にマッピングすることにより3次元地図データを生成する3次元地図データの生成方法。 A method of generating three-dimensional map data representing a three-dimensional structure of a vehicle driving environment using a measurement vehicle moving on a moving plane that is a floor or ground surface, the method comprising:
The measurement vehicle is configured such that three-dimensional point cloud data can be obtained by measuring the distance to one or more target points on a first plane perpendicular or oblique to the central axis in the longitudinal direction of the measurement vehicle. a first two-dimensional range sensor attached to the
A positioning information acquisition circuit that acquires positioning information that can identify the position and direction of the measurement vehicle,
A coordinate conversion process based on the positioning information acquired by the positioning information acquisition circuit is applied to the three-dimensional point cloud data from the first two-dimensional range sensor, thereby creating a three-dimensional image whose reference position and orientation are specified. Convert to point cloud data,
A 3D map data generation method that generates 3D map data by mapping 3D point cloud data after coordinate transformation into a 3D space. - 請求項1において、前記測位情報取得回路が取得する測位情報は、絶対位置が既知の位置を基準とする変位量、および絶対方位が既知の方位を基準とする方位の変動量である方位変化量を含んでおり、
前記第1の2次元測域センサによる3次元点群データは、前記測位情報に基づいて絶対的な3次元座標を特定可能である3次元地図データの生成方法。 In claim 1, the positioning information acquired by the positioning information acquisition circuit includes an amount of displacement with respect to a position whose absolute position is known, and an amount of change in direction, which is an amount of change in direction with respect to a direction whose absolute direction is known. It contains
The three-dimensional point group data obtained by the first two-dimensional range sensor is a method for generating three-dimensional map data, in which absolute three-dimensional coordinates can be specified based on the positioning information. - 請求項1において、前記測位情報取得回路は、慣性航法により前記計測車両の位置及び方位を推定することにより前記測位情報を取得する3次元地図データの生成方法。 2. The method of generating three-dimensional map data according to claim 1, wherein the positioning information acquisition circuit acquires the positioning information by estimating the position and orientation of the measurement vehicle using inertial navigation.
- 請求項1において、前記計測車両は、車両が走行する路面に配列されたマーカを検出するためのマーカ検出回路を備え、
前記測位情報取得回路は、前記マーカ検出回路により検出されたマーカの敷設位置を基準として前記計測車両の位置を特定することにより前記測位情報を取得する3次元地図データの生成方法。 In claim 1, the measurement vehicle includes a marker detection circuit for detecting markers arranged on a road surface on which the vehicle travels,
The positioning information acquisition circuit acquires the positioning information by specifying the position of the measurement vehicle based on the marker installation position detected by the marker detection circuit. - 請求項4において、前記マーカ検出回路によるマーカの検出に応じて、前記第1の2次元測域センサが3次元点群データを取得する3次元地図データの生成方法。 5. The method for generating three-dimensional map data according to claim 4, wherein the first two-dimensional range sensor acquires three-dimensional point cloud data in response to marker detection by the marker detection circuit.
- 請求項4において、前記測位情報取得回路は、前記マーカ検出回路によりいずれかのマーカが検出されてから新たなマーカが検出されるまでの間、慣性航法により前記計測車両の位置及び方位を推定することにより前記測位情報を取得する3次元地図データの生成方法。 In claim 4, the positioning information acquisition circuit estimates the position and orientation of the measurement vehicle by inertial navigation from when any marker is detected by the marker detection circuit until a new marker is detected. A method for generating three-dimensional map data by acquiring the positioning information.
- 請求項4において、前記マーカは、周囲に磁気を作用し、磁気的に検出可能な磁気マーカである3次元地図データの生成方法。 The method for generating 3D map data according to claim 4, wherein the marker is a magnetic marker that exerts a magnetic effect on the surroundings and can be magnetically detected.
- 請求項1において、前記計測車両には、該計測車両の中心軸の軸方向を含む水平面である第2の平面上の少なくとも一の対象点までの距離を計測することにより3次元点群データを取得する第2の2次元測域センサが取り付けられ、
前記測位情報取得回路は、前記計測車両の移動前に前記第2の2次元測域センサが取得した前記3次元点群データと、前記計測車両の移動後に前記第2の2次元測域センサが取得した前記3次元点群データと、を比較することにより前記測位情報を取得する3次元地図データの生成方法。 In claim 1, the measurement vehicle is provided with three-dimensional point group data by measuring a distance to at least one target point on a second plane that is a horizontal plane including the axial direction of the central axis of the measurement vehicle. A second two-dimensional range sensor for acquisition is attached,
The positioning information acquisition circuit uses the three-dimensional point group data acquired by the second two-dimensional range sensor before the measurement vehicle moves, and the second two-dimensional range sensor after the measurement vehicle moves. A method for generating three-dimensional map data, wherein the positioning information is obtained by comparing the acquired three-dimensional point cloud data with the acquired three-dimensional point cloud data. - 請求項1において、前記計測車両には、該計測車両の中心軸の軸方向を含む水平面である第2の平面上の少なくとも一の対象点までの距離を計測することにより3次元点群データを取得する第2の2次元測域センサが取り付けられ、
前記測位情報取得回路は、前記計測車両の移動前に前記第2の2次元測域センサが取得した前記3次元点群データと、前記計測車両の移動後に前記第2の2次元測域センサが取得した前記3次元点群データと、を比較することにより測位情報を取得可能であると共に、
慣性航法によって前記計測車両の位置及び方位を推定することにより前記測位情報を取得可能であり、
当該測位情報取得回路は、前記計測車両の移動前後の3次元点群データの比較による測位情報により、前記慣性航法による測位情報の誤差を補正するように構成されている3次元地図データの生成方法。 In claim 1, the measurement vehicle is provided with three-dimensional point group data by measuring a distance to at least one target point on a second plane that is a horizontal plane including the axial direction of the central axis of the measurement vehicle. A second two-dimensional range sensor for acquisition is attached,
The positioning information acquisition circuit uses the three-dimensional point group data acquired by the second two-dimensional range sensor before the measurement vehicle moves, and the second two-dimensional range sensor after the measurement vehicle moves. It is possible to obtain positioning information by comparing the acquired three-dimensional point cloud data, and
The positioning information can be obtained by estimating the position and direction of the measurement vehicle by inertial navigation,
The positioning information acquisition circuit is configured to correct an error in the positioning information obtained by the inertial navigation using positioning information obtained by comparing three-dimensional point cloud data before and after the movement of the measurement vehicle. . - 請求項1~9のいずれか1項において、前記計測車両には、当該計測車両の中心軸に対して直交あるいは斜交すると共に前記第1の平面と交差する第3の平面上の一または複数の対象点までの距離を計測することにより3次元点群データを取得する第3の2次元測域センサが取り付けられ、
前記測位情報取得回路により取得された測位情報に基づく座標変換処理を、前記第1の2次元測域センサによる3次元点群データ及び前記第3の2次元測域センサによる3次元点群データに施すことにより、基準となる位置及び方位が特定された3次元点群データに変換し、
当該基準となる位置及び方位が特定された3次元点群データを、3次元空間にマッピングすることにより3次元地図データを生成する3次元地図データの生成方法。 In any one of claims 1 to 9, the measurement vehicle includes one or more planes on a third plane that is perpendicular or oblique to the central axis of the measurement vehicle and intersects the first plane. A third two-dimensional range sensor is installed, which acquires three-dimensional point cloud data by measuring the distance to the target point.
Coordinate conversion processing based on the positioning information acquired by the positioning information acquisition circuit is performed on three-dimensional point cloud data obtained by the first two-dimensional range sensor and three-dimensional point cloud data obtained by the third two-dimensional range sensor. By applying this method, it is converted into three-dimensional point cloud data in which the reference position and orientation are specified.
A method for generating three-dimensional map data, which generates three-dimensional map data by mapping three-dimensional point cloud data in which the reference position and orientation are specified in a three-dimensional space. - 請求項1~9のいずれか1項において、前記計測車両には、当該計測車両の中心軸に対して直交あるいは斜交すると共に前記第1の平面と交差する第3の平面上の一または複数の対象点までの距離を計測することにより3次元点群データを取得する第3の2次元測域センサが取り付けられ、
前記第1の2次元測域センサが距離を計測する前記第1の平面と、前記第3の2次元測域センサが距離を計測する前記第3の平面と、の交線の前記移動平面に対する傾きに基づいて前記中心軸回りの計測車両の傾きを検出し、
当該中心軸回りの計測車両の傾きに基づき、前記座標変換後の3次元点群データを3次元空間にマッピングする位置を補正する3次元地図データの生成方法。 In any one of claims 1 to 9, the measurement vehicle includes one or more planes on a third plane that is perpendicular or oblique to the central axis of the measurement vehicle and intersects the first plane. A third two-dimensional range sensor is installed, which acquires three-dimensional point cloud data by measuring the distance to the target point.
The intersection of the first plane on which the first two-dimensional range sensor measures distance and the third plane on which the third two-dimensional range sensor measures distance with respect to the moving plane detecting the inclination of the measurement vehicle around the central axis based on the inclination;
A method for generating three-dimensional map data, which corrects a position at which the coordinate-transformed three-dimensional point group data is mapped in a three-dimensional space based on the inclination of the measurement vehicle around the central axis. - 請求項1において、前記第1の2次元測域センサにより一の対象点が観測された後、前記計測車両の位置及び方位が所定量、変化した場合に、前記一の対象点を基準とした既知の方位において、前記第1の2次元測域センサによって観測され得る予測対象点を求め、
前記測位情報取得回路が取得した測位情報に基づく前記計測車両の位置及び方位の変化量が前記所定量に達したとき、前記一の対象点を基準として既知の方位に位置する他の対象点を前記第1の2次元測域センサによって観測し、
前記予測対象点と前記他の対象点との位置的な偏差に基づいて前記測位情報取得回路が取得した測位情報の誤差を特定し、
当該測位情報の誤差に応じて、前記測位情報取得回路が取得した測位情報を補正する3次元地図データの生成方法。 2. The method according to claim 1, wherein, when a position and orientation of the measurement vehicle change by a predetermined amount after a certain target point is observed by the first two-dimensional range sensor, a predicted target point that can be observed by the first two-dimensional range sensor in a known orientation based on the certain target point is obtained,
When the amount of change in the position and orientation of the measurement vehicle based on the positioning information acquired by the positioning information acquisition circuit reaches the predetermined amount, another target point located in a known orientation with respect to the one target point is observed by the first two-dimensional range sensor;
determining an error in the positioning information acquired by the positioning information acquisition circuit based on a positional deviation between the predicted target point and the other target point;
A method for generating three-dimensional map data, which corrects the positioning information acquired by the positioning information acquisition circuit according to an error in the positioning information. - 請求項12において、前記一の対象点、前記他の対象点、及び前記予測対象点は、前記既知の方位に沿うように3次元空間において延設された棒状部材の外表面上の点である3次元地図データの生成方法。 In claim 12, the one target point, the other target point, and the prediction target point are points on the outer surface of a rod-shaped member extending in a three-dimensional space along the known direction. How to generate 3D map data.
- 請求項1において、3次元空間において、相対的な位置関係が既知である第1の対象点と第2の対象点とが設定されており、
前記第1の2次元測域センサにより第1の対象点が観測されたとき、当該観測された時点の計測車両の位置及び方位を基準として、前記第1の2次元測域センサにより前記第2の対象点が観測されるときの前記計測車両の位置及び方位の変化量を予測し、
前記第1の2次元測域センサにより前記第1の対象点が観測された後、前記計測車両が移動して前記第1の2次元測域センサにより前記第2の対象点が実際に観測されたとき、前記測位情報取得回路が取得した測位情報に基づき、前記基準とした位置及び方位に対する前記計測車両の位置及び方位の変化量を推定し、
前記予測された変化量と前記推定された変化量とを比較することで、前記測位情報取得回路が取得した測位情報の誤差を特定し、
当該測位情報の誤差に応じて、前記測位情報取得回路が取得した測位情報を補正する3次元地図データの生成方法。 In claim 1, in a three-dimensional space, a first target point and a second target point whose relative positional relationship is known are set,
When the first target point is observed by the first two-dimensional range sensor, the second target point is determined by the first two-dimensional range sensor based on the position and direction of the measurement vehicle at the time of the observation. predicting the amount of change in the position and orientation of the measurement vehicle when the target point is observed;
After the first target point is observed by the first two-dimensional range sensor, the measurement vehicle moves and the second target point is actually observed by the first two-dimensional range sensor. estimating the amount of change in the position and orientation of the measurement vehicle with respect to the reference position and orientation based on the positioning information acquired by the positioning information acquisition circuit;
Identifying an error in the positioning information acquired by the positioning information acquisition circuit by comparing the predicted amount of change and the estimated amount of change,
A method for generating three-dimensional map data in which positioning information acquired by the positioning information acquisition circuit is corrected according to an error in the positioning information. - 床面あるいは地面である移動平面上を移動可能な計測車両を用いて車両の走行環境の3次元構造を表す3次元地図データを生成するシステムであって、
前記計測車両の前後方向の中心軸に対して直交あるいは斜交する第1の平面上の一または複数の対象点までの距離を計測することにより3次元点群データを取得できるよう、前記計測車両に取り付けられた第1の2次元測域センサと、
前記計測車両の位置及び方位を特定可能な測位情報を取得する測位情報取得回路と、
前記測位情報取得回路により取得された測位情報に基づく座標変換処理を、前記第1の2次元測域センサによる3次元点群データに施し、基準となる位置及び方位が特定された3次元点群データに変換する座標変換回路と、
座標変換後の2次元点群データを3次元空間にマッピングすることにより3次元地図を生成するマッピング回路と、を含む3次元地図データの生成システム。 A system that generates three-dimensional map data representing a three-dimensional structure of a vehicle driving environment using a measurement vehicle that can move on a moving plane that is a floor surface or the ground,
The measurement vehicle is configured such that three-dimensional point cloud data can be obtained by measuring the distance to one or more target points on a first plane perpendicular or oblique to the central axis in the longitudinal direction of the measurement vehicle. a first two-dimensional range sensor attached to the
a positioning information acquisition circuit that acquires positioning information that can identify the position and direction of the measurement vehicle;
A coordinate conversion process based on the positioning information acquired by the positioning information acquisition circuit is applied to the three-dimensional point cloud data obtained by the first two-dimensional range sensor, and a three-dimensional point cloud whose reference position and orientation are specified. A coordinate conversion circuit that converts into data,
A system for generating three-dimensional map data, comprising: a mapping circuit that generates a three-dimensional map by mapping two-dimensional point group data after coordinate conversion onto a three-dimensional space.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2022152011 | 2022-09-23 | ||
JP2022-152011 | 2022-09-23 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2024063078A1 true WO2024063078A1 (en) | 2024-03-28 |
Family
ID=90454560
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2023/034040 WO2024063078A1 (en) | 2022-09-23 | 2023-09-20 | Method and system for generating 3-dimensional map data |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2024063078A1 (en) |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005070840A (en) * | 2003-08-25 | 2005-03-17 | East Japan Railway Co | Three dimensional model preparing device, three dimensional model preparing method and three dimensional model preparing program |
JP2010191066A (en) * | 2009-02-17 | 2010-09-02 | Mitsubishi Electric Corp | Three-dimensional map correcting device and three-dimensional map correction program |
-
2023
- 2023-09-20 WO PCT/JP2023/034040 patent/WO2024063078A1/en unknown
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005070840A (en) * | 2003-08-25 | 2005-03-17 | East Japan Railway Co | Three dimensional model preparing device, three dimensional model preparing method and three dimensional model preparing program |
JP2010191066A (en) * | 2009-02-17 | 2010-09-02 | Mitsubishi Electric Corp | Three-dimensional map correcting device and three-dimensional map correction program |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110419067B (en) | Marker system | |
CN110402311B (en) | Construction method and operation system of magnetic marker | |
Rose et al. | An integrated vehicle navigation system utilizing lane-detection and lateral position estimation systems in difficult environments for GPS | |
US10955854B2 (en) | Method and system for determining the position of a vehicle | |
KR101880013B1 (en) | Magnetic position estimating apparatus and magnetic position estimating method | |
CN109791408B (en) | Self-position estimation method and self-position estimation device | |
JP6766527B2 (en) | Vehicle system and course estimation method | |
EP3343173B1 (en) | Vehicle position estimation device, vehicle position estimation method | |
WO2017149813A1 (en) | Sensor calibration system | |
US10955857B2 (en) | Stationary camera localization | |
CN107422730A (en) | The AGV transportation systems of view-based access control model guiding and its driving control method | |
CN110709906B (en) | Marker system and application method | |
WO2015173034A1 (en) | Method and system for determining a position relative to a digital map | |
WO2009098319A2 (en) | Navigational device for a vehicle | |
JP2019513996A (en) | Method of determining the attitude of at least a partially autonomous driving vehicle in the vicinity by a ground sign | |
JP6806891B2 (en) | Information processing equipment, control methods, programs and storage media | |
CN113260544A (en) | Correction method of gyroscope sensor | |
CN111108344A (en) | Position capturing system and position capturing method | |
JP2018169301A (en) | Marker system | |
JP2005258941A (en) | Device for detecting obstacle | |
US11933633B2 (en) | Point cloud data acquiring method and point cloud data acquiring system | |
JP6834401B2 (en) | Self-position estimation method and self-position estimation device | |
WO2024063078A1 (en) | Method and system for generating 3-dimensional map data | |
WO2021084731A1 (en) | Information processing device, information processing system, information processing method, and information processing program | |
WO2021074660A1 (en) | Object recognition method and object recognition device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 23868199 Country of ref document: EP Kind code of ref document: A1 |