[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

WO2018179960A1 - Mobile body and local position estimation device - Google Patents

Mobile body and local position estimation device Download PDF

Info

Publication number
WO2018179960A1
WO2018179960A1 PCT/JP2018/005266 JP2018005266W WO2018179960A1 WO 2018179960 A1 WO2018179960 A1 WO 2018179960A1 JP 2018005266 W JP2018005266 W JP 2018005266W WO 2018179960 A1 WO2018179960 A1 WO 2018179960A1
Authority
WO
WIPO (PCT)
Prior art keywords
time
moving body
positioning device
estimated value
moving
Prior art date
Application number
PCT/JP2018/005266
Other languages
French (fr)
Japanese (ja)
Inventor
伊知朗 宮崎
信也 安達
Original Assignee
日本電産株式会社
日本電産シンポ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本電産株式会社, 日本電産シンポ株式会社 filed Critical 日本電産株式会社
Priority to JP2019508736A priority Critical patent/JPWO2018179960A1/en
Publication of WO2018179960A1 publication Critical patent/WO2018179960A1/en

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions

Definitions

  • the present disclosure relates to a moving body and a self-position estimation apparatus used for the moving body.
  • a moving body such as a drone (unmanned aerial vehicle), an autonomous driving car, and an autonomous mobile robot is underway.
  • a moving body that performs self-position estimation includes an external sensor such as a laser range sensor, and senses surrounding space while moving to acquire sensor data. For example, it is possible to identify the self-location on the environment map by collating (matching) the local map data around the moving object created from the sensor data with a wider range of environment map data.
  • Japanese Patent Laying-Open No. 2016-224680 discloses a self-position estimation apparatus that includes a first self-position estimation unit and a second self-position estimation unit, and executes an estimation process for each step.
  • the first self-position estimating unit obtains a probability distribution of the latest position of the moving body from the sensor data and the environment map, and estimates the first self-position based on the probability distribution.
  • the second self-position estimation unit estimates the second self-position by adding the movement distance and movement direction from the previous step to the current step acquired by odometry to the final self-position estimated in the previous step. To do.
  • the weighted average value of the first self-position and the second self-position is set as the final self-position in the current step.
  • the second self-position is calculated based on the movement of the moving body (movement of one step) during a fixed time from the previous step to the current step. . Also, the movement for one step is measured by odometry at a fixed time from the previous step to the current step.
  • the embodiment of the present disclosure provides a moving body capable of realizing position estimation with high accuracy regardless of the moving speed, and a self-position estimating apparatus used for the moving body.
  • the mobile body of the present disclosure includes a motor, a drive device that controls the motor to move the mobile body, and an external sensor that senses the surrounding space and periodically outputs sensor data.
  • a storage device that stores map data, a positioning device that performs a process of estimating the position of the moving body using the sensor data and the map data, and sequentially outputs the estimated value of the position, and an arithmetic circuit; Is provided.
  • the arithmetic circuit starts the process of estimating the position of the moving body at the second time after the positioning device outputs the estimated value of the position of the moving body at the first time from the first time.
  • the estimated value of the position at the first time is corrected based on the distance and direction that the moving object moves until the second time, and the corrected estimated value is the position of the moving object at the second time. Is given to the positioning device as an initial position.
  • the self-position estimation apparatus is a self-position estimation apparatus that is used by being mounted on a mobile body that includes an external sensor that senses surrounding space and periodically outputs sensor data in an exemplary embodiment.
  • a storage device that stores map data, a positioning device that performs a process of estimating the position of the moving body using the sensor data and the map data, and sequentially outputs the estimated value of the position, and an arithmetic circuit.
  • the arithmetic circuit starts the process of estimating the position of the moving body at the second time after the positioning device outputs the estimated value of the position of the moving body at the first time from the first time.
  • the estimated value of the position at the first time is corrected based on the distance and direction that the moving object moves until the second time, and the corrected estimated value is the position of the moving object at the second time. Is given to the positioning device as an initial position.
  • the moving distance and direction of the moving object can be calculated according to the processing time of the positioning device. it can. For this reason, it is not limited to the period of one step of the sequential processing, and it becomes possible to give the initial position with higher accuracy to the positioning device.
  • FIG. 1 is a block diagram illustrating a basic configuration example of a moving object according to the present disclosure.
  • FIG. 2 is a diagram illustrating an example of the timing at which sensor data is periodically output from the external sensor 106 and the timing at which a position estimation value is output from the positioning device 110 in the conventional example.
  • FIG. 3 is a diagram illustrating an example of a timing at which sensor data is periodically output from the external sensor 106 and a timing at which a position estimation value is output from the positioning device 110 for a moving object according to the present disclosure.
  • FIG. 4 is a diagram illustrating an exemplary AGV 10 that travels in a passage 1 in a factory.
  • FIG. 5 is a diagram illustrating an overview of an exemplary management system 1000 that manages the running of the AGV 10.
  • FIG. 1 is a block diagram illustrating a basic configuration example of a moving object according to the present disclosure.
  • FIG. 2 is a diagram illustrating an example of the timing at which sensor data is periodically output from the external sensor 106 and
  • FIG. 6 is a diagram illustrating an example of each target position ( ⁇ ) set in the travel route of the AGV 10.
  • FIG. 7A is a diagram illustrating an example of a movement route of the AGV 10 that continuously moves.
  • FIG. 7B is a diagram illustrating an example of a movement route of the AGV 10 that continuously moves.
  • FIG. 7C is a diagram illustrating an example of a movement route of the AGV 10 that continuously moves.
  • FIG. 8 is an external view of an exemplary AGV 10.
  • FIG. 9 is a diagram illustrating a hardware configuration of the AGV 10.
  • FIG. 10 is a diagram illustrating the AGV 10 that scans the surrounding space using the LRF 15 while moving.
  • FIG. 11 is a diagram illustrating the AGV 10 that generates a map while moving.
  • FIG. 12 is a diagram illustrating the AGV 10 that generates a map while moving.
  • FIG. 13 is a diagram illustrating the AGV 10 that generates a map while moving.
  • FIG. 14 is a diagram illustrating the AGV 10 that generates a map while moving.
  • FIG. 15 is a diagram schematically showing the completed map 30.
  • FIG. 16 is a diagram schematically illustrating a general position identification process.
  • FIG. 17 is a diagram schematically illustrating a general position identification process.
  • FIG. 18 is a diagram schematically illustrating a general position identification process.
  • FIG. 19 is a flowchart illustrating an example of position identification processing after losing sight of the self position.
  • FIG. 20 is a diagram illustrating a hardware configuration of the travel management device 20.
  • FIG. 1 shows a basic configuration example of a moving object according to the present disclosure.
  • the moving body 100 in this example includes an electric motor (hereinafter simply referred to as “motor”) 102, a driving device 104 that moves the moving body 100 by controlling the motor 102, and sensor data by sensing the surrounding space. Is provided with an external sensor 106 that periodically outputs.
  • a typical example of the moving body 10 is a moving body that has at least one drive wheel (not shown) mechanically coupled to the motor 102 and can travel on the ground by the traction of the drive wheel.
  • the moving body 100 further includes a self-position estimation device 200.
  • the self-position estimation device 200 performs various operations, such as a storage device 108 that stores map data (map data) of the surrounding environment, a positioning device 110 that estimates the position of the moving body 100 using sensor data and map data, and the like. And an arithmetic circuit 120 to be executed.
  • the positioning device 110 sequentially outputs estimated values of the position (self-position) of the moving body 100 when the moving body 100 is moving or stopped.
  • the arithmetic circuit 120 calculates position information necessary for the positioning device 110 and supplies the position information to the positioning device 110.
  • FIG. 2 is a diagram illustrating an example of timing at which sensor data is periodically output from the external sensor 106 and timing at which a position estimation value is output from the positioning device 110. Black dots in FIG. 2 indicate sensor data output timing.
  • the sensor data is output from the external sensor 106 at a cycle Ts.
  • the position estimation value is output from the positioning device 110 at the cycle Tp.
  • the estimated position values x1, x2, and x3 were normally output at times t1, t2, and t3, respectively. However, at time t4, information indicating “undefined” as the position estimation value or a position estimation value with very low reliability was output. In such a case, the positioning device 110 loses the current position information necessary for the next position estimation, and cannot continuously output the position estimated value at the period Tp. For this reason, it is necessary to provide the positioning device 110 with the highly reliable position estimation value x3 output at time t3 as the current position of the moving body 100.
  • the positioning device 110 performs self-position estimation using, for example, an ICP (Iterative Closest Point) matching algorithm.
  • the positioning device 110 may perform probabilistic position estimation using a particle filter based on the Monte Carlo method. When the positioning device 110 acquires the estimated position value x3 as the initial position, the positioning device 110 starts position identification processing.
  • the traveling speed (moving speed) of the moving body 100 When the traveling speed (moving speed) of the moving body 100 is increased, the moving distance of the moving body 100 is increased while the time of the period Tp has elapsed. For example, when the moving speed of the moving body 100 is 5 meters / second and the period Tp is 100 milliseconds, the estimated position x3 at time t3 is shifted by 0.5 meters at time t4. In addition, at time t5 when the time Tp has elapsed from time t4, the moving body 100 shifts by 1 meter from the estimated position value x3 at time t3. Unless the traveling of the moving body 100 is stopped, the position of the moving body 100 is further away from the estimated position value x3 at time t3.
  • sensor data is continuously output from the external sensor 106 at a cycle Ts shorter than the cycle Tp.
  • the position identification process should be performed using the latest sensor data.
  • the initial position of the moving body 100 required for the position identification process is greatly deviated from the actual position, there is a problem that the position identification process takes a long time or the position identification cannot be realized.
  • the apparatus disclosed in Japanese Patent Laying-Open No. 2016-224680 acquires the moving distance and direction of the moving body 100 in each step of sequential processing by odometry.
  • the period of one step corresponds to the period Tp. Therefore, according to this apparatus, when the self-position is lost at time t4, a value (second self-position) obtained by adding the movement distance and direction from time t3 to time t4 to the position estimated value x3 at time t3. It will be used as the initial position.
  • probabilistic self-position estimation is executed using a particle filter.
  • the self-position estimation algorithm does not need to be probabilistic and may be an algorithm based on pattern matching.
  • Such a conventional example has the following problems.
  • the arithmetic circuit 120 in FIG. 1 executes the following processing in order to solve the above-described problem. This process will be described with reference to FIG.
  • the positioning device 110 outputs the estimated value x3 of the position of the moving body 100 at time t3 (first time), and then loses its own position at time t4 (position estimated value “undefined”). And In this case, the arithmetic circuit 120 starts a position identification process for estimating the position of the moving body 100 at time t6 (second time). At this time, the arithmetic circuit 120 determines the position of the position at time t3 (first time) based on the distance and direction in which the moving body 100 moves from time t3 (first time) to time t6 (second time). The estimated value x3 is corrected. The arithmetic circuit 120 gives the corrected estimated value (x3 + ⁇ x) to the positioning device 110 as the initial position of the moving body 100 at time t6 (second time).
  • the timing at which the corrected estimated value (x3 + ⁇ x) is given to the positioning device 110 is time t5.
  • the time from time t5 to time t6 may be zero in an extreme case.
  • the correction amount ⁇ x is the movement distance and direction acquired, for example, by odometry from time t3 to time t5.
  • the period (t6-t5) from time t5 to time t6 cannot be ignored, the moving distance and direction of the moving body in the period (t6-t5) may be obtained by calculation and included in the correction amount ⁇ x.
  • the period from time t3 to time t6 can be very long. In such a case, it is preferable to determine the position of the moving body 100 at time t6 as the initial position according to the processing time.
  • the corrected estimated value (x3 + ⁇ x) may be called the “predicted position” of the moving body 100 at time t6 (second time).
  • the initial position of the self-position identification process that starts after losing sight of the self-position is taken into account based on the moving speed and direction of the moving body 100 in consideration of the time required for the process after sensor data acquisition. Calculate the expected position. For this reason, the accuracy of the initial position is improved, and the self-position identification process can be appropriately started. As a result, smooth running can be continued without decelerating or stopping the moving body.
  • the positioning device 110 when the positioning device 110 outputs the estimated value of the position of the moving body 100, the positioning device 110 may output information indicating the reliability of the estimation.
  • the arithmetic circuit 120 after the first time to the second time, after the correction, when the reliability of the estimated value of the position of the moving body output from the positioning device 110 is less than the set value. Is provided to the positioning device 110 as the initial position of the moving body at the second time.
  • the reliability of the estimation can be an evaluation value of the degree of coincidence by pattern matching or a probabilistic index. When the reliability is lower than a preset value, it can be assumed that the self-position has been lost.
  • the arithmetic circuit 120 responds to a request from the positioning device 110 between the first time and the second time, and calculates the corrected estimated value at the initial position of the mobile object 100 at the second time.
  • the position is given to the positioning device 110.
  • the positioning device 110 calculates the “estimated value after correction” to the arithmetic circuit 120 before the position estimation value of “indefinite” or low reliability is output from the positioning device 110. It is also possible to instruct. If the calculation start of the “estimated value after correction” by the arithmetic circuit 120 can be accelerated, the interval from the first time to the second time can be shortened, so that the period in which the self-position is lost can be shortened. .
  • the positioning device 110 uses the initial position of the moving body 100 at the second time given from the arithmetic circuit 120 and the sensor data output from the external sensor 106 before the second time and the second time, The position of the moving body 100 at the second time is estimated.
  • the positioning device 110 outputs the estimated value of the position of the moving body 100 at the second time at the third time.
  • the arithmetic circuit 120 can control the traveling of the moving object 100 based on the estimated position value.
  • the arithmetic circuit 120 determines a distance and a direction in which the moving body 100 moves between the first time and the second time according to the operating state of the driving device 104. Such distance and direction can be grasped from the operating state of the drive device 104 that controls the motor 102. For example, when the number of motors 102 that generate traction is two, the moving speed and moving direction of the moving body 100 can be determined from the rotational speeds of the individual motors 102. Since the rotational speed of the motor 102 is defined by the operating state of the driving device 104, the moving speed of the moving body 100 can be determined based on the operating state of the driving device without a sensor that directly detects the rotating state of the motor 102. Can be determined. Further, the operating state of the driving device 104 can be known from the contents of the command given to the driving device 104 by the arithmetic circuit 120.
  • the moving body 100 further includes an internal sensor that outputs odometry information.
  • the internal sensor include an inertial measurement unit such as a rotor encoder and a gyro sensor that measure the rotational speed of a motor or wheels. The rotational speed of the wheel can also be estimated from the operating state of the drive device.
  • the arithmetic circuit 120 in such an embodiment can determine the distance and direction in which the moving body moves from the first time to the second time using the odometry information output from the internal sensor.
  • the map data may be data created based on sensor data periodically output from the external sensor 106 when the moving body 100 is moving, or map data created by another method. There may be.
  • the map data may be data obtained by integrating the map data of a plurality of zones.
  • a typical example of the map data is an occupation grid map, but is not limited to this.
  • the driving device 104 controls the motor 102 based on the command value of the position of the moving body 100 and the estimated value of the position of the moving body 100 output from the positioning device 110. This command value may be given to the driving device 104 from a controller not shown in FIG.
  • an automatic guided vehicle is mentioned as an example of a moving body.
  • the automatic guided vehicle is also called AGV (Automated Guided Vehicle), and is also described as “AGV” in this specification.
  • FIG. 4 shows, for example, the AGV 10 that travels in the passage 1 in the factory.
  • FIG. 5 shows an outline of a management system 1000 that manages the running of the AGV 10 according to this example.
  • the AGV 10 has map data (map data) and travels while recognizing which position it is currently traveling.
  • the travel route of the AGV 10 follows a command from the travel management device 20 of FIG.
  • the AGV 10 moves by rotating a plurality of built-in motors according to a command and rotating wheels.
  • the command is transmitted from the traveling management device 20 to the AGV 10 by radio.
  • the communication between the AGV 10 and the travel management device 20 can be performed using, for example, the wireless access points 2a, 2b provided near the ceiling of the factory.
  • the communication conforms to, for example, the Wi-Fi (registered trademark) standard.
  • Wi-Fi registered trademark
  • FIG. 4 a plurality of AGVs 10 may travel. The traveling of each of the plurality of AGVs 10 may or may not be managed by the traveling management device 20.
  • the outline of the operation of the AGV 10 and the travel management device 20 included in the management system 1000 is as follows.
  • the AGV 10 starts from the n-th position to the (n + 1) -th position (hereinafter, “position M n + 1 ”). ).
  • the target position can be determined by the administrator for each AGV 10, for example.
  • the AGV 10 When the AGV 10 reaches the target position M n + 1 , the AGV 10 transmits an arrival notification (hereinafter referred to as “notification”) to the travel management device 20.
  • the notification is sent to the travel management device 20 via the wireless access point 2a.
  • AGV10 collates the output of the external sensor which senses the periphery, and map data, and identifies a self position. Then, it may be determined whether or not the self position matches the position M n + 1 .
  • the traveling management apparatus 20 When the notification is received, the traveling management apparatus 20 generates the next command ((n + 1) th command) for moving the AGV 10 from the position M n + 1 to the position M n + 2 .
  • the (n + 1) th command includes the position coordinates of the position M n + 2 and may further include numerical values such as acceleration time and moving speed during constant speed traveling.
  • the traveling management device 20 transmits the (n + 1) th command to the AGV 10.
  • the AGV 10 analyzes the (n + 1) th command and performs a preprocessing calculation necessary for the movement from the position M n + 1 to the position M n + 2 .
  • the preprocessing calculation is, for example, calculation for determining the rotation speed, rotation time, etc. of each motor for driving each wheel of the AGV 10.
  • FIG. 6 shows an example of each target position ( ⁇ ) set in the travel route of AGV10.
  • the interval between two adjacent target positions does not have to be a fixed value and can be determined by an administrator.
  • the AGV 10 can move in various directions according to commands from the travel management device 20.
  • 7A to 7C show examples of movement paths of the AGV 10 that moves continuously.
  • FIG. 7A shows the movement path of the AGV 10 when traveling straight. After reaching the position M n + 1 , the AGV 10 can perform a pre-processing calculation, operate each motor according to the calculation result, and continue to move linearly to the next position M n + 2 .
  • FIG. 7B shows a movement path of the AGV 10 that makes a left turn at the position M n + 1 and moves toward the position M n + 2 .
  • the AGV 10 performs a preprocessing calculation after reaching the position M n + 1 and rotates at least one motor located on the right side in the traveling direction according to the calculation result.
  • the AGV 10 rotates counterclockwise by an angle ⁇ on the spot, all the motors rotate at a constant speed toward the position M n + 2 and go straight.
  • FIG. 7C shows a movement path of the AGV 10 when moving in a circular arc shape from the position M n + 1 to the position M n + 2 .
  • the AGV 10 performs a preprocessing calculation after reaching the position M n + 1, and increases the rotational speed of the outer peripheral motor relative to the inner peripheral motor in accordance with the calculation result. As a result, the AGV 10 can continue to move along an arcuate path toward the next position M n + 2 .
  • FIG. 8 is an external view of an exemplary AGV 10 according to the present embodiment.
  • FIG. 9 shows the hardware configuration of the AGV 10.
  • the AGV 10 includes four wheels 11a to 11d, a frame 12, a transport table 13, a travel control device 14, and a laser range finder (LRF) 15.
  • LRF laser range finder
  • the traveling control device 14 is a device that controls the operation of the AGV 10, and also functions as a self-position estimation device.
  • the traveling control device 14 mainly includes a plurality of integrated circuits including a microcomputer (described later), a plurality of electronic components, and a substrate on which the plurality of integrated circuits and the plurality of electronic components are mounted.
  • the travel control device 14 performs data transmission / reception with the travel management device 20 and pre-processing calculation described above.
  • the LRF 15 is a range sensor that measures the distance to the target by, for example, irradiating the target with an infrared laser beam 15a and detecting the reflected light of the laser beam 15a.
  • the LRF 15 corresponds to the external sensor 106 in FIG.
  • Other examples of external sensors for sensing the surrounding space to acquire sensor data include an image sensor and an ultrasonic sensor.
  • the LRF 15 of the AGV 10 emits a pulsed laser beam 15a while changing its direction every 0.25 degrees in a space in the range of 135 degrees to the left and right (total 270 degrees) with respect to the front of the AGV 10, for example. Then, the reflected light of each laser beam 15a is detected.
  • distance data from the AGV 10 to the reflection point can be obtained for each of a total of about 1080 different directions obtained by dividing the range of angle 270 degrees every 0.25 degrees.
  • the values of 270 degrees and 0.25 degrees are examples, and the scanning mode varies depending on the type of LRF 15.
  • the time required for one scan by the LRF 15 is several milliseconds to several tens of milliseconds, for example.
  • the LRF 15 outputs sensor data periodically (for example, every several tens of milliseconds) while sensing the surrounding space.
  • the position and posture of the AGV 10 and the scan result of the LRF 15 it is possible to know the arrangement state of the object around the AGV 10.
  • the position and posture of a moving object are called poses.
  • the position and orientation of the moving body in the two-dimensional plane are expressed by position coordinates (x, y) in the XY orthogonal coordinate system and an angle ⁇ with respect to the X axis, respectively.
  • the position and orientation of the AGV 10, that is, the pose (x, y, ⁇ ) may be simply referred to as “position” or “position coordinates” below.
  • the positioning device which will be described later, identifies the local position (x, y, ⁇ ) on the environment map by matching (matching) the local map data created from the scan results of LRF15 with a wider range of environment map data. It becomes possible to do.
  • the position of the reflection point viewed from the center position of the radiation of the laser beam 15a can be expressed using polar coordinates determined by the angle and the distance.
  • the polar coordinates are local coordinates that move with the AGV 10.
  • the LRF 15 outputs sensor data expressed in polar coordinates.
  • the LRF 15 may convert the position expressed in polar coordinates into orthogonal coordinates and output the result.
  • LRF 15 Since the structure and operation principle of the LRF are known, further detailed description is omitted in this specification. Examples of objects that can be detected by the LRF 15 are people, luggage, shelves, and walls.
  • the “sensor data” output from the LRF 15 is a plurality of sets of vector data in which the angle ⁇ and the distance L are set as one set.
  • the angle ⁇ changes by 0.25 degrees within a range of, for example, ⁇ 135 degrees to +135 degrees.
  • the angle may be expressed with the right side as positive and the left side as negative with respect to the front of the AGV 10.
  • the distance L is the distance to the object measured for each angle ⁇ .
  • the distance L is obtained by dividing half of the difference between the emission time of the laser beam 15a and the reception time of the reflected light (that is, the time required for the round trip of the laser beam) by the speed of light.
  • FIG. 9 also shows a specific configuration of the traveling control device 14 of the AGV 10.
  • the AGV 10 in this example includes a travel control device 14, an LRF 15, four motors 16a to 16d, and a drive device 17.
  • the traveling control device 14 includes a microcomputer 14a, a memory 14b, a storage device 14c, a communication circuit 14d, and a positioning device 14e.
  • the microcomputer 14a, the memory 14b, the storage device 14c, the communication circuit 14d, and the positioning device 14e are connected by a communication bus 14f and can exchange data with each other.
  • the LRF 15 is also connected to the communication bus 14f via a communication interface (not shown), and transmits measurement data as measurement results to the microcomputer 14a, the positioning device 14e, and / or the memory 14b.
  • the microcomputer 14a is a control circuit (computer) that performs calculations for controlling the entire AGV 10 including the travel control device 14.
  • the microcomputer 14a can operate as the arithmetic circuit 120 in FIG.
  • the microcomputer 14a is a semiconductor integrated circuit.
  • the microcomputer 14a transmits a PWM (Pulse Width Modulation) signal to the driving device 17 to control the driving device 17 and adjust the current flowing through the motor. As a result, each of the motors 16a to 16d rotates at a desired rotation speed.
  • PWM Pulse Width Modulation
  • the memory 14b is a volatile storage device that stores a computer program executed by the microcomputer 14a.
  • the memory 14b can also be used as a work memory when the microcomputer 14a and the positioning device 14e perform calculations.
  • the storage device 14c is a non-volatile semiconductor memory device that stores map data.
  • the storage device 14c may be a magnetic recording medium typified by a hard disk or an optical recording medium typified by an optical disk, and a head for writing and / or reading data on any recording medium.
  • the apparatus and the control device of the head device may be included. Details of the map data will be described later with reference to FIG.
  • the storage device 14c corresponds to the storage device 108 in FIG.
  • the communication circuit 14d is a wireless communication circuit that performs wireless communication conforming to, for example, the Wi-Fi (registered trademark) standard.
  • the positioning device 14e receives the sensor data from the LRF 15, and reads the map data stored in the storage device 14c.
  • the timing at which the positioning device 14e receives sensor data from the LRF 15 does not have to coincide with the timing at which the LRF 15 outputs sensor data.
  • the LRF 15 may output sensor data every 25 milliseconds, and the positioning device 14e may receive the sensor data every 100 milliseconds.
  • the positioning device 14e performs a process of comparing the sensor data and the map data to identify the self position. The specific operation of the positioning device 14e will be described later.
  • the microcomputer 14a and the positioning device 14e are separate components, but this is an example. It may be a single chip circuit or a semiconductor integrated circuit capable of independently performing the operations of the microcomputer 14a and the positioning device 14e.
  • FIG. 9 shows a chip circuit 14g including the microcomputer 14a and the positioning device 14e.
  • the microcomputer 14a, the positioning device 14e, and / or the chip circuit 14g may be referred to as a computer, an arithmetic circuit, or a processing circuit.
  • the microcomputer 14a and the positioning device 14e are separately provided will be described.
  • the four motors 16a to 16d are attached to the four wheels 11a to 11d, respectively, and rotate each wheel.
  • the number of motors mounted on one AGV 10 is not limited to four. Further, the motor for traction does not have to be attached to all four wheels 11a to 11d, and is typically attached to two wheels.
  • the AGV 10 may be provided with a wheel and a motor for steering, or a motor for other purposes.
  • the drive device 17 has motor drive circuits 17a to 17d for adjusting the current flowing through each of the four motors 16a to 16d.
  • the driving device 17 corresponds to the driving device 104 in FIG.
  • Each of the motor drive circuits 17a to 17d is a so-called inverter circuit, and the current flowing to each motor is turned on or off by the PWM signal transmitted from the microcomputer 14a, thereby adjusting the current flowing to the motor.
  • the map data in the present embodiment can be created by SLAM (Simultaneous Localization and Mapping) technology.
  • the AGV 10 scans the surrounding space by operating the LRF 15 while actually traveling in a factory where the AGV 10 is used, and generates a map while estimating its own position.
  • the AGV 10 may travel on a specific route while being controlled by the administrator, and generate a map from the sensor data acquired by the LRF 15.
  • the map data may be created by “online processing” while the AGV 10 is moving, or by “offline processing” by a computer located outside the AGV 10 using a large amount of sensor data acquired while the AGV 10 is moving. You may go.
  • FIG. 10 to 14 show the AGV 10 that generates a map while moving.
  • FIG. 10 shows an AGV 10 that scans the surrounding space using the LRF 15. A laser beam is emitted at every predetermined angle, and scanning is performed.
  • the position of the reflection point of the laser beam is indicated by using a plurality of points represented by the symbol “ ⁇ ”, such as point 4 in FIG.
  • These plural points form point cloud data (Point Cloud Data).
  • the positioning device 14e accumulates the position of the black spot 4 obtained as a result of traveling, for example, in the memory 14b.
  • the map is gradually completed by continuously performing scanning while the AGV 10 travels.
  • FIG. 11 to FIG. 14 only the scan range is shown for the sake of simplicity.
  • the scan range is also an example, and is different from the above-described example of 270 degrees in total.
  • FIG. 15 schematically shows a part of the completed map 30.
  • the positioning device 14e accumulates the data of the map 30 in the memory 14b or the storage device 14c.
  • the number or density of black spots shown in the figure is an example.
  • FIG. 16 to 18 schematically show a procedure of general position identification processing.
  • AGV has acquired in advance a map (hereinafter referred to as “reference map 30”) corresponding to the map 30 of FIG.
  • reference map 30 a map corresponding to the map 30 of FIG.
  • the AGV acquires sensor data 32 shown in FIG. 16 at predetermined time intervals, and executes a process of identifying its own position on the reference map 30.
  • the AGV sequentially sets various local maps (for example, local maps 34a, 34b, 34c) in which the position and angle of the AGV are changed on the reference map 30, and a plurality of reflection points and sensor data included in each of the local maps. 32 is collated with the reflection point included in 32. Such collation can be performed by the ICP matching described above, for example. In order to perform matching efficiently, it is necessary to appropriately set the position and angle of the AGV on the reference map 30, that is, the pose. In the present embodiment, the microcomputer 14a in FIG. 9 sets an initial position by the method described with reference to FIG. 3, and gives it to the positioning device 14e.
  • FIG. 17 schematically shows a point (for example, point 5) determined to be a match as a result of the collation, by the symbol “ ⁇ ”.
  • a point for example, point 5
  • the local map 34d that minimizes the root mean square of the distance (error) between corresponding features is selected, the position of the AGV estimated by matching and the position of the AGV corresponding to the local map 34d The position is determined.
  • the identified self-position 36 is represented by the symbol “X”.
  • the latest sensor data and the position of the AGV when the sensor data is acquired to be precise, a pose is required.
  • the AGV does not lose sight of the self position, it is possible to estimate the current self position with high accuracy as the initial position at the previous estimated position (identification position).
  • Such an update of the estimated value can be realized, for example, with a period of 100 milliseconds.
  • the AGV loses its own position, if the position of the AGV (initial position) necessary for the start of collation is greatly shifted from the actual AGV position, It takes a long time. For this reason, the AGV needs to be stopped or decelerated for verification.
  • the initial position is appropriately predicted according to the moving distance and direction of the AGV and the processing time. For this reason, efficient collation processing is realized based on an appropriate initial position.
  • FIG. 19 is a flowchart showing an example of position identification processing after losing sight of the self position. An example of position identification processing will be described with reference to FIGS. 9 and 19.
  • the microcomputer 14a in FIG. 9 acquires odometry information in step S10 in FIG. “After losing sight of the self position” means after time t4 in FIG.
  • the odometry information can be acquired from a rotary encoder (not shown) or the like.
  • step S20 the microcomputer 14a corrects the previous identification position and calculates an initial position necessary for collation. Specifically, the microcomputer 14a calculates the moving speed of the AGV based on the odometry information. Further, the microcomputer 14a determines the time until the time (time t6 in FIG. 3) at which the latest sensor data is acquired, and calculates the moving distance of the AGV based on this time and the moving speed. Furthermore, the microcomputer 14a predicts the position of the AGV at the time when the latest sensor data is acquired (time t6 in FIG. 3) from the moving distance and the moving direction of the AGV.
  • step S30 the microcomputer 14a gives the predicted position to the positioning device 14e as an initial position for collation.
  • step S40 the positioning device 14e acquires the latest sensor data from the LRF 15.
  • step S50 the positioning device 14e starts collation for position identification using the initial position and the latest sensor data. After the collation is completed and the self-position is identified, the positioning device 14e outputs the self-position as a position estimation value.
  • FIG. 20 shows a hardware configuration of the travel management device 20.
  • the travel management device 20 includes a CPU 21, a memory 22, a storage device 23, a communication circuit 24, and an image processing circuit 25.
  • the CPU 21, the memory 22, the storage device 23, the communication circuit 24, and the image processing circuit 25 are connected by a communication bus 27 and can exchange data with each other.
  • the CPU 21 is a signal processing circuit (computer) that controls the operation of the travel management device 20.
  • the CPU 21 is a semiconductor integrated circuit.
  • the memory 22 is a volatile storage device that stores a computer program executed by the CPU 21.
  • the memory 22 can also be used as a work memory when the CPU 21 performs calculations.
  • the storage device 23 stores map data created by the AGV 10.
  • the storage device 23 may be a nonvolatile semiconductor memory, a magnetic recording medium represented by a hard disk, or an optical recording medium represented by an optical disk.
  • the storage device 23 may also store position data indicating each position that can be a destination of the AGV 10 that is necessary to function as the travel management device 20.
  • the position data can be represented by coordinates virtually set in the factory by an administrator, for example.
  • the location data is determined by the administrator.
  • the communication circuit 24 performs wired communication based on, for example, the Ethernet (registered trademark) standard.
  • the communication circuit 24 is connected to the wireless access points 2a, 2b and the like by wire, and can communicate with the AGV 10 via the wireless access points 2a, 2b and the like.
  • the communication circuit 24 receives data and a command of a position to which the AGV 10 should go from the CPU 21 via the bus 27 and transmits it to the AGV 10. This data and command are received by the communication circuit 14d of the AGV 10 shown in FIG.
  • the communication circuit 24 transmits data (for example, notification and position information) received from the communication circuit 14 d (FIG. 9) of the AGV 10 to the CPU 21 and / or the memory 22 via the bus 27.
  • the AGV 10 periodically transmits the self-position information (position and angle) output from the positioning device 14 e to the communication circuit 24 of the travel management device 20. This period can be, for example, 100 milliseconds to 1 second.
  • the image processing circuit 25 is a circuit that generates video data to be displayed on the external monitor 29.
  • the image processing circuit 25 operates exclusively when the administrator operates the travel management device 20. In the present embodiment, further detailed explanation is omitted.
  • the monitor 29 may be integrated with the travel management apparatus 20. Further, the CPU 21 may perform the processing of the image processing circuit 25.
  • the technique of the present disclosure can be widely used for a moving body that performs a process of identifying a self-position.

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

This mobile body comprises: an external environment sensor (106) which senses the space in the periphery of the mobile body and periodically outputs sensor data; a storage device (108) which stores map data; a positioning device (110) which, using the sensor data and the map data, carries out a process of estimating the position of the mobile body and sequentially outputs position estimation values; and a computation circuit (120). When the positioning device is to commence the process of estimating the position of the mobile body at a second time point after having outputted the position estimation value for the mobile body at a first time point, the computation circuit references the distance and direction of the movement of the mobile body between the first time point and the second time point to correct the position estimation value at the first time point accordingly, and supplies the positioning device with the corrected estimation value which serves as an initial position of the position of the mobile body at the second time point.

Description

移動体および自己位置推定装置Mobile body and self-position estimation device
 本開示は、移動体、および当該移動体に用いられる自己位置推定装置に関する。 The present disclosure relates to a moving body and a self-position estimation apparatus used for the moving body.
 ドローン(無人航空機)、自動運転カー、および自律移動ロボットなどの移動体(以下、単に「移動体」と称する)の位置を高い精度で推定する位置推定技術の開発が進められている。自己位置推定を行う移動体は、レーザ測域センサなどの外界センサを備え、移動しながら周囲の空間をセンシングしてセンサデータを取得する。たとえば、センサデータから作成した移動体周囲の局所的地図データを、より広範囲の環境地図データと照合(マッチング)することにより、環境地図上における自己位置を同定することが可能になる。 Development of a position estimation technique for estimating the position of a moving body (hereinafter simply referred to as “moving body”) such as a drone (unmanned aerial vehicle), an autonomous driving car, and an autonomous mobile robot is underway. A moving body that performs self-position estimation includes an external sensor such as a laser range sensor, and senses surrounding space while moving to acquire sensor data. For example, it is possible to identify the self-location on the environment map by collating (matching) the local map data around the moving object created from the sensor data with a wider range of environment map data.
 特開2016-224680号公報は、第1自己位置推定部と第2自己位置推定部とを備え、ステップごとに推定処理を実行する自己位置推定装置を開示している。第1自己位置推定部は、センサデータおよび環境地図から移動体の最新位置の確率分布を求め、この確率分布に基づいて第1自己位置を推定する。第2自己位置推定部は、オドメトリによって取得される、前回のステップから現在のステップまでの移動距離および移動方向を、前回のステップで推定された最終自己位置に加算して第2自己位置を推定する。この自己位置推定装置では、第1自己位置および第2自己位置の重みづけ平均値が現在のステップにおける最終自己位置とされる。 Japanese Patent Laying-Open No. 2016-224680 discloses a self-position estimation apparatus that includes a first self-position estimation unit and a second self-position estimation unit, and executes an estimation process for each step. The first self-position estimating unit obtains a probability distribution of the latest position of the moving body from the sensor data and the environment map, and estimates the first self-position based on the probability distribution. The second self-position estimation unit estimates the second self-position by adding the movement distance and movement direction from the previous step to the current step acquired by odometry to the final self-position estimated in the previous step. To do. In this self-position estimation device, the weighted average value of the first self-position and the second self-position is set as the final self-position in the current step.
特開2016-224680号公報Japanese Unexamined Patent Publication No. 2016-224680
 特開2016-224680号公報の装置によれば、第2自己位置は、前回のステップから現在のステップまでの固定された時間における移動体の移動(1ステップ分の移動)に基づいて算出される。また、1ステップ分の移動は、前回のステップから現在のステップまでの固定された時間においてオドメトリによって測定される。 According to the apparatus disclosed in Japanese Patent Application Laid-Open No. 2016-224680, the second self-position is calculated based on the movement of the moving body (movement of one step) during a fixed time from the previous step to the current step. . Also, the movement for one step is measured by odometry at a fixed time from the previous step to the current step.
 上記の装置では、第1自己位置推定部による第1自己位置の推定に失敗したとき、または、推定の信頼度が低いとき、移動体の移動速度が高いと正確度の高い推定が実現しないという課題がある。 In the above apparatus, when the first self-position estimation unit fails to estimate the first self-position, or when the estimation reliability is low, high-precision estimation is not realized if the moving speed of the moving object is high. There are challenges.
 本開示の実施形態は、移動速度によらず、正確度の高い位置推定を実現し得る移動体、および当該移動体に用いられる自己位置推定装置を提供する。 The embodiment of the present disclosure provides a moving body capable of realizing position estimation with high accuracy regardless of the moving speed, and a self-position estimating apparatus used for the moving body.
 本開示の移動体は、例示的な実施形態において、モータと、前記モータを制御して前記移動体を移動させる駆動装置と、周囲の空間をセンシングしてセンサデータを周期的に出力する外界センサと、地図データを記憶する記憶装置と、前記センサデータおよび前記地図データを利用して前記移動体の位置を推定する処理を行い、前記位置の推定値を順次出力する測位装置と、演算回路とを備える。前記演算回路は、前記測位装置が第1時刻における前記移動体の位置の推定値を出力した後、第2時刻における前記移動体の位置を推定する処理を開始するとき、前記第1時刻から前記第2時刻までの間に前記移動体が移動する距離および方向に基づいて、前記第1時刻における前記位置の推定値を補正し、補正後の推定値を前記第2時刻における前記移動体の位置の初期位置として前記測位装置に与える。 In an exemplary embodiment, the mobile body of the present disclosure includes a motor, a drive device that controls the motor to move the mobile body, and an external sensor that senses the surrounding space and periodically outputs sensor data. A storage device that stores map data, a positioning device that performs a process of estimating the position of the moving body using the sensor data and the map data, and sequentially outputs the estimated value of the position, and an arithmetic circuit; Is provided. The arithmetic circuit starts the process of estimating the position of the moving body at the second time after the positioning device outputs the estimated value of the position of the moving body at the first time from the first time. The estimated value of the position at the first time is corrected based on the distance and direction that the moving object moves until the second time, and the corrected estimated value is the position of the moving object at the second time. Is given to the positioning device as an initial position.
 本開示の自己位置推定装置は、例示的に実施形態において、周囲の空間をセンシングしてセンサデータを周期的に出力する外界センサを備える移動体に搭載されて使用される自己位置推定装置であって、地図データを記憶する記憶装置と、前記センサデータおよび前記地図データを利用して前記移動体の位置を推定する処理を行い、前記位置の推定値を順次出力する測位装置と演算回路とを備える。前記演算回路は、前記測位装置が第1時刻における前記移動体の位置の推定値を出力した後、第2時刻における前記移動体の位置を推定する処理を開始するとき、前記第1時刻から前記第2時刻までの間に前記移動体が移動する距離および方向に基づいて、前記第1時刻における前記位置の推定値を補正し、補正後の推定値を前記第2時刻における前記移動体の位置の初期位置として前記測位装置に与える。 The self-position estimation apparatus according to the present disclosure is a self-position estimation apparatus that is used by being mounted on a mobile body that includes an external sensor that senses surrounding space and periodically outputs sensor data in an exemplary embodiment. A storage device that stores map data, a positioning device that performs a process of estimating the position of the moving body using the sensor data and the map data, and sequentially outputs the estimated value of the position, and an arithmetic circuit. Prepare. The arithmetic circuit starts the process of estimating the position of the moving body at the second time after the positioning device outputs the estimated value of the position of the moving body at the first time from the first time. The estimated value of the position at the first time is corrected based on the distance and direction that the moving object moves until the second time, and the corrected estimated value is the position of the moving object at the second time. Is given to the positioning device as an initial position.
 本発明の移動体の実施形態によれば、測位装置による移動体の位置推定に必要な初期位置を決定するとき、測位装置の処理時間に応じて移動体の移動距離および方向を算出することができる。このため、逐次処理の1ステップの期間に限定されず、より正確度の高い初期位置を測位装置に与えることが可能になる。 According to the embodiment of the moving object of the present invention, when determining the initial position necessary for position estimation of the moving object by the positioning device, the moving distance and direction of the moving object can be calculated according to the processing time of the positioning device. it can. For this reason, it is not limited to the period of one step of the sequential processing, and it becomes possible to give the initial position with higher accuracy to the positioning device.
図1は、本開示による移動体の基本構成例を示すブロツク図である。FIG. 1 is a block diagram illustrating a basic configuration example of a moving object according to the present disclosure. 図2は、従来例について、外界センサ106からセンサデータが周期的に出力されるタイミング、および、測位装置110から位置推定値が出力されるタイミングの例を示す図である。FIG. 2 is a diagram illustrating an example of the timing at which sensor data is periodically output from the external sensor 106 and the timing at which a position estimation value is output from the positioning device 110 in the conventional example. 図3は、本開示による移動体について、外界センサ106からセンサデータが周期的に出力されるタイミング、および、測位装置110から位置推定値が出力されるタイミングの例を示す図である。FIG. 3 is a diagram illustrating an example of a timing at which sensor data is periodically output from the external sensor 106 and a timing at which a position estimation value is output from the positioning device 110 for a moving object according to the present disclosure. 図4は、工場内の通路1を走行する例示的なAGV10を示す図である。FIG. 4 is a diagram illustrating an exemplary AGV 10 that travels in a passage 1 in a factory. 図5は、AGV10の走行を管理する例示的な管理システム1000の概要を示す図である。FIG. 5 is a diagram illustrating an overview of an exemplary management system 1000 that manages the running of the AGV 10. 図6は、AGV10の走行経路に設定される各目的位置(▲)の例を示す図である。FIG. 6 is a diagram illustrating an example of each target position (▲) set in the travel route of the AGV 10. 図7Aは、継続的に移動するAGV10の移動経路の例を示す図である。FIG. 7A is a diagram illustrating an example of a movement route of the AGV 10 that continuously moves. 図7Bは、継続的に移動するAGV10の移動経路の例を示す図である。FIG. 7B is a diagram illustrating an example of a movement route of the AGV 10 that continuously moves. 図7Cは、継続的に移動するAGV10の移動経路の例を示す図である。FIG. 7C is a diagram illustrating an example of a movement route of the AGV 10 that continuously moves. 図8は、例示的なAGV10の外観図である。FIG. 8 is an external view of an exemplary AGV 10. 図9は、AGV10のハードウェアの構成を示す図である。FIG. 9 is a diagram illustrating a hardware configuration of the AGV 10. 図10は、移動しながらLRF15を用いて周囲の空間をスキャンするAGV10を示す図である。FIG. 10 is a diagram illustrating the AGV 10 that scans the surrounding space using the LRF 15 while moving. 図11は、移動しながらマップを生成するAGV10を示す図である。FIG. 11 is a diagram illustrating the AGV 10 that generates a map while moving. 図12は、移動しながらマップを生成するAGV10を示す図である。FIG. 12 is a diagram illustrating the AGV 10 that generates a map while moving. 図13は、移動しながらマップを生成するAGV10を示す図である。FIG. 13 is a diagram illustrating the AGV 10 that generates a map while moving. 図14は、移動しながらマップを生成するAGV10を示す図である。FIG. 14 is a diagram illustrating the AGV 10 that generates a map while moving. 図15は、完成したマップ30を模式的に示す図である。FIG. 15 is a diagram schematically showing the completed map 30. 図16は、一般的な位置同定処理の手順を模式的に示す図である。FIG. 16 is a diagram schematically illustrating a general position identification process. 図17は、一般的な位置同定処理の手順を模式的に示す図である。FIG. 17 is a diagram schematically illustrating a general position identification process. 図18は、一般的な位置同定処理の手順を模式的に示す図である。FIG. 18 is a diagram schematically illustrating a general position identification process. 図19は、自己位置を見失った後の位置同定処理の例を示すフローチャートである。FIG. 19 is a flowchart illustrating an example of position identification processing after losing sight of the self position. 図20は、走行管理装置20のハードウェア構成を示す図である。FIG. 20 is a diagram illustrating a hardware configuration of the travel management device 20.
 本開示による移動体の具体的な実施形態を説明する前に、本開示による移動体の基本構成例を説明する。 Before describing a specific embodiment of a moving object according to the present disclosure, a basic configuration example of a moving object according to the present disclosure will be described.
 図1は、本開示による移動体の基本構成例を示している。この例における移動体100は、電気モータ(以下、単に「モータ」と称する。)102と、モータ102を制御して移動体100を移動させる駆動装置104と、周囲の空間をセンシングしてセンサデータを周期的に出力する外界センサ106とを備えている。移動体10の典型例は、モータ102に対して機械的に結合した少なくとも1個の駆動輪(不図示)を有し、駆動輪のトラクションによって地上を走行することができる移動体である。 FIG. 1 shows a basic configuration example of a moving object according to the present disclosure. The moving body 100 in this example includes an electric motor (hereinafter simply referred to as “motor”) 102, a driving device 104 that moves the moving body 100 by controlling the motor 102, and sensor data by sensing the surrounding space. Is provided with an external sensor 106 that periodically outputs. A typical example of the moving body 10 is a moving body that has at least one drive wheel (not shown) mechanically coupled to the motor 102 and can travel on the ground by the traction of the drive wheel.
 移動体100は、さらに自己位置推定装置200を備えている。自己位置推定装置200は、周囲環境の地図データ(マップデータ)を記憶する記憶装置108と、センサデータおよび地図データを利用して移動体100の位置を推定する測位装置110と、種々の演算を実行する演算回路120とを有している。測位装置110は、移動体100が移動しているとき、または停止しているとき、移動体100の位置(自己位置)の推定値を順次出力する。演算回路120は、測位装置110に必要な位置情報を算出して測位装置110に与える。 The moving body 100 further includes a self-position estimation device 200. The self-position estimation device 200 performs various operations, such as a storage device 108 that stores map data (map data) of the surrounding environment, a positioning device 110 that estimates the position of the moving body 100 using sensor data and map data, and the like. And an arithmetic circuit 120 to be executed. The positioning device 110 sequentially outputs estimated values of the position (self-position) of the moving body 100 when the moving body 100 is moving or stopped. The arithmetic circuit 120 calculates position information necessary for the positioning device 110 and supplies the position information to the positioning device 110.
 以下、自己位置推定装置200の動作を詳細に説明する。 Hereinafter, the operation of the self-position estimation apparatus 200 will be described in detail.
 まず、図2を参照して、自己位置推定装置200における従来の動作例を説明する。図2は、外界センサ106からセンサデータが周期的に出力されるタイミング、および、測位装置110から位置推定値が出力されるタイミングの例を示す図である。図2の黒丸は、センサデータの出力タイミングを示している。センサデータは、外界センサ106から周期Tsで出力される。一方、測位装置110からは、周期Tpで位置推定値が出力される。 First, a conventional operation example in the self-position estimation apparatus 200 will be described with reference to FIG. FIG. 2 is a diagram illustrating an example of timing at which sensor data is periodically output from the external sensor 106 and timing at which a position estimation value is output from the positioning device 110. Black dots in FIG. 2 indicate sensor data output timing. The sensor data is output from the external sensor 106 at a cycle Ts. On the other hand, the position estimation value is output from the positioning device 110 at the cycle Tp.
 図2の例において、時刻t1、t2、t3では、それぞれ、位置推定値x1、x2、x3が正常に出力された。しかし、時刻t4では、位置推定値として「不定」を示す情報、または、非常に信頼度の低い位置推定値が出力された。このような場合、測位装置110は、次の位置推定に必要な現在の位置情報を失い、周期Tpで継続して位置推定値を出力することができなくなる。このため、移動体100の現在の位置として、時刻t3に出力された信頼度の高い位置推定値x3を測位装置110に与えることが必要になる。測位装置110は、たとえばICP(Iterative Closest Point)マッチングアルゴリズムによって自己位置推定を行う。測位装置110は、モンテカルロ法によるパーティクルフィルタを用いた確率的な位置推定を行ってもよい。測位装置110は、初期位置として位置推定値x3を取得すると、位置同定処理を開始する。 In the example of FIG. 2, the estimated position values x1, x2, and x3 were normally output at times t1, t2, and t3, respectively. However, at time t4, information indicating “undefined” as the position estimation value or a position estimation value with very low reliability was output. In such a case, the positioning device 110 loses the current position information necessary for the next position estimation, and cannot continuously output the position estimated value at the period Tp. For this reason, it is necessary to provide the positioning device 110 with the highly reliable position estimation value x3 output at time t3 as the current position of the moving body 100. The positioning device 110 performs self-position estimation using, for example, an ICP (Iterative Closest Point) matching algorithm. The positioning device 110 may perform probabilistic position estimation using a particle filter based on the Monte Carlo method. When the positioning device 110 acquires the estimated position value x3 as the initial position, the positioning device 110 starts position identification processing.
 移動体100の走行速度(移動速度)が高くなると、周期Tpの時間が経過する間に移動体100の移動する距離が長くなる。たとえば、移動体100の移動速度が5メートル/秒、周期Tpが100ミリ秒の場合、時刻t4の時点で時刻t3の位置推定値x3から0.5メートルもシフトする。また、時刻t4から周期Tpだけ時間が経過した時刻t5では、移動体100は時刻t3の位置推定値x3から1メートルもシフトしてしまう。移動体100の走行を停止しないかぎり、移動体100の位置は時刻t3の位置推定値x3からさらに遠ざかっていく。 When the traveling speed (moving speed) of the moving body 100 is increased, the moving distance of the moving body 100 is increased while the time of the period Tp has elapsed. For example, when the moving speed of the moving body 100 is 5 meters / second and the period Tp is 100 milliseconds, the estimated position x3 at time t3 is shifted by 0.5 meters at time t4. In addition, at time t5 when the time Tp has elapsed from time t4, the moving body 100 shifts by 1 meter from the estimated position value x3 at time t3. Unless the traveling of the moving body 100 is stopped, the position of the moving body 100 is further away from the estimated position value x3 at time t3.
 一方、外界センサ106からは、周期Tpよりも短い周期Tsでセンサデータが継続して出力される。移動体100の現在の位置を推定するためには、最新のセンサデータを用いて位置同定処理を行うべきである。しかし、この位置同定処理に必要な移動体100の初期位置が現実の位置から大きくずれていると、位置同定処理に長時間を要し、あるいは、位置同定が実現しないという問題がある。 On the other hand, sensor data is continuously output from the external sensor 106 at a cycle Ts shorter than the cycle Tp. In order to estimate the current position of the moving body 100, the position identification process should be performed using the latest sensor data. However, if the initial position of the moving body 100 required for the position identification process is greatly deviated from the actual position, there is a problem that the position identification process takes a long time or the position identification cannot be realized.
 特開2016-224680号公報に開示されている装置は、逐次処理の各ステップにおける移動体100の移動距離および方向をオドメトリによって取得している。1ステップの期間は、周期Tpに相当する。したがって、この装置によれば、時刻t4で自己位置を見失ったときは、時刻t3から時刻t4までの移動距離および方向を、時刻t3の位置推定値x3に加算した値(第2自己位置)を初期位置として用いることになる。なお、特許文献1に開示されている装置では、パーティクルフィルタを用いて確率的な自己位置推定が実行されている。しかし、本開示による移動体および自己位置推定装置では、自己位置推定のアルゴリズムは、確率的である必要はなく、パターンマッチングによるアルゴリズムであってもよい。 The apparatus disclosed in Japanese Patent Laying-Open No. 2016-224680 acquires the moving distance and direction of the moving body 100 in each step of sequential processing by odometry. The period of one step corresponds to the period Tp. Therefore, according to this apparatus, when the self-position is lost at time t4, a value (second self-position) obtained by adding the movement distance and direction from time t3 to time t4 to the position estimated value x3 at time t3. It will be used as the initial position. In the device disclosed in Patent Document 1, probabilistic self-position estimation is executed using a particle filter. However, in the mobile body and the self-position estimation apparatus according to the present disclosure, the self-position estimation algorithm does not need to be probabilistic and may be an algorithm based on pattern matching.
 このような従来例には、以下の課題がある。 Such a conventional example has the following problems.
 センサデータを取得してから地図データと照合するには、1回または複数回のスキャンによって取得したセンサデータから局所的地図データを作成する必要がある。このため、自己位置を見失った時刻t4における位置をオドメトリによって算出したとしても、照合に用いられるセンサデータの取得を実際に完了する時刻t6までに移動体100はさらに移動することになる。このため、より正確度の高い初期位置を与えないと、次の位置同定にも失敗する可能性がある。このことは、自己位置を見失った後も、停止または減速することなく高速で走行することが求められるような移動体では、特に問題になる。 In order to collate with map data after acquiring sensor data, it is necessary to create local map data from sensor data acquired by one or more scans. For this reason, even if the position at time t4 when the self-position is lost is calculated by odometry, the moving body 100 further moves by time t6 when the acquisition of sensor data used for collation is actually completed. For this reason, if the initial position with higher accuracy is not given, the next position identification may also fail. This is particularly a problem for mobiles that are required to travel at high speed without stopping or decelerating after losing sight of their position.
 本開示による移動体では、上記の課題を解決するため、図1の演算回路120が以下の処理を実行する。図3を参照して、この処理を説明する。 In the moving object according to the present disclosure, the arithmetic circuit 120 in FIG. 1 executes the following processing in order to solve the above-described problem. This process will be described with reference to FIG.
 図3に示されるように、時刻t3(第1時刻)における移動体100の位置の推定値x3を測位装置110が出力した後、時刻t4で自己位置を見失った(位置推定値「不定」)とする。この場合において、演算回路120は、時刻t6(第2時刻)における移動体100の位置を推定する位置同定処理を開始する。このとき、演算回路120は、時刻t3(第1時刻)から時刻t6(第2時刻)までの間に移動体100が移動する距離および方向に基づいて、時刻t3(第1時刻)における位置の推定値x3を補正する。演算回路120は、補正後の推定値(x3+Δx)を、時刻t6(第2時刻)における移動体100の位置の初期位置として測位装置110に与える。 As shown in FIG. 3, the positioning device 110 outputs the estimated value x3 of the position of the moving body 100 at time t3 (first time), and then loses its own position at time t4 (position estimated value “undefined”). And In this case, the arithmetic circuit 120 starts a position identification process for estimating the position of the moving body 100 at time t6 (second time). At this time, the arithmetic circuit 120 determines the position of the position at time t3 (first time) based on the distance and direction in which the moving body 100 moves from time t3 (first time) to time t6 (second time). The estimated value x3 is corrected. The arithmetic circuit 120 gives the corrected estimated value (x3 + Δx) to the positioning device 110 as the initial position of the moving body 100 at time t6 (second time).
 図3の例において、補正後の推定値(x3+Δx)が測位装置110に与えられるタイミングは時刻t5である。時刻t5から時刻t6までの時間は、極端な場合、零であってもよい。時刻t5=時刻t6のとき、補正量Δxは、時刻t3から時刻t5までの間に、たとえばオドメトリによって取得された移動距離および方向である。しかし、時刻t5から時刻t6までの期間(t6-t5)が無視できない場合は、その期間(t6-t5)における移動体の移動距離および方向を計算によって求め、補正量Δxに含めてもよい。次々に取得されるセンサデータを用いて局所位置データを作成する処理に時間を要する場合、時刻t3から時刻t6までの期間は非常に長くなり得る。このような場合、上記の処理時間に応じて時刻t6における移動体100の位置を初期位置として決定することが好ましい。 In the example of FIG. 3, the timing at which the corrected estimated value (x3 + Δx) is given to the positioning device 110 is time t5. The time from time t5 to time t6 may be zero in an extreme case. When time t5 = time t6, the correction amount Δx is the movement distance and direction acquired, for example, by odometry from time t3 to time t5. However, when the period (t6-t5) from time t5 to time t6 cannot be ignored, the moving distance and direction of the moving body in the period (t6-t5) may be obtained by calculation and included in the correction amount Δx. When it takes time to create local position data using sensor data acquired one after another, the period from time t3 to time t6 can be very long. In such a case, it is preferable to determine the position of the moving body 100 at time t6 as the initial position according to the processing time.
 以上の説明から明らかなように、補正後の推定値(x3+Δx)は、時刻t6(第2時刻)における移動体100の「予測位置」と呼んでも良い。このように本開示によれば、自己位置を見失った後に開始する自己位置同定処理の初期位置として、センサデータ取得後の処理に必要な時間を考慮し、移動体100の移動速度および方向に基づく予想位置を算出する。このため、初期位置の正確度が向上し、自己位置同定処理が適切に開始できる。その結果、移動体の減速または停止を行うことなく、スムーズな走行を継続し得る。 As is clear from the above description, the corrected estimated value (x3 + Δx) may be called the “predicted position” of the moving body 100 at time t6 (second time). As described above, according to the present disclosure, the initial position of the self-position identification process that starts after losing sight of the self-position is taken into account based on the moving speed and direction of the moving body 100 in consideration of the time required for the process after sensor data acquisition. Calculate the expected position. For this reason, the accuracy of the initial position is improved, and the self-position identification process can be appropriately started. As a result, smooth running can be continued without decelerating or stopping the moving body.
 ある態様において、測位装置110は、移動体100の位置の推定値を出力するとき、推定の信頼度を示す情報を出力してもよい。この態様において、演算回路120は、第1時刻から第2時刻までの間において、測位装置110から出力された移動体の位置の推定値についての信頼度が設定値未満になったとき、補正後の推定値を第2時刻における移動体の位置の初期位置として測位装置110に与える。推定の信頼度は、パターンマッチングによる一致度の評価値、または確率的な指標であり得る。信頼度が予め設定された値よりも低いとき、自己位置を見失ったとすることができる。 In an aspect, when the positioning device 110 outputs the estimated value of the position of the moving body 100, the positioning device 110 may output information indicating the reliability of the estimation. In this aspect, the arithmetic circuit 120, after the first time to the second time, after the correction, when the reliability of the estimated value of the position of the moving body output from the positioning device 110 is less than the set value. Is provided to the positioning device 110 as the initial position of the moving body at the second time. The reliability of the estimation can be an evaluation value of the degree of coincidence by pattern matching or a probabilistic index. When the reliability is lower than a preset value, it can be assumed that the self-position has been lost.
 他の態様において、演算回路120は、第1時刻から第2時刻までの間において、測位装置110からの要求に応答して、補正後の推定値を第2時刻における移動体100の位置の初期位置として測位装置110に与える。この態様によれば、たとえば、「不定」または信頼度の低い位置推定値が測位装置110から出力される前に、測位装置110が演算回路120に「補正後の推定値」を算出するように指示することも可能になる。演算回路120による「補正後の推定値」の算出開始を早めることができれば、第1時刻から第2時刻までの間隔を短くできるため、自己位置を見失っている期間を短縮することが可能になる。 In another aspect, the arithmetic circuit 120 responds to a request from the positioning device 110 between the first time and the second time, and calculates the corrected estimated value at the initial position of the mobile object 100 at the second time. The position is given to the positioning device 110. According to this aspect, for example, the positioning device 110 calculates the “estimated value after correction” to the arithmetic circuit 120 before the position estimation value of “indefinite” or low reliability is output from the positioning device 110. It is also possible to instruct. If the calculation start of the “estimated value after correction” by the arithmetic circuit 120 can be accelerated, the interval from the first time to the second time can be shortened, so that the period in which the self-position is lost can be shortened. .
 測位装置110は、演算回路120から与えられた第2時刻における移動体100の位置の初期位置と、第2時刻および第2時刻以前に外界センサ106から出力されたセンサデータとを利用して、第2時刻における移動体100の位置を推定する。測位装置110は、第2時刻における移動体100の位置の推定値を第3時刻に出力する。第2時刻における移動体100の位置推定値が出力されることにより、演算回路120は、この位置推定値に基づいて移動体100の走行を制御することが可能になる。 The positioning device 110 uses the initial position of the moving body 100 at the second time given from the arithmetic circuit 120 and the sensor data output from the external sensor 106 before the second time and the second time, The position of the moving body 100 at the second time is estimated. The positioning device 110 outputs the estimated value of the position of the moving body 100 at the second time at the third time. By outputting the estimated position value of the moving body 100 at the second time, the arithmetic circuit 120 can control the traveling of the moving object 100 based on the estimated position value.
 ある実施形態において、演算回路120は、駆動装置104の動作状態に応じて、第1時刻から第2時刻までの間に移動体100が移動する距離および方向を決定する。このような距離および方向は、モータ102を制御する駆動装置104の動作状態から把握することが可能である。たとえばトラクションを生成するモータ102の個数が2個の場合、個々のモータ102の回転速度から移動体100の移動速度および移動方向を決定することができる。モータ102の回転速度は駆動装置104の動作状態によって規定されるため、モータ102の回転状態を直接に検出するセンサが無くても、駆動装置の動作状態に基づいて移動体100の移動速度などを決定することができる。また、駆動装置104の動作状態は、演算回路120が駆動装置104に与える指令の中身から知ることができる。 In an embodiment, the arithmetic circuit 120 determines a distance and a direction in which the moving body 100 moves between the first time and the second time according to the operating state of the driving device 104. Such distance and direction can be grasped from the operating state of the drive device 104 that controls the motor 102. For example, when the number of motors 102 that generate traction is two, the moving speed and moving direction of the moving body 100 can be determined from the rotational speeds of the individual motors 102. Since the rotational speed of the motor 102 is defined by the operating state of the driving device 104, the moving speed of the moving body 100 can be determined based on the operating state of the driving device without a sensor that directly detects the rotating state of the motor 102. Can be determined. Further, the operating state of the driving device 104 can be known from the contents of the command given to the driving device 104 by the arithmetic circuit 120.
 他の実施形態において、移動体100は、オドメトリ情報を出力する内界センサをさらに備えている。内界センサの例は、モータまたは車輪の回転速度を測定するロータエンコーダ、ジャイロセンサなどの慣性計測ユニットを含む。車輪の回転速度は、駆動装置の動作状態から推定することも可能である。このような実施形態における演算回路120は、内界センサから出力されたオドメトリ情報を利用して、第1時刻から第2時刻までの間に前記移動体が移動する距離および方向を決定し得る。 In another embodiment, the moving body 100 further includes an internal sensor that outputs odometry information. Examples of the internal sensor include an inertial measurement unit such as a rotor encoder and a gyro sensor that measure the rotational speed of a motor or wheels. The rotational speed of the wheel can also be estimated from the operating state of the drive device. The arithmetic circuit 120 in such an embodiment can determine the distance and direction in which the moving body moves from the first time to the second time using the odometry information output from the internal sensor.
 地図データは、移動体100が移動しているときに外界センサ106から周期的に出力されたセンサデータに基づいて作成されたデータであってもよいし、他の方法によって作成された地図データであってもよい。地図データは、複数のゾーンのそれぞれの地図データが統合されたデータであってもよい。地図データの典型例は、占有格子地図であるが、これに限定されない。 The map data may be data created based on sensor data periodically output from the external sensor 106 when the moving body 100 is moving, or map data created by another method. There may be. The map data may be data obtained by integrating the map data of a plurality of zones. A typical example of the map data is an occupation grid map, but is not limited to this.
 駆動装置104は、移動体100の位置の指令値と測位装置110から出力された移動体100の位置の推定値とに基づいてモータ102を制御する。この指令値は、図1に示されてないコントローラから駆動装置104に与えられてもよい。 The driving device 104 controls the motor 102 based on the command value of the position of the moving body 100 and the estimated value of the position of the moving body 100 output from the positioning device 110. This command value may be given to the driving device 104 from a controller not shown in FIG.
[実施形態]
 以下、添付の図面を参照しながら、本開示による移動体の実施形態を説明する。本明細書では、移動体の一例として無人搬送車を挙げる。無人搬送車はAGV(Automated Guided Vehicle)とも呼ばれており、本明細書でも「AGV」と記述する。
[Embodiment]
Hereinafter, embodiments of a moving object according to the present disclosure will be described with reference to the accompanying drawings. In this specification, an automatic guided vehicle is mentioned as an example of a moving body. The automatic guided vehicle is also called AGV (Automated Guided Vehicle), and is also described as “AGV” in this specification.
 図4は、たとえば工場内の通路1を走行するAGV10を示す。図5は、本例による、AGV10の走行を管理する管理システム1000の概要を示す。図示される例では、AGV10は地図データ(マップデータ)を有し、自身が現在どの位置を走行しているかを認識しながら走行する。AGV10の走行経路は、図5の走行管理装置20からの指令に従う。AGV10は、指令に従って、内蔵された複数のモータをそれぞれ駆動し、車輪を回転させることによって移動する。指令は無線により、走行管理装置20からAGV10に送られる。AGV10と走行管理装置20との通信は、たとえば工場の天井付近に設けられた無線アクセスポイント2a、2b等を利用して行われ得る。通信は、たとえばWi-Fi(登録商標)規格に準拠する。なお、図4には1台のAGV10のみが示されているが、複数台のAGV10が走行してもよい。当該複数台のAGV10の各々の走行は、走行管理装置20によって管理されていてもよいし、されていなくてもよい。 FIG. 4 shows, for example, the AGV 10 that travels in the passage 1 in the factory. FIG. 5 shows an outline of a management system 1000 that manages the running of the AGV 10 according to this example. In the illustrated example, the AGV 10 has map data (map data) and travels while recognizing which position it is currently traveling. The travel route of the AGV 10 follows a command from the travel management device 20 of FIG. The AGV 10 moves by rotating a plurality of built-in motors according to a command and rotating wheels. The command is transmitted from the traveling management device 20 to the AGV 10 by radio. The communication between the AGV 10 and the travel management device 20 can be performed using, for example, the wireless access points 2a, 2b provided near the ceiling of the factory. The communication conforms to, for example, the Wi-Fi (registered trademark) standard. Although only one AGV 10 is shown in FIG. 4, a plurality of AGVs 10 may travel. The traveling of each of the plurality of AGVs 10 may or may not be managed by the traveling management device 20.
 管理システム1000に含まれるAGV10および走行管理装置20の動作の概要は以下のとおりである。 The outline of the operation of the AGV 10 and the travel management device 20 included in the management system 1000 is as follows.
 AGV10は、走行管理装置20からの指令(第n指令((n:正の整数)))に従って、第n位置から、目的位置である第(n+1)位置(以下「位置Mn+1」などと記述する。)に向かって移動中であるとする。なお、目的位置は、たとえばAGV10ごとに管理者によって決定され得る。 In accordance with a command from the travel management device 20 (n-th command ((n: positive integer))), the AGV 10 starts from the n-th position to the (n + 1) -th position (hereinafter, “position M n + 1 ”). ). The target position can be determined by the administrator for each AGV 10, for example.
 AGV10が目的とする位置Mn+1に到達すると、AGV10は走行管理装置20に宛てて到達通知(以下「通知」と記述する。)を送信する。通知は無線アクセスポイント2aを介して走行管理装置20に送られる。本実施形態では、AGV10は、周囲をセンシングする外界センサの出力と、地図データとを照合して自己位置を同定する。そして自己位置が当該位置Mn+1に一致するか否かを判定すればよい。 When the AGV 10 reaches the target position M n + 1 , the AGV 10 transmits an arrival notification (hereinafter referred to as “notification”) to the travel management device 20. The notification is sent to the travel management device 20 via the wireless access point 2a. In this embodiment, AGV10 collates the output of the external sensor which senses the periphery, and map data, and identifies a self position. Then, it may be determined whether or not the self position matches the position M n + 1 .
 通知を受信すると、走行管理装置20は、AGV10を位置Mn+1から位置Mn+2まで移動させるための次の指令(第(n+1)指令)を生成する。第(n+1)指令は、位置Mn+2の位置座標を含み、さらに加速時間、定速走行時の移動速度等の数値を含み得る。走行管理装置20は、AGV10に宛てて当該第(n+1)指令を送信する。 When the notification is received, the traveling management apparatus 20 generates the next command ((n + 1) th command) for moving the AGV 10 from the position M n + 1 to the position M n + 2 . The (n + 1) th command includes the position coordinates of the position M n + 2 and may further include numerical values such as acceleration time and moving speed during constant speed traveling. The traveling management device 20 transmits the (n + 1) th command to the AGV 10.
 第(n+1)指令を受信すると、AGV10は、第(n+1)指令を解析して、位置Mn+1から位置Mn+2までの移動に必要な前処理演算を行う。前処理演算は、たとえば、AGV10の各車輪を駆動するための各モータの回転速度、回転時間等を決定するための演算である。 When the (n + 1) th command is received, the AGV 10 analyzes the (n + 1) th command and performs a preprocessing calculation necessary for the movement from the position M n + 1 to the position M n + 2 . The preprocessing calculation is, for example, calculation for determining the rotation speed, rotation time, etc. of each motor for driving each wheel of the AGV 10.
 図6は、AGV10の走行経路に設定される各目的位置(▲)の例を示す。隣り合う2つの目的位置の間隔は固定値でなくてもよく、管理者によって決定され得る。 FIG. 6 shows an example of each target position (▲) set in the travel route of AGV10. The interval between two adjacent target positions does not have to be a fixed value and can be determined by an administrator.
 AGV10は、走行管理装置20からの指令に応じて種々の方向に移動することが可能である。図7A~図7Cは、連続的に移動するAGV10の移動経路の例を示す。 The AGV 10 can move in various directions according to commands from the travel management device 20. 7A to 7C show examples of movement paths of the AGV 10 that moves continuously.
 図7Aは、直進時のAGV10の移動経路を示す。AGV10は、位置Mn+1への到達後、前処理演算を行い、演算結果に従って各モータを動作させて、次の位置Mn+2に直線的に移動を継続することができる。 FIG. 7A shows the movement path of the AGV 10 when traveling straight. After reaching the position M n + 1 , the AGV 10 can perform a pre-processing calculation, operate each motor according to the calculation result, and continue to move linearly to the next position M n + 2 .
 図7Bは、位置Mn+1において左折し、位置Mn+2に向けて移動するAGV10の移動経路を示す。AGV10は、位置Mn+1への到達後、前処理演算を行い、演算結果に従って、進行方向右側に位置する少なくとも1台のモータを回転させる。そしてその場で角度θだけ半時計回りに回転すると、AGV10は位置Mn+2に向けて全てのモータを等速で回転させ、直進する。 FIG. 7B shows a movement path of the AGV 10 that makes a left turn at the position M n + 1 and moves toward the position M n + 2 . The AGV 10 performs a preprocessing calculation after reaching the position M n + 1 and rotates at least one motor located on the right side in the traveling direction according to the calculation result. When the AGV 10 rotates counterclockwise by an angle θ on the spot, all the motors rotate at a constant speed toward the position M n + 2 and go straight.
 図7Cは、位置Mn+1から位置Mn+2まで円弧状に移動するときのAGV10の移動経路を示す。AGV10は、位置Mn+1への到達後、前処理演算を行い、演算結果に従って内周側のモータよりも外周側のモータの回転速度を速める。これにより、AGV10は次の位置Mn+2に向けて円弧状の経路で移動を継続することができる。 FIG. 7C shows a movement path of the AGV 10 when moving in a circular arc shape from the position M n + 1 to the position M n + 2 . The AGV 10 performs a preprocessing calculation after reaching the position M n + 1, and increases the rotational speed of the outer peripheral motor relative to the inner peripheral motor in accordance with the calculation result. As a result, the AGV 10 can continue to move along an arcuate path toward the next position M n + 2 .
 図8は、本実施形態にかかる例示的なAGV10の外観図である。また図9は、AGV10のハードウェアの構成を示す。 FIG. 8 is an external view of an exemplary AGV 10 according to the present embodiment. FIG. 9 shows the hardware configuration of the AGV 10.
 AGV10は、4つの車輪11a~11dと、フレーム12と、搬送テーブル13と、走行制御装置14と、レーザレンジファインダ(LRF)15とを有する。なお、図8には、前輪11a、後輪11bおよび後輪11cは示されているが、前輪11dはフレーム12の蔭に隠れているため明示されていない。 The AGV 10 includes four wheels 11a to 11d, a frame 12, a transport table 13, a travel control device 14, and a laser range finder (LRF) 15. In FIG. 8, the front wheel 11a, the rear wheel 11b, and the rear wheel 11c are shown, but the front wheel 11d is not clearly shown because it is hidden behind the frame 12.
 走行制御装置14は、AGV10の動作を制御する装置であり、自己位置推定装置としても機能する。走行制御装置14は、主としてマイコン(後述)を含む複数の集積回路、複数の電子部品および、当該複数の集積回路および当該複数の電子部品が搭載された基板を含む。走行制御装置14は、上述した、走行管理装置20とのデータの送受信、および、前処理演算を行う。 The traveling control device 14 is a device that controls the operation of the AGV 10, and also functions as a self-position estimation device. The traveling control device 14 mainly includes a plurality of integrated circuits including a microcomputer (described later), a plurality of electronic components, and a substrate on which the plurality of integrated circuits and the plurality of electronic components are mounted. The travel control device 14 performs data transmission / reception with the travel management device 20 and pre-processing calculation described above.
 LRF15は、たとえば赤外のレーザビーム15aを目標物に照射し、レーザビーム15aの反射光を検出することにより、目標物までの距離を測定する測域センサである。LRF15は、図1の外界センサ106に相当する。周囲の空間をセンシングしてセンサデータを取得するための外界センサの他の例は、イメージセンサおよび超音波センサを含む。本実施形態では、AGV10のLRF15は、たとえばAGV10の正面を基準として左右135度(合計270度)の範囲の空間に、0.25度ごとに方向を変化させながらパルス状のレーザビーム15aを放射し、各レーザビーム15aの反射光を検出する。これにより、角度270度の範囲を0.25度ごとに分割した合計約1080の異なる方向のそれぞれについて、AGV10から反射点までの距離のデータを得ることができる。270度および0.25度の数値は、例示であり、スキャンの態様はLRF15の種類に応じて異なる。LRF15による1回のスキャンに要する時間は、たとえば数ミリから数十ミリ秒である。LRF15は、周囲の空間をセンシングしながら周期的(たとえば数十ミリ秒毎)にセンサデータを出力する。 The LRF 15 is a range sensor that measures the distance to the target by, for example, irradiating the target with an infrared laser beam 15a and detecting the reflected light of the laser beam 15a. The LRF 15 corresponds to the external sensor 106 in FIG. Other examples of external sensors for sensing the surrounding space to acquire sensor data include an image sensor and an ultrasonic sensor. In this embodiment, the LRF 15 of the AGV 10 emits a pulsed laser beam 15a while changing its direction every 0.25 degrees in a space in the range of 135 degrees to the left and right (total 270 degrees) with respect to the front of the AGV 10, for example. Then, the reflected light of each laser beam 15a is detected. Thus, distance data from the AGV 10 to the reflection point can be obtained for each of a total of about 1080 different directions obtained by dividing the range of angle 270 degrees every 0.25 degrees. The values of 270 degrees and 0.25 degrees are examples, and the scanning mode varies depending on the type of LRF 15. The time required for one scan by the LRF 15 is several milliseconds to several tens of milliseconds, for example. The LRF 15 outputs sensor data periodically (for example, every several tens of milliseconds) while sensing the surrounding space.
 AGV10の位置および姿勢と、LRF15のスキャン結果とにより、AGV10の周囲における物体の配置状態を知ることができる。一般に、移動体の位置および姿勢は、ポーズ(pose)と呼ばれる。2次元面内における移動体の位置および姿勢は、それぞれ、XY直交座標系における位置座標(x, y)、およびX軸に対する角度θによって表現される。AGV10の位置および姿勢、すなわちポーズ(x, y, θ)を、以下、単に「位置」または「位置座標」と呼ぶことがある。 From the position and posture of the AGV 10 and the scan result of the LRF 15, it is possible to know the arrangement state of the object around the AGV 10. In general, the position and posture of a moving object are called poses. The position and orientation of the moving body in the two-dimensional plane are expressed by position coordinates (x, y) in the XY orthogonal coordinate system and an angle θ with respect to the X axis, respectively. The position and orientation of the AGV 10, that is, the pose (x, y, θ) may be simply referred to as “position” or “position coordinates” below.
 後述する測位装置は、LRF15のスキャン結果から作成された局所的地図データを、より広範囲の環境地図データと照合(マッチング)することにより、環境地図上における自己位置(x, y, θ)を同定することが可能になる。 The positioning device, which will be described later, identifies the local position (x, y, θ) on the environment map by matching (matching) the local map data created from the scan results of LRF15 with a wider range of environment map data. It becomes possible to do.
 なお、レーザビーム15aの放射の中心位置から見た反射点の位置は、角度および距離によって決定される極座標を用いて表現され得る。この極座標は、AGV10とともに移動する局所的な座標である。本実施形態では、LRF15は極座標で表現されたセンサデータを出力する。ただし、LRF15は、極座標で表現された位置を直交座標に変換して出力してもよい。 Note that the position of the reflection point viewed from the center position of the radiation of the laser beam 15a can be expressed using polar coordinates determined by the angle and the distance. The polar coordinates are local coordinates that move with the AGV 10. In the present embodiment, the LRF 15 outputs sensor data expressed in polar coordinates. However, the LRF 15 may convert the position expressed in polar coordinates into orthogonal coordinates and output the result.
 LRFの構造および動作原理は公知であるため、本明細書ではこれ以上の詳細な説明は省略する。なお、LRF15によって検出され得る物体の例は、人、荷物、棚、壁である。 Since the structure and operation principle of the LRF are known, further detailed description is omitted in this specification. Examples of objects that can be detected by the LRF 15 are people, luggage, shelves, and walls.
 本明細書では、センサから出力されたデータを「センサデータ」と呼ぶ。LRF15から出力された「センサデータ」は、角度θと距離Lとを一組とした複数組のベクトルデータである。角度θは、前述したように、たとえば-135度から+135度の範囲で0.25度ずつ変化する。角度は、AGV10の正面を基準として右側を正、左側を負として表現され得る。距離Lは、角度θごとに計測された物体までの距離である。距離Lは、レーザビーム15aの放射時刻と反射光の受信時刻との差(つまりレーザビームの往復の所要時間)の半分を、光速で除算して得られる。 In this specification, data output from the sensor is referred to as “sensor data”. The “sensor data” output from the LRF 15 is a plurality of sets of vector data in which the angle θ and the distance L are set as one set. As described above, the angle θ changes by 0.25 degrees within a range of, for example, −135 degrees to +135 degrees. The angle may be expressed with the right side as positive and the left side as negative with respect to the front of the AGV 10. The distance L is the distance to the object measured for each angle θ. The distance L is obtained by dividing half of the difference between the emission time of the laser beam 15a and the reception time of the reflected light (that is, the time required for the round trip of the laser beam) by the speed of light.
 図9を参照する。図9には、AGV10の走行制御装置14の具体的な構成も示されている。 Refer to FIG. FIG. 9 also shows a specific configuration of the traveling control device 14 of the AGV 10.
 この例におけるAGV10は、走行制御装置14と、LRF15と、4台のモータ16a~16dと、駆動装置17とを備えている。 The AGV 10 in this example includes a travel control device 14, an LRF 15, four motors 16a to 16d, and a drive device 17.
 走行制御装置14は、マイコン14aと、メモリ14bと、記憶装置14cと、通信回路14dと、測位装置14eとを有している。マイコン14a、メモリ14b、記憶装置14c、通信回路14dおよび測位装置14eは通信バス14fで接続されており、相互にデータを授受することが可能である。またLRF15もまた通信インタフェース(図示せず)を介して通信バス14fに接続されており、測定結果である測定データを、マイコン14a、測位装置14eおよび/またはメモリ14bに送信する。 The traveling control device 14 includes a microcomputer 14a, a memory 14b, a storage device 14c, a communication circuit 14d, and a positioning device 14e. The microcomputer 14a, the memory 14b, the storage device 14c, the communication circuit 14d, and the positioning device 14e are connected by a communication bus 14f and can exchange data with each other. The LRF 15 is also connected to the communication bus 14f via a communication interface (not shown), and transmits measurement data as measurement results to the microcomputer 14a, the positioning device 14e, and / or the memory 14b.
 マイコン14aは、走行制御装置14を含むAGV10の全体を制御するための演算を行う制御回路(コンピュータ)である。マイコン14aは、図1の演算回路120として動作することができる。典型的にはマイコン14aは半導体集積回路である。マイコン14aは、PWM(Pulse Width Modulation)信号を駆動装置17に送信して駆動装置17を制御し、モータに流れる電流を調整させる。これによりモータ16a~16dの各々が所望の回転速度で回転する。 The microcomputer 14a is a control circuit (computer) that performs calculations for controlling the entire AGV 10 including the travel control device 14. The microcomputer 14a can operate as the arithmetic circuit 120 in FIG. Typically, the microcomputer 14a is a semiconductor integrated circuit. The microcomputer 14a transmits a PWM (Pulse Width Modulation) signal to the driving device 17 to control the driving device 17 and adjust the current flowing through the motor. As a result, each of the motors 16a to 16d rotates at a desired rotation speed.
 メモリ14bは、マイコン14aが実行するコンピュータプログラムを記憶する、揮発性の記憶装置である。メモリ14bは、マイコン14aおよび測位装置14eが演算を行う際のワークメモリとしても利用され得る。 The memory 14b is a volatile storage device that stores a computer program executed by the microcomputer 14a. The memory 14b can also be used as a work memory when the microcomputer 14a and the positioning device 14e perform calculations.
 記憶装置14cは、地図データを記憶する不揮発性の半導体メモリ装置である。ただし、記憶装置14cは、ハードディスクに代表される磁気記録媒体、または、光ディスクに代表される光学式記録媒体であってもよく、さらにいずれかの記録媒体にデータを書き込みおよび/または読み出すためのヘッド装置および当該ヘッド装置の制御装置を含んでもよい。地図データの詳細は図50を参照しながら後述する。記憶装置14cは、図1の記憶装置108に相当する。 The storage device 14c is a non-volatile semiconductor memory device that stores map data. However, the storage device 14c may be a magnetic recording medium typified by a hard disk or an optical recording medium typified by an optical disk, and a head for writing and / or reading data on any recording medium. The apparatus and the control device of the head device may be included. Details of the map data will be described later with reference to FIG. The storage device 14c corresponds to the storage device 108 in FIG.
 通信回路14dは、たとえばWi-Fi(登録商標)規格に準拠した無線通信を行う無線通信回路である。 The communication circuit 14d is a wireless communication circuit that performs wireless communication conforming to, for example, the Wi-Fi (registered trademark) standard.
 測位装置14eは、LRF15からセンサデータを受け取り、また、記憶装置14cに記憶された地図データを読み出す。測位装置14eがLRF15からセンサデータを受け取るタイミングは、LRF15がセンサデータを出力するタイミングと一致している必要はない。たとえばLRF15がセンサデータを25ミリ秒ごとに出力し、測位装置14eは、100ミリ秒ごとにセンサデータを受け取っても良い。測位装置14eは、センサデータと地図データとを照合して自己位置を同定する処理を行う。測位装置14eの具体的な動作は後述する。 The positioning device 14e receives the sensor data from the LRF 15, and reads the map data stored in the storage device 14c. The timing at which the positioning device 14e receives sensor data from the LRF 15 does not have to coincide with the timing at which the LRF 15 outputs sensor data. For example, the LRF 15 may output sensor data every 25 milliseconds, and the positioning device 14e may receive the sensor data every 100 milliseconds. The positioning device 14e performs a process of comparing the sensor data and the map data to identify the self position. The specific operation of the positioning device 14e will be described later.
 なお、本実施形態では、マイコン14aと測位装置14eとは別個の構成要素であるとしているが、これは一例である。マイコン14aおよび測位装置14eの各動作を独立して行うことが可能な1つのチップ回路または半導体集積回路であってもよい。図9には、マイコン14aおよび測位装置14eを包括するチップ回路14gが示されている。本明細書では、マイコン14a、測位装置14eおよび/またはチップ回路14gは、コンピュータ、演算回路、または、処理回路と呼ぶことがある。なお、以下では、マイコン14aおよび測位装置14eが別個独立に設けられている例で説明する。 In this embodiment, the microcomputer 14a and the positioning device 14e are separate components, but this is an example. It may be a single chip circuit or a semiconductor integrated circuit capable of independently performing the operations of the microcomputer 14a and the positioning device 14e. FIG. 9 shows a chip circuit 14g including the microcomputer 14a and the positioning device 14e. In this specification, the microcomputer 14a, the positioning device 14e, and / or the chip circuit 14g may be referred to as a computer, an arithmetic circuit, or a processing circuit. Hereinafter, an example in which the microcomputer 14a and the positioning device 14e are separately provided will be described.
 4台のモータ16a~16dは、それぞれ4つの車輪11a~11dに取り付けられ、各車輪を回転させる。1台のAGV10に搭載されるモータの個数は4個に限定されない。また、トラクションのためのモータは4つの車輪11a~11dの全てに取り付けられる必要はなく、典型的には、2個の車輪に取り付けられる。AGV10には、操舵のための車輪およびモータが設けられていてもよいし、他の用途のモータが搭載されていてもよい。 The four motors 16a to 16d are attached to the four wheels 11a to 11d, respectively, and rotate each wheel. The number of motors mounted on one AGV 10 is not limited to four. Further, the motor for traction does not have to be attached to all four wheels 11a to 11d, and is typically attached to two wheels. The AGV 10 may be provided with a wheel and a motor for steering, or a motor for other purposes.
 駆動装置17は、4台のモータ16a~16dの各々に流れる電流を調整するためのモータ駆動回路17a~17dを有する。駆動装置17は、図1の駆動装置104に相当する。モータ駆動回路17a~17dの各々はいわゆるインバータ回路であり、マイコン14aから送信されたPWM信号によって各モータに流れる電流をオンまたはオフし、それによりモータに流れる電流を調整する。 The drive device 17 has motor drive circuits 17a to 17d for adjusting the current flowing through each of the four motors 16a to 16d. The driving device 17 corresponds to the driving device 104 in FIG. Each of the motor drive circuits 17a to 17d is a so-called inverter circuit, and the current flowing to each motor is turned on or off by the PWM signal transmitted from the microcomputer 14a, thereby adjusting the current flowing to the motor.
 次に、本開示によるAGV10がマップを生成する処理の一例を説明する。 Next, an example of processing in which the AGV 10 according to the present disclosure generates a map will be described.
 本実施形態における地図データは、SLAM(Simultaneous Localization and Mapping)技術によって作成され得る。AGV10は、たとえば、AGV10が使用される工場内を実際に走行しながらLRF15を動作させて周囲の空間をスキャンし、自己の位置を推定しながらマップを生成する。またはAGV10は、管理者に制御されながら特定の経路を走行し、LRF15によって取得したセンサデータからマップを生成してもよい。地図データの作成は、AGV10の移動中に「オンライン処理」で行ってもよいし、AGV10の移動中に取得した大量のセンサデータを用いて、AGV10の外部に位置するコンピュータによって「オフライン処理」で行ってもよい。 The map data in the present embodiment can be created by SLAM (Simultaneous Localization and Mapping) technology. For example, the AGV 10 scans the surrounding space by operating the LRF 15 while actually traveling in a factory where the AGV 10 is used, and generates a map while estimating its own position. Alternatively, the AGV 10 may travel on a specific route while being controlled by the administrator, and generate a map from the sensor data acquired by the LRF 15. The map data may be created by “online processing” while the AGV 10 is moving, or by “offline processing” by a computer located outside the AGV 10 using a large amount of sensor data acquired while the AGV 10 is moving. You may go.
 図10から図14はそれぞれ、移動しながらマップを生成するAGV10を示す。図10には、LRF15を用いて周囲の空間をスキャンするAGV10が示されている。所定の角度ごとにレーザビームが放射され、スキャンが行われる。 10 to 14 show the AGV 10 that generates a map while moving. FIG. 10 shows an AGV 10 that scans the surrounding space using the LRF 15. A laser beam is emitted at every predetermined angle, and scanning is performed.
 図10から図14の各々では、レーザビームの反射点の位置が、図10の点4のような、記号「・」で表される複数の点を用いて示されている。これらの複数の点は、点群データ(Point Cloud Data)を形成している。測位装置14eは、走行に伴って得られる黒点4の位置を、たとえばメモリ14bに蓄積する。AGV10が走行しながらスキャンを継続して行うことにより、マップが徐々に完成されてゆく。図11から図14では、簡略化のためスキャン範囲のみが示されている。当該スキャン範囲も例示であり、上述した合計270度の例とは異なる。 In each of FIGS. 10 to 14, the position of the reflection point of the laser beam is indicated by using a plurality of points represented by the symbol “·”, such as point 4 in FIG. These plural points form point cloud data (Point Cloud Data). The positioning device 14e accumulates the position of the black spot 4 obtained as a result of traveling, for example, in the memory 14b. The map is gradually completed by continuously performing scanning while the AGV 10 travels. In FIG. 11 to FIG. 14, only the scan range is shown for the sake of simplicity. The scan range is also an example, and is different from the above-described example of 270 degrees in total.
 図15は、完成したマップ30の一部を模式的に示す。測位装置14eは、マップ30のデータをメモリ14bまたは記憶装置14cに蓄積する。なお図示されている黒点の数または密度は一例である。 FIG. 15 schematically shows a part of the completed map 30. The positioning device 14e accumulates the data of the map 30 in the memory 14b or the storage device 14c. The number or density of black spots shown in the figure is an example.
 図16から図18は、一般的な位置同定処理の手順を模式的に示す。AGVは、SLAM技術によって取得した、図16のマップ30に相当するマップ(以下、「参照マップ30」と称する。)を事前に取得している。AGVは走行時に、所定の時間間隔で、図16に示すセンサデータ32を取得し、参照マップ30上における自己位置を同定する処理を実行する。 16 to 18 schematically show a procedure of general position identification processing. AGV has acquired in advance a map (hereinafter referred to as “reference map 30”) corresponding to the map 30 of FIG. During traveling, the AGV acquires sensor data 32 shown in FIG. 16 at predetermined time intervals, and executes a process of identifying its own position on the reference map 30.
 まず、AGVは、参照マップ30上でAGVの位置および角度を変化させた種々の局所地図(たとえば局所地図34a、34b、34c)を順次設定し、その各々に含まれる複数の反射点とセンサデータ32に含まれる反射点とを照合する。このような照合は、たとえば、前述したICPマッチングによって行うことができる。照合を効率的に実行するためには、参照マップ30上におけるAGVの位置および角度、すなわちポーズを適切に設定する必要がある。本実施形態では、図3を参照しながら説明した方法で図9のマイコン14aが初期位置を設定し、測位装置14eに与える。 First, the AGV sequentially sets various local maps (for example, local maps 34a, 34b, 34c) in which the position and angle of the AGV are changed on the reference map 30, and a plurality of reflection points and sensor data included in each of the local maps. 32 is collated with the reflection point included in 32. Such collation can be performed by the ICP matching described above, for example. In order to perform matching efficiently, it is necessary to appropriately set the position and angle of the AGV on the reference map 30, that is, the pose. In the present embodiment, the microcomputer 14a in FIG. 9 sets an initial position by the method described with reference to FIG. 3, and gives it to the positioning device 14e.
 図17は、照合の結果、一致したと判定された点(たとえば点5)を記号「■」で模式的に示す。たとえば、対応する特徴間の距離(誤差)の二乗平均が最小になる局所地図34dが選択されると、その局所地図34dに対応するAGVの位置および角度から、照合によって推定されたAGVの位置および位置が決まる。図18では、同定された自己位置36が、記号「X」で表されている。 FIG. 17 schematically shows a point (for example, point 5) determined to be a match as a result of the collation, by the symbol “■”. For example, when the local map 34d that minimizes the root mean square of the distance (error) between corresponding features is selected, the position of the AGV estimated by matching and the position of the AGV corresponding to the local map 34d The position is determined. In FIG. 18, the identified self-position 36 is represented by the symbol “X”.
 図16に示されるような照合を行うには、最新のセンサデータと、そのセンサデータを取得したときのAGVの位置、正確にはポーズが必要である。AGVが自己位置を見失わない場合、前回の推定位置(同定位置)に初期位置として現在の自己位置を高い精度で推定することが可能である。そのような推定値の更新は、たとえば100ミリ秒周期で実現し得る。しかし、図2を参照して説明したように、AGVが自己位置を見失った場合、照合開始に必要なAGVの位置(初期位置)が現実のAGVの位置から大きくシフトしていると、照合に長い時間がかかる。このため、AGVは照合のために停止または減速する必要が発生する。本開示の実施形態によれば、図3を参照しながら説明したように、自己位置を見失った場合、AGVの移動距離および方向ならびに処理時間に応じて初期位置を適切に予想する。このため、適切な初期位置に基づいて、効率的な照合処理が実現する。 In order to perform collation as shown in FIG. 16, the latest sensor data and the position of the AGV when the sensor data is acquired, to be precise, a pose is required. When the AGV does not lose sight of the self position, it is possible to estimate the current self position with high accuracy as the initial position at the previous estimated position (identification position). Such an update of the estimated value can be realized, for example, with a period of 100 milliseconds. However, as described with reference to FIG. 2, when the AGV loses its own position, if the position of the AGV (initial position) necessary for the start of collation is greatly shifted from the actual AGV position, It takes a long time. For this reason, the AGV needs to be stopped or decelerated for verification. According to the embodiment of the present disclosure, as described with reference to FIG. 3, when the self-position is lost, the initial position is appropriately predicted according to the moving distance and direction of the AGV and the processing time. For this reason, efficient collation processing is realized based on an appropriate initial position.
 図19は、自己位置を見失った後の位置同定処理の例を示すフローチャートである。図9および図19を参照しながら、位置同定処理の例を説明する。 FIG. 19 is a flowchart showing an example of position identification processing after losing sight of the self position. An example of position identification processing will be described with reference to FIGS. 9 and 19.
 まず、図9のマイコン14aは、自己位置を見失った後、図19のステップS10において、オドメトリ情報を取得する。「自己位置を見失った後」とは、図3における時刻t4以降を意味する。オドメトリ情報は、不図示のロータリエンコーダなどから取得され得る。 First, after losing sight of its own position, the microcomputer 14a in FIG. 9 acquires odometry information in step S10 in FIG. “After losing sight of the self position” means after time t4 in FIG. The odometry information can be acquired from a rotary encoder (not shown) or the like.
 次に、ステップS20において、マイコン14aは前回の同定位置を補正し、照合に必要な初期位置を算出する。具体的には、オドメトリ情報に基づいて、マイコン14aはAGVの移動速度を算出する。また、マイコン14aは、最新のセンサデータを取得する時刻(図3の時刻t6)までの時間を決定し、この時間および移動速度に基づいて、AGVの移動距離を算出する。さらに、マイコン14aは、この移動距離およびAGVの移動方向から、最新のセンサデータを取得する時刻(図3の時刻t6)におけるAGVの位置を予測する。 Next, in step S20, the microcomputer 14a corrects the previous identification position and calculates an initial position necessary for collation. Specifically, the microcomputer 14a calculates the moving speed of the AGV based on the odometry information. Further, the microcomputer 14a determines the time until the time (time t6 in FIG. 3) at which the latest sensor data is acquired, and calculates the moving distance of the AGV based on this time and the moving speed. Furthermore, the microcomputer 14a predicts the position of the AGV at the time when the latest sensor data is acquired (time t6 in FIG. 3) from the moving distance and the moving direction of the AGV.
 ステップS30において、マイコン14aは、予測した位置を照合のための初期位置として測位装置14eに与える。 In step S30, the microcomputer 14a gives the predicted position to the positioning device 14e as an initial position for collation.
 ステップS40において、測位装置14eはLRF15から最新のセンサデータを取得する。 In step S40, the positioning device 14e acquires the latest sensor data from the LRF 15.
 ステップS50において、測位装置14eは、上記の初期位置と最新のセンサデータとを用いて位置同定のための照合を開始する。測位装置14eは、照合が完了し、自己位置を同定した後、その自己位置を位置推定値として出力する。 In step S50, the positioning device 14e starts collation for position identification using the initial position and the latest sensor data. After the collation is completed and the self-position is identified, the positioning device 14e outputs the self-position as a position estimation value.
 図20は、走行管理装置20のハードウェア構成を示す。走行管理装置20は、CPU21と、メモリ22と、記憶装置23と、通信回路24と、画像処理回路25とを有する。CPU21、メモリ22、記憶装置23、通信回路24、および画像処理回路25は通信バス27で接続されており、相互にデータを授受することが可能である。 FIG. 20 shows a hardware configuration of the travel management device 20. The travel management device 20 includes a CPU 21, a memory 22, a storage device 23, a communication circuit 24, and an image processing circuit 25. The CPU 21, the memory 22, the storage device 23, the communication circuit 24, and the image processing circuit 25 are connected by a communication bus 27 and can exchange data with each other.
 CPU21は、走行管理装置20の動作を制御する信号処理回路(コンピュータ)である。典型的にはCPU21は半導体集積回路である。 The CPU 21 is a signal processing circuit (computer) that controls the operation of the travel management device 20. Typically, the CPU 21 is a semiconductor integrated circuit.
 メモリ22は、CPU21が実行するコンピュータプログラムを記憶する、揮発性の記憶装置である。メモリ22は、CPU21が演算を行う際のワークメモリとしても利用され得る。 The memory 22 is a volatile storage device that stores a computer program executed by the CPU 21. The memory 22 can also be used as a work memory when the CPU 21 performs calculations.
 記憶装置23は、AGV10が作成した地図データを格納している。記憶装置23は、不揮発性の半導体メモリであってもよいし、ハードディスクに代表される磁気記録媒体、光ディスクに代表される光学式記録媒体であってもよい。 The storage device 23 stores map data created by the AGV 10. The storage device 23 may be a nonvolatile semiconductor memory, a magnetic recording medium represented by a hard disk, or an optical recording medium represented by an optical disk.
 記憶装置23には、走行管理装置20として機能するために必要な、AGV10の行き先となり得る各位置を示す位置データも格納され得る。位置データは、たとえば管理者によって工場内に仮想的に設定された座標によって表され得る。位置データは管理者によって決定される。 The storage device 23 may also store position data indicating each position that can be a destination of the AGV 10 that is necessary to function as the travel management device 20. The position data can be represented by coordinates virtually set in the factory by an administrator, for example. The location data is determined by the administrator.
 通信回路24は、たとえばイーサネット(登録商標)規格に準拠した有線通信を行う。通信回路24は無線アクセスポイント2a、2b等と有線で接続されており、無線アクセスポイント2a、2b等を介して、AGV10と通信することができる。 The communication circuit 24 performs wired communication based on, for example, the Ethernet (registered trademark) standard. The communication circuit 24 is connected to the wireless access points 2a, 2b and the like by wire, and can communicate with the AGV 10 via the wireless access points 2a, 2b and the like.
 通信回路24は、AGV10が向かうべき位置のデータおよび指令を、バス27を介してCPU21から受信してAGV10に送信する。このデータおよび指令は、図9に示されるAGV10の通信回路14dが受信する。通信回路24は、AGV10の通信回路14d(図9)から受信したデータ(たとえば通知および位置情報)を、バス27を介してCPU21および/またはメモリ22に送信する。AGV10は、測位装置14eが出力した自己位置情報(位置および角度)を周期的に走行管理装置20の通信回路24に送信する。この周期は、たとえば100ミリ秒から1秒であり得る。 The communication circuit 24 receives data and a command of a position to which the AGV 10 should go from the CPU 21 via the bus 27 and transmits it to the AGV 10. This data and command are received by the communication circuit 14d of the AGV 10 shown in FIG. The communication circuit 24 transmits data (for example, notification and position information) received from the communication circuit 14 d (FIG. 9) of the AGV 10 to the CPU 21 and / or the memory 22 via the bus 27. The AGV 10 periodically transmits the self-position information (position and angle) output from the positioning device 14 e to the communication circuit 24 of the travel management device 20. This period can be, for example, 100 milliseconds to 1 second.
 画像処理回路25は外部モニタ29に表示する映像データを生成する回路である。画像処理回路25は、専ら、管理者が走行管理装置20を操作する際に動作する。本実施形態では特にこれ以上の詳細な説明は省略する。なお、モニタ29は走行管理装置20と一体化されていてもよい。また画像処理回路25の処理をCPU21が行ってもよい。 The image processing circuit 25 is a circuit that generates video data to be displayed on the external monitor 29. The image processing circuit 25 operates exclusively when the administrator operates the travel management device 20. In the present embodiment, further detailed explanation is omitted. The monitor 29 may be integrated with the travel management apparatus 20. Further, the CPU 21 may perform the processing of the image processing circuit 25.
 本開示の技術は、自己位置を同定する処理を行う移動体に広く用いられ得る。 The technique of the present disclosure can be widely used for a moving body that performs a process of identifying a self-position.
 2a、2b・・・無線アクセスポイント、10・・・自動搬送車(AGV)、14・・・走行制御装置、14a・・・マイコン、14b・・・メモリ、14c・・・記憶装置、14d・・・通信回路、14e・・・測位装置、15・・・LRF、16a~16d・・・モータ、17・・・駆動装置、17a~17d・・・モータ駆動回路、20・・・走行管理装置 2a, 2b ... wireless access point, 10 ... automatic guided vehicle (AGV), 14 ... travel control device, 14a ... microcomputer, 14b ... memory, 14c ... storage device, 14d ..Communication circuit, 14e ... positioning device, 15 ... LRF, 16a to 16d ... motor, 17 ... drive device, 17a to 17d ... motor drive circuit, 20 ... travel management device

Claims (10)

  1.  移動体であって、
     モータと、
     前記モータを制御して前記移動体を移動させる駆動装置と、
     周囲の空間をセンシングしてセンサデータを周期的に出力する外界センサと、
     地図データを記憶する記憶装置と、
     前記センサデータおよび前記地図データを利用して前記移動体の位置を推定する処理を行い、前記位置の推定値を順次出力する測位装置と、
     演算回路と
    を備え、
     前記演算回路は、前記測位装置が第1時刻における前記移動体の位置の推定値を出力した後、第2時刻における前記移動体の位置を推定する処理を開始するとき、前記第1時刻から前記第2時刻までの間に前記移動体が移動する距離および方向に基づいて、前記第1時刻における前記位置の推定値を補正し、補正後の推定値を前記第2時刻における前記移動体の位置の初期位置として前記測位装置に与える、移動体。
    A moving object,
    A motor,
    A drive device for controlling the motor to move the movable body;
    An external sensor that senses the surrounding space and periodically outputs sensor data;
    A storage device for storing map data;
    A positioning device that performs a process of estimating the position of the moving body using the sensor data and the map data, and sequentially outputs the estimated value of the position;
    With an arithmetic circuit,
    The arithmetic circuit starts the process of estimating the position of the moving body at the second time after the positioning device outputs the estimated value of the position of the moving body at the first time from the first time. The estimated value of the position at the first time is corrected based on the distance and direction that the moving object moves until the second time, and the corrected estimated value is the position of the moving object at the second time. A moving body that is given to the positioning device as an initial position of the moving body.
  2.  前記測位装置は、前記移動体の位置の推定値を出力するとき、推定の信頼度を示す情報を出力し、
     前記演算回路は、前記第1時刻から前記第2時刻までの間において、前記測位装置から出力された前記移動体の位置の推定値についての信頼度が設定値未満になったとき、前記補正後の推定値を前記第2時刻における前記移動体の位置の初期位置として前記測位装置に与える、請求項1に記載の移動体。
    When the positioning device outputs an estimated value of the position of the moving body, it outputs information indicating the reliability of the estimation,
    When the reliability of the estimated value of the position of the moving object output from the positioning device is less than a set value between the first time and the second time, the arithmetic circuit performs the post-correction The mobile object according to claim 1, wherein the estimated value is given to the positioning device as an initial position of the position of the mobile object at the second time.
  3.  前記演算回路は、前記第1時刻から前記第2時刻までの間において、前記測位装置からの要求に応答して、前記補正後の推定値を前記第2時刻における前記移動体の位置の初期位置として前記測位装置に与える、請求項1に記載の移動体。 The arithmetic circuit responds to a request from the positioning device between the first time and the second time, and calculates the corrected estimated value as an initial position of the moving body at the second time. The moving body according to claim 1, which is given to the positioning device as
  4.  前記測位装置は、前記演算回路から与えられた前記第2時刻における前記移動体の位置の初期位置と、前記第2時刻までに前記外界センサから出力された前記センサデータとを利用して、前記第2時刻における前記移動体の位置を推定し、前記第2時刻における前記移動体の位置の推定値を第3時刻に出力する、請求項1から3のいずれかに記載の移動体。 The positioning device uses the initial position of the position of the moving body at the second time given from the arithmetic circuit and the sensor data output from the external sensor by the second time, The mobile body according to claim 1, wherein the position of the mobile body at a second time is estimated, and an estimated value of the position of the mobile body at the second time is output at a third time.
  5.  前記測位装置は、前記演算回路から与えられた前記第2時刻における前記移動体の位置の初期位置と、前記外界センサから出力された前記センサデータとを利用して、前記移動体の位置を推定し、前記移動体の位置の推定値を第3時刻に出力する、請求項1から3のいずれかに記載の移動体。 The positioning device estimates the position of the moving body using the initial position of the moving body at the second time given from the arithmetic circuit and the sensor data output from the external sensor. The mobile body according to claim 1, wherein an estimated value of the position of the mobile body is output at a third time.
  6.  前記演算回路は、前記駆動装置の動作状態に応じて、前記第1時刻から前記第2時刻までの間に前記移動体が移動する距離および方向を決定する、請求項1から5のいずれかに記載の移動体。 The calculation circuit according to any one of claims 1 to 5, wherein the arithmetic circuit determines a distance and a direction in which the moving body moves between the first time and the second time according to an operating state of the driving device. The moving body described.
  7.  前記移動体は、オドメトリ情報を出力する内界センサをさらに備えており、
     前記演算回路は、前記内界センサから出力された前記オドメトリ情報を利用して、前記第1時刻から前記第2時刻までの間に前記移動体が移動する距離および方向を決定する、請求項1から5のいずれかに記載の移動体。
    The mobile body further includes an internal sensor that outputs odometry information,
    The arithmetic circuit determines a distance and a direction in which the moving body moves from the first time to the second time, using the odometry information output from the internal sensor. To 5. The moving body according to any one of 5 to 5.
  8.  前記地図データは、前記移動体が移動しているときに前記外界センサから周期的に出力された前記センサデータに基づいて作成されたデータである、請求項1から7のいずれかに記載の移動体。 The movement according to any one of claims 1 to 7, wherein the map data is data created based on the sensor data periodically output from the external sensor when the moving body is moving. body.
  9.  前記駆動装置は、前記移動体の位置の指令値と前記測位装置から出力された前記移動体の位置の推定値とに基づいて前記モータを制御する、請求項1から8のいずれかに記載の移動体。 The said drive device controls the said motor based on the command value of the position of the said mobile body, and the estimated value of the position of the said mobile body output from the said positioning apparatus. Moving body.
  10.  周囲の空間をセンシングしてセンサデータを周期的に出力する外界センサを備える移動体に搭載されて使用される自己位置推定装置であって、
     地図データを記憶する記憶装置と、
     前記センサデータおよび前記地図データを利用して前記移動体の位置を推定する処理を行い、前記位置の推定値を順次出力する測位装置と、
     演算回路と、
    を備え、
     前記演算回路は、前記測位装置が第1時刻における前記移動体の位置の推定値を出力した後、第2時刻における前記移動体の位置を推定する処理を開始するとき、前記第1時刻から前記第2時刻までの間に前記移動体が移動する距離および方向に基づいて、前記第1時刻における前記位置の推定値を補正し、補正後の推定値を前記第2時刻における前記移動体の位置の初期位置として前記測位装置に与える、自己位置推定装置。
    A self-position estimation device that is used by being mounted on a moving body that includes an external sensor that senses the surrounding space and periodically outputs sensor data,
    A storage device for storing map data;
    A positioning device that performs a process of estimating the position of the moving body using the sensor data and the map data, and sequentially outputs the estimated value of the position;
    An arithmetic circuit;
    With
    The arithmetic circuit starts the process of estimating the position of the moving body at the second time after the positioning device outputs the estimated value of the position of the moving body at the first time from the first time. The estimated value of the position at the first time is corrected based on the distance and direction that the moving object moves until the second time, and the corrected estimated value is the position of the moving object at the second time. A self-position estimation device that is given to the positioning device as an initial position of
PCT/JP2018/005266 2017-03-27 2018-02-15 Mobile body and local position estimation device WO2018179960A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2019508736A JPWO2018179960A1 (en) 2017-03-27 2018-02-15 Moving object and self-position estimation device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017061670 2017-03-27
JP2017-061670 2017-03-27

Publications (1)

Publication Number Publication Date
WO2018179960A1 true WO2018179960A1 (en) 2018-10-04

Family

ID=63674724

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/005266 WO2018179960A1 (en) 2017-03-27 2018-02-15 Mobile body and local position estimation device

Country Status (2)

Country Link
JP (1) JPWO2018179960A1 (en)
WO (1) WO2018179960A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPWO2020137312A1 (en) * 2018-12-28 2020-07-02
JPWO2020137311A1 (en) * 2018-12-28 2020-07-02
WO2020137315A1 (en) * 2018-12-28 2020-07-02 パナソニックIpマネジメント株式会社 Positioning device and mobile body
US20220066466A1 (en) * 2020-09-03 2022-03-03 Honda Motor Co., Ltd. Self-position estimation method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009205465A (en) * 2008-02-28 2009-09-10 Toyota Motor Corp Autonomous mobile body
JP2010117847A (en) * 2008-11-12 2010-05-27 Toyota Motor Corp Moving object, moving object control system, and control method for moving object
JP2016110576A (en) * 2014-12-10 2016-06-20 株式会社豊田中央研究所 Self position estimation device and mobile body with self position estimation device
JP2016224680A (en) * 2015-05-29 2016-12-28 株式会社豊田中央研究所 Self-position estimation device and mobile body having self-position estimation device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009205465A (en) * 2008-02-28 2009-09-10 Toyota Motor Corp Autonomous mobile body
JP2010117847A (en) * 2008-11-12 2010-05-27 Toyota Motor Corp Moving object, moving object control system, and control method for moving object
JP2016110576A (en) * 2014-12-10 2016-06-20 株式会社豊田中央研究所 Self position estimation device and mobile body with self position estimation device
JP2016224680A (en) * 2015-05-29 2016-12-28 株式会社豊田中央研究所 Self-position estimation device and mobile body having self-position estimation device

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPWO2020137312A1 (en) * 2018-12-28 2020-07-02
JPWO2020137311A1 (en) * 2018-12-28 2020-07-02
WO2020137311A1 (en) * 2018-12-28 2020-07-02 パナソニックIpマネジメント株式会社 Positioning device and moving object
WO2020137315A1 (en) * 2018-12-28 2020-07-02 パナソニックIpマネジメント株式会社 Positioning device and mobile body
WO2020137312A1 (en) * 2018-12-28 2020-07-02 パナソニックIpマネジメント株式会社 Positioning device and mobile body
JPWO2020137315A1 (en) * 2018-12-28 2021-11-11 パナソニックIpマネジメント株式会社 Positioning device and mobile
JP7336753B2 (en) 2018-12-28 2023-09-01 パナソニックIpマネジメント株式会社 Positioning device and moving body
JP7336752B2 (en) 2018-12-28 2023-09-01 パナソニックIpマネジメント株式会社 Positioning device and moving object
JP7482453B2 (en) 2018-12-28 2024-05-14 パナソニックIpマネジメント株式会社 Positioning device and mobile object
US20220066466A1 (en) * 2020-09-03 2022-03-03 Honda Motor Co., Ltd. Self-position estimation method

Also Published As

Publication number Publication date
JPWO2018179960A1 (en) 2020-02-13

Similar Documents

Publication Publication Date Title
US11373395B2 (en) Methods and systems for simultaneous localization and calibration
JP6769659B2 (en) Mobile management systems, methods, and computer programs
JP6711138B2 (en) Self-position estimating device and self-position estimating method
WO2018179960A1 (en) Mobile body and local position estimation device
JP6825712B2 (en) Mobiles, position estimators, and computer programs
US11537140B2 (en) Mobile body, location estimation device, and computer program
KR102341712B1 (en) Robot with high accuracy position determination, and method for operating the same
JP2009237851A (en) Mobile object control system
JP2011141663A (en) Automated guided vehicle and travel control method for the same
US20230333568A1 (en) Transport vehicle system, transport vehicle, and control method
JP2020004342A (en) Mobile body controller
JP2000172337A (en) Autonomous mobile robot
JP2019079171A (en) Movable body
JP7396353B2 (en) Map creation system, signal processing circuit, mobile object and map creation method
JP2021056764A (en) Movable body
WO2018180175A1 (en) Mobile body, signal processing device, and computer program
US10990104B2 (en) Systems and methods including motorized apparatus for calibrating sensors
JP6751469B2 (en) Map creation system
EP3605263B1 (en) Moving body management system, moving body, traveling management device, and computer program
US20240142992A1 (en) Map generation device and map generation system
US20240338024A1 (en) Autonomous Driving Control Apparatus, System Including The Same, And Method Thereof
WO2020138115A1 (en) Autonomous moving body
WO2021220331A1 (en) Mobile body system
WO2019059299A1 (en) Operation management device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18774664

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019508736

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18774664

Country of ref document: EP

Kind code of ref document: A1