[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

WO2024201821A1 - Control device, control method, and program - Google Patents

Control device, control method, and program Download PDF

Info

Publication number
WO2024201821A1
WO2024201821A1 PCT/JP2023/012921 JP2023012921W WO2024201821A1 WO 2024201821 A1 WO2024201821 A1 WO 2024201821A1 JP 2023012921 W JP2023012921 W JP 2023012921W WO 2024201821 A1 WO2024201821 A1 WO 2024201821A1
Authority
WO
WIPO (PCT)
Prior art keywords
pedestrian
moving body
route
leading
pedestrians
Prior art date
Application number
PCT/JP2023/012921
Other languages
French (fr)
Japanese (ja)
Inventor
燦心 松▲崎▼
美紗 小室
あかね 今泉
Original Assignee
本田技研工業株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 本田技研工業株式会社 filed Critical 本田技研工業株式会社
Priority to PCT/JP2023/012921 priority Critical patent/WO2024201821A1/en
Publication of WO2024201821A1 publication Critical patent/WO2024201821A1/en

Links

Images

Definitions

  • the present invention relates to a control device, a control method, and a program.
  • Patent Document 1 An invention for a driving control device related to micromobility has been disclosed (Patent Document 1). Research is also being conducted on moving objects that not only follow a user but also move autonomously leading the user.
  • the present invention was made in consideration of these circumstances, and one of its objectives is to provide a control device, control method, and program that can generate a route for a moving body so that it behaves appropriately in relation to a leading subject.
  • a control device is a control device that controls a moving body that moves autonomously in an area where other pedestrians walk while at least temporarily leading a leading subject, and includes a recognition unit that recognizes objects including the leading subject and the other pedestrians, a prediction unit that predicts the future positions of the recognized other pedestrians, a path generation unit that generates a path that the moving body should take in the future, and a drive control unit that controls a drive unit attached to the moving body so that the moving body moves along the path, wherein the path generation unit calculates an index value that represents the degree of influence that the other pedestrians have on the leading subject based on the distance and relative speed between the leading subject and the other pedestrians, and generates the path so that the moving body is located midway between the leading subject and the other pedestrians at the time when the index value is maximum.
  • the index value is calculated to be larger when the distance is shorter and the component of the relative speed in the direction in which the other pedestrian is moving toward the leading subject is larger.
  • the route generation unit when a plurality of waypoints corresponding to a plurality of other pedestrians respectively have a predetermined relationship with each other, the route generation unit generates the route so that the moving body passes through an integrated waypoint that combines the plurality of waypoints.
  • the predetermined relationship is one in which the multiple intermediate points are located within a predetermined distance from each other, and when the multiple intermediate points are passed, a sharp turn of less than a predetermined angle occurs in the path of the moving body.
  • a control method is a control method executed by a processor of a control device that controls a moving object that moves autonomously in an area where other pedestrians walk while at least temporarily leading a leading object, and includes the steps of: recognizing objects including the leading object and the other pedestrians; predicting the future positions of the recognized other pedestrians; generating a path that the moving object should follow in the future; and controlling a drive unit attached to the moving object so that the moving object moves along the path, wherein generating the path includes calculating an index value that represents the degree of influence that the other pedestrians have on the leading object based on the distance and relative speed between the leading object and the other pedestrians, and generating the path so that the moving object is located at the midpoint between the leading object and the other pedestrians at the time when the index value is maximum.
  • a program causes a processor of a control device that controls a moving object that moves autonomously in an area where other pedestrians walk while at least temporarily leading a leading object, to execute the following operations: recognizing objects including the leading object and the other pedestrians; predicting the future positions of the recognized other pedestrians; generating a path for the moving object to follow in the future; and controlling a drive unit attached to the moving object so that the moving object moves along the path, wherein generating the path includes calculating an index value representing the degree of influence that the other pedestrians have on the leading object based on the distance and relative speed between the leading object and the other pedestrians, and generating the path so that the moving object is located midway between the leading object and the other pedestrians at the point where the index value is maximum.
  • a route for a moving object can be generated so that the moving object behaves appropriately with respect to a leading subject.
  • FIG. 1 is a configuration diagram of a moving object.
  • FIG. 2 is a configuration diagram of a control device.
  • 13 is a flowchart showing an example of a process performed by a prediction unit.
  • FIG. 11 is a diagram for explaining an example of a method for generating an ideal route.
  • FIG. 13 is a diagram illustrating an example of how a waypoint is set.
  • FIG. 13 is a diagram illustrating a situation in which a route cannot be conveniently generated.
  • 13 is a diagram showing an example of a route in a case where an integrated waypoint that combines a plurality of waypoints is set by an integration processing unit; FIG.
  • the control device of the present invention controls the drive device of the moving body to move the moving body.
  • the moving body in the present invention is an autonomous moving body leading a leading subject in an area where pedestrians walk.
  • the area where pedestrians walk is a sidewalk, a public open space, a floor in a building, etc., and may include a roadway.
  • the leading subject is, for example, one of the pedestrians, but it may also be a robot or an animal (hereinafter, this is referred to as user U).
  • the moving body moves a little ahead of an elderly user U while heading toward a predetermined destination point, thereby operating to prevent other pedestrians who hinder the movement of the user U from getting too close to the user U (i.e., it operates to make a way for the user U).
  • the user U is not limited to an elderly person, but may be a person who tends to have difficulty walking, a child, a person shopping at a supermarket, a patient moving in a hospital, a pet taking a walk, etc. It should be noted that such an operation may not be performed constantly, but may be performed temporarily.
  • the moving body may temporarily lead the user by executing the algorithm of the present invention.
  • [Basic configuration] 1 is a configuration diagram of a moving body 1.
  • the moving body 1 is equipped with, for example, an HMI (Human Machine Interface) 10, an object detection device 20, a driving device 30, a sensor 40, and a control device 100. These components are supported or housed by a base 5.
  • HMI Human Machine Interface
  • the HMI 10 presents various information to the user U and accepts input operations by the user U.
  • the HMI 10 includes various display devices, speakers, buzzers, touch panels, switches, keys, short-range wireless communication devices, etc.
  • the HMI 10 accepts the setting of a destination point.
  • the object detection device 20 is a device that generates data for recognizing other pedestrians and the user U present around the moving body 1.
  • the object detection device 20 includes, for example, a camera whose shooting range is the area around the moving body 1.
  • the object detection device 20 may also include sensors such as a radar device, a LIDAR (Light Detection and Ranging) sensor, and an ultrasonic sensor, as well as an object recognition device that identifies objects by performing sensor fusion processing based on the output of these sensors.
  • sensors such as a radar device, a LIDAR (Light Detection and Ranging) sensor, and an ultrasonic sensor, as well as an object recognition device that identifies objects by performing sensor fusion processing based on the output of these sensors.
  • LIDAR Light Detection and Ranging
  • the driving device 30 is a mechanism for moving the moving body 1 including the base body 5 in any direction.
  • the driving device 30 includes, for example, a plurality of wheels, a driving motor attached to one or more of the wheels, and a steering device attached to one or more of the wheels.
  • the driving device 30 moves the moving body 1 while keeping the front surface of the base body 5 facing the traveling direction of the moving body 1.
  • Sensor 40 is a sensor for detecting the behavior of moving body 1.
  • Sensor 40 includes, for example, a wheel speed sensor for detecting the wheel speed, an acceleration sensor for detecting the acceleration acting on moving body 1, a yaw rate sensor attached near the center of gravity in the horizontal direction of base body 5, a steering angle sensor for detecting the steering angle of the steered wheels (steered wheels), and an orientation sensor for detecting the horizontal orientation of moving body 1.
  • the control device 100 includes, for example, a recognition unit 110, a prediction unit 120, a route generation unit 130, and a drive control unit 140.
  • the route generation unit 130 includes, for example, an index value calculation unit 132, an intermediate point setting unit 134, an integration processing unit 136, and a future position determination unit 138. These components are realized, for example, by a hardware processor such as a CPU (Central Processing Unit) executing a program (software).
  • a hardware processor such as a CPU (Central Processing Unit) executing a program (software).
  • Some or all of these components may be realized by hardware (including circuitry) such as an LSI (Large Scale Integration), an ASIC (Application Specific Integrated Circuit), an FPGA (Field-Programmable Gate Array), or a GPU (Graphics Processing Unit), or may be realized by collaboration between software and hardware.
  • the program may be stored in advance in a storage device (a storage device with a non-transient storage medium) such as a hard disk drive (HDD) or flash memory, or may be stored in a removable storage medium (non-transient storage medium) such as a DVD or CD-ROM, and installed in the storage device by inserting the storage medium into a drive device.
  • a storage device a storage device with a non-transient storage medium
  • HDD hard disk drive
  • flash memory or may be stored in a removable storage medium (non-transient storage medium) such as a DVD or CD-ROM, and installed in the storage device by inserting the storage medium into a drive device.
  • the recognition unit 110 recognizes objects including other pedestrians (hereinafter simply referred to as pedestrian P) and user U based on the data output by the object detection device 20.
  • pedestrian P pedestrians
  • the recognition unit 110 recognizes pedestrians by inputting camera images into a trained model for identifying pedestrians. The same applies to objects other than pedestrians.
  • the recognition unit 110 may store multiple images of user U captured in advance by a camera as templates in a storage unit (not shown), and identify user U by comparing the templates with the camera images.
  • the recognition unit 110 may recognize the position of user U by using communication directivity to perform short-range wireless communication between the terminal device held by user U and the HMI 10.
  • the prediction unit 120 predicts the future position of the pedestrian P. Furthermore, the prediction unit 120 may also predict the future position of the user U based on the previous route Rm(z) generated one cycle ago while the route generation unit 130 is performing the repetitive processing.
  • Figure 3 is a flowchart showing an example of the processing contents of the prediction unit 120.
  • the prediction unit 120 inputs the previous route Rm(z) (S1) and generates an ideal route Ru#(i) for the user U to follow the moving body 1 (S2).
  • the ideal route Ru#(i) is the route along which the user U is predicted to travel, assuming that no pedestrian P is present in the vicinity of the moving body 1 and the user U.
  • Figure 4 is a diagram for explaining an example of a method for generating the ideal route Ru#(i).
  • the route Rm is expressed as a series of multiple route points Om for each step.
  • a step is a future point in time that arrives at a predetermined time interval.
  • the prediction unit 120 generates the ideal route Ru# of the user U so that the movement vector Vu-1 of the user U moving from step 0 (calculation time) to step 1 faces the position Om-1 of the moving body 1 at step 1 of the previous trajectory Rm(z), the movement vector Vu-2 of the user U moving from step 1 to step 2 faces the position Om-2 of the moving body 1 at step 2 of the previous trajectory Rm(z), and the movement vector Vu-3 of the user U moving from step 2 to step 3 faces the position Om-3 of the moving body 1 at step 3 of the previous trajectory Rm(z), that is, so as to follow the moving body 1 with a delay of one cycle.
  • This generation method is merely an example, and other generation methods may be adopted.
  • the prediction unit 120 may generate the ideal route Ru# of the user U using a method similar to that for predicting the future position of the pedestrian P described later.
  • the prediction unit 120 fits the past movement trajectory of pedestrian P to a line or curve represented by a function, and predicts the future position of pedestrian P by assuming that pedestrian P will continue to move along that line or curve in the future.
  • the prediction unit 120 predicts the future position of pedestrian P up to q steps ahead, assuming, for example, that pedestrian P moves at a constant speed without accelerating or decelerating (if there is a tendency for pedestrian P to accelerate or decelerate by the time of prediction, that tendency may be continued).
  • the prediction unit 120 outputs the ideal route Ru# of user U and the expected route Rp(k) of pedestrian P to the route generation unit 130.
  • the index value calculation unit 132 of the path generation unit 130 calculates an index value F1 representing the degree of influence that the pedestrian P has on the user U in each future step for each pedestrian P.
  • the index value F1 is calculated to be a larger value as the distance between the pedestrian P and the user U is shorter and the relative speed between the pedestrian P and the user U is higher.
  • the index value F1 is expressed by, for example, Equation (1).
  • d up is the distance between the pedestrian P and the user U.
  • ⁇ up is calculated by Equation (2).
  • v up is the relative speed between the pedestrian P and the user.
  • ⁇ and ⁇ are weighting coefficients.
  • u up is a vector obtained by normalizing ⁇ up as expressed by Equation (3).
  • e ui is a normalized orientation vector (the direction of the pedestrian P as seen from the user U) expressed by Equation (4).
  • the midpoint setting unit 134 extracts the step where the index value F1 calculated for each future step for each pedestrian P is maximum for each pedestrian P. It then obtains the position on the predicted route Rp(k) corresponding to the step where the index value F1 is maximum, and sets the midpoint between the pedestrian P and the user U at that step.
  • Figure 5 is a diagram illustrating how waypoints are set.
  • there are four pedestrians P, k 1 to 4.
  • pedestrian P(1) since he is moving away from user U, his index value F1 is cut off by the threshold value and he is not taken into consideration for route Rm.
  • the index value F1 is maximum for pedestrian P(2) in step 7, for pedestrian P(3) in step 9, and for pedestrian P(4) in step 13.
  • P(2,7), P(3,9), and P(4,13) are positions () corresponding to steps (7,9,13) where the index values F1 of pedestrians P(2), P(3), and P(4) are maximum, respectively.
  • U(7), U(9), and U(13) are positions predicted to be passed by user U in steps 7, 9, and 13, respectively.
  • the midpoint setting unit 134 sets the midpoint MID (7) between P (2, 7) and U (7), the midpoint MID (9) between P (3, 9) and U (9), and the midpoint MID (13) between P (4, 13) and U (13) as the midpoints corresponding to steps 7, 9, and 13, respectively.
  • the future position determination unit 138 generates the route Rm for the moving body 1 so that the set waypoints are passed in step order. Furthermore, the future position determination unit 138 generates the route Rm for the moving body 1 by adding a bias toward the destination point using some method. The future position determination unit 138 determines the route Rm for the moving body 1 to be a smooth curve that can pass the waypoints in step order and generally leads to the destination point. For example, the future position determination unit 138 generates the route Rm for the moving body 1 by appropriately combining spline curves, circular arcs, straight lines, etc.
  • the moving body 1 is controlled so that its presence does not cause other pedestrians P to get too close to the user U.
  • the index value F1 should be at its maximum when the pedestrian P is closest to the user U, and so by moving to the intermediate point in that situation, the moving body 1 can restrain the pedestrian P from getting too close to the user U.
  • Figure 5 shows how the waypoints are conveniently arranged in step order, but in reality, if the waypoints are set according to the principle and route Rm is generated, the route Rm may end up with the moving body 1 going back and forth.
  • Figure 6 shows an example of a situation in which route Rm cannot be conveniently generated. As shown, MID (9) is closer to the moving body 1 than MID (7), which creates a situation in which route Rm cannot be generated smoothly. As a result, the behavior of the moving body 1 becomes unstable, and the user U may confuse the pedestrian P.
  • the integration processing unit 136 sets an integrated waypoint that combines the multiple waypoints.
  • the future position determination unit 138 then generates a route Rm so that the mobile unit 1 passes through the integrated waypoint, without taking into account the individual waypoints combined into the integrated waypoint.
  • the specified relationship means that there are multiple intermediate points within a specified distance, and when the multiple intermediate points are passed, a sharp turn of less than a specified angle (e.g., about 90 degrees) occurs on the route Rm of the moving body 1.
  • A(7) in FIG. 6 indicates the range of the specified distance as seen from MID(7).
  • MID(9) is within A(7), and a sharp turn occurs on the route Rm of the moving body 1 when MID(7) and MID(9) are passed, so MID(7) and MID(9) are determined to have a specified relationship.
  • FIG. 7 is a diagram showing an example of route Rm when an integrated waypoint that combines multiple waypoints is set by the integration processing unit 136.
  • MID*(7,9) is the integrated waypoint that combines MID(7) and MID(9).
  • the integration processing unit 136 sets the integrated waypoint by, for example, calculating a weighted sum for both position and time using the ratio of the magnitude of F1 (equations (2) and (3)).
  • F1(7) is the magnitude of index value F1 of pedestrian P(3) in step 7
  • F1(9) is the magnitude of index value F1 of pedestrian P(2) in step 7.
  • t* indicates the timing (step) at which the integrated waypoint should be passed.
  • MID*(7,9) ⁇ MID(9) ⁇ F1(2)+MID(7) ⁇ F1(3) ⁇ / ⁇ F1(7)+F1(9) ⁇ ...(2)
  • t* ⁇ 9 ⁇ F1(2)+7 ⁇ F1(3) ⁇ / ⁇ F1(7)+F1(9) ⁇ ...(3)
  • a route for the mobile body 1 can be generated so that the mobile body 1 behaves appropriately with respect to the lead subject (user U).
  • control device 100 is described as being mounted on the moving body 1, but this is not limited thereto.
  • the control device 100 may be installed at a location away from the moving body 1, acquire output data from the object detection device 20 via communication, and transmit a drive instruction signal to the drive device 30, i.e., remotely control the moving body 1.
  • a control device for controlling a moving body that autonomously moves in an area where other pedestrians are walking while leading a leading subject, one or more storage media storing computer-readable instructions; a processor coupled to the one or more storage media; The processor executes the computer-readable instructions to: Recognizing objects including the leading subject and the other pedestrians; predicting a future location of the recognized other pedestrians; and generating a route to be taken by the moving body in the future; controlling a drive unit attached to the moving body to move the moving body along the path; generating the route includes calculating an index value representing a degree of influence that the other pedestrian has on the target leading person based on a distance and a relative speed between the target leading person and the other pedestrian, and generating the route such that the moving object is located at a midpoint between the target leading person and the other pedestrian at a time when the index value is maximum. Control device.

Landscapes

  • Traffic Control Systems (AREA)

Abstract

A control device for at least temporarily controlling a moving body that, while leading a person being led, autonomously moves in a region in which another pedestrian walks. The control device comprises a recognition unit that recognizes an object including the person being led and the other pedestrian, a prediction unit that predicts the future position of the recognized other pedestrian, a route generation unit that generates a route on which the moving body is to proceed in the future, and a drive control unit that controls a drive device attached to the moving body so that the moving body moves along the path. On the basis of the distance between the person being led and the other pedestrian and the relative speed, an index value representing the degree to which the other pedestrian affects the person being led is calculated, and the route is generated so that the moving body is positioned at an intermediate location between the person being led and the other pedestrian when the index value is maximized.

Description

制御装置、制御方法、およびプログラムControl device, control method, and program

 本発明は、制御装置、制御方法、およびプログラムに関する。 The present invention relates to a control device, a control method, and a program.

 近年、利用者の荷物を運搬する等の目的で、利用者に追従して自律的に移動する移動体(ロボット、マイクロモビリティなどと称される)について実用化が進められている。マイクロモビリティに関する走行制御装置の発明が開示されている(特許文献1)。また、利用者に追従するだけでなく、利用者を先導して自律的に移動する移動体についても研究が進められている。 In recent years, moving objects (called robots, micromobility, etc.) that move autonomously following a user for purposes such as transporting the user's luggage have been put to practical use. An invention for a driving control device related to micromobility has been disclosed (Patent Document 1). Research is also being conducted on moving objects that not only follow a user but also move autonomously leading the user.

特表2020-529050号公報Special Publication No. 2020-529050

“Social Force Model for Pedestrian Dynamics”D. Helbing and P. Molnar, , 20 May 1998.“Social Force Model for Pedestrian Dynamics” D. Helbing and P. Molnar, , 20 May 1998.

 従来の技術では、移動体が置かれた状況によっては適切に経路を生成することができない場合があった。例えば、先導対象者に対してどのような挙動で先導するかについて考慮されていなかった。  In conventional technology, it was sometimes impossible to generate an appropriate route depending on the situation the moving object was in. For example, no consideration was given to how the moving object should behave in relation to the person being led.

 本発明は、このような事情を考慮してなされたものであり、先導対象者に対して適切な挙動をするように移動体の経路を生成することが可能な制御装置、制御方法、およびプログラムを提供することを目的の一つとする。 The present invention was made in consideration of these circumstances, and one of its objectives is to provide a control device, control method, and program that can generate a route for a moving body so that it behaves appropriately in relation to a leading subject.

 この発明に係る制御装置、制御方法、およびプログラムは、以下の構成を採用した。
 (1):この発明の一態様に係る制御装置は、少なくとも一時的に、先導対象者を先導しながら、他の歩行者が歩行する領域を自律的に移動する移動体を制御する制御装置であって、前記先導対象者および前記他の歩行者を含む物体を認識する認識部と、前記認識された他の歩行者の将来の位置を予測する予測部と、前記移動体が将来進むべき経路を生成する経路生成部と、前記経路に沿って前記移動体が移動するように前記移動体に取り付けられた駆動装置を制御する駆動制御部と、を備え、前記経路生成部は、前記先導対象者と前記他の歩行者との距離および相対速度に基づいて、前記他の歩行者が前記先導対象者に与える影響度を表す指標値を計算し、前記指標値が最大となる時点において前記移動体が前記先導対象者と前記他の歩行者の中間地点に位置するように前記経路を生成する、制御装置である。
The control device, control method, and program according to the present invention employ the following configuration.
(1): A control device according to one embodiment of the present invention is a control device that controls a moving body that moves autonomously in an area where other pedestrians walk while at least temporarily leading a leading subject, and includes a recognition unit that recognizes objects including the leading subject and the other pedestrians, a prediction unit that predicts the future positions of the recognized other pedestrians, a path generation unit that generates a path that the moving body should take in the future, and a drive control unit that controls a drive unit attached to the moving body so that the moving body moves along the path, wherein the path generation unit calculates an index value that represents the degree of influence that the other pedestrians have on the leading subject based on the distance and relative speed between the leading subject and the other pedestrians, and generates the path so that the moving body is located midway between the leading subject and the other pedestrians at the time when the index value is maximum.

 (2):上記(1)の態様において、前記指標値は、前記距離が短く、且つ前記相対速度のうち前記他の歩行者が前記先導対象者に向かう方向の成分が大きい程、大きい値に算出されるものである。 (2): In the above aspect (1), the index value is calculated to be larger when the distance is shorter and the component of the relative speed in the direction in which the other pedestrian is moving toward the leading subject is larger.

 (3):上記(1)の態様において、前記経路生成部は、複数の前記他の歩行者にそれぞれ対応する複数の中間地点同士が所定の関係を有する場合、当該複数の中間地点をまとめた統合中間地点を前記移動体が通過するように前記経路を生成するものである。 (3): In the aspect of (1) above, when a plurality of waypoints corresponding to a plurality of other pedestrians respectively have a predetermined relationship with each other, the route generation unit generates the route so that the moving body passes through an integrated waypoint that combines the plurality of waypoints.

 (4):上記(3)の態様において、前記所定の関係は、前記複数の中間地点が所定距離以内に存在し、且つ当該複数の中間地点を通過した場合の前記移動体の経路に所定角度未満の急旋回部が生じる関係であるものである。 (4): In the aspect of (3) above, the predetermined relationship is one in which the multiple intermediate points are located within a predetermined distance from each other, and when the multiple intermediate points are passed, a sharp turn of less than a predetermined angle occurs in the path of the moving body.

 (5):本発明の他の態様に係る制御方法は、少なくとも一時的に、先導対象者を先導しながら、他の歩行者が歩行する領域を自律的に移動する移動体を制御する制御装置のプロセッサが実行する制御方法であって、前記先導対象者および前記他の歩行者を含む物体を認識することと、前記認識された他の歩行者の将来の位置を予測することと、前記移動体が将来進むべき経路を生成することと、前記経路に沿って前記移動体が移動するように前記移動体に取り付けられた駆動装置を制御することと、を備え、前記経路を生成することは、前記先導対象者と前記他の歩行者との距離および相対速度に基づいて、前記他の歩行者が前記先導対象者に与える影響度を表す指標値を計算し、前記指標値が最大となる時点において前記移動体が前記先導対象者と前記他の歩行者の中間地点に位置するように前記経路を生成することを含むものである。 (5): A control method according to another aspect of the present invention is a control method executed by a processor of a control device that controls a moving object that moves autonomously in an area where other pedestrians walk while at least temporarily leading a leading object, and includes the steps of: recognizing objects including the leading object and the other pedestrians; predicting the future positions of the recognized other pedestrians; generating a path that the moving object should follow in the future; and controlling a drive unit attached to the moving object so that the moving object moves along the path, wherein generating the path includes calculating an index value that represents the degree of influence that the other pedestrians have on the leading object based on the distance and relative speed between the leading object and the other pedestrians, and generating the path so that the moving object is located at the midpoint between the leading object and the other pedestrians at the time when the index value is maximum.

 (11):本発明の他の態様に係るプログラムは、少なくとも一時的に、先導対象者を先導しながら、他の歩行者が歩行する領域を自律的に移動する移動体を制御する制御装置のプロセッサに、前記先導対象者および前記他の歩行者を含む物体を認識することと、前記認識された他の歩行者の将来の位置を予測することと、前記移動体が将来進むべき経路を生成することと、前記経路に沿って前記移動体が移動するように前記移動体に取り付けられた駆動装置を制御することと、を実行させるプログラムであって、前記経路を生成することは、前記先導対象者と前記他の歩行者との距離および相対速度に基づいて、前記他の歩行者が前記先導対象者に与える影響度を表す指標値を計算し、前記指標値が最大となる時点において前記移動体が前記先導対象者と前記他の歩行者の中間地点に位置するように前記経路を生成することを含むものである。 (11): A program according to another aspect of the present invention causes a processor of a control device that controls a moving object that moves autonomously in an area where other pedestrians walk while at least temporarily leading a leading object, to execute the following operations: recognizing objects including the leading object and the other pedestrians; predicting the future positions of the recognized other pedestrians; generating a path for the moving object to follow in the future; and controlling a drive unit attached to the moving object so that the moving object moves along the path, wherein generating the path includes calculating an index value representing the degree of influence that the other pedestrians have on the leading object based on the distance and relative speed between the leading object and the other pedestrians, and generating the path so that the moving object is located midway between the leading object and the other pedestrians at the point where the index value is maximum.

 (1)~(11)の態様によれば、先導対象者に対して適切な挙動をするように移動体の経路を生成することができる。 According to aspects (1) to (11), a route for a moving object can be generated so that the moving object behaves appropriately with respect to a leading subject.

移動体の構成図である。FIG. 1 is a configuration diagram of a moving object. 制御装置の構成図である。FIG. 2 is a configuration diagram of a control device. 予測部の処理内容の一例を示すフローチャートである。13 is a flowchart showing an example of a process performed by a prediction unit. 理想経路の生成手法の一例について説明するための図である。FIG. 11 is a diagram for explaining an example of a method for generating an ideal route. 中間地点が設定される様子を例示した図である。FIG. 13 is a diagram illustrating an example of how a waypoint is set. 経路が都合よく生成できない場面を例示した図である。FIG. 13 is a diagram illustrating a situation in which a route cannot be conveniently generated. 統合処理部により複数の中間地点をまとめた統合中間地点が設定された場合の経路の一例を示す図である。13 is a diagram showing an example of a route in a case where an integrated waypoint that combines a plurality of waypoints is set by an integration processing unit; FIG.

 [概要]
 以下、図面を参照し、本発明の制御装置、制御方法、およびプログラムの実施形態について説明する。本発明の制御装置は、移動体の駆動装置を制御して移動体を移動させるものである。本発明における移動体とは、歩行者が歩行する領域を、先導対象者を先導しながら自律的に移動するものである。歩行者が歩行する領域とは、歩道、公開空地、建物内のフロアなどであり、車道を含んでもよい。以下の説明では移動体には人が搭乗しないものとするが、移動体に人が搭乗しても構わない。先導対象者は、例えば歩行者の一人であるが、ロボットや動物であってもよい(以下、これをユーザUと称する)。移動体は、例えば、予め与えられた目的地点に向かいつつ、高齢者であるユーザUの少し先を移動することで、ユーザUの移動の妨げになる他の歩行者がユーザUに近づき過ぎないように動作する(つまり、ユーザUに道をつくるように動作する)。ユーザUは、高齢者に限らず、歩行が困難な傾向にある人、子供、スーパーで買い物をする人、病院内を移動する患者、散歩をするペットなどであってもよい。なお、このような動作は常時行われるのではなく、一時的に行われてもよい。例えば、移動体がユーザと並走、又はユーザを追走し、ユーザの進行方向上に所定の状況(例えば、障害物の存在や交通状況の混雑など)を検知した場合に、移動体が、本発明のアルゴリズムを実行することにより、一時的にユーザを先導してもよい。
[overview]
Hereinafter, with reference to the drawings, an embodiment of the control device, control method, and program of the present invention will be described. The control device of the present invention controls the drive device of the moving body to move the moving body. The moving body in the present invention is an autonomous moving body leading a leading subject in an area where pedestrians walk. The area where pedestrians walk is a sidewalk, a public open space, a floor in a building, etc., and may include a roadway. In the following description, it is assumed that a person does not ride on the moving body, but a person may ride on the moving body. The leading subject is, for example, one of the pedestrians, but it may also be a robot or an animal (hereinafter, this is referred to as user U). For example, the moving body moves a little ahead of an elderly user U while heading toward a predetermined destination point, thereby operating to prevent other pedestrians who hinder the movement of the user U from getting too close to the user U (i.e., it operates to make a way for the user U). The user U is not limited to an elderly person, but may be a person who tends to have difficulty walking, a child, a person shopping at a supermarket, a patient moving in a hospital, a pet taking a walk, etc. It should be noted that such an operation may not be performed constantly, but may be performed temporarily. For example, when a moving body is traveling alongside or chasing a user and detects a specified situation in the user's direction of travel (e.g., the presence of an obstacle or traffic congestion), the moving body may temporarily lead the user by executing the algorithm of the present invention.

 [基本構成]
 図1は、移動体1の構成図である。移動体1には、例えば、HMI(Human machine Interface)10と、物体検知デバイス20と、駆動装置30と、センサ40と、制御装置100とが搭載される。これらの構成は、基体5によって支持或いは収容されている。
[Basic configuration]
1 is a configuration diagram of a moving body 1. The moving body 1 is equipped with, for example, an HMI (Human Machine Interface) 10, an object detection device 20, a driving device 30, a sensor 40, and a control device 100. These components are supported or housed by a base 5.

 HMI10は、ユーザUに対して各種情報を提示すると共に、ユーザUによる入力操作を受け付ける。HMI10は、各種表示装置、スピーカ、ブザー、タッチパネル、スイッチ、キー、近距離無線通信装置などを含む。例えば、HMI10は、目的地点の設定を受け付ける。 The HMI 10 presents various information to the user U and accepts input operations by the user U. The HMI 10 includes various display devices, speakers, buzzers, touch panels, switches, keys, short-range wireless communication devices, etc. For example, the HMI 10 accepts the setting of a destination point.

 物体検知デバイス20は、移動体1の周囲に存在する他の歩行者およびユーザUを認識するためのデータを生成するデバイスである。物体検知デバイス20は、例えば、移動体1の周囲を撮影範囲とするカメラを含む。物体検知デバイス20は、レーダ装置、LIDAR(Light Detection and Ranging)、超音波センサなどのセンサ類、およびこれらのセンサ類の出力に基づくセンサフュージョン処理を行って物体を特定する物体認識装置等を含んでもよい。 The object detection device 20 is a device that generates data for recognizing other pedestrians and the user U present around the moving body 1. The object detection device 20 includes, for example, a camera whose shooting range is the area around the moving body 1. The object detection device 20 may also include sensors such as a radar device, a LIDAR (Light Detection and Ranging) sensor, and an ultrasonic sensor, as well as an object recognition device that identifies objects by performing sensor fusion processing based on the output of these sensors.

 駆動装置30は、基体5を含む移動体1を任意の方向に移動させるための機構である。駆動装置30は、例えば、複数の車輪と、一以上の車輪に取り付けられた駆動モータと、一以上の車輪に取り付けられた操舵装置とを含む。駆動装置30の構成について特段の制約は無く、任意の構成を有してもよい。駆動装置30は、原則として基体5の前面が移動体1の進行方向を向くようにしながら移動体1を移動させる。 The driving device 30 is a mechanism for moving the moving body 1 including the base body 5 in any direction. The driving device 30 includes, for example, a plurality of wheels, a driving motor attached to one or more of the wheels, and a steering device attached to one or more of the wheels. There are no particular restrictions on the configuration of the driving device 30, and it may have any configuration. In principle, the driving device 30 moves the moving body 1 while keeping the front surface of the base body 5 facing the traveling direction of the moving body 1.

 センサ40は、移動体1の挙動を検出するためのセンサである。センサ40は、例えば、車輪の速度を検出するための車輪速センサ、移動体1に作用する加速度を検出する加速度センサ、基体5の水平方向に関する重心近くに取り付けられたヨーレートセンサ、操舵される車輪(操舵輪)の操舵角を検出する操舵角センサ、移動体1の水平方向に関する向きを検出する方位センサ等を含む。 Sensor 40 is a sensor for detecting the behavior of moving body 1. Sensor 40 includes, for example, a wheel speed sensor for detecting the wheel speed, an acceleration sensor for detecting the acceleration acting on moving body 1, a yaw rate sensor attached near the center of gravity in the horizontal direction of base body 5, a steering angle sensor for detecting the steering angle of the steered wheels (steered wheels), and an orientation sensor for detecting the horizontal orientation of moving body 1.

 図2は、制御装置100の構成図である。制御装置100は、例えば、認識部110と、予測部120と、経路生成部130と、駆動制御部140とを備える。経路生成部130は、例えば、指標値計算部132と、中間地点設定部134と、統合処理部136と、将来位置決定部138とを備える。これらの構成要素は、例えば、CPU(Central Processing Unit)などのハードウェアプロセッサがプログラム(ソフトウェア)を実行することにより実現される。これらの構成要素のうち一部または全部は、LSI(Large Scale Integration)やASIC(Application Specific Integrated Circuit)、FPGA(Field-Programmable Gate Array)、GPU(Graphics Processing Unit)などのハードウェア(回路部;circuitryを含む)によって実現されてもよいし、ソフトウェアとハードウェアの協働によって実現されてもよい。プログラムは、予めHDD(Hard Disk Drive)やフラッシュメモリなどの記憶装置(非一過性の記憶媒体を備える記憶装置)に格納されていてもよいし、DVDやCD-ROMなどの着脱可能な記憶媒体(非一過性の記憶媒体)に格納されており、記憶媒体がドライブ装置に装着されることで記憶装置にインストールされてもよい。 2 is a configuration diagram of the control device 100. The control device 100 includes, for example, a recognition unit 110, a prediction unit 120, a route generation unit 130, and a drive control unit 140. The route generation unit 130 includes, for example, an index value calculation unit 132, an intermediate point setting unit 134, an integration processing unit 136, and a future position determination unit 138. These components are realized, for example, by a hardware processor such as a CPU (Central Processing Unit) executing a program (software). Some or all of these components may be realized by hardware (including circuitry) such as an LSI (Large Scale Integration), an ASIC (Application Specific Integrated Circuit), an FPGA (Field-Programmable Gate Array), or a GPU (Graphics Processing Unit), or may be realized by collaboration between software and hardware. The program may be stored in advance in a storage device (a storage device with a non-transient storage medium) such as a hard disk drive (HDD) or flash memory, or may be stored in a removable storage medium (non-transient storage medium) such as a DVD or CD-ROM, and installed in the storage device by inserting the storage medium into a drive device.

 認識部110は、物体検知デバイス20の出力するデータに基づいて、他の歩行者(以下、単に歩行者Pと称する)およびユーザUを含む物体を認識する。物体検知デバイス20がカメラである場合、認識部110は、歩行者を識別するための学習済モデルにカメラ画像を入力することで歩行者を認識する。歩行者以外の物体についても同様である。また、歩行者PとユーザUを見分けるために、認識部110は、予めカメラによって撮影されたユーザUの複数の画像をテンプレートとして記憶部(不図示)に保持しておき、テンプレートとカメラ画像を比較することでユーザUを特定してもよい。その他、認識部110は、通信指向性を利用して、ユーザUの保持する端末装置とHMI10が近距離無線通信を行うことでユーザUの位置を認識してもよい。 The recognition unit 110 recognizes objects including other pedestrians (hereinafter simply referred to as pedestrian P) and user U based on the data output by the object detection device 20. When the object detection device 20 is a camera, the recognition unit 110 recognizes pedestrians by inputting camera images into a trained model for identifying pedestrians. The same applies to objects other than pedestrians. In order to distinguish between pedestrians P and user U, the recognition unit 110 may store multiple images of user U captured in advance by a camera as templates in a storage unit (not shown), and identify user U by comparing the templates with the camera images. In addition, the recognition unit 110 may recognize the position of user U by using communication directivity to perform short-range wireless communication between the terminal device held by user U and the HMI 10.

 予測部120は、歩行者Pの将来の位置を予測する。更に、予測部120は、経路生成部130が繰り返し処理を実行する中で1サイクル前に生成した前回経路Rm(z)に基づいてユーザUの将来の位置も予測してよい。 The prediction unit 120 predicts the future position of the pedestrian P. Furthermore, the prediction unit 120 may also predict the future position of the user U based on the previous route Rm(z) generated one cycle ago while the route generation unit 130 is performing the repetitive processing.

 図3は、予測部120の処理内容の一例を示すフローチャートである。まず、予測部120は、前回経路Rm(z)を入力し(S1)、ユーザUが移動体1に追従する場合の理想経路Ru#(i)を生成する(S2)。理想経路Ru#(i)は、歩行者Pが移動体1およびユーザUの周辺に存在しないと仮定した場合に、ユーザUが移動するであろうと予想される経路である。図4は、理想経路Ru#(i)の生成手法の一例について説明するための図である。経路Rmはステップごとの複数の経路点Omを連ねたものとして表現される。ステップとは、所定時間ごとに到来する将来時点である。例えば、予測部120は、ステップ0(計算時点)からステップ1に向かうユーザUの移動ベクトルVu-1が、前回軌道Rm(z)のステップ1における移動体1の位置Om-1を向くように、ステップ1からステップ2に向かうユーザUの移動ベクトルVu-2が、前回軌道Rm(z)のステップ2における移動体1の位置Om-2を向くように、ステップ2からステップ3に向かうユーザUの移動ベクトルVu-3が、前回軌道Rm(z)のステップ3における移動体1の位置Om-3を向くように、つまり1サイクル遅れて移動体1に追従するように、ユーザUの理想経路Ru#を生成する。係る生成手法はあくまで一例であり、他の生成手法が採用されてもよい。例えば、予測部120は、後述する歩行者Pの将来の位置を予測するのと同様の手法で、ユーザUの理想経路Ru#を生成してもよい。 Figure 3 is a flowchart showing an example of the processing contents of the prediction unit 120. First, the prediction unit 120 inputs the previous route Rm(z) (S1) and generates an ideal route Ru#(i) for the user U to follow the moving body 1 (S2). The ideal route Ru#(i) is the route along which the user U is predicted to travel, assuming that no pedestrian P is present in the vicinity of the moving body 1 and the user U. Figure 4 is a diagram for explaining an example of a method for generating the ideal route Ru#(i). The route Rm is expressed as a series of multiple route points Om for each step. A step is a future point in time that arrives at a predetermined time interval. For example, the prediction unit 120 generates the ideal route Ru# of the user U so that the movement vector Vu-1 of the user U moving from step 0 (calculation time) to step 1 faces the position Om-1 of the moving body 1 at step 1 of the previous trajectory Rm(z), the movement vector Vu-2 of the user U moving from step 1 to step 2 faces the position Om-2 of the moving body 1 at step 2 of the previous trajectory Rm(z), and the movement vector Vu-3 of the user U moving from step 2 to step 3 faces the position Om-3 of the moving body 1 at step 3 of the previous trajectory Rm(z), that is, so as to follow the moving body 1 with a delay of one cycle. This generation method is merely an example, and other generation methods may be adopted. For example, the prediction unit 120 may generate the ideal route Ru# of the user U using a method similar to that for predicting the future position of the pedestrian P described later.

 図3に戻り、予測部120は、歩行者Pごとに(k=1~m)、歩行者Pの将来の位置を予測して予測経路Rp(k)を出力する(S3)。予測部120は、例えば、歩行者Pの過去の移動軌跡を関数で表される直線または曲線にフィッティングし、将来においても歩行者Pがその直線または曲線に沿って移動すると仮定して歩行者Pの将来の位置を予測する。予測部120は、例えば歩行者Pが加減速せず一定速度で移動すると仮定し(予測時点までに加速傾向、減速傾向にある場合はその傾向を引き継いでもよい)、qステップ先まで歩行者Pの将来の位置を予測する。予測部120は、ユーザUの理想経路Ru#と歩行者Pの予想経路Rp(k)を経路生成部130に出力する。 Returning to FIG. 3, the prediction unit 120 predicts the future position of pedestrian P for each pedestrian P (k = 1 to m) and outputs a predicted route Rp(k) (S3). The prediction unit 120, for example, fits the past movement trajectory of pedestrian P to a line or curve represented by a function, and predicts the future position of pedestrian P by assuming that pedestrian P will continue to move along that line or curve in the future. The prediction unit 120 predicts the future position of pedestrian P up to q steps ahead, assuming, for example, that pedestrian P moves at a constant speed without accelerating or decelerating (if there is a tendency for pedestrian P to accelerate or decelerate by the time of prediction, that tendency may be continued). The prediction unit 120 outputs the ideal route Ru# of user U and the expected route Rp(k) of pedestrian P to the route generation unit 130.

 経路生成部130の指標値計算部132は、歩行者Pごとに、将来の各ステップにおいて、歩行者PがユーザUに与える影響度を表す指標値F1を計算する。指標値F1は、歩行者PとユーザUの距離が短く、歩行者PとユーザUの相対速度が大きい程、大きい値に算出される。指標値F1は、例えば式(1)で表される。式中、dupは歩行者PとユーザUの距離である。ωupは式(2)で計算される。式中、vupは歩行者Pとユーザの相対速度である。なお、λ、ηは重み係数である。uupは式(3)で表されるように、ωupを正規化したベクトルである。euiは式(4)で表される正規化された方位ベクトル(ユーザUから見た歩行者Pの方向)である。 The index value calculation unit 132 of the path generation unit 130 calculates an index value F1 representing the degree of influence that the pedestrian P has on the user U in each future step for each pedestrian P. The index value F1 is calculated to be a larger value as the distance between the pedestrian P and the user U is shorter and the relative speed between the pedestrian P and the user U is higher. The index value F1 is expressed by, for example, Equation (1). In the equation, d up is the distance between the pedestrian P and the user U. ω up is calculated by Equation (2). In the equation, v up is the relative speed between the pedestrian P and the user. Note that λ and η are weighting coefficients. u up is a vector obtained by normalizing ω up as expressed by Equation (3). e ui is a normalized orientation vector (the direction of the pedestrian P as seen from the user U) expressed by Equation (4).

 F1=-{exp-η・dup・||ωup||}・uup …(1)
 ωup=λ・vup+eup …(2)
 uup=ωup/||ωup|| …(3)
 eup=(Pp-Pu)/||Pp-Pu|| …(4)
F1=-{exp -η・dup・||ωup|| }・u up …(1)
ω up =λ・v up +e up …(2)
u up = ω up / | | ω up | | …(3)
e up = (Pp-Pu)/||Pp-Pu||...(4)

 中間地点設定部134は、歩行者Pごとに、将来の各ステップごとに計算された指標値F1について、歩行者Pごとに最大となるステップを抽出する。そして、予想経路Rp(k)における指標値F1が最大となるステップに対応する位置を取得し、当該ステップにおける歩行者PとユーザUの中間地点を設定する。 The midpoint setting unit 134 extracts the step where the index value F1 calculated for each future step for each pedestrian P is maximum for each pedestrian P. It then obtains the position on the predicted route Rp(k) corresponding to the step where the index value F1 is maximum, and sets the midpoint between the pedestrian P and the user U at that step.

 図5は、中間地点が設定される様子を例示した図である。図ではk=1~4の4人の歩行者Pが存在している。歩行者P(1)についてはユーザUから遠ざかる動きをしているため、指標値F1が閾値で足切りされ、経路Rmの考慮に入れられていない。歩行者P(2)はステップ7で、歩行者P(3)はステップ9で、歩行者P(4)はステップ13で指標値F1が最大となる。図中、P(2,7)、P(3,9)、P(4,13)のそれぞれは、歩行者P(2)、P(3)、P(4)の指標値F1が最大となるステップ(7、9、13)に対応する位置()である。U(7)、U(9)、U(13)のそれぞれは、ユーザUがステップ7、9、13で通過すると予測された位置である。中間地点設定部134は、P(2,7)とU(7)の中間地点MID(7)、P(3,9)とU(9)の中間地点MID(9)、P(4,13)とU(13)の中間地点MID(13)を、それぞれステップ7、9、13に対応する中間地点として設定する。 Figure 5 is a diagram illustrating how waypoints are set. In the diagram, there are four pedestrians P, k = 1 to 4. As for pedestrian P(1), since he is moving away from user U, his index value F1 is cut off by the threshold value and he is not taken into consideration for route Rm. The index value F1 is maximum for pedestrian P(2) in step 7, for pedestrian P(3) in step 9, and for pedestrian P(4) in step 13. In the diagram, P(2,7), P(3,9), and P(4,13) are positions () corresponding to steps (7,9,13) where the index values F1 of pedestrians P(2), P(3), and P(4) are maximum, respectively. U(7), U(9), and U(13) are positions predicted to be passed by user U in steps 7, 9, and 13, respectively. The midpoint setting unit 134 sets the midpoint MID (7) between P (2, 7) and U (7), the midpoint MID (9) between P (3, 9) and U (9), and the midpoint MID (13) between P (4, 13) and U (13) as the midpoints corresponding to steps 7, 9, and 13, respectively.

 統合処理部136の処理内容については後述する。 The processing details of the integrated processing unit 136 will be described later.

 将来位置決定部138は、設定された中間地点を、ステップ順に通過するように、移動体1の経路Rmを生成する。更に、将来位置決定部138は、目的地点に向かうバイアスを何らかの手法で付加して移動体1の経路Rmを生成する。将来位置決定部138は、中間地点を、ステップ順に通過することができ、全体的に目的地点に向かう滑らかな曲線を移動体1の経路Rmとする。例えば、将来位置決定部138は、スプライン曲線、円弧、直線などを適宜組み合わせることで移動体1の経路Rmを生成する。 The future position determination unit 138 generates the route Rm for the moving body 1 so that the set waypoints are passed in step order. Furthermore, the future position determination unit 138 generates the route Rm for the moving body 1 by adding a bias toward the destination point using some method. The future position determination unit 138 determines the route Rm for the moving body 1 to be a smooth curve that can pass the waypoints in step order and generally leads to the destination point. For example, the future position determination unit 138 generates the route Rm for the moving body 1 by appropriately combining spline curves, circular arcs, straight lines, etc.

 これによって、移動体1は、自身の存在によって他の歩行者PがユーザUに接近し過ぎないようにするように制御される。例えば、歩行者PがユーザUに最も接近している場面において指標値F1が最大となる筈であるため、その場面において移動体1が中間地点に入ることで歩行者PのユーザUへの接近を牽制できるからである。 As a result, the moving body 1 is controlled so that its presence does not cause other pedestrians P to get too close to the user U. For example, the index value F1 should be at its maximum when the pedestrian P is closest to the user U, and so by moving to the intermediate point in that situation, the moving body 1 can restrain the pedestrian P from getting too close to the user U.

 図5の例は、ステップ順に中間地点が都合よく並んでいる例であり、実際には原則通りに中間地点を設定して経路Rmを生成すると、移動体1が行ったり来たりするような経路Rmが生成されてしまう場合がある。図6は、経路Rmが都合よく生成できない場面を例示した図である。図示するように、MID(9)がMID(7)よりも移動体1に近い位置に存在することで、経路Rmを滑らかに生成することができない状況が生じている。この結果、移動体1の挙動が安定せず、ユーザUが歩行者Pを戸惑わせることが生じ得る。 The example in Figure 5 shows how the waypoints are conveniently arranged in step order, but in reality, if the waypoints are set according to the principle and route Rm is generated, the route Rm may end up with the moving body 1 going back and forth. Figure 6 shows an example of a situation in which route Rm cannot be conveniently generated. As shown, MID (9) is closer to the moving body 1 than MID (7), which creates a situation in which route Rm cannot be generated smoothly. As a result, the behavior of the moving body 1 becomes unstable, and the user U may confuse the pedestrian P.

 これに対応するために、統合処理部136は、設定された複数の中間地点同士が所定の関係を有する場合、当該複数の中間地点をまとめた統合中間地点を設定する。そいて、将来位置決定部138は、統合中間地点にまとめられた個別の中間地点については考慮せず、統合中間地点を移動体1が通過するように経路Rmを生成する。 To deal with this, if multiple set waypoints have a predetermined relationship with each other, the integration processing unit 136 sets an integrated waypoint that combines the multiple waypoints.The future position determination unit 138 then generates a route Rm so that the mobile unit 1 passes through the integrated waypoint, without taking into account the individual waypoints combined into the integrated waypoint.

 所定の関係とは、複数の中間地点が所定距離以内に存在し、且つ当該複数の中間地点を通過した場合の移動体1の経路Rmに所定角度(例えば90度程度)未満の急旋回部が生じる。図6のA(7)はMID(7)から見た所定距離の範囲を示している。この例ではMID(9)がA(7)に収まっており、MID(7)とMID(9)を通過した場合の移動体1の経路Rmに急旋回部が生じているため、MID(7)とMID(9)は所定の関係を有すると判定される。 The specified relationship means that there are multiple intermediate points within a specified distance, and when the multiple intermediate points are passed, a sharp turn of less than a specified angle (e.g., about 90 degrees) occurs on the route Rm of the moving body 1. A(7) in FIG. 6 indicates the range of the specified distance as seen from MID(7). In this example, MID(9) is within A(7), and a sharp turn occurs on the route Rm of the moving body 1 when MID(7) and MID(9) are passed, so MID(7) and MID(9) are determined to have a specified relationship.

 図7は、統合処理部136により複数の中間地点をまとめた統合中間地点が設定された場合の経路Rmの一例を示す図である。図中、MID*(7,9)がMID(7)とMID(9)をまとめた統合中間地点である。統合処理部136は、例えば、位置と時間の双方について、F1の大きさの比率で加重和を求めることで統合中間地点を設定する(式(2)、(3))。式中、F1(7)はステップ7における歩行者P(3)の指標値F1の大きさであり、F1(9)はステップ7における歩行者P(2)の指標値F1の大きさである。また、t*は統合中間地点を通過すべきタイミング(ステップ)を示している。 FIG. 7 is a diagram showing an example of route Rm when an integrated waypoint that combines multiple waypoints is set by the integration processing unit 136. In the figure, MID*(7,9) is the integrated waypoint that combines MID(7) and MID(9). The integration processing unit 136 sets the integrated waypoint by, for example, calculating a weighted sum for both position and time using the ratio of the magnitude of F1 (equations (2) and (3)). In the equations, F1(7) is the magnitude of index value F1 of pedestrian P(3) in step 7, and F1(9) is the magnitude of index value F1 of pedestrian P(2) in step 7. In addition, t* indicates the timing (step) at which the integrated waypoint should be passed.

 MID*(7,9)={MID(9)×F1(2)+MID(7)×F1(3)}/{F1(7)+F1(9)} …(2)
 t*={9×F1(2)+7×F1(3)}/{F1(7)+F1(9)} …(3)
MID*(7,9)={MID(9)×F1(2)+MID(7)×F1(3)}/{F1(7)+F1(9)}…(2)
t*={9×F1(2)+7×F1(3)}/{F1(7)+F1(9)}…(3)

 このように処理を行うことで、移動体1の挙動を安定的にすることができる。 By performing processing in this manner, the behavior of the moving body 1 can be stabilized.

 以上説明した実施形態によれば、先導対象者(ユーザU)に対して適切な挙動をするように移動体1の経路を生成することができる。 According to the embodiment described above, a route for the mobile body 1 can be generated so that the mobile body 1 behaves appropriately with respect to the lead subject (user U).

 上記の説明において、制御装置100は、移動体1に搭載されるものとしたが、これに限らず、移動体1から離れた場所に設置され、通信によって物体検知デバイス20の出力データを取得すると共に、駆動指示信号を駆動装置30に送信するもの、つまり移動体1を遠隔制御するものであってもよい。 In the above description, the control device 100 is described as being mounted on the moving body 1, but this is not limited thereto. The control device 100 may be installed at a location away from the moving body 1, acquire output data from the object detection device 20 via communication, and transmit a drive instruction signal to the drive device 30, i.e., remotely control the moving body 1.

 上記説明した実施形態は、以下のように表現することができる。
 先導対象者を先導しながら、他の歩行者が歩行する領域を自律的に移動する移動体を制御する制御装置であって、
 コンピュータによって読み込み可能な命令(computer-readable instructions)を格納する一以上の記憶媒体(storage medium)と、
 前記一以上の記憶媒体に接続されたプロセッサと、を備え、
 前記プロセッサは、前記コンピュータによって読み込み可能な命令を実行することにより(the processor executing the computer-readable instructions to:)、
 前記先導対象者および前記他の歩行者を含む物体を認識することと、
 前記認識された他の歩行者の将来の位置を予測することと、
 前記移動体が将来進むべき経路を生成することと、
 前記経路に沿って前記移動体が移動するように前記移動体に取り付けられた駆動装置を制御することと、を実行し、
 前記経路を生成することは、前記先導対象者と前記他の歩行者との距離および相対速度に基づいて、前記他の歩行者が前記先導対象者に与える影響度を表す指標値を計算し、前記指標値が最大となる時点において前記移動体が前記先導対象者と前記他の歩行者の中間地点に位置するように前記経路を生成することを含む、
 制御装置。
The above-described embodiment can be expressed as follows.
A control device for controlling a moving body that autonomously moves in an area where other pedestrians are walking while leading a leading subject,
one or more storage media storing computer-readable instructions;
a processor coupled to the one or more storage media;
The processor executes the computer-readable instructions to:
Recognizing objects including the leading subject and the other pedestrians;
predicting a future location of the recognized other pedestrians; and
generating a route to be taken by the moving body in the future;
controlling a drive unit attached to the moving body to move the moving body along the path;
generating the route includes calculating an index value representing a degree of influence that the other pedestrian has on the target leading person based on a distance and a relative speed between the target leading person and the other pedestrian, and generating the route such that the moving object is located at a midpoint between the target leading person and the other pedestrian at a time when the index value is maximum.
Control device.

 以上、本発明を実施するための形態について実施形態を用いて説明したが、本発明はこうした実施形態に何等限定されるものではなく、本発明の要旨を逸脱しない範囲内において種々の変形及び置換を加えることができる。  Although the above describes the form for carrying out the present invention using an embodiment, the present invention is in no way limited to such an embodiment, and various modifications and substitutions can be made without departing from the spirit of the present invention.

1 移動体
20 物体検知デバイス
30 駆動装置
100 制御装置
110 認識部
120 予測部
130 経路生成部
132 指標値計算部
134 中間地点設定部
136 統合処理部
138 将来位置決定部
140 駆動制御部
REFERENCE SIGNS LIST 1 Mobile object 20 Object detection device 30 Driving device 100 Control device 110 Recognition unit 120 Prediction unit 130 Path generation unit 132 Index value calculation unit 134 Waypoint setting unit 136 Integrated processing unit 138 Future position determination unit 140 Driving control unit

Claims (6)

 少なくとも一時的に、先導対象者を先導しながら、他の歩行者が歩行する領域を自律的に移動する移動体を制御する制御装置であって、
 前記先導対象者および前記他の歩行者を含む物体を認識する認識部と、
 前記認識された他の歩行者の将来の位置を予測する予測部と、
 前記移動体が将来進むべき経路を生成する経路生成部と、
 前記経路に沿って前記移動体が移動するように前記移動体に取り付けられた駆動装置を制御する駆動制御部と、
 を備え、
 前記経路生成部は、前記先導対象者と前記他の歩行者との距離および相対速度に基づいて、前記他の歩行者が前記先導対象者に与える影響度を表す指標値を計算し、前記指標値が最大となる時点において前記移動体が前記先導対象者と前記他の歩行者の中間地点に位置するように前記経路を生成する、
 制御装置。
A control device for controlling a moving body that autonomously moves in an area where other pedestrians are walking while at least temporarily leading a leading subject,
A recognition unit that recognizes objects including the leading person and the other pedestrians;
a prediction unit for predicting a future position of the recognized other pedestrian;
A route generation unit that generates a route that the moving body should take in the future;
a drive control unit that controls a drive device attached to the moving body so that the moving body moves along the path;
Equipped with
the route generation unit calculates an index value representing a degree of influence that the other pedestrian has on the target leading person based on a distance and a relative speed between the target leading person and the other pedestrian, and generates the route such that the moving body is located at a midpoint between the target leading person and the other pedestrian at a time when the index value is maximum.
Control device.
 前記指標値は、前記距離が短く、且つ前記相対速度のうち前記他の歩行者が前記先導対象者に向かう方向の成分が大きい程、大きい値に算出される、
 請求項1記載の制御装置。
The index value is calculated to be a larger value as the distance is shorter and the component of the relative speed in the direction in which the other pedestrian moves toward the leading target pedestrian is larger.
The control device according to claim 1.
 前記経路生成部は、複数の前記他の歩行者にそれぞれ対応する複数の中間地点同士が所定の関係を有する場合、当該複数の中間地点をまとめた統合中間地点を前記移動体が通過するように前記経路を生成する、
 請求項1記載の制御装置。
the route generation unit generates the route such that the moving object passes through an integrated route that combines a plurality of waypoints corresponding to the plurality of other pedestrians when the plurality of waypoints have a predetermined relationship with each other.
The control device according to claim 1.
 前記所定の関係は、前記複数の中間地点が所定距離以内に存在し、且つ当該複数の中間地点を通過した場合の前記移動体の経路に所定角度未満の急旋回部が生じる関係である、
 請求項3記載の制御装置。
the predetermined relationship is a relationship in which the plurality of intermediate points are located within a predetermined distance from each other, and a sharp turning portion of less than a predetermined angle occurs on the path of the moving object when the moving object passes through the plurality of intermediate points;
The control device according to claim 3.
 少なくとも一時的に、先導対象者を先導しながら、他の歩行者が歩行する領域を自律的に移動する移動体を制御する制御装置のプロセッサが実行する制御方法であって、
 前記先導対象者および前記他の歩行者を含む物体を認識することと、
 前記認識された他の歩行者の将来の位置を予測することと、
 前記移動体が将来進むべき経路を生成することと、
 前記経路に沿って前記移動体が移動するように前記移動体に取り付けられた駆動装置を制御することと、を備え、
 前記経路を生成することは、前記先導対象者と前記他の歩行者との距離および相対速度に基づいて、前記他の歩行者が前記先導対象者に与える影響度を表す指標値を計算し、前記指標値が最大となる時点において前記移動体が前記先導対象者と前記他の歩行者の中間地点に位置するように前記経路を生成することを含む、
 制御方法。
A control method executed by a processor of a control device that controls a moving object that moves autonomously in an area where other pedestrians are walking while at least temporarily leading a leading subject, comprising:
Recognizing objects including the leading subject and the other pedestrians;
predicting a future location of the recognized other pedestrians; and
generating a route to be taken by the moving body in the future;
and controlling a drive device attached to the moving body so as to move the moving body along the path;
generating the route includes calculating an index value representing a degree of influence that the other pedestrian has on the target leading person based on a distance and a relative speed between the target leading person and the other pedestrian, and generating the route such that the moving object is located at a midpoint between the target leading person and the other pedestrian at a time when the index value is maximum.
Control methods.
 少なくとも一時的に、先導対象者を先導しながら、他の歩行者が歩行する領域を自律的に移動する移動体を制御する制御装置のプロセッサに、
 前記先導対象者および前記他の歩行者を含む物体を認識することと、
 前記認識された他の歩行者の将来の位置を予測することと、
 前記移動体が将来進むべき経路を生成することと、
 前記経路に沿って前記移動体が移動するように前記移動体に取り付けられた駆動装置を制御することと、を実行させるプログラムであって、
 前記経路を生成することは、前記先導対象者と前記他の歩行者との距離および相対速度に基づいて、前記他の歩行者が前記先導対象者に与える影響度を表す指標値を計算し、前記指標値が最大となる時点において前記移動体が前記先導対象者と前記他の歩行者の中間地点に位置するように前記経路を生成することを含む、
 プログラム。
A processor of a control device that controls a moving object that moves autonomously in an area where other pedestrians are walking while at least temporarily leading a leading target person,
Recognizing objects including the leading subject and the other pedestrians;
predicting a future location of the recognized other pedestrians; and
generating a route to be taken by the moving body in the future;
and controlling a drive device attached to the moving body so as to move the moving body along the path,
generating the route includes calculating an index value representing a degree of influence that the other pedestrian has on the target leading person based on a distance and a relative speed between the target leading person and the other pedestrian, and generating the route such that the moving object is located at a midpoint between the target leading person and the other pedestrian at a time when the index value is maximum.
program.
PCT/JP2023/012921 2023-03-29 2023-03-29 Control device, control method, and program WO2024201821A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2023/012921 WO2024201821A1 (en) 2023-03-29 2023-03-29 Control device, control method, and program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2023/012921 WO2024201821A1 (en) 2023-03-29 2023-03-29 Control device, control method, and program

Publications (1)

Publication Number Publication Date
WO2024201821A1 true WO2024201821A1 (en) 2024-10-03

Family

ID=92903637

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/012921 WO2024201821A1 (en) 2023-03-29 2023-03-29 Control device, control method, and program

Country Status (1)

Country Link
WO (1) WO2024201821A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008234404A (en) * 2007-03-22 2008-10-02 Toyota Motor Corp Mobile robot and robot movement control method
JP2008307658A (en) * 2007-06-15 2008-12-25 Toyota Motor Corp Autonomous mobile device
JP2021163215A (en) * 2020-03-31 2021-10-11 株式会社エクォス・リサーチ Mobile controller, mobile control program and mobile

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008234404A (en) * 2007-03-22 2008-10-02 Toyota Motor Corp Mobile robot and robot movement control method
JP2008307658A (en) * 2007-06-15 2008-12-25 Toyota Motor Corp Autonomous mobile device
JP2021163215A (en) * 2020-03-31 2021-10-11 株式会社エクォス・リサーチ Mobile controller, mobile control program and mobile

Similar Documents

Publication Publication Date Title
CN112840350B (en) Autonomous vehicle planning and prediction
JP7607641B2 (en) Modeling and predicting yielding behavior.
US11787438B2 (en) Collaborative vehicle path generation
US11554790B2 (en) Trajectory classification
WO2019124001A1 (en) Moving body behavior prediction device and moving body behavior prediction method
US9116521B2 (en) Autonomous moving device and control method thereof
JP2022506475A (en) Orbit generation
JP7450481B2 (en) Mobile object control device, mobile object, mobile object control method, and program
JP2006134221A (en) Follow-up movement device
WO2023028208A1 (en) Techniques for detecting road blockages and generating alternative routes
JP2023525054A (en) Trajectory classification
US12005925B1 (en) Collaborative action ambiguity resolution for autonomous vehicles
WO2024201821A1 (en) Control device, control method, and program
JP2022132902A (en) Mobile body control system, mobile body, mobile body control method, and program
US20240092350A1 (en) Vehicle safety system
WO2024201818A1 (en) Control device, control method, and program
JP2023522844A (en) Remote control for collaborative vehicle guidance
WO2025069393A1 (en) Control device, control method, and program
WO2024195078A1 (en) Control device, control method, and program
WO2025120847A1 (en) Control device, control method, and program
JP2022550122A (en) Autonomous Driving Optimization Method and System Based on Reinforcement Learning Based on User Preference
EP4557037A1 (en) Control device, control method, and program
WO2024134901A1 (en) Control device, control method, and program
JP6599817B2 (en) Arithmetic apparatus, arithmetic method and program
WO2025069293A1 (en) Filter creation method, filter creation device, program, and filter device