[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

WO2019124279A1 - Dispositif de traitement d'informations - Google Patents

Dispositif de traitement d'informations Download PDF

Info

Publication number
WO2019124279A1
WO2019124279A1 PCT/JP2018/046194 JP2018046194W WO2019124279A1 WO 2019124279 A1 WO2019124279 A1 WO 2019124279A1 JP 2018046194 W JP2018046194 W JP 2018046194W WO 2019124279 A1 WO2019124279 A1 WO 2019124279A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
detection
detection result
feature
information processing
Prior art date
Application number
PCT/JP2018/046194
Other languages
English (en)
Japanese (ja)
Inventor
加藤 正浩
良樹 轡
淑子 加藤
一聡 田中
Original Assignee
パイオニア株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パイオニア株式会社 filed Critical パイオニア株式会社
Publication of WO2019124279A1 publication Critical patent/WO2019124279A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/10Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to vehicle motion
    • B60W40/114Yaw movement
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • the present invention relates to an information processing apparatus that performs predetermined processing based on a detection result of a detection unit that detects a feature around a moving object.
  • an automatic travel system grasps the situation by recognizing an object present around the vehicle, generates an optimal target track, and controls the vehicle to travel along the target track. Do. At this time, if the self-position estimation accuracy of the vehicle is poor, there is a possibility that the actual traveling track deviates from the target track, which reduces the safety of automatic traveling. Accurate self-positioning is one of the important factors to ensure the safety of automatic driving.
  • the self-position estimation in the conventional car navigation system often uses GNSS (Global Navigation Satellite System). Therefore, there is a problem that the accuracy is deteriorated in an environment where multiple paths such as an unreceivable place such as in a tunnel or a valley of a building are frequent.
  • GNSS Global Navigation Satellite System
  • a so-called dead reckoning technique that estimates the vehicle position based on the traveling state of the vehicle (for example, the vehicle speed and the yaw rate). Then, in order to improve the estimation accuracy of the vehicle position by dead reckoning, it is necessary to accurately acquire the traveling state of the vehicle such as the above-described vehicle speed.
  • Patent Document 1 As a technique for acquiring the yaw rate with high accuracy, for example, Patent Document 1 can be mentioned.
  • Patent Document 1 describes that the yaw rate is accurately estimated by using the lateral movement amount LM of the fixed object F detected by the front object detection sensor 31.
  • the yaw rate in the present specification indicates the amount of change in yaw angle per unit time.
  • the gyro sensor can be corrected using an accurate estimated value of the yaw rate, it is premised that the vehicle makes a turn when estimating the yaw rate. Therefore, the yaw rate can not be calculated when traveling on a straight road.
  • One of the problems to be solved by the present invention is to enable yaw rate calculation even when traveling on a straight road.
  • the invention according to claim 1 made for the purpose of solving the above problems is a first acquisition unit for acquiring a detection result of a detection unit for detecting a feature around an autonomous driving vehicle traveling autonomously, and the acquired An output unit that outputs control information for controlling the autonomous driving vehicle such that the traveling direction of the autonomous driving vehicle changes based on a detection result, and the traveling direction of the autonomous driving vehicle is changed by the control information And a calculation unit that calculates a yaw angle change amount of the autonomous driving vehicle when the moving direction is changing, based on a detection result of the detection unit at that time.
  • the invention according to claim 6 is a detection unit that detects a feature around an autonomous driving vehicle traveling autonomously, a first acquisition unit that acquires a detection result of the detection unit, and the acquired detection result.
  • a calculating unit configured to calculate a yaw angle change amount of the autonomous driving vehicle when the moving direction is changing, based on a detection result of the detecting unit.
  • the invention according to claim 7 is an information processing method executed by an information processing apparatus that performs a predetermined process based on a detection result of a detection unit that detects a feature around an autonomous driving vehicle traveling autonomously.
  • the invention according to claim 8 is characterized in that the information processing method according to claim 7 is executed by a computer.
  • the first acquisition unit acquires the detection result of the detection unit that detects a feature around the autonomous driving vehicle traveling autonomously
  • the output unit detects the acquired Control information for controlling the autonomous driving vehicle is output based on the result so that the moving direction of the autonomous driving vehicle changes.
  • the calculation unit calculates the yaw angle change amount of the automatically driven vehicle when the movement direction is changed, based on the detection result of the detection unit when the automatically driven vehicle changes the movement direction according to the control information. .
  • the traveling path is not curved but travels on a straight road. Even if there is, the situation in which the yaw angle changes can be created, so that the yaw rate can be calculated.
  • the first acquisition unit continuously acquires, at predetermined time intervals, detection results of the detection unit that detects a feature having continuity along the traveling path around the autonomous driving vehicle, and the predetermined result is determined from the detection results.
  • a first setting unit configured to set the direction of the reference line.
  • the calculation unit calculates the yaw angle change amount of the autonomous driving vehicle based on the moving direction of the autonomous driving vehicle with respect to the reference line at the first time and the traveling direction of the autonomous driving vehicle with respect to the reference line at the second time. You may do so. By doing this, it is possible to calculate the yaw rate based on the directions of the reference lines of the features at two times and the moving direction of the moving object. And if it is a feature which has continuity along a running path to a feature, since it is not necessary to be a fixed thing indicated in patent documents 1, calculation frequency of yaw rate can be raised.
  • a second setting unit for setting the direction of the normal direction of the feature is provided, and the calculation unit is configured to detect the direction of the feature from the automatically driven vehicle at the first time and the feature from the automatically driven vehicle at the second time.
  • the yaw angle change amount of the autonomous driving vehicle may be calculated on the basis of the change of the angle relative to the direction of the normal direction of the feature from the first time. By doing this, it is possible to calculate the yaw rate using a feature such as a road sign, for example.
  • an evaluation unit that evaluates the detection accuracy of at least one of the feature and the reference line may further be included, and the calculation unit may calculate the yaw angle change amount when the evaluation result of the detection accuracy is a predetermined value or more. . By doing this, it is possible to calculate the yaw rate only when the detection accuracy of the feature or the reference line is high. Therefore, the calculation accuracy of the yaw rate can be improved.
  • the second acquisition unit may acquire information output from the gyro sensor, and the correction unit may correct the information output from the gyro sensor based on the yaw angle change amount calculated by the calculation unit. Good. By doing this, it becomes possible to correct the detection value of the gyro sensor with the calculation value of the calculation unit. Therefore, the accuracy of the detection value of the gyro sensor can be improved.
  • the detection apparatus concerning one Embodiment of this invention is provided with the detection part which detects the terrestrial feature around the autonomous driving vehicle which carries out an autonomous travel. Then, the first acquisition unit acquires the detection result of the detection unit that detects a feature around the autonomous driving vehicle traveling autonomously, and the output unit changes the moving direction of the autonomous driving vehicle based on the acquired detection result Control information for controlling the autonomous driving vehicle. Then, the calculation unit calculates the yaw angle change amount of the automatically driven vehicle when the movement direction is changed, based on the detection result of the detection unit when the automatically driven vehicle changes the movement direction according to the control information. .
  • the traveling path is not curved and travels on a straight road. Even in this case, the yaw rate can be calculated because the situation in which the yaw angle changes can be created.
  • a detection result of a detection unit that detects a feature around the autonomous driving vehicle traveling autonomously is acquired, and acquired in the output step.
  • control information for controlling the autonomous driving vehicle is output such that the moving direction of the autonomous driving vehicle changes.
  • the yaw angle change amount of the automatically driven vehicle when the moving direction is changed is calculated based on the detection result of the detection unit when the automatically driven vehicle moves according to the control information.
  • an information processing program that causes a computer to execute the above-described information processing method may be used.
  • the traveling path is not curved and travels on a straight road in order to output control information for controlling the autonomous driving vehicle so that the moving direction of the autonomous driving vehicle changes. Even when this is the case, the yaw rate can be calculated because the situation in which the yaw angle changes can be created.
  • the information processing apparatus according to the present embodiment is included in the detection device 1 and moves together with an automatically driven vehicle as a moving body.
  • the autonomous driving vehicle travels autonomously (automatic driving) to a destination set based on the map DB 10 including the result of detection by the sensor group 11 described later and map information for automatic driving that the storage unit 12 has.
  • the detection device 1 includes a sensor group 11, a storage unit 12, a control unit 15, and an output unit 16.
  • the sensor group 11 includes a lidar 21, a vehicle speed sensor 22, an acceleration sensor 23, a gyro sensor 24, an inclination sensor 25, a temperature sensor 26, and a GPS receiver 27.
  • the lidar 21 as a detection unit emits laser light in a pulse shape to discretely measure the distance to an object present in the outside world.
  • the lidar 21 outputs a point cloud of measurement points indicated by a combination of the distance to the object from which the laser light is reflected and the emission angle of the laser light.
  • the lidar 21 is used to detect features present in the periphery of an autonomous driving vehicle.
  • a feature is a concept that includes all natural or artificial objects present on the ground. Examples of features include path features located on a route (i.e., a road) such as an autonomous vehicle and peripheral features located on the periphery of the road.
  • a route i.e., a road
  • peripheral features located on the periphery of the road.
  • the route top feature a road sign, a traffic light, a guardrail, a footbridge, etc.
  • the road itself is also included. That is, characters and figures drawn on the road surface, and the shape of the road (road width and curvature) are also included in the route features.
  • examples of the peripheral features include buildings (houses, stores) and billboards located along the road.
  • the vehicle speed sensor 22 measures a pulse (also referred to as an “axle rotation pulse”) formed of a pulse signal generated as the wheels of the autonomous driving vehicle rotate, and detects the vehicle speed.
  • the acceleration sensor 23 detects an acceleration in the traveling direction of the autonomous driving vehicle.
  • the gyro sensor 24 detects the yaw rate of the autonomous driving vehicle at the time of direction change.
  • the tilt sensor 25 detects a tilt angle (also referred to as a “slope angle”) in the pitch direction with respect to the horizontal plane of the autonomous driving vehicle.
  • the temperature sensor 26 detects the temperature around the gyro sensor 24.
  • a GPS (Global Positioning System) receiver 27 detects an absolute position of an autonomous driving vehicle by receiving radio waves including positioning data from a plurality of GPS satellites. The output of each sensor of the sensor group 11 is supplied to the control unit 15.
  • the storage unit 12 stores an information processing program executed by the control unit 15, information required for the control unit 15 to execute a predetermined process, and the like.
  • the storage unit 12 stores a map database (DB) 10 including road data and feature information.
  • map DB10 may be updated regularly.
  • the control unit 15 receives partial map information related to the area to which the vehicle position belongs from an external server device that manages map information via a communication unit (not shown), and causes the map DB 10 to reflect it.
  • a server device that can communicate with the detection device 1 may store the map DB 10.
  • the control unit 15 communicates with an external server device to acquire necessary feature information and the like from the map DB 10.
  • the output unit 16 outputs, for example, information such as the yaw rate calculated by the control unit 15, a control signal such as a lane change of the automatically driven vehicle, and the like to another on-vehicle device such as a control device for automatic driving.
  • the control unit 15 includes a CPU (Central Processing Unit) or the like that executes a program, and controls the entire detection device 1.
  • the control unit 15 includes an acquisition unit 15a, a recognition unit 15b, a setting unit 15c, a calculation unit 15d, and an evaluation unit 15e.
  • the control unit 15 calculates the yaw rate of the vehicle based on the features detected by the rider 21.
  • the acquisition unit 15a continuously acquires detection results in a window, which will be described later, among detection results of the features detected by the lidar 21 at predetermined time intervals.
  • the recognition unit 15b recognizes a reference line of the feature based on a predetermined rule described later from the detection result.
  • the reference line will be described later.
  • the setting unit 15c internally sets the direction of the reference line.
  • the calculating unit 15d is configured to calculate the moving direction of the automatically driven vehicle with respect to the reference line at the first time which is a specific time during traveling of the automatically driven vehicle and the automatically driven vehicle with respect to the reference line at the second time which is different from the first time.
  • the yaw rate of the autonomous driving vehicle is calculated based on the movement direction and the change in direction of the reference line from the first time.
  • the evaluation unit 15 e evaluates the detection accuracy of a white line center line (reference line) described later.
  • control part 15 functions as an information processor concerning this example among detection devices 1 of composition of having mentioned above.
  • the boundary line (so-called white line) that divides the lane is used as a feature. Since the white line is coated with a retroreflective material, the reflection intensity is high and detection by the lidar 21 is easy.
  • a white line is demonstrated as a terrestrial feature in a present Example, it will not be specifically limited if it is a terrestrial feature which has continuity along traveling roads, such as a lane boundary line other than a white line and a guardrail.
  • the detection of the white line in the present embodiment will be described.
  • FIG. 2 it is assumed that the autonomous driving vehicle C is traveling from left to right in the drawing.
  • the rider 21L is installed on the left side of the front portion of the autonomous driving vehicle C, and the rider 21R is similarly installed on the right side of the front portion of the autonomous driving vehicle C.
  • a window W which is a rectangular area is set in the detection range A.
  • the window W is set at a position where the white line D1 and the white line D2 can be easily detected in the detection range A.
  • This window W is a detection area which moves with the automatically driven vehicle C in this embodiment.
  • the rider 21 installed in front of the automatically driven vehicle C will be described. However, the rider may be installed behind the automatically driven vehicle C. Furthermore, only one of the riders 21L and 21R may be used.
  • the lidar 21 in this embodiment scans an object by emitting pulsed light sequentially from one side to the other side in the horizontal direction. Therefore, as shown in the upper part of FIG. 2 and FIG. 3, the scan locus is linear when viewed from above. Therefore, the acquiring unit 15a acquires information from the lidar 21 at intervals of the scanned lines. That is, the acquisition unit 15a continuously acquires the detection result of the feature at predetermined time intervals.
  • a beam scanning in the horizontal direction is vertically moved up and down to obtain a plurality of lines, or a plurality of optical systems scanning in the horizontal direction are vertically arranged to obtain a plurality of lines There is. It is known that the scan interval of such a type of rider increases with distance from the autonomous driving vehicle C (see also FIG. 3). This is because the angle between the rider 21 and the feature (the road surface) becomes shallower as it goes away from the autonomous driving vehicle C.
  • the reference line defines the direction of the white line (feature) in the window W, and is used to compare with the moving direction of the automatically driven vehicle C in calculating the yaw rate of the automatically driven vehicle C described later. .
  • the white line D1 can be recognized by detecting a portion (high reflection intensity) in the window W where the reflection intensity is high. For example, a portion having a reflection intensity equal to or higher than a predetermined threshold may be regarded as a white line.
  • the center point of the high reflection strength portion in the line of each scan is determined, the equation of the straight line passing each center point is determined by, for example, the least square method, and the straight line represented by the equation obtained is set as the white line center line.
  • the white line center line L is obtained by the equation passing through the central points c1 to c10 by the least squares method.
  • the center line of the white line is calculated in the present embodiment, it is not limited to the center line.
  • the line may be determined by the method of least squares based on the end of the white line, that is, the end point where the white line is detected in the form of a scan line. That is, in the present embodiment, the reference line (white line center line) is recognized with the least squares method as a predetermined rule. Then, the direction represented by the calculated white line center line is set in the setting unit 15 c.
  • the white line center line L is used to calculate the yaw rate of the automatically driven vehicle C, the accuracy of the white line center line L may also affect the yaw rate accuracy. Therefore, the detection accuracy of the white line center line L is calculated by the following evaluation index.
  • the first evaluation index is whether or not the number of scan lines of a predetermined number or more can be detected in the range of the window W. For example, as shown in the middle of FIG. 3, when only a part of the end of the white line D1 can be detected, the number of center points (the number of samples) is small and the accuracy of the white line center line L obtained by the least squares method is not high. There is a case.
  • the evaluation index b in this case is expressed by the following equation (1), assuming that the number of lines measured by the lidar 21L or the lidar 21R is N L and the expected number of lines is N M.
  • the second of the evaluation indexes is whether the line width of the high reflection intensity portion is approximately equal to the target white line width. For example, as shown in the lower part of FIG. 3, when the coating of the white line D1 is deteriorated, the width of the white line D1 changes, so the position (the value indicated by the sample) of the center point of the white line is the original center position (reference value) There are cases where the accuracy of the white line center line L determined by the method of least squares may not be high because there are many cases of deviation (the error is large).
  • the evaluation index c of case, lidar 21L or lidar 21R linewidth measured in W L, when the original line width is W M represented by the following equation (2). Note that the original line width may be included in, for example, the map DB 10.
  • the white line D1 is a straight line, and the angle of the automatically driven vehicle C with the white line D1 is changed due to a change in lane, left turn, or the like.
  • the white line center line L is determined by the method described above, and the direction is specified from the equation indicating the white line center line L. Then, the angle difference between the white line center line L and the direction in which the autonomous driving vehicle C is directed is determined for each time, the difference in the angle difference is taken, and the yaw angle change ⁇ is determined.
  • the difference ⁇ is determined.
  • the difference with ⁇ of the previous time is taken, and the yaw angle change ⁇ is obtained (see the equation (3)).
  • the white line D1 is premised on a straight line, the direction of the white line center line L is the same and there is no change, so that the yaw angle change ⁇ can be obtained by the equation (3).
  • the yaw rate change ⁇ obtained by the equation (3) is divided by the time difference ⁇ t obtained by the equation (4) to calculate the yaw rate ⁇ _dot (see the equation (5)).
  • the yaw rate (the amount of change of the yaw angle per unit time) of the automatically driven vehicle C is calculated based on the change in the moving direction of the automatically driven vehicle C with respect to the line).
  • the automatically driven vehicle C gradually moves away from the white line D1 due to a lane change, etc.
  • the window W is adjusted from the vehicle position, and the window W includes a portion with high reflection intensity.
  • This adjustment can be performed based on, for example, the steering angle of the vehicle C or a value detected by the gyro sensor 24.
  • the yaw rate detected by the gyro sensor 24 is known to have sensitivity and offset (see FIG. 5). Therefore, assuming that the yaw rate detected by the gyro sensor 24 is ⁇ [t], the sensitivity coefficient is A, and the offset coefficient is B, the true yaw rate ⁇ _dot is expressed by the following equation (6).
  • sensitivity coefficient A and offset coefficient B can be obtained by the least squares method or the like. Therefore, the yaw rate detected by the gyro sensor 24 can be corrected using the sensitivity coefficient A and the offset coefficient B which are obtained.
  • step S101 the acquisition unit 15a reads white line information in the vicinity of the vehicle position from the map DB 10.
  • the white line information includes information on the width of the white line used to obtain the evaluation index of the white line center line L described above.
  • step S101 it is determined whether the white line indicated by the read white line information or the white line recognized by the recognition unit 15b is a continuous section of a straight line. The following steps are not performed if it is not a section following the straight line. Whether the section is a continuation of a straight line may include white line information including information indicating the straight section.
  • step S102 the recognition unit 15b performs white line detection processing in which a portion with high reflection intensity is detected by the lidar 21 as a white line
  • step S103 the evaluation unit 15e calculates the equations (1) and (2).
  • the evaluation indexes b and c are calculated by the equation). In this flowchart, although both of the evaluation indexes b and c are calculated and evaluated, only one of them may be used.
  • step S104 the evaluation unit 15e determines whether both of the evaluation indexes b and c are larger than a predetermined value. If one of the evaluation indexes b and c is smaller than the predetermined value (in the case of No), the process returns to step S101. If both evaluation indexes b and c are larger than the predetermined value (in the case of Yes), the evaluation unit 15e is performed in step S105. However, it is determined that yaw rate calculation using a white line is possible.
  • step S106 the calculation unit 15d determines whether the lane change is possible. This determination may be made based on, for example, whether the road currently being traveled has a plurality of lanes based on the map DB 10, and whether there is another vehicle or the like in the surroundings based on information acquired from the sensor group 11 such as the rider 21.
  • the process returns to step S101, and in the case where the lane change is possible (in the case of Yes), the lane change operation is performed in step S107. That is, a control signal (control information) for lane change is output to the control device that controls the steering function, the accelerator, the brake, and the like in the automatically driven vehicle C via the output unit 16.
  • step S108 the calculation unit 15d calculates the yaw rate by the equations (3) to (5) described above.
  • the yaw rate calculation process will be described with reference to the flowchart of FIG. 7 described later.
  • step S109 the sensitivity and offset of the gyro sensor 24 are calculated (corrected) by the above-described equation (7).
  • step S108 the lane change direction (moving direction changes) based on the detection result of the rider 21 (detection unit) during the lane change (when the moving direction changes) according to the control signal.
  • the yaw rate with respect to the moving direction of the automatically driven vehicle C is calculated.
  • the sensitivity coefficient A and the offset coefficient B can be appropriately corrected, and the accuracy of the yaw rate detected by the gyro sensor 24 can be improved.
  • step S108 the yaw rate calculation process
  • step S201 the recognition unit 15b performs a white line detection process of detecting a portion with high reflection intensity as a white line based on the information acquired by the acquisition unit 15a from the lidar 21, and in step S202, the evaluation unit 15e
  • the evaluation indexes b and c are calculated by the equations (1) and (2) described above. That is, the acquisition unit 15a functions as a first acquisition unit.
  • step S203 the evaluation unit 15e determines whether both of the evaluation indexes b and c are larger than a predetermined value. If one of the evaluation indexes b and c is smaller than the predetermined value (in the case of No), the process returns to the step S201, and if both the evaluation indexes b and c are larger than the predetermined value (in the case of Yes) Then, the equation of the white line center line L is determined using the least squares method from the center point of each scan line.
  • steps S201 to S203 perform the same operation as steps S102 to S104, steps S102 to S104 determine whether the white line read in step S101 is a white line suitable for calculating the yaw rate.
  • the steering angle changes to calculate .DELTA..theta.
  • step S205 the calculation unit 15d obtains an angular difference ⁇ between the direction of the vehicle (automatically driven vehicle C) and the white line center line L, and in step S206, the calculation unit 15d calculates The difference between the time and ⁇ is taken to determine the yaw angle change ⁇ .
  • step S207 the calculation unit 15d obtains the time difference ⁇ t (k) from the previous scan time by equation (4), and in step S208, the calculation unit 15d calculates the yaw rate ⁇ _dot (k) by equation (5). calculate.
  • the control unit 15 acquires the detection result of the rider 21 that detects a feature around the autonomous driving vehicle C, and the output unit 16 obtains the detection result based on the acquired detection result.
  • a control signal is output so that the automatically driven vehicle C changes lanes.
  • the calculation unit 15d calculates the yaw rate in the moving direction of the automatically driven vehicle C when changing lanes. In this way, the control signal is output so that the automatically driven vehicle C changes lanes, so that the yaw angle changes even if the traveling road is not curved and travels on a straight road.
  • the yaw rate can be calculated.
  • control unit 15 continuously obtains detection results of the lidar 21 for detecting a feature having continuity along the traveling path around the autonomous driving vehicle C at predetermined time intervals, and determines the predetermined result from the detection results.
  • the recognition unit 15 b that recognizes the white line center line L based on the rule of and the setting unit 15 c that sets the direction of the white line center line L.
  • the calculating unit 15 d then calculates the moving direction of the vehicle C with respect to the white line center line L at the first time (t (k)) and the white line center line L at the second time (t (k + 1)).
  • the yaw rate of the vehicle C is calculated on the basis of the change in the moving direction of the vehicle C with respect to. By doing this, the yaw rate can be calculated based on the moving direction of the vehicle C with respect to the white line center line L of the white line D1 at two times.
  • the feature is a white line which is a boundary that divides the lane in which the vehicle C travels, and the calculation unit 15 d calculates the yaw rate when the steering angle of the vehicle C changes.
  • the yaw rate based on the demarcation line that divides the lanes such as the white line on the road. Since many boundaries such as white lines exist on the road and are less likely to be blocked by other vehicles and the like, it is possible to increase the yaw rate calculation frequency.
  • the evaluation unit 15e calculates evaluation indices b and c of the white line center line L, and the calculation unit 15d calculates the yaw rate when the evaluation indices b and c are equal to or greater than a predetermined value. By doing this, it is possible to calculate the yaw rate only when the values (detection accuracy) of the evaluation indexes b and c of the white line center line L are high. Therefore, the calculation accuracy of the yaw rate can be improved.
  • the evaluation unit 15e performs evaluation based on the number of lines in which the white line D1 is detected in the window moving together with the vehicle C and the difference between the width of each line and the original line width. By doing this, it can be evaluated that the required detection accuracy can not be obtained, for example, when the number of lines is insufficient. Alternatively, when the error between the width of each line and the original line width is large, it can be evaluated that the required detection accuracy can not be obtained.
  • the acquiring unit 15a acquires the yaw rate output from the gyro sensor 24, and the calculating unit 15d corrects the yaw rate output from the gyro sensor 24 based on the yaw rate calculated by itself. By doing this, it becomes possible to correct the yaw rate detected by the gyro sensor 24 by the yaw rate calculated by the calculation unit 15 d. Therefore, the accuracy of the detection value (yaw rate) of the gyro sensor 24 can be improved.
  • FIGS. 8 to 12 a detection apparatus and an information processing apparatus according to a second embodiment of the present invention will be described with reference to FIGS. 8 to 12.
  • the same parts as those of the first embodiment described above are designated by the same reference numerals and the description thereof will be omitted.
  • the present embodiment is the same as the configuration shown in FIG. 1, but it is not a feature having continuity along a running road such as a white line as a feature but targets a feature having no continuity such as a road sign or a signboard And A method of calculating the yaw rate of the automatically driven vehicle C in the present embodiment will be described with reference to FIG. FIG. 8 shows a flat plate-like landmark LA as a feature, in which the automatically driven vehicle C changes its angle with the landmark LA due to a lane change or a turn.
  • the angle 2 2 with the normal to is calculated, and ⁇ , which is the amount of change (change) in the angle, is determined (see equation (8)).
  • the information on the normal of the landmark LA surface may be included in the information on the landmark LA in the map DB 10, for example, and may be set in the setting unit 15c.
  • an irradiation angle theta A1 of the laser beam spot P 1 definitive rider 21 is irradiated to the landmark LA, the point P 2 definitive rider 21 and the irradiation angle theta A2 of the laser light applied to the landmark LA, and the ⁇ ,
  • the irradiation angle ⁇ A1 and the irradiation angle ⁇ A2 indicate the orientation of the feature from the automatically driven vehicle C.
  • the yaw rate change ⁇ obtained by the equation (9) is divided by the time difference ⁇ t obtained by the equation (10) to calculate the yaw rate ⁇ _dot (see the equation (11)).
  • the direction of the landmark LA (feature) from the automatically driven vehicle C at the time of the point P1 (first time), and the landmark LA from the automatically driven vehicle C at the time (second time) of the point P2 Calculate the yaw angle change amount of the automatically driven vehicle C based on the direction of the object and the change in the angle relative to the direction of the normal direction of the landmark LA (feature) from the time (first time) of the point P1. .
  • landmarks LA such as signs and signs shown in FIG. Sometimes it hides.
  • the size information of the landmark LA is included in the map DB 10, and the evaluation index is calculated by comparing the size information with the detection result by the lidar 21.
  • the lengths in the vertical direction and the horizontal direction of the landmark LA are obtained.
  • the vertical length of the landmark LA is H M
  • the horizontal length of the landmark LA is W M
  • the vertical length of the landmark LA detected by the lidar H L is W M
  • the evaluation index a is expressed by the following equation (12).
  • the longitudinal length H L of the landmark LA detected by the lidar 21 is the longest row in the vertical direction among the point clouds indicating the landmark LA detected by the lidar 21.
  • W L is the horizontal length of the landmark LA detected by the rider 21, long lines in the most lateral direction in the group of points showing the landmark LA detected by the rider 21 between the ends of the points The length between the two points of.
  • step S301 the acquisition unit 15a reads landmark information in the vicinity of the vehicle position from the map DB 10.
  • the landmark information includes the information on the normal to the surface of the landmark described above, and the information on the length in the vertical direction of the landmark LA and the length in the horizontal direction of the landmark LA.
  • step S302 the recognition unit 15b performs landmark detection processing from the detection result of the lidar 21.
  • step S303 the evaluation unit 15e calculates an evaluation index a according to the above-described equation (12).
  • step S304 the evaluation unit 15e determines whether the evaluation index a is larger than a predetermined value. If the evaluation index a is smaller than the predetermined value (in the case of No), the process returns to step S301. If the evaluation index a is larger than the predetermined value (in the case of Yes), the evaluation unit 15e performs the yaw rate using the landmark in step S305. It is judged that calculation is possible.
  • step S306 the calculation unit 15d determines whether the lane change is possible. This determination is the same as step S106. If it is not possible to change the lane (in the case of No), the process returns to step S301. If the lane change is possible (in the case of Yes), the lane change operation is performed in step S307. That is, a control signal for changing the lane is output to the control device that controls the steering function, the accelerator, the brake, and the like in the automatically driven vehicle C via the output unit 16.
  • step S308 the calculation unit 15d calculates the yaw rate according to the equations (8) to (11) described above.
  • the yaw rate calculation process will be described with reference to the flowchart of FIG. 12 described later.
  • step S309 the sensitivity and offset of the gyro sensor 24 are calculated (corrected) by the equation (7) described in the first embodiment.
  • step S308 the yaw rate calculation process (step S308) will be described with reference to the flowchart of FIG. This flowchart is mainly executed by the calculation unit 15d.
  • step S400 the irradiation angle ⁇ of the laser beam emitted to the landmark LA by the lidar 21 is determined.
  • step S401 the angle between the laser light emitted from the lidar 21 and the normal to the surface of the landmark LA is determined based on the information on the normal to the surface of the landmark acquired in step S301.
  • step S402 the difference with the angle of the previous time is obtained by equation (8), and in step S403, the difference between the irradiation angle ⁇ with the previous time and the angle difference ⁇ obtained in step S402
  • the yaw angle change ⁇ is obtained from
  • step S404 the calculation unit 15d obtains the time difference ⁇ t (k) from the previous scan time by equation (10), and in step S405, the calculation unit 15d calculates the yaw rate ⁇ _dot (k) by equation (11). calculate.
  • the calculation unit 15 d includes the setting unit 15 c that sets the direction of the normal direction of the feature, and the calculation unit 15 d is a landmark LA from the automatically driven vehicle C at the time (first time) of the point P1. Calculate the yaw angle change amount of the automatically driven vehicle C based on the direction of the object and the direction of the landmark LA (feature) from the automatically driven vehicle C at the time of the point P2 (second time) and ⁇ There is. By doing this, it is possible to calculate the yaw rate using a feature such as a road sign, for example.
  • the yaw rate is calculated based on the respective features and averaged. Processing can improve accuracy. As this averaging process, weighted averaging may be mentioned. By doing this, averaging according to each precision is performed, and it becomes possible to further improve the yaw rate.
  • the present invention is not limited to the above embodiment. That is, those skilled in the art can carry out various modifications without departing from the gist of the present invention in accordance with conventionally known findings. As long as the configuration of the information processing apparatus of the present invention is provided even by such a modification, it is of course included in the scope of the present invention.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Mathematical Physics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)

Abstract

L'invention concerne un dispositif de traitement d'informations selon lequel, même pendant le déplacement sur une route droite, une vitesse de lacet peut être calculée. Dans une unité de commande (15), une unité d'acquisition (15) acquiert un résultat de détection d'un lidar (21) qui détecte un objet au sol à proximité d'un véhicule à conduite automatique C, et une unité de sortie (16) délivre un signal de commande de telle sorte que le véhicule à conduite automatique C effectue un changement de voie sur la base du résultat de détection acquis. En outre, sur la base du résultat de détection du lidar (21) obtenu lorsque le véhicule à conduite automatique C effectue un changement de voie en réponse au signal de commande, une unité de calcul (15d) calcule une vitesse de lacet par rapport à la direction de déplacement du véhicule à conduite automatique C par réalisation d'un changement de voie.
PCT/JP2018/046194 2017-12-19 2018-12-14 Dispositif de traitement d'informations WO2019124279A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-242746 2017-12-19
JP2017242746 2017-12-19

Publications (1)

Publication Number Publication Date
WO2019124279A1 true WO2019124279A1 (fr) 2019-06-27

Family

ID=66993532

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/046194 WO2019124279A1 (fr) 2017-12-19 2018-12-14 Dispositif de traitement d'informations

Country Status (1)

Country Link
WO (1) WO2019124279A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07179140A (ja) * 1993-12-24 1995-07-18 Nissan Motor Co Ltd 車両用自動操縦装置
JP2003048564A (ja) * 2001-08-07 2003-02-18 Nissan Motor Co Ltd 車両用操舵制御装置
JP2012066777A (ja) * 2010-09-27 2012-04-05 Mazda Motor Corp ヨーレートのずれ検出装置
JP2014106683A (ja) * 2012-11-27 2014-06-09 Aisin Aw Co Ltd 道路勾配記録システム、道路勾配記録方法、道路勾配記録プログラム、運転支援システム、運転支援方法および運転支援プログラム。
JP2016133838A (ja) * 2015-01-15 2016-07-25 トヨタ自動車株式会社 複合線判定装置及び複合線判定方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07179140A (ja) * 1993-12-24 1995-07-18 Nissan Motor Co Ltd 車両用自動操縦装置
JP2003048564A (ja) * 2001-08-07 2003-02-18 Nissan Motor Co Ltd 車両用操舵制御装置
JP2012066777A (ja) * 2010-09-27 2012-04-05 Mazda Motor Corp ヨーレートのずれ検出装置
JP2014106683A (ja) * 2012-11-27 2014-06-09 Aisin Aw Co Ltd 道路勾配記録システム、道路勾配記録方法、道路勾配記録プログラム、運転支援システム、運転支援方法および運転支援プログラム。
JP2016133838A (ja) * 2015-01-15 2016-07-25 トヨタ自動車株式会社 複合線判定装置及び複合線判定方法

Similar Documents

Publication Publication Date Title
US12174021B2 (en) Measurement accuracy calculation device, self-position estimation device, control method, program and storage medium
US10401503B2 (en) Location estimation device
JP6806891B2 (ja) 情報処理装置、制御方法、プログラム及び記憶媒体
US12146746B2 (en) Information processing device, control method, program and storage medium
US12085653B2 (en) Position estimation device, estimation device, control method, program and storage media
WO2021205193A1 (fr) Procédé de correction d'informations cartographiques, procédé d'aide à la conduite et dispositif de correction d'informations cartographiques
US20240053440A1 (en) Self-position estimation device, self-position estimation method, program, and recording medium
US11828603B2 (en) Measurement device, measurement method and program
US11420632B2 (en) Output device, control method, program and storage medium
JP2020032986A (ja) 姿勢推定装置、制御方法、プログラム及び記憶媒体
JP2021120683A (ja) 出力装置、制御方法、プログラム及び記憶媒体
US20230243657A1 (en) Vehicle control device and host vehicle position estimation method
WO2018212302A1 (fr) Dispositif d'estimation de position propre, procédé de commande, programme et support d'informations
WO2019124279A1 (fr) Dispositif de traitement d'informations
WO2019124278A1 (fr) Dispositif de traitement d'informations
WO2019124277A1 (fr) Dispositif de traitement d'informations
WO2018212290A1 (fr) Dispositif de traitement d'informations, procédé de commande, programme et support de stockage
WO2020045057A1 (fr) Dispositif d'estimation de position, procédé de commande, programme et support de stockage
JP2023151307A (ja) 走路推定方法及び走路推定装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18890012

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18890012

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP