[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

WO2019124279A1 - Information processing device - Google Patents

Information processing device Download PDF

Info

Publication number
WO2019124279A1
WO2019124279A1 PCT/JP2018/046194 JP2018046194W WO2019124279A1 WO 2019124279 A1 WO2019124279 A1 WO 2019124279A1 JP 2018046194 W JP2018046194 W JP 2018046194W WO 2019124279 A1 WO2019124279 A1 WO 2019124279A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
detection
detection result
feature
information processing
Prior art date
Application number
PCT/JP2018/046194
Other languages
French (fr)
Japanese (ja)
Inventor
加藤 正浩
良樹 轡
淑子 加藤
一聡 田中
Original Assignee
パイオニア株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パイオニア株式会社 filed Critical パイオニア株式会社
Publication of WO2019124279A1 publication Critical patent/WO2019124279A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/10Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to vehicle motion
    • B60W40/114Yaw movement
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • the present invention relates to an information processing apparatus that performs predetermined processing based on a detection result of a detection unit that detects a feature around a moving object.
  • an automatic travel system grasps the situation by recognizing an object present around the vehicle, generates an optimal target track, and controls the vehicle to travel along the target track. Do. At this time, if the self-position estimation accuracy of the vehicle is poor, there is a possibility that the actual traveling track deviates from the target track, which reduces the safety of automatic traveling. Accurate self-positioning is one of the important factors to ensure the safety of automatic driving.
  • the self-position estimation in the conventional car navigation system often uses GNSS (Global Navigation Satellite System). Therefore, there is a problem that the accuracy is deteriorated in an environment where multiple paths such as an unreceivable place such as in a tunnel or a valley of a building are frequent.
  • GNSS Global Navigation Satellite System
  • a so-called dead reckoning technique that estimates the vehicle position based on the traveling state of the vehicle (for example, the vehicle speed and the yaw rate). Then, in order to improve the estimation accuracy of the vehicle position by dead reckoning, it is necessary to accurately acquire the traveling state of the vehicle such as the above-described vehicle speed.
  • Patent Document 1 As a technique for acquiring the yaw rate with high accuracy, for example, Patent Document 1 can be mentioned.
  • Patent Document 1 describes that the yaw rate is accurately estimated by using the lateral movement amount LM of the fixed object F detected by the front object detection sensor 31.
  • the yaw rate in the present specification indicates the amount of change in yaw angle per unit time.
  • the gyro sensor can be corrected using an accurate estimated value of the yaw rate, it is premised that the vehicle makes a turn when estimating the yaw rate. Therefore, the yaw rate can not be calculated when traveling on a straight road.
  • One of the problems to be solved by the present invention is to enable yaw rate calculation even when traveling on a straight road.
  • the invention according to claim 1 made for the purpose of solving the above problems is a first acquisition unit for acquiring a detection result of a detection unit for detecting a feature around an autonomous driving vehicle traveling autonomously, and the acquired An output unit that outputs control information for controlling the autonomous driving vehicle such that the traveling direction of the autonomous driving vehicle changes based on a detection result, and the traveling direction of the autonomous driving vehicle is changed by the control information And a calculation unit that calculates a yaw angle change amount of the autonomous driving vehicle when the moving direction is changing, based on a detection result of the detection unit at that time.
  • the invention according to claim 6 is a detection unit that detects a feature around an autonomous driving vehicle traveling autonomously, a first acquisition unit that acquires a detection result of the detection unit, and the acquired detection result.
  • a calculating unit configured to calculate a yaw angle change amount of the autonomous driving vehicle when the moving direction is changing, based on a detection result of the detecting unit.
  • the invention according to claim 7 is an information processing method executed by an information processing apparatus that performs a predetermined process based on a detection result of a detection unit that detects a feature around an autonomous driving vehicle traveling autonomously.
  • the invention according to claim 8 is characterized in that the information processing method according to claim 7 is executed by a computer.
  • the first acquisition unit acquires the detection result of the detection unit that detects a feature around the autonomous driving vehicle traveling autonomously
  • the output unit detects the acquired Control information for controlling the autonomous driving vehicle is output based on the result so that the moving direction of the autonomous driving vehicle changes.
  • the calculation unit calculates the yaw angle change amount of the automatically driven vehicle when the movement direction is changed, based on the detection result of the detection unit when the automatically driven vehicle changes the movement direction according to the control information. .
  • the traveling path is not curved but travels on a straight road. Even if there is, the situation in which the yaw angle changes can be created, so that the yaw rate can be calculated.
  • the first acquisition unit continuously acquires, at predetermined time intervals, detection results of the detection unit that detects a feature having continuity along the traveling path around the autonomous driving vehicle, and the predetermined result is determined from the detection results.
  • a first setting unit configured to set the direction of the reference line.
  • the calculation unit calculates the yaw angle change amount of the autonomous driving vehicle based on the moving direction of the autonomous driving vehicle with respect to the reference line at the first time and the traveling direction of the autonomous driving vehicle with respect to the reference line at the second time. You may do so. By doing this, it is possible to calculate the yaw rate based on the directions of the reference lines of the features at two times and the moving direction of the moving object. And if it is a feature which has continuity along a running path to a feature, since it is not necessary to be a fixed thing indicated in patent documents 1, calculation frequency of yaw rate can be raised.
  • a second setting unit for setting the direction of the normal direction of the feature is provided, and the calculation unit is configured to detect the direction of the feature from the automatically driven vehicle at the first time and the feature from the automatically driven vehicle at the second time.
  • the yaw angle change amount of the autonomous driving vehicle may be calculated on the basis of the change of the angle relative to the direction of the normal direction of the feature from the first time. By doing this, it is possible to calculate the yaw rate using a feature such as a road sign, for example.
  • an evaluation unit that evaluates the detection accuracy of at least one of the feature and the reference line may further be included, and the calculation unit may calculate the yaw angle change amount when the evaluation result of the detection accuracy is a predetermined value or more. . By doing this, it is possible to calculate the yaw rate only when the detection accuracy of the feature or the reference line is high. Therefore, the calculation accuracy of the yaw rate can be improved.
  • the second acquisition unit may acquire information output from the gyro sensor, and the correction unit may correct the information output from the gyro sensor based on the yaw angle change amount calculated by the calculation unit. Good. By doing this, it becomes possible to correct the detection value of the gyro sensor with the calculation value of the calculation unit. Therefore, the accuracy of the detection value of the gyro sensor can be improved.
  • the detection apparatus concerning one Embodiment of this invention is provided with the detection part which detects the terrestrial feature around the autonomous driving vehicle which carries out an autonomous travel. Then, the first acquisition unit acquires the detection result of the detection unit that detects a feature around the autonomous driving vehicle traveling autonomously, and the output unit changes the moving direction of the autonomous driving vehicle based on the acquired detection result Control information for controlling the autonomous driving vehicle. Then, the calculation unit calculates the yaw angle change amount of the automatically driven vehicle when the movement direction is changed, based on the detection result of the detection unit when the automatically driven vehicle changes the movement direction according to the control information. .
  • the traveling path is not curved and travels on a straight road. Even in this case, the yaw rate can be calculated because the situation in which the yaw angle changes can be created.
  • a detection result of a detection unit that detects a feature around the autonomous driving vehicle traveling autonomously is acquired, and acquired in the output step.
  • control information for controlling the autonomous driving vehicle is output such that the moving direction of the autonomous driving vehicle changes.
  • the yaw angle change amount of the automatically driven vehicle when the moving direction is changed is calculated based on the detection result of the detection unit when the automatically driven vehicle moves according to the control information.
  • an information processing program that causes a computer to execute the above-described information processing method may be used.
  • the traveling path is not curved and travels on a straight road in order to output control information for controlling the autonomous driving vehicle so that the moving direction of the autonomous driving vehicle changes. Even when this is the case, the yaw rate can be calculated because the situation in which the yaw angle changes can be created.
  • the information processing apparatus according to the present embodiment is included in the detection device 1 and moves together with an automatically driven vehicle as a moving body.
  • the autonomous driving vehicle travels autonomously (automatic driving) to a destination set based on the map DB 10 including the result of detection by the sensor group 11 described later and map information for automatic driving that the storage unit 12 has.
  • the detection device 1 includes a sensor group 11, a storage unit 12, a control unit 15, and an output unit 16.
  • the sensor group 11 includes a lidar 21, a vehicle speed sensor 22, an acceleration sensor 23, a gyro sensor 24, an inclination sensor 25, a temperature sensor 26, and a GPS receiver 27.
  • the lidar 21 as a detection unit emits laser light in a pulse shape to discretely measure the distance to an object present in the outside world.
  • the lidar 21 outputs a point cloud of measurement points indicated by a combination of the distance to the object from which the laser light is reflected and the emission angle of the laser light.
  • the lidar 21 is used to detect features present in the periphery of an autonomous driving vehicle.
  • a feature is a concept that includes all natural or artificial objects present on the ground. Examples of features include path features located on a route (i.e., a road) such as an autonomous vehicle and peripheral features located on the periphery of the road.
  • a route i.e., a road
  • peripheral features located on the periphery of the road.
  • the route top feature a road sign, a traffic light, a guardrail, a footbridge, etc.
  • the road itself is also included. That is, characters and figures drawn on the road surface, and the shape of the road (road width and curvature) are also included in the route features.
  • examples of the peripheral features include buildings (houses, stores) and billboards located along the road.
  • the vehicle speed sensor 22 measures a pulse (also referred to as an “axle rotation pulse”) formed of a pulse signal generated as the wheels of the autonomous driving vehicle rotate, and detects the vehicle speed.
  • the acceleration sensor 23 detects an acceleration in the traveling direction of the autonomous driving vehicle.
  • the gyro sensor 24 detects the yaw rate of the autonomous driving vehicle at the time of direction change.
  • the tilt sensor 25 detects a tilt angle (also referred to as a “slope angle”) in the pitch direction with respect to the horizontal plane of the autonomous driving vehicle.
  • the temperature sensor 26 detects the temperature around the gyro sensor 24.
  • a GPS (Global Positioning System) receiver 27 detects an absolute position of an autonomous driving vehicle by receiving radio waves including positioning data from a plurality of GPS satellites. The output of each sensor of the sensor group 11 is supplied to the control unit 15.
  • the storage unit 12 stores an information processing program executed by the control unit 15, information required for the control unit 15 to execute a predetermined process, and the like.
  • the storage unit 12 stores a map database (DB) 10 including road data and feature information.
  • map DB10 may be updated regularly.
  • the control unit 15 receives partial map information related to the area to which the vehicle position belongs from an external server device that manages map information via a communication unit (not shown), and causes the map DB 10 to reflect it.
  • a server device that can communicate with the detection device 1 may store the map DB 10.
  • the control unit 15 communicates with an external server device to acquire necessary feature information and the like from the map DB 10.
  • the output unit 16 outputs, for example, information such as the yaw rate calculated by the control unit 15, a control signal such as a lane change of the automatically driven vehicle, and the like to another on-vehicle device such as a control device for automatic driving.
  • the control unit 15 includes a CPU (Central Processing Unit) or the like that executes a program, and controls the entire detection device 1.
  • the control unit 15 includes an acquisition unit 15a, a recognition unit 15b, a setting unit 15c, a calculation unit 15d, and an evaluation unit 15e.
  • the control unit 15 calculates the yaw rate of the vehicle based on the features detected by the rider 21.
  • the acquisition unit 15a continuously acquires detection results in a window, which will be described later, among detection results of the features detected by the lidar 21 at predetermined time intervals.
  • the recognition unit 15b recognizes a reference line of the feature based on a predetermined rule described later from the detection result.
  • the reference line will be described later.
  • the setting unit 15c internally sets the direction of the reference line.
  • the calculating unit 15d is configured to calculate the moving direction of the automatically driven vehicle with respect to the reference line at the first time which is a specific time during traveling of the automatically driven vehicle and the automatically driven vehicle with respect to the reference line at the second time which is different from the first time.
  • the yaw rate of the autonomous driving vehicle is calculated based on the movement direction and the change in direction of the reference line from the first time.
  • the evaluation unit 15 e evaluates the detection accuracy of a white line center line (reference line) described later.
  • control part 15 functions as an information processor concerning this example among detection devices 1 of composition of having mentioned above.
  • the boundary line (so-called white line) that divides the lane is used as a feature. Since the white line is coated with a retroreflective material, the reflection intensity is high and detection by the lidar 21 is easy.
  • a white line is demonstrated as a terrestrial feature in a present Example, it will not be specifically limited if it is a terrestrial feature which has continuity along traveling roads, such as a lane boundary line other than a white line and a guardrail.
  • the detection of the white line in the present embodiment will be described.
  • FIG. 2 it is assumed that the autonomous driving vehicle C is traveling from left to right in the drawing.
  • the rider 21L is installed on the left side of the front portion of the autonomous driving vehicle C, and the rider 21R is similarly installed on the right side of the front portion of the autonomous driving vehicle C.
  • a window W which is a rectangular area is set in the detection range A.
  • the window W is set at a position where the white line D1 and the white line D2 can be easily detected in the detection range A.
  • This window W is a detection area which moves with the automatically driven vehicle C in this embodiment.
  • the rider 21 installed in front of the automatically driven vehicle C will be described. However, the rider may be installed behind the automatically driven vehicle C. Furthermore, only one of the riders 21L and 21R may be used.
  • the lidar 21 in this embodiment scans an object by emitting pulsed light sequentially from one side to the other side in the horizontal direction. Therefore, as shown in the upper part of FIG. 2 and FIG. 3, the scan locus is linear when viewed from above. Therefore, the acquiring unit 15a acquires information from the lidar 21 at intervals of the scanned lines. That is, the acquisition unit 15a continuously acquires the detection result of the feature at predetermined time intervals.
  • a beam scanning in the horizontal direction is vertically moved up and down to obtain a plurality of lines, or a plurality of optical systems scanning in the horizontal direction are vertically arranged to obtain a plurality of lines There is. It is known that the scan interval of such a type of rider increases with distance from the autonomous driving vehicle C (see also FIG. 3). This is because the angle between the rider 21 and the feature (the road surface) becomes shallower as it goes away from the autonomous driving vehicle C.
  • the reference line defines the direction of the white line (feature) in the window W, and is used to compare with the moving direction of the automatically driven vehicle C in calculating the yaw rate of the automatically driven vehicle C described later. .
  • the white line D1 can be recognized by detecting a portion (high reflection intensity) in the window W where the reflection intensity is high. For example, a portion having a reflection intensity equal to or higher than a predetermined threshold may be regarded as a white line.
  • the center point of the high reflection strength portion in the line of each scan is determined, the equation of the straight line passing each center point is determined by, for example, the least square method, and the straight line represented by the equation obtained is set as the white line center line.
  • the white line center line L is obtained by the equation passing through the central points c1 to c10 by the least squares method.
  • the center line of the white line is calculated in the present embodiment, it is not limited to the center line.
  • the line may be determined by the method of least squares based on the end of the white line, that is, the end point where the white line is detected in the form of a scan line. That is, in the present embodiment, the reference line (white line center line) is recognized with the least squares method as a predetermined rule. Then, the direction represented by the calculated white line center line is set in the setting unit 15 c.
  • the white line center line L is used to calculate the yaw rate of the automatically driven vehicle C, the accuracy of the white line center line L may also affect the yaw rate accuracy. Therefore, the detection accuracy of the white line center line L is calculated by the following evaluation index.
  • the first evaluation index is whether or not the number of scan lines of a predetermined number or more can be detected in the range of the window W. For example, as shown in the middle of FIG. 3, when only a part of the end of the white line D1 can be detected, the number of center points (the number of samples) is small and the accuracy of the white line center line L obtained by the least squares method is not high. There is a case.
  • the evaluation index b in this case is expressed by the following equation (1), assuming that the number of lines measured by the lidar 21L or the lidar 21R is N L and the expected number of lines is N M.
  • the second of the evaluation indexes is whether the line width of the high reflection intensity portion is approximately equal to the target white line width. For example, as shown in the lower part of FIG. 3, when the coating of the white line D1 is deteriorated, the width of the white line D1 changes, so the position (the value indicated by the sample) of the center point of the white line is the original center position (reference value) There are cases where the accuracy of the white line center line L determined by the method of least squares may not be high because there are many cases of deviation (the error is large).
  • the evaluation index c of case, lidar 21L or lidar 21R linewidth measured in W L, when the original line width is W M represented by the following equation (2). Note that the original line width may be included in, for example, the map DB 10.
  • the white line D1 is a straight line, and the angle of the automatically driven vehicle C with the white line D1 is changed due to a change in lane, left turn, or the like.
  • the white line center line L is determined by the method described above, and the direction is specified from the equation indicating the white line center line L. Then, the angle difference between the white line center line L and the direction in which the autonomous driving vehicle C is directed is determined for each time, the difference in the angle difference is taken, and the yaw angle change ⁇ is determined.
  • the difference ⁇ is determined.
  • the difference with ⁇ of the previous time is taken, and the yaw angle change ⁇ is obtained (see the equation (3)).
  • the white line D1 is premised on a straight line, the direction of the white line center line L is the same and there is no change, so that the yaw angle change ⁇ can be obtained by the equation (3).
  • the yaw rate change ⁇ obtained by the equation (3) is divided by the time difference ⁇ t obtained by the equation (4) to calculate the yaw rate ⁇ _dot (see the equation (5)).
  • the yaw rate (the amount of change of the yaw angle per unit time) of the automatically driven vehicle C is calculated based on the change in the moving direction of the automatically driven vehicle C with respect to the line).
  • the automatically driven vehicle C gradually moves away from the white line D1 due to a lane change, etc.
  • the window W is adjusted from the vehicle position, and the window W includes a portion with high reflection intensity.
  • This adjustment can be performed based on, for example, the steering angle of the vehicle C or a value detected by the gyro sensor 24.
  • the yaw rate detected by the gyro sensor 24 is known to have sensitivity and offset (see FIG. 5). Therefore, assuming that the yaw rate detected by the gyro sensor 24 is ⁇ [t], the sensitivity coefficient is A, and the offset coefficient is B, the true yaw rate ⁇ _dot is expressed by the following equation (6).
  • sensitivity coefficient A and offset coefficient B can be obtained by the least squares method or the like. Therefore, the yaw rate detected by the gyro sensor 24 can be corrected using the sensitivity coefficient A and the offset coefficient B which are obtained.
  • step S101 the acquisition unit 15a reads white line information in the vicinity of the vehicle position from the map DB 10.
  • the white line information includes information on the width of the white line used to obtain the evaluation index of the white line center line L described above.
  • step S101 it is determined whether the white line indicated by the read white line information or the white line recognized by the recognition unit 15b is a continuous section of a straight line. The following steps are not performed if it is not a section following the straight line. Whether the section is a continuation of a straight line may include white line information including information indicating the straight section.
  • step S102 the recognition unit 15b performs white line detection processing in which a portion with high reflection intensity is detected by the lidar 21 as a white line
  • step S103 the evaluation unit 15e calculates the equations (1) and (2).
  • the evaluation indexes b and c are calculated by the equation). In this flowchart, although both of the evaluation indexes b and c are calculated and evaluated, only one of them may be used.
  • step S104 the evaluation unit 15e determines whether both of the evaluation indexes b and c are larger than a predetermined value. If one of the evaluation indexes b and c is smaller than the predetermined value (in the case of No), the process returns to step S101. If both evaluation indexes b and c are larger than the predetermined value (in the case of Yes), the evaluation unit 15e is performed in step S105. However, it is determined that yaw rate calculation using a white line is possible.
  • step S106 the calculation unit 15d determines whether the lane change is possible. This determination may be made based on, for example, whether the road currently being traveled has a plurality of lanes based on the map DB 10, and whether there is another vehicle or the like in the surroundings based on information acquired from the sensor group 11 such as the rider 21.
  • the process returns to step S101, and in the case where the lane change is possible (in the case of Yes), the lane change operation is performed in step S107. That is, a control signal (control information) for lane change is output to the control device that controls the steering function, the accelerator, the brake, and the like in the automatically driven vehicle C via the output unit 16.
  • step S108 the calculation unit 15d calculates the yaw rate by the equations (3) to (5) described above.
  • the yaw rate calculation process will be described with reference to the flowchart of FIG. 7 described later.
  • step S109 the sensitivity and offset of the gyro sensor 24 are calculated (corrected) by the above-described equation (7).
  • step S108 the lane change direction (moving direction changes) based on the detection result of the rider 21 (detection unit) during the lane change (when the moving direction changes) according to the control signal.
  • the yaw rate with respect to the moving direction of the automatically driven vehicle C is calculated.
  • the sensitivity coefficient A and the offset coefficient B can be appropriately corrected, and the accuracy of the yaw rate detected by the gyro sensor 24 can be improved.
  • step S108 the yaw rate calculation process
  • step S201 the recognition unit 15b performs a white line detection process of detecting a portion with high reflection intensity as a white line based on the information acquired by the acquisition unit 15a from the lidar 21, and in step S202, the evaluation unit 15e
  • the evaluation indexes b and c are calculated by the equations (1) and (2) described above. That is, the acquisition unit 15a functions as a first acquisition unit.
  • step S203 the evaluation unit 15e determines whether both of the evaluation indexes b and c are larger than a predetermined value. If one of the evaluation indexes b and c is smaller than the predetermined value (in the case of No), the process returns to the step S201, and if both the evaluation indexes b and c are larger than the predetermined value (in the case of Yes) Then, the equation of the white line center line L is determined using the least squares method from the center point of each scan line.
  • steps S201 to S203 perform the same operation as steps S102 to S104, steps S102 to S104 determine whether the white line read in step S101 is a white line suitable for calculating the yaw rate.
  • the steering angle changes to calculate .DELTA..theta.
  • step S205 the calculation unit 15d obtains an angular difference ⁇ between the direction of the vehicle (automatically driven vehicle C) and the white line center line L, and in step S206, the calculation unit 15d calculates The difference between the time and ⁇ is taken to determine the yaw angle change ⁇ .
  • step S207 the calculation unit 15d obtains the time difference ⁇ t (k) from the previous scan time by equation (4), and in step S208, the calculation unit 15d calculates the yaw rate ⁇ _dot (k) by equation (5). calculate.
  • the control unit 15 acquires the detection result of the rider 21 that detects a feature around the autonomous driving vehicle C, and the output unit 16 obtains the detection result based on the acquired detection result.
  • a control signal is output so that the automatically driven vehicle C changes lanes.
  • the calculation unit 15d calculates the yaw rate in the moving direction of the automatically driven vehicle C when changing lanes. In this way, the control signal is output so that the automatically driven vehicle C changes lanes, so that the yaw angle changes even if the traveling road is not curved and travels on a straight road.
  • the yaw rate can be calculated.
  • control unit 15 continuously obtains detection results of the lidar 21 for detecting a feature having continuity along the traveling path around the autonomous driving vehicle C at predetermined time intervals, and determines the predetermined result from the detection results.
  • the recognition unit 15 b that recognizes the white line center line L based on the rule of and the setting unit 15 c that sets the direction of the white line center line L.
  • the calculating unit 15 d then calculates the moving direction of the vehicle C with respect to the white line center line L at the first time (t (k)) and the white line center line L at the second time (t (k + 1)).
  • the yaw rate of the vehicle C is calculated on the basis of the change in the moving direction of the vehicle C with respect to. By doing this, the yaw rate can be calculated based on the moving direction of the vehicle C with respect to the white line center line L of the white line D1 at two times.
  • the feature is a white line which is a boundary that divides the lane in which the vehicle C travels, and the calculation unit 15 d calculates the yaw rate when the steering angle of the vehicle C changes.
  • the yaw rate based on the demarcation line that divides the lanes such as the white line on the road. Since many boundaries such as white lines exist on the road and are less likely to be blocked by other vehicles and the like, it is possible to increase the yaw rate calculation frequency.
  • the evaluation unit 15e calculates evaluation indices b and c of the white line center line L, and the calculation unit 15d calculates the yaw rate when the evaluation indices b and c are equal to or greater than a predetermined value. By doing this, it is possible to calculate the yaw rate only when the values (detection accuracy) of the evaluation indexes b and c of the white line center line L are high. Therefore, the calculation accuracy of the yaw rate can be improved.
  • the evaluation unit 15e performs evaluation based on the number of lines in which the white line D1 is detected in the window moving together with the vehicle C and the difference between the width of each line and the original line width. By doing this, it can be evaluated that the required detection accuracy can not be obtained, for example, when the number of lines is insufficient. Alternatively, when the error between the width of each line and the original line width is large, it can be evaluated that the required detection accuracy can not be obtained.
  • the acquiring unit 15a acquires the yaw rate output from the gyro sensor 24, and the calculating unit 15d corrects the yaw rate output from the gyro sensor 24 based on the yaw rate calculated by itself. By doing this, it becomes possible to correct the yaw rate detected by the gyro sensor 24 by the yaw rate calculated by the calculation unit 15 d. Therefore, the accuracy of the detection value (yaw rate) of the gyro sensor 24 can be improved.
  • FIGS. 8 to 12 a detection apparatus and an information processing apparatus according to a second embodiment of the present invention will be described with reference to FIGS. 8 to 12.
  • the same parts as those of the first embodiment described above are designated by the same reference numerals and the description thereof will be omitted.
  • the present embodiment is the same as the configuration shown in FIG. 1, but it is not a feature having continuity along a running road such as a white line as a feature but targets a feature having no continuity such as a road sign or a signboard And A method of calculating the yaw rate of the automatically driven vehicle C in the present embodiment will be described with reference to FIG. FIG. 8 shows a flat plate-like landmark LA as a feature, in which the automatically driven vehicle C changes its angle with the landmark LA due to a lane change or a turn.
  • the angle 2 2 with the normal to is calculated, and ⁇ , which is the amount of change (change) in the angle, is determined (see equation (8)).
  • the information on the normal of the landmark LA surface may be included in the information on the landmark LA in the map DB 10, for example, and may be set in the setting unit 15c.
  • an irradiation angle theta A1 of the laser beam spot P 1 definitive rider 21 is irradiated to the landmark LA, the point P 2 definitive rider 21 and the irradiation angle theta A2 of the laser light applied to the landmark LA, and the ⁇ ,
  • the irradiation angle ⁇ A1 and the irradiation angle ⁇ A2 indicate the orientation of the feature from the automatically driven vehicle C.
  • the yaw rate change ⁇ obtained by the equation (9) is divided by the time difference ⁇ t obtained by the equation (10) to calculate the yaw rate ⁇ _dot (see the equation (11)).
  • the direction of the landmark LA (feature) from the automatically driven vehicle C at the time of the point P1 (first time), and the landmark LA from the automatically driven vehicle C at the time (second time) of the point P2 Calculate the yaw angle change amount of the automatically driven vehicle C based on the direction of the object and the change in the angle relative to the direction of the normal direction of the landmark LA (feature) from the time (first time) of the point P1. .
  • landmarks LA such as signs and signs shown in FIG. Sometimes it hides.
  • the size information of the landmark LA is included in the map DB 10, and the evaluation index is calculated by comparing the size information with the detection result by the lidar 21.
  • the lengths in the vertical direction and the horizontal direction of the landmark LA are obtained.
  • the vertical length of the landmark LA is H M
  • the horizontal length of the landmark LA is W M
  • the vertical length of the landmark LA detected by the lidar H L is W M
  • the evaluation index a is expressed by the following equation (12).
  • the longitudinal length H L of the landmark LA detected by the lidar 21 is the longest row in the vertical direction among the point clouds indicating the landmark LA detected by the lidar 21.
  • W L is the horizontal length of the landmark LA detected by the rider 21, long lines in the most lateral direction in the group of points showing the landmark LA detected by the rider 21 between the ends of the points The length between the two points of.
  • step S301 the acquisition unit 15a reads landmark information in the vicinity of the vehicle position from the map DB 10.
  • the landmark information includes the information on the normal to the surface of the landmark described above, and the information on the length in the vertical direction of the landmark LA and the length in the horizontal direction of the landmark LA.
  • step S302 the recognition unit 15b performs landmark detection processing from the detection result of the lidar 21.
  • step S303 the evaluation unit 15e calculates an evaluation index a according to the above-described equation (12).
  • step S304 the evaluation unit 15e determines whether the evaluation index a is larger than a predetermined value. If the evaluation index a is smaller than the predetermined value (in the case of No), the process returns to step S301. If the evaluation index a is larger than the predetermined value (in the case of Yes), the evaluation unit 15e performs the yaw rate using the landmark in step S305. It is judged that calculation is possible.
  • step S306 the calculation unit 15d determines whether the lane change is possible. This determination is the same as step S106. If it is not possible to change the lane (in the case of No), the process returns to step S301. If the lane change is possible (in the case of Yes), the lane change operation is performed in step S307. That is, a control signal for changing the lane is output to the control device that controls the steering function, the accelerator, the brake, and the like in the automatically driven vehicle C via the output unit 16.
  • step S308 the calculation unit 15d calculates the yaw rate according to the equations (8) to (11) described above.
  • the yaw rate calculation process will be described with reference to the flowchart of FIG. 12 described later.
  • step S309 the sensitivity and offset of the gyro sensor 24 are calculated (corrected) by the equation (7) described in the first embodiment.
  • step S308 the yaw rate calculation process (step S308) will be described with reference to the flowchart of FIG. This flowchart is mainly executed by the calculation unit 15d.
  • step S400 the irradiation angle ⁇ of the laser beam emitted to the landmark LA by the lidar 21 is determined.
  • step S401 the angle between the laser light emitted from the lidar 21 and the normal to the surface of the landmark LA is determined based on the information on the normal to the surface of the landmark acquired in step S301.
  • step S402 the difference with the angle of the previous time is obtained by equation (8), and in step S403, the difference between the irradiation angle ⁇ with the previous time and the angle difference ⁇ obtained in step S402
  • the yaw angle change ⁇ is obtained from
  • step S404 the calculation unit 15d obtains the time difference ⁇ t (k) from the previous scan time by equation (10), and in step S405, the calculation unit 15d calculates the yaw rate ⁇ _dot (k) by equation (11). calculate.
  • the calculation unit 15 d includes the setting unit 15 c that sets the direction of the normal direction of the feature, and the calculation unit 15 d is a landmark LA from the automatically driven vehicle C at the time (first time) of the point P1. Calculate the yaw angle change amount of the automatically driven vehicle C based on the direction of the object and the direction of the landmark LA (feature) from the automatically driven vehicle C at the time of the point P2 (second time) and ⁇ There is. By doing this, it is possible to calculate the yaw rate using a feature such as a road sign, for example.
  • the yaw rate is calculated based on the respective features and averaged. Processing can improve accuracy. As this averaging process, weighted averaging may be mentioned. By doing this, averaging according to each precision is performed, and it becomes possible to further improve the yaw rate.
  • the present invention is not limited to the above embodiment. That is, those skilled in the art can carry out various modifications without departing from the gist of the present invention in accordance with conventionally known findings. As long as the configuration of the information processing apparatus of the present invention is provided even by such a modification, it is of course included in the scope of the present invention.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Mathematical Physics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)

Abstract

Provided is an information processing device in which, even during traveling on a straight road, a yaw rate can be calculated. In a control unit (15), an acquisition unit (15) acquires a detection result of a lidar (21) that detects a ground object in the vicinity of an automatic driving vehicle C, and an output unit (16) outputs a control signal such that the automatic driving vehicle C makes a lane change on the basis of the acquired detection result. In addition, on the basis of the detection result of the lidar (21) obtained when the automatic driving vehicle C is making a lane change in response to the control signal, a calculation unit (15d) calculates a yaw rate with respect to the movement direction of the automatic driving vehicle C making a lane change.

Description

情報処理装置Information processing device
 本発明は、移動体の周辺の地物を検出する検出部の検出結果に基づいて所定の処理を行う情報処理装置に関する。 The present invention relates to an information processing apparatus that performs predetermined processing based on a detection result of a detection unit that detects a feature around a moving object.
 例えば、近年開発が進められている自動走行システムは、車両周辺に存在する物体の認識による状況把握を行い、最適な目標軌道を生成し、その目標軌道に沿って走行するように車両の制御を行う。この時、もし車両の自己位置推定精度が悪いと、実走行軌道が目標軌道から逸脱する可能性が生じ自動走行の安全性を低下させてしまう。自動走行の安全性を確保するためには、精度の良い自己位置推定は重要な要素のひとつである。 For example, an automatic travel system, which has been developed in recent years, grasps the situation by recognizing an object present around the vehicle, generates an optimal target track, and controls the vehicle to travel along the target track. Do. At this time, if the self-position estimation accuracy of the vehicle is poor, there is a possibility that the actual traveling track deviates from the target track, which reduces the safety of automatic traveling. Accurate self-positioning is one of the important factors to ensure the safety of automatic driving.
 従来のカーナビゲーションシステムにおける自己位置推定はGNSS(Global Navigation Satellite System)を用いることが多い。そのため、トンネル内などの受信不能な場所やビルの谷間などのマルチパスが多発する環境下では精度が悪化するという問題があった。 The self-position estimation in the conventional car navigation system often uses GNSS (Global Navigation Satellite System). Therefore, there is a problem that the accuracy is deteriorated in an environment where multiple paths such as an unreceivable place such as in a tunnel or a valley of a building are frequent.
 そこで、車両の走行状態(例えば車両速度及びヨーレート)に基づいて車両位置を推定するいわゆるデッドレコニング技術が知られている。そして、デッドレコニングによる車両位置の推定精度を向上させるためには、上記した車両速度等の車両の走行状態を精度良く取得する必要がある。 Therefore, a so-called dead reckoning technique is known that estimates the vehicle position based on the traveling state of the vehicle (for example, the vehicle speed and the yaw rate). Then, in order to improve the estimation accuracy of the vehicle position by dead reckoning, it is necessary to accurately acquire the traveling state of the vehicle such as the above-described vehicle speed.
 ヨーレートを精度良く取得する技術としては、例えば特許文献1が挙げられる。特許文献1は、前方物体検知センサ31により検知された固定物Fの横移動量LMを用いることにより、ヨーレートを正確に推定することが記載されている。ここで、本明細書におけるヨーレートとは、単位時間当たりのヨー角の変化量を示すものとする。 As a technique for acquiring the yaw rate with high accuracy, for example, Patent Document 1 can be mentioned. Patent Document 1 describes that the yaw rate is accurately estimated by using the lateral movement amount LM of the fixed object F detected by the front object detection sensor 31. Here, the yaw rate in the present specification indicates the amount of change in yaw angle per unit time.
特開2012‐66777号公報JP 2012-66777 A
 特許文献1に記載の発明では、ヨーレートの正確な推定値を用いてジャイロセンサを補正することができるが、ヨーレート推定にあたり、車両が旋回走行することが前提となっている。そのため、直線道路を走行している場合にはヨーレートを算出することができない。 In the invention described in Patent Document 1, although the gyro sensor can be corrected using an accurate estimated value of the yaw rate, it is premised that the vehicle makes a turn when estimating the yaw rate. Therefore, the yaw rate can not be calculated when traveling on a straight road.
 本発明が解決しようとする課題としては、直線道路を走行している場合であってもヨーレート算出を可能とすることが一例として挙げられる。 One of the problems to be solved by the present invention is to enable yaw rate calculation even when traveling on a straight road.
 上記課題を解決するためになされた請求項1に記載の発明は、自律走行する自動運転車両の周辺の地物を検出する検出部の検出結果を取得する第1取得部と、前記取得された検出結果に基づき前記自動運転車両の移動方向が変化するように当該自動運転車両を制御するための制御情報を出力する出力部と、前記自動運転車両が前記制御情報によって移動方向が変化しているときの前記検出部の検出結果に基づき、前記移動方向が変化しているときの前記自動運転車両のヨー角変化量を算出する算出部と、を備えることを特徴としている。 The invention according to claim 1 made for the purpose of solving the above problems is a first acquisition unit for acquiring a detection result of a detection unit for detecting a feature around an autonomous driving vehicle traveling autonomously, and the acquired An output unit that outputs control information for controlling the autonomous driving vehicle such that the traveling direction of the autonomous driving vehicle changes based on a detection result, and the traveling direction of the autonomous driving vehicle is changed by the control information And a calculation unit that calculates a yaw angle change amount of the autonomous driving vehicle when the moving direction is changing, based on a detection result of the detection unit at that time.
 また、請求項6に記載の発明は、自律走行する自動運転車両の周辺の地物を検出する検出部と、前記検出部の検出結果を取得する第1取得部と、前記取得された検出結果に基づき前記自動運転車両の移動方向が変化するように当該自動運転車両を制御するための制御情報を出力する出力部と、前記自動運転車両が前記制御情報によって移動方向が変化しているときの前記検出部の検出結果に基づき、前記移動方向が変化しているときの前記自動運転車両のヨー角変化量を算出する算出部と、を備えることを特徴としている。 The invention according to claim 6 is a detection unit that detects a feature around an autonomous driving vehicle traveling autonomously, a first acquisition unit that acquires a detection result of the detection unit, and the acquired detection result. An output unit for outputting control information for controlling the autonomous driving vehicle such that the traveling direction of the autonomous driving vehicle changes based on the condition that the traveling direction of the autonomous driving vehicle is changed by the control information And a calculating unit configured to calculate a yaw angle change amount of the autonomous driving vehicle when the moving direction is changing, based on a detection result of the detecting unit.
 また、請求項7に記載の発明は、自律走行する自動運転車両の周辺の地物を検出する検出部の検出結果に基づいて所定の処理を行う情報処理装置で実行される情報処理方法であって、前記検出部の検出結果を取得する取得工程と、前記取得された検出結果に基づき前記自動運転車両の移動方向が変化するように当該自動運転車両を制御するための制御情報を出力する出力工程と、前記自動運転車両が前記制御情報によって移動方向が変化しているときの前記検出部の検出結果に基づき、前記移動方向が変化しているときの前記自動運転車両のヨー角変化量を算出する算出工程と、を含むことを特徴としている。 The invention according to claim 7 is an information processing method executed by an information processing apparatus that performs a predetermined process based on a detection result of a detection unit that detects a feature around an autonomous driving vehicle traveling autonomously. An acquisition step of acquiring the detection result of the detection unit, and an output for outputting control information for controlling the autonomous driving vehicle such that the moving direction of the autonomous driving vehicle changes based on the acquired detection result Based on the process and the detection result of the detection unit when the direction of movement of the autonomous driving vehicle is changing according to the control information, the yaw angle change amount of the autonomous driving vehicle when the movement direction is changing And calculating step of calculating.
 また、請求項8に記載の発明は、請求項7に記載の情報処理方法を、コンピュータにより実行させることを特徴としている。 The invention according to claim 8 is characterized in that the information processing method according to claim 7 is executed by a computer.
本発明の第1の実施例にかかる情報処理装置を有する検出装置の構成である。It is a structure of a detection apparatus which has an information processing apparatus concerning the 1st Example of this invention. 図1に示された検出装置の白線の検出について説明図である。It is explanatory drawing about detection of the white line of the detection apparatus shown by FIG. 図1に示された制御部における白線の中心線算出についての説明図である。It is explanatory drawing about the centerline calculation of the white line in the control part shown by FIG. 図1に示された制御部におけるヨーレート算出方法についての説明図である。It is explanatory drawing about the yaw rate calculation method in the control part shown by FIG. 図1に示されたジャイロセンサにおける検出ヨーレートと真のヨーレートとの関係を示したグラフである。It is the graph which showed the relationship between the detection yaw rate and the true yaw rate in the gyro sensor shown by FIG. 図1に示された制御部のジャイロセンサの補正処理のフローチャートである。It is a flowchart of the correction | amendment process of the gyro sensor of the control part shown by FIG. 図6に示されたヨーレート算出処理のフローチャートである。It is a flowchart of the yaw rate calculation process shown by FIG. 本発明の第2の実施例にかかる制御部におけるヨーレート算出方法についての説明図である。It is explanatory drawing about the yaw rate calculation method in the control part concerning the 2nd Example of this invention. 地物検出の際に発生するオクルージョンの説明図である。It is explanatory drawing of the occlusion which generate | occur | produces in the case of feature detection. 本発明の第2の実施例にかかる評価指標の算出の説明図である。It is an explanatory view of calculation of an evaluation index concerning a 2nd example of the present invention. 本発明の第2の実施例にかかる制御部ジャイロセンサの補正処理のフローチャートである。It is a flowchart of a correction | amendment process of the control part gyro sensor concerning the 2nd Example of this invention. 図11に示されたヨーレート算出処理のフローチャートである。It is a flowchart of the yaw rate calculation process shown by FIG. 複数の地物を利用してヨーレートを算出する場合の説明図である。It is explanatory drawing in the case of calculating a yaw rate using a several terrestrial feature.
 以下、本発明の一実施形態にかかる情報処理装置を説明する。本発明の一実施形態にかかる情報処理装置は、第1取得部が、自律走行する自動運転車両の周辺の地物を検出する検出部の検出結果を取得し、出力部が、取得された検出結果に基づき自動運転車両の移動方向が変化するように当該自動運転車両を制御するための制御情報を出力する。そして、算出部が、自動運転車両が制御情報によって移動方向が変化しているときの検出部の検出結果に基づき、移動方向が変化しているときの自動運転車両のヨー角変化量を算出する。このようにすることにより、自動運転車両の移動方向が変化するように当該自動運転車両を制御するための制御情報を出力するため、走行路がカーブしていなく直線道路を走行している場合であっても、ヨー角が変化する状況を作り出すことができるため、ヨーレート算出が可能となる。 Hereinafter, an information processing apparatus according to an embodiment of the present invention will be described. In the information processing apparatus according to the embodiment of the present invention, the first acquisition unit acquires the detection result of the detection unit that detects a feature around the autonomous driving vehicle traveling autonomously, and the output unit detects the acquired Control information for controlling the autonomous driving vehicle is output based on the result so that the moving direction of the autonomous driving vehicle changes. Then, the calculation unit calculates the yaw angle change amount of the automatically driven vehicle when the movement direction is changed, based on the detection result of the detection unit when the automatically driven vehicle changes the movement direction according to the control information. . By doing this, in order to output control information for controlling the autonomous driving vehicle so that the moving direction of the autonomous driving vehicle changes, the traveling path is not curved but travels on a straight road. Even if there is, the situation in which the yaw angle changes can be created, so that the yaw rate can be calculated.
 また、第1取得部は、自動運転車両の周辺の、走行路上に沿った連続性を有する地物を検出する検出部の検出結果を、所定時間間隔で連続的に取得し、検出結果から所定の規則に基づいて地物の基準線を認識する認識部と、基準線の向きを設定する第1設定部と、を備えている。そして、算出部は、第1時刻における基準線に対する自動運転車両の移動方向と、第2時刻における基準線に対する自動運転車両の移動方向とに基づいて前記自動運転車両のヨー角変化量を算出するようにしてもよい。このようにすることにより、2つの時刻の地物の基準線の向きと移動体の移動方向に基づいてヨーレートを算出することができる。そして、地物には走行路上に沿った連続性を有する地物であれば、特許文献1に記載されている固定物でなくてもよいので、ヨーレートの算出頻度を高めることができる。 In addition, the first acquisition unit continuously acquires, at predetermined time intervals, detection results of the detection unit that detects a feature having continuity along the traveling path around the autonomous driving vehicle, and the predetermined result is determined from the detection results. And a first setting unit configured to set the direction of the reference line. Then, the calculation unit calculates the yaw angle change amount of the autonomous driving vehicle based on the moving direction of the autonomous driving vehicle with respect to the reference line at the first time and the traveling direction of the autonomous driving vehicle with respect to the reference line at the second time. You may do so. By doing this, it is possible to calculate the yaw rate based on the directions of the reference lines of the features at two times and the moving direction of the moving object. And if it is a feature which has continuity along a running path to a feature, since it is not necessary to be a fixed thing indicated in patent documents 1, calculation frequency of yaw rate can be raised.
 また、地物の法線方向の向きを設定する第2設定部を備え、算出部は、第1時刻における自動運転車両からの地物の向きと、第2時刻における自動運転車両からの地物の向き及び第1時刻からの地物の法線方向の向きに対する角度の変化分とに基づいて自動運転車両のヨー角変化量を算出してもよい。このようにすることにより、例えば道路標識等の地物を用いてヨーレートの算出をすることができる。 In addition, a second setting unit for setting the direction of the normal direction of the feature is provided, and the calculation unit is configured to detect the direction of the feature from the automatically driven vehicle at the first time and the feature from the automatically driven vehicle at the second time The yaw angle change amount of the autonomous driving vehicle may be calculated on the basis of the change of the angle relative to the direction of the normal direction of the feature from the first time. By doing this, it is possible to calculate the yaw rate using a feature such as a road sign, for example.
 また、地物及び基準線の少なくとも一方の検出精度を評価する評価部を更に有し、算出部は、検出精度の評価結果が所定以上であった場合はヨー角変化量を算出してもよい。このようにすることにより、地物や基準線の検出精度が高い場合にのみヨーレートを算出することができる。そのため、ヨーレートの算出精度を向上させることができる。 In addition, an evaluation unit that evaluates the detection accuracy of at least one of the feature and the reference line may further be included, and the calculation unit may calculate the yaw angle change amount when the evaluation result of the detection accuracy is a predetermined value or more. . By doing this, it is possible to calculate the yaw rate only when the detection accuracy of the feature or the reference line is high. Therefore, the calculation accuracy of the yaw rate can be improved.
 また、ジャイロセンサから出力された情報を取得する第2取得部と、算出部で算出されたヨー角変化量に基づいてジャイロセンサから出力された情報を補正する補正部と、を更に備えてもよい。このようにすることにより、ジャイロセンサの検出値を算出部の算出値により補正することが可能となる。したがって、ジャイロセンサの検出値の精度を向上させることができる。 In addition, the second acquisition unit may acquire information output from the gyro sensor, and the correction unit may correct the information output from the gyro sensor based on the yaw angle change amount calculated by the calculation unit. Good. By doing this, it becomes possible to correct the detection value of the gyro sensor with the calculation value of the calculation unit. Therefore, the accuracy of the detection value of the gyro sensor can be improved.
 また、本発明の一実施形態にかかる検出装置は、自律走行する自動運転車両の周辺の地物を検出する検出部を備えている。そして、第1取得部が、自律走行する自動運転車両の周辺の地物を検出する検出部の検出結果を取得し、出力部が、取得された検出結果に基づき自動運転車両の移動方向が変化するように当該自動運転車両を制御するための制御情報を出力する。そして、算出部が、自動運転車両が制御情報によって移動方向が変化しているときの検出部の検出結果に基づき、移動方向が変化しているときの自動運転車両のヨー角変化量を算出する。このようにすることにより、検出装置において、自動運転車両の移動方向が変化するように当該自動運転車両を制御するための制御情報を出力するため、走行路がカーブしていなく直線道路を走行している場合であっても、ヨー角が変化する状況を作り出すことができるため、ヨーレート算出が可能となる。 Moreover, the detection apparatus concerning one Embodiment of this invention is provided with the detection part which detects the terrestrial feature around the autonomous driving vehicle which carries out an autonomous travel. Then, the first acquisition unit acquires the detection result of the detection unit that detects a feature around the autonomous driving vehicle traveling autonomously, and the output unit changes the moving direction of the autonomous driving vehicle based on the acquired detection result Control information for controlling the autonomous driving vehicle. Then, the calculation unit calculates the yaw angle change amount of the automatically driven vehicle when the movement direction is changed, based on the detection result of the detection unit when the automatically driven vehicle changes the movement direction according to the control information. . By doing this, in the detection device, in order to output control information for controlling the autonomous driving vehicle so as to change the moving direction of the autonomous driving vehicle, the traveling path is not curved and travels on a straight road. Even in this case, the yaw rate can be calculated because the situation in which the yaw angle changes can be created.
 また、本発明の一実施形態にかかる情報処理方法は、第1取得工程で、自律走行する自動運転車両の周辺の地物を検出する検出部の検出結果を取得し、出力工程で、取得された検出結果に基づき自動運転車両の移動方向が変化するように当該自動運転車両を制御するための制御情報を出力する。そして、算出工程で、自動運転車両が制御情報によって移動方向が変化しているときの検出部の検出結果に基づき、移動方向が変化しているときの自動運転車両のヨー角変化量を算出する。このようにすることにより、自動運転車両の移動方向が変化するように当該自動運転車両を制御するための制御情報を出力するため、走行路がカーブしていなく直線道路を走行している場合であっても、ヨー角が変化する状況を作り出すことができるため、ヨーレート算出が可能となる。 Further, in the information processing method according to the embodiment of the present invention, in the first acquisition step, a detection result of a detection unit that detects a feature around the autonomous driving vehicle traveling autonomously is acquired, and acquired in the output step. Based on the detection result, control information for controlling the autonomous driving vehicle is output such that the moving direction of the autonomous driving vehicle changes. Then, in the calculation step, the yaw angle change amount of the automatically driven vehicle when the moving direction is changed is calculated based on the detection result of the detection unit when the automatically driven vehicle moves according to the control information. . By doing this, in order to output control information for controlling the autonomous driving vehicle so that the moving direction of the autonomous driving vehicle changes, the traveling path is not curved but travels on a straight road. Even if there is, the situation in which the yaw angle changes can be created, so that the yaw rate can be calculated.
 また、上述した情報処理方法をコンピュータにより実行させる情報処理プログラムとしてもよい。このようにすることにより、自動運転車両の移動方向が変化するように当該自動運転車両を制御するための制御情報を出力するため、コンピュータを用いて、走行路がカーブしていなく直線道路を走行している場合であっても、ヨー角が変化する状況を作り出すことができるため、ヨーレート算出が可能となる。 In addition, an information processing program that causes a computer to execute the above-described information processing method may be used. In this way, using a computer, the traveling path is not curved and travels on a straight road in order to output control information for controlling the autonomous driving vehicle so that the moving direction of the autonomous driving vehicle changes. Even when this is the case, the yaw rate can be calculated because the situation in which the yaw angle changes can be created.
 本発明の第1の実施例にかかる情報処理装置を図1~図7を参照して説明する。本実施例にかかる情報処理装置は検出装置1に含まれ、移動体としての自動運転車両と共に移動する。自動運転車両は、後述するセンサ群11が検出した結果及び、記憶部12が有する自動運転用の地図情報を含む地図DB10に基づいて設定された目的地まで自律的に走行(自動運転)する。 An information processing apparatus according to a first embodiment of the present invention will be described with reference to FIGS. 1 to 7. The information processing apparatus according to the present embodiment is included in the detection device 1 and moves together with an automatically driven vehicle as a moving body. The autonomous driving vehicle travels autonomously (automatic driving) to a destination set based on the map DB 10 including the result of detection by the sensor group 11 described later and map information for automatic driving that the storage unit 12 has.
 本実施形態にかかる検出装置1の概略ブロック構成を図1に示す。検出装置1は、センサ群11と、記憶部12と、制御部15と、出力部16と、を備えている。 A schematic block configuration of the detection device 1 according to the present embodiment is shown in FIG. The detection device 1 includes a sensor group 11, a storage unit 12, a control unit 15, and an output unit 16.
 センサ群11は、ライダ21と、車速センサ22と、加速度センサ23と、ジャイロセンサ24と、傾斜センサ25と、温度センサ26と、GPS受信機27と、を備えている。 The sensor group 11 includes a lidar 21, a vehicle speed sensor 22, an acceleration sensor 23, a gyro sensor 24, an inclination sensor 25, a temperature sensor 26, and a GPS receiver 27.
 検出部としてのライダ21は、パルス状にレーザ光を出射することで、外界に存在する物体までの距離を離散的に測定する。ライダ21は、レーザ光が反射された物体までの距離と、当該レーザ光の出射角度との組により示された計測点の点群を出力する。本実施例では、ライダ21は、自動運転車両の周辺に存在する地物の検出に用いられる。地物とは、地上に存在する天然または人工のあらゆる物体を含む概念である。地物の例としては、自動運転車両等の経路(即ち道路)上に位置する経路上地物と、道路の周辺に位置する周辺地物と、が含まれる。経路上地物の例としては、道路標識や信号機、ガードレール、歩道橋等が挙げられ、道路そのものも含まれる。即ち、路面に描写された文字や図形、及び、道路の形状(道幅や曲率)も経路上地物に含まれる。また、周辺地物の例としては、道路に沿って位置する建築物(住宅、店舗)や看板等が挙げられる。 The lidar 21 as a detection unit emits laser light in a pulse shape to discretely measure the distance to an object present in the outside world. The lidar 21 outputs a point cloud of measurement points indicated by a combination of the distance to the object from which the laser light is reflected and the emission angle of the laser light. In the present embodiment, the lidar 21 is used to detect features present in the periphery of an autonomous driving vehicle. A feature is a concept that includes all natural or artificial objects present on the ground. Examples of features include path features located on a route (i.e., a road) such as an autonomous vehicle and peripheral features located on the periphery of the road. As an example of the route top feature, a road sign, a traffic light, a guardrail, a footbridge, etc. may be mentioned, and the road itself is also included. That is, characters and figures drawn on the road surface, and the shape of the road (road width and curvature) are also included in the route features. In addition, examples of the peripheral features include buildings (houses, stores) and billboards located along the road.
 車速センサ22は、自動運転車両の車輪の回転に伴って発生されているパルス信号からなるパルス(「車軸回転パルス」とも呼ぶ。)を計測し、車速を検出する。加速度センサ23は、自動運転車両の進行方向における加速度を検出する。ジャイロセンサ24は、自動運転車両の方向変換時における車両のヨーレートを検出する。傾斜センサ25は、自動運転車両の水平面に対するピッチ方向での傾斜角(「勾配角」とも呼ぶ。)を検出する。温度センサ26は、ジャイロセンサ24の周辺での温度を検出する。GPS(Global Positioning System)受信機27は、複数のGPS衛星から、測位用データを含む電波を受信することで、自動運転車両の絶対的な位置を検出する。センサ群11の各センサの出力は、制御部15に供給される。 The vehicle speed sensor 22 measures a pulse (also referred to as an “axle rotation pulse”) formed of a pulse signal generated as the wheels of the autonomous driving vehicle rotate, and detects the vehicle speed. The acceleration sensor 23 detects an acceleration in the traveling direction of the autonomous driving vehicle. The gyro sensor 24 detects the yaw rate of the autonomous driving vehicle at the time of direction change. The tilt sensor 25 detects a tilt angle (also referred to as a “slope angle”) in the pitch direction with respect to the horizontal plane of the autonomous driving vehicle. The temperature sensor 26 detects the temperature around the gyro sensor 24. A GPS (Global Positioning System) receiver 27 detects an absolute position of an autonomous driving vehicle by receiving radio waves including positioning data from a plurality of GPS satellites. The output of each sensor of the sensor group 11 is supplied to the control unit 15.
 記憶部12は、制御部15が実行する情報処理プログラムや、制御部15が所定の処理を実行するのに必要な情報等を記憶する。本実施例では、記憶部12は、道路データ及び地物の情報を含む地図データベース(DB)10を記憶する。なお、地図DB10は、定期的に更新されてもよい。この場合、例えば、制御部15は、図示しない通信部を介し、地図情報を管理する外部のサーバ装置から、自車位置が属するエリアに関する部分地図情報を受信し、地図DB10に反映させる。なお、記憶部12が地図DB10を記憶する代わりに、検出装置1と通信可能なサーバ装置が地図DB10を記憶してもよい。この場合、制御部15は、外部のサーバ装置と通信を行うことにより、地図DB10から必要な地物の情報等を取得する。 The storage unit 12 stores an information processing program executed by the control unit 15, information required for the control unit 15 to execute a predetermined process, and the like. In the present embodiment, the storage unit 12 stores a map database (DB) 10 including road data and feature information. In addition, map DB10 may be updated regularly. In this case, for example, the control unit 15 receives partial map information related to the area to which the vehicle position belongs from an external server device that manages map information via a communication unit (not shown), and causes the map DB 10 to reflect it. Note that, instead of the storage unit 12 storing the map DB 10, a server device that can communicate with the detection device 1 may store the map DB 10. In this case, the control unit 15 communicates with an external server device to acquire necessary feature information and the like from the map DB 10.
 出力部16は、例えば、制御部15で算出されたヨーレート等の情報や自動運転車両の車線変更等の制御信号等を、自動運転の制御装置等の他の車載機器に出力する。 The output unit 16 outputs, for example, information such as the yaw rate calculated by the control unit 15, a control signal such as a lane change of the automatically driven vehicle, and the like to another on-vehicle device such as a control device for automatic driving.
 制御部15は、プログラムを実行するCPU(Central Processing Unit)などを含み、検出装置1の全体を制御する。制御部15は、取得部15aと、認識部15bと、設定部15cと、算出部15dと、評価部15eと、を備えている。本実施例では、制御部15は、ライダ21で検出された地物に基づいて車両のヨーレートを算出する。 The control unit 15 includes a CPU (Central Processing Unit) or the like that executes a program, and controls the entire detection device 1. The control unit 15 includes an acquisition unit 15a, a recognition unit 15b, a setting unit 15c, a calculation unit 15d, and an evaluation unit 15e. In the present embodiment, the control unit 15 calculates the yaw rate of the vehicle based on the features detected by the rider 21.
 取得部15aは、ライダ21が検出した地物の検出結果のうち、後述するウィンドウにおける検出結果を所定時間間隔で連続的に取得する。 The acquisition unit 15a continuously acquires detection results in a window, which will be described later, among detection results of the features detected by the lidar 21 at predetermined time intervals.
 認識部15bは、検出結果から後述する所定の規則に基づいて地物の基準線を認識する。基準線については後述する。 The recognition unit 15b recognizes a reference line of the feature based on a predetermined rule described later from the detection result. The reference line will be described later.
 設定部15cは、基準線の向きを内部に設定する。 The setting unit 15c internally sets the direction of the reference line.
 算出部15dは、自動運転車両走行中の特定の時刻である第1時刻における基準線に対する自動運転車両の移動方向と、第1時刻と異なる時刻である第2時刻における基準線に対する自動運転車両の移動方向及び基準線の第1時刻からの向きの変化分に基づいて自動運転車両のヨーレートを算出する。 The calculating unit 15d is configured to calculate the moving direction of the automatically driven vehicle with respect to the reference line at the first time which is a specific time during traveling of the automatically driven vehicle and the automatically driven vehicle with respect to the reference line at the second time which is different from the first time. The yaw rate of the autonomous driving vehicle is calculated based on the movement direction and the change in direction of the reference line from the first time.
 評価部15eは、後述する白線中心線(基準線)の検出精度を評価する。 The evaluation unit 15 e evaluates the detection accuracy of a white line center line (reference line) described later.
 そして、上述した構成の検出装置1のうち制御部15が本実施例にかかる情報処理装置として機能する。 And control part 15 functions as an information processor concerning this example among detection devices 1 of composition of having mentioned above.
 次に、上述した構成の検出装置1の制御部15(情報処理装置)におけるヨーレート検出の方法について説明する。以下の説明では地物として車線を区切る境界線(いわゆる白線)を利用して行う。白線は再帰性反射材が塗布されているため、反射強度が高くライダ21による検出が容易である。なお、本実施例では地物として白線で説明するが、白線以外の車線境界線やガードレール等の走行路上に沿った連続性を有する地物であれば特に限定されない。 Next, a method of yaw rate detection in the control unit 15 (information processing apparatus) of the detection apparatus 1 configured as described above will be described. In the following description, the boundary line (so-called white line) that divides the lane is used as a feature. Since the white line is coated with a retroreflective material, the reflection intensity is high and detection by the lidar 21 is easy. In addition, although a white line is demonstrated as a terrestrial feature in a present Example, it will not be specifically limited if it is a terrestrial feature which has continuity along traveling roads, such as a lane boundary line other than a white line and a guardrail.
 本実施例における白線の検出について説明する。図2において自動運転車両Cは、図中左から右へ向かって走行しているとする。そして、自動運転車両Cの前方部左側にはライダ21Lが設置され、同様に自動運転車両Cの前方部右側にはライダ21Rが設置されている。 The detection of the white line in the present embodiment will be described. In FIG. 2, it is assumed that the autonomous driving vehicle C is traveling from left to right in the drawing. The rider 21L is installed on the left side of the front portion of the autonomous driving vehicle C, and the rider 21R is similarly installed on the right side of the front portion of the autonomous driving vehicle C.
 そして、ライダ21L、21Rの検出範囲をAとすると、その検出範囲Aに矩形状の領域であるウィンドウWが設定される。このウィンドウWは、検出範囲Aの中で白線D1及び白線D2が検出し易い位置に設定される。このウィンドウWが本実施例における自動運転車両Cとともに移動する検出領域となる。なお、本実施例では、自動運転車両Cの前方に設置したライダ21により説明するが、自動運転車両Cの後方に設置したライダであってもよい。さらに、ライダ21L、21Rのいずれかのみであってもよい。 Then, assuming that the detection range of the riders 21L and 21R is A, a window W which is a rectangular area is set in the detection range A. The window W is set at a position where the white line D1 and the white line D2 can be easily detected in the detection range A. This window W is a detection area which moves with the automatically driven vehicle C in this embodiment. In the present embodiment, the rider 21 installed in front of the automatically driven vehicle C will be described. However, the rider may be installed behind the automatically driven vehicle C. Furthermore, only one of the riders 21L and 21R may be used.
 次に、ライダ21のスキャン間隔について説明する。本実施例におけるライダ21は、水平方向の一方から他方に沿って順次パルス光を発光することにより物体をスキャンする。そのため、図2の上段や図3に示したように、上から見るとスキャン軌跡がライン状になる。したがって、取得部15aは、このスキャンされたラインの間隔でライダ21から情報を取得する。即ち、取得部15aは、地物の検出結果を、所定の時間間隔で連続的に取得している。 Next, the scan interval of the lidar 21 will be described. The lidar 21 in this embodiment scans an object by emitting pulsed light sequentially from one side to the other side in the horizontal direction. Therefore, as shown in the upper part of FIG. 2 and FIG. 3, the scan locus is linear when viewed from above. Therefore, the acquiring unit 15a acquires information from the lidar 21 at intervals of the scanned lines. That is, the acquisition unit 15a continuously acquires the detection result of the feature at predetermined time intervals.
 また、一般的なライダとして、水平方向にスキャンするビームを垂直方向に上下動させて複数のラインを得るものや、水平方向にスキャンする光学系を垂直方向に複数個並べて複数のラインを得るものがある。このようなタイプのライダのスキャン間隔は、自動運転車両Cからみて離れるにしたがって広がることが知られている(図3も参照)。これは、ライダ21と地物(路面)との角度が自動運転車両Cからみて離れるにしたがって浅くなるためである。 Also, as a general lidar, a beam scanning in the horizontal direction is vertically moved up and down to obtain a plurality of lines, or a plurality of optical systems scanning in the horizontal direction are vertically arranged to obtain a plurality of lines There is. It is known that the scan interval of such a type of rider increases with distance from the autonomous driving vehicle C (see also FIG. 3). This is because the angle between the rider 21 and the feature (the road surface) becomes shallower as it goes away from the autonomous driving vehicle C.
 次に、白線の基準線としての中心線の算出方法(認識方法)と、中心線の検出精度の算出方法について図3を参照して説明する。この基準線は、ウィンドウW内における白線(地物)の向きを規定し、後述する自動運転車両Cのヨーレートの算出の際に自動運転車両Cの移動方向と比較するために用いられるものである。 Next, the calculation method (recognition method) of the center line as the reference line of the white line and the calculation method of the detection accuracy of the center line will be described with reference to FIG. The reference line defines the direction of the white line (feature) in the window W, and is used to compare with the moving direction of the automatically driven vehicle C in calculating the yaw rate of the automatically driven vehicle C described later. .
 白線D1は、ウィンドウW内の反射強度の高い部分(高反射強度)を検出することで認識することができる。例えば所定閾値以上の反射強度がある部分を白線と見做せばよい。このとき各スキャンのラインにおける高反射強度部分の中心点を求め、各中心点を通る直線の式を例えば最小二乗法で求め、求められた式で表される直線を白線中心線とする。例えば図3の上段の場合は、中心点c1~c10を通る式を最小二乗法で求めたものが白線中心線Lとなる。なお、本実施例では、白線の中心線を算出しているが中心線に限らない。例えば、白線の端部、即ちスキャンのライン状で白線が検出された最も端の点に基づいて最小二乗法で線を求めてもよい。即ち、本実施例においては、最小二乗法を所定の規則として基準線(白線中心線)を認識している。そして、算出された白線中心線で表される向きが設定部15cに設定される。 The white line D1 can be recognized by detecting a portion (high reflection intensity) in the window W where the reflection intensity is high. For example, a portion having a reflection intensity equal to or higher than a predetermined threshold may be regarded as a white line. At this time, the center point of the high reflection strength portion in the line of each scan is determined, the equation of the straight line passing each center point is determined by, for example, the least square method, and the straight line represented by the equation obtained is set as the white line center line. For example, in the case of the upper part of FIG. 3, the white line center line L is obtained by the equation passing through the central points c1 to c10 by the least squares method. Although the center line of the white line is calculated in the present embodiment, it is not limited to the center line. For example, the line may be determined by the method of least squares based on the end of the white line, that is, the end point where the white line is detected in the form of a scan line. That is, in the present embodiment, the reference line (white line center line) is recognized with the least squares method as a predetermined rule. Then, the direction represented by the calculated white line center line is set in the setting unit 15 c.
 次に、白線中心線Lの検出精度の算出方法について説明する。白線中心線Lは、自動運転車両Cのヨーレート算出に用いられるため、白線中心線Lの精度が低いとヨーレートの精度にも影響を及ぼすことがある。そこで、以下のような評価指標により白線中心線Lの検出精度を算出する。 Next, a method of calculating the detection accuracy of the white line center line L will be described. Since the white line center line L is used to calculate the yaw rate of the automatically driven vehicle C, the accuracy of the white line center line L may also affect the yaw rate accuracy. Therefore, the detection accuracy of the white line center line L is calculated by the following evaluation index.
 評価指標の1つ目は、ウィンドウWの範囲で所定以上のスキャンのライン数が検出できているかである。例えば図3中段に示すように、白線D1の端部等の一部しか検出できない場合は、中心点の数(サンプル数)が少ないため最小二乗法で求めた白線中心線Lの精度が高くない場合がある。この場合の評価指標bは、ライダ21Lまたはライダ21Rで計測したライン数をN、期待されるライン数をNとすると次の(1)式により表される。
Figure JPOXMLDOC01-appb-M000001
The first evaluation index is whether or not the number of scan lines of a predetermined number or more can be detected in the range of the window W. For example, as shown in the middle of FIG. 3, when only a part of the end of the white line D1 can be detected, the number of center points (the number of samples) is small and the accuracy of the white line center line L obtained by the least squares method is not high. There is a case. The evaluation index b in this case is expressed by the following equation (1), assuming that the number of lines measured by the lidar 21L or the lidar 21R is N L and the expected number of lines is N M.
Figure JPOXMLDOC01-appb-M000001
 評価指標の2つ目は、高反射強度部分のラインの幅が対象とする白線幅にほぼ等しいかである。例えば図3下段に示すように、白線D1の塗装が劣化している場合は、白線D1の幅が変化するため白線の中心点の位置(サンプルが示す値)が本来の中心位置(基準値)からずれることが多い(誤差が大きい)ため最小二乗法で求めた白線中心線Lの精度が高くない場合がある。この場合の評価指標cは、ライダ21Lまたはライダ21Rで計測したライン幅をW、本来のライン幅をWとすると次の(2)式により表される。なお、本来のライン幅は、例えば地図DB10に含めればよい。
Figure JPOXMLDOC01-appb-M000002
The second of the evaluation indexes is whether the line width of the high reflection intensity portion is approximately equal to the target white line width. For example, as shown in the lower part of FIG. 3, when the coating of the white line D1 is deteriorated, the width of the white line D1 changes, so the position (the value indicated by the sample) of the center point of the white line is the original center position (reference value) There are cases where the accuracy of the white line center line L determined by the method of least squares may not be high because there are many cases of deviation (the error is large). The evaluation index c of case, lidar 21L or lidar 21R linewidth measured in W L, when the original line width is W M represented by the following equation (2). Note that the original line width may be included in, for example, the map DB 10.
Figure JPOXMLDOC01-appb-M000002
 そして、上述した2つの評価指標b、cに基づいてヨーレートの算出の可否を決定する。詳細は後述するフローチャートで説明する。 Then, whether to calculate the yaw rate is determined based on the two evaluation indexes b and c described above. Details will be described in the flowchart to be described later.
 次に、自動運転車両Cのヨーレート算出方法について図4を参照して説明する。本実施例では、白線D1が直線であって、自動運転車両Cが車線変更や右左折等により白線D1との角度が変化する場合である。まず、上述した方法で白線中心線Lを求め、この白線中心線Lを示す式から向きを特定する。そして、その白線中心線Lと自動運転車両Cが向いている方向との角度差を時刻毎に求め、その角度差の差分を取り、ヨー角度変化ΔΨを求める。 Next, a method of calculating the yaw rate of the automatically driven vehicle C will be described with reference to FIG. In the present embodiment, the white line D1 is a straight line, and the angle of the automatically driven vehicle C with the white line D1 is changed due to a change in lane, left turn, or the like. First, the white line center line L is determined by the method described above, and the direction is specified from the equation indicating the white line center line L. Then, the angle difference between the white line center line L and the direction in which the autonomous driving vehicle C is directed is determined for each time, the difference in the angle difference is taken, and the yaw angle change ΔΨ is determined.
 図4の場合、時刻t(k-1)、時刻t(k)、時刻t(k+1)と、時刻t毎に白線中心線Lと自動運転車両Cの向いている方向(移動方向)の角度差Δθを求める。そして、前の時刻のΔθとの差分を取り、ヨー角度変化ΔΨを求める((3)式を参照)。なお、本実施例の場合、白線D1は直線を前提としているので白線中心線Lの向きは同じであり変化が無いため(3)式でヨー角度変化ΔΨが求まることとなる。
Figure JPOXMLDOC01-appb-M000003
In the case of FIG. 4, the angle between the time t (k-1), the time t (k) and the time t (k + 1), and the direction (moving direction) in which the white line center line L faces the autonomous driving vehicle C at every time t. The difference Δθ is determined. Then, the difference with Δθ of the previous time is taken, and the yaw angle change ΔΨ is obtained (see the equation (3)). In the case of the present embodiment, since the white line D1 is premised on a straight line, the direction of the white line center line L is the same and there is no change, so that the yaw angle change ΔΨ can be obtained by the equation (3).
Figure JPOXMLDOC01-appb-M000003
 次に、Δθを求めた2つの時刻の時間差Δtを求める((4)式を参照)。
Figure JPOXMLDOC01-appb-M000004
Next, the time difference Δt between the two times at which Δθ is determined is determined (see equation (4)).
Figure JPOXMLDOC01-appb-M000004
 そして、(3)式で求めたヨー角度変化ΔΨを(4)式で求めた時間差Δtで除算してヨーレートΨ_dotを算出する((5)式を参照)。
Figure JPOXMLDOC01-appb-M000005
Then, the yaw rate change ΔΨ obtained by the equation (3) is divided by the time difference Δt obtained by the equation (4) to calculate the yaw rate Ψ_dot (see the equation (5)).
Figure JPOXMLDOC01-appb-M000005
 即ち、時刻t(k-1)(第1時刻)における白線中心線L(基準線)に対する自動運転車両Cの移動方向と、時刻t(k)(第2時刻)における白線中心線L(基準線)に対する自動運転車両Cの移動方向の変化に基づいて自動運転車両Cのヨーレート(単位時間当たりのヨー角の変化量)を算出する。 That is, the moving direction of the automatically driven vehicle C with respect to the white line center line L (reference line) at time t (k-1) (first time), and the white line center line L at time t (k) (second time) The yaw rate (the amount of change of the yaw angle per unit time) of the automatically driven vehicle C is calculated based on the change in the moving direction of the automatically driven vehicle C with respect to the line).
 なお、図4では、車線変更等により白線D1から自動運転車両Cが徐々に離れていくが、この場合、自車位置からウィンドウWの位置を調整し、反射強度が大きい部分がウィンドウWに含まれるようにする。この調整は、例えば車両Cの操舵角やジャイロセンサ24が検出した値に基づいて行うことができる。 In FIG. 4, the automatically driven vehicle C gradually moves away from the white line D1 due to a lane change, etc. In this case, the window W is adjusted from the vehicle position, and the window W includes a portion with high reflection intensity. To be This adjustment can be performed based on, for example, the steering angle of the vehicle C or a value detected by the gyro sensor 24.
 次に、上述したようにして求めたヨーレートを用いてジャイロセンサ24で検出される値(ヨーレート)を補正する方法について説明する。 Next, a method of correcting the value (yaw rate) detected by the gyro sensor 24 using the yaw rate obtained as described above will be described.
 ジャイロセンサ24で検出されるヨーレートは、感度とオフセットを持つことが知られている(図5を参照)。したがって、真のヨーレートΨ_dotは、ジャイロセンサ24で検出されたヨーレートをω[t]、感度係数をA、オフセット係数をBとすると、以下の(6)式で表される。
Figure JPOXMLDOC01-appb-M000006
The yaw rate detected by the gyro sensor 24 is known to have sensitivity and offset (see FIG. 5). Therefore, assuming that the yaw rate detected by the gyro sensor 24 is ω [t], the sensitivity coefficient is A, and the offset coefficient is B, the true yaw rate Ψ_dot is expressed by the following equation (6).
Figure JPOXMLDOC01-appb-M000006
 (5)式で算出されたヨーレートは、真のヨーレートと見做せるので、y[t]=Ψ_dot[t]、x[t]=ω[t]とすると、(6)式は(7)式で示した1次式で表すことができる。
Figure JPOXMLDOC01-appb-M000007
Since the yaw rate calculated by the equation (5) can be regarded as a true yaw rate, assuming that y [t] = Ψ_dot [t] and x [t] = ω [t], the equation (6) becomes the equation (7) It can be represented by a linear expression shown by the equation.
Figure JPOXMLDOC01-appb-M000007
 したがって、x[t]とy[t]の複数データがあれば、逐次最小二乗法等により感度係数A、オフセット係数Bを求めることができる。よって、求めた感度係数A、オフセット係数Bを用いてジャイロセンサ24で検出されたヨーレートを補正することができる。 Therefore, if there is a plurality of data of x [t] and y [t], sensitivity coefficient A and offset coefficient B can be obtained by the least squares method or the like. Therefore, the yaw rate detected by the gyro sensor 24 can be corrected using the sensitivity coefficient A and the offset coefficient B which are obtained.
 上述したジャイロセンサ24の補正処理を図6のフローチャートを参照して説明する。このフローチャートは制御部15で実行される。まず、ステップS101において、取得部15aが、地図DB10から自車位置近辺の白線情報を読み取る。この白線情報には、上述した白線中心線Lの評価指標を求めるために用いる白線の幅の情報が含まれる。そして、ステップS101においては、読み取った白線情報の示す白線や認識部15bが認識した白線が直線の続く区間であるか判断する。直線の続く区間でない場合は以下のステップを実行しない。直線の続く区間であるかは、白線情報に直線区間を示す情報を含ませるようにすればよい。 The correction process of the above-described gyro sensor 24 will be described with reference to the flowchart of FIG. This flowchart is executed by the control unit 15. First, in step S101, the acquisition unit 15a reads white line information in the vicinity of the vehicle position from the map DB 10. The white line information includes information on the width of the white line used to obtain the evaluation index of the white line center line L described above. Then, in step S101, it is determined whether the white line indicated by the read white line information or the white line recognized by the recognition unit 15b is a continuous section of a straight line. The following steps are not performed if it is not a section following the straight line. Whether the section is a continuation of a straight line may include white line information including information indicating the straight section.
 次に、ステップS102において、認識部15bが、ライダ21により反射強度が高い部分を白線として検出する白線検出処理を行って、ステップS103において、評価部15eが、上述した(1)式及び(2)式により評価指標b、cを算出する。なお、本フローチャートでは評価指標b、cの両方を算出して評価するが、いずれか一方のみであってもよい。 Next, in step S102, the recognition unit 15b performs white line detection processing in which a portion with high reflection intensity is detected by the lidar 21 as a white line, and in step S103, the evaluation unit 15e calculates the equations (1) and (2). The evaluation indexes b and c are calculated by the equation). In this flowchart, although both of the evaluation indexes b and c are calculated and evaluated, only one of them may be used.
 次に、ステップS104において、評価部15eが、評価指標b、cが共に所定値よりも大きいか否か判断する。評価指標b、cのいずれかが所定値より小さい場合(Noの場合)はステップS101に戻り、評価指標b、cが共に所定値より大きい場合(Yesの場合)はステップS105において、評価部15eが、白線を利用したヨーレート算出が可能と判断する。 Next, in step S104, the evaluation unit 15e determines whether both of the evaluation indexes b and c are larger than a predetermined value. If one of the evaluation indexes b and c is smaller than the predetermined value (in the case of No), the process returns to step S101. If both evaluation indexes b and c are larger than the predetermined value (in the case of Yes), the evaluation unit 15e is performed in step S105. However, it is determined that yaw rate calculation using a white line is possible.
 次に、ステップS106において、算出部15dが、車線変更可能か否かを判断する。この判断は、例えば地図DB10に基づいて現在走行中の道路が複数車線あるか、ライダ21等のセンサ群11から取得した情報により周囲に他の車両等があるか等に基づいて行えばよい。車線変更可能でない場合(Noの場合)はステップS101に戻り、車線変更可能な場合(Yesの場合)はステップS107において車線変更動作を行う。つまり、自動運転車両C内の操舵機能やアクセル、ブレーキ等を制御する制御装置に対して車線変更のための制御信号(制御情報)を出力部16を介して出力する。 Next, in step S106, the calculation unit 15d determines whether the lane change is possible. This determination may be made based on, for example, whether the road currently being traveled has a plurality of lanes based on the map DB 10, and whether there is another vehicle or the like in the surroundings based on information acquired from the sensor group 11 such as the rider 21. When the lane change is not possible (in the case of No), the process returns to step S101, and in the case where the lane change is possible (in the case of Yes), the lane change operation is performed in step S107. That is, a control signal (control information) for lane change is output to the control device that controls the steering function, the accelerator, the brake, and the like in the automatically driven vehicle C via the output unit 16.
 次に、ステップS108において、算出部15dが、上述した(3)式~(5)式によりヨーレートを算出する。ヨーレート算出処理については後述する図7のフローチャートで説明する。そして、ステップS109において、上述した(7)式によりジャイロセンサ24の感度とオフセットの算出(補正)をする。 Next, in step S108, the calculation unit 15d calculates the yaw rate by the equations (3) to (5) described above. The yaw rate calculation process will be described with reference to the flowchart of FIG. 7 described later. Then, in step S109, the sensitivity and offset of the gyro sensor 24 are calculated (corrected) by the above-described equation (7).
 即ち、ステップS108では、自動運転車両Cが制御信号によって車線変更中(移動方向が変化しているとき)のライダ21(検出部)の検出結果に基づき、車線変更中(移動方向が変化しているとき)の自動運転車両Cの移動方向に対するヨーレートを算出している。 That is, in step S108, the lane change direction (moving direction changes) based on the detection result of the rider 21 (detection unit) during the lane change (when the moving direction changes) according to the control signal. The yaw rate with respect to the moving direction of the automatically driven vehicle C is calculated.
 また、図6のフローチャートを繰り返すことで、感度係数A、オフセット係数Bが適宜補正されて、ジャイロセンサ24で検出されるヨーレートの精度を向上させることができる。 Further, by repeating the flowchart of FIG. 6, the sensitivity coefficient A and the offset coefficient B can be appropriately corrected, and the accuracy of the yaw rate detected by the gyro sensor 24 can be improved.
 次に、ヨーレート算出処理(ステップS108)について図7のフローチャートを参照して説明する。このフローチャートは主に算出部15dで実行される。 Next, the yaw rate calculation process (step S108) will be described with reference to the flowchart of FIG. This flowchart is mainly executed by the calculation unit 15d.
 まず、ステップS201において、認識部15bが、ライダ21から取得部15aが取得した情報に基づいて反射強度が高い部分を白線として検出する白線検出処理を行って、ステップS202において、評価部15eが、上述した(1)式及び(2)式により評価指標b、cを算出する。即ち、取得部15aが第1取得部として機能する。 First, in step S201, the recognition unit 15b performs a white line detection process of detecting a portion with high reflection intensity as a white line based on the information acquired by the acquisition unit 15a from the lidar 21, and in step S202, the evaluation unit 15e The evaluation indexes b and c are calculated by the equations (1) and (2) described above. That is, the acquisition unit 15a functions as a first acquisition unit.
 次に、ステップS203において、評価部15eが、評価指標b、cが共に所定値よりも大きいか否か判断する。評価指標b、cのいずれかが所定値より小さい場合(Noの場合)はステップS201に戻り、評価指標b、cが共に所定値より大きい場合(Yesの場合)はステップS204において、上述したように、各スキャンのラインの中心点から最小二乗法を用いて白線中心線Lの式を求める。なお、ステップS201~S203は、ステップS102~S104と同じ動作を行っているが、ステップS102~S104は、ステップS101で読み出した白線がヨーレートの算出に適した白線が否かを判断しているのに対して、ステップS201~S203は、操舵角が変化してΔθを算出するそれぞれの時刻(図4のt(k-1)、t(k)、t(k+1))における白線の利用の適否を判断している。即ち、評価部15eによって検出精度が所定以上であった場合はヨーレートを算出する。 Next, in step S203, the evaluation unit 15e determines whether both of the evaluation indexes b and c are larger than a predetermined value. If one of the evaluation indexes b and c is smaller than the predetermined value (in the case of No), the process returns to the step S201, and if both the evaluation indexes b and c are larger than the predetermined value (in the case of Yes) Then, the equation of the white line center line L is determined using the least squares method from the center point of each scan line. Although steps S201 to S203 perform the same operation as steps S102 to S104, steps S102 to S104 determine whether the white line read in step S101 is a white line suitable for calculating the yaw rate. On the other hand, in steps S201 to S203, the steering angle changes to calculate .DELTA..theta. The suitability of using the white line at each time (t (k-1), t (k), t (k + 1) in FIG. 4). Is determined. That is, the yaw rate is calculated by the evaluation unit 15e when the detection accuracy is equal to or more than a predetermined value.
 次に、ステップS205において、算出部15dが、自車(自動運転車両C)の向きと白線中心線Lとの角度差Δθを求め、ステップS206において、算出部15dが、(3)式により前時刻のΔθとの差分を取り、ヨー角度変化ΔΨを求める。 Next, in step S205, the calculation unit 15d obtains an angular difference Δθ between the direction of the vehicle (automatically driven vehicle C) and the white line center line L, and in step S206, the calculation unit 15d calculates The difference between the time and Δθ is taken to determine the yaw angle change ΔΨ.
 次に、ステップS207において、算出部15dが、(4)式により前回スキャン時刻からの時間差Δt(k)を求め、ステップS208において、算出部15dが、(5)式によりヨーレートΨ_dot(k)を算出する。 Next, in step S207, the calculation unit 15d obtains the time difference Δt (k) from the previous scan time by equation (4), and in step S208, the calculation unit 15d calculates the yaw rate Ψ_dot (k) by equation (5). calculate.
 本実施例によれば、制御部15は、取得部15aが、自動運転車両Cの周辺の地物を検出するライダ21の検出結果を取得し、出力部16が、取得された検出結果に基づき自動運転車両Cが車線変更するように制御信号を出力する。そして、算出部15dが、自動運転車両Cが制御信号によって車線変更しているときのライダ21の検出結果に基づき、車線変更しているときの自動運転車両Cの移動方向に対するヨーレートを算出する。このようにすることにより、自動運転車両Cが車線変更するように制御信号を出力するため、走行路がカーブしていなく直線道路を走行している場合であっても、ヨー角が変化する状況を作り出すことができるため、ヨーレート算出が可能となる。 According to the present embodiment, the control unit 15 acquires the detection result of the rider 21 that detects a feature around the autonomous driving vehicle C, and the output unit 16 obtains the detection result based on the acquired detection result. A control signal is output so that the automatically driven vehicle C changes lanes. Then, based on the detection result of the rider 21 when the automatically driven vehicle C changes lanes according to the control signal, the calculation unit 15d calculates the yaw rate in the moving direction of the automatically driven vehicle C when changing lanes. In this way, the control signal is output so that the automatically driven vehicle C changes lanes, so that the yaw angle changes even if the traveling road is not curved and travels on a straight road. The yaw rate can be calculated.
 また、制御部15は、自動運転車両Cの周辺の、走行路上に沿った連続性を有する地物を検出するライダ21の検出結果を、所定時間間隔で連続的に取得し、検出結果から所定の規則に基づいて白線中心線Lを認識する認識部15bと、白線中心線Lの向きを設定する設定部15cと、を備えている。そして、算出部15dは、そして、算出部15dが、第1時刻(t(k))における白線中心線Lに対する車両Cの移動方向と、第2時刻(t(k+1))における白線中心線Lに対する車両Cの移動方向の変化に基づいて車両Cのヨーレートを算出する。このようにすることにより、2つの時刻の白線D1の白線中心線Lに対する車両Cの移動方向に基づいてヨーレートを算出することができる。 In addition, the control unit 15 continuously obtains detection results of the lidar 21 for detecting a feature having continuity along the traveling path around the autonomous driving vehicle C at predetermined time intervals, and determines the predetermined result from the detection results. The recognition unit 15 b that recognizes the white line center line L based on the rule of and the setting unit 15 c that sets the direction of the white line center line L. Then, the calculating unit 15 d then calculates the moving direction of the vehicle C with respect to the white line center line L at the first time (t (k)) and the white line center line L at the second time (t (k + 1)). The yaw rate of the vehicle C is calculated on the basis of the change in the moving direction of the vehicle C with respect to. By doing this, the yaw rate can be calculated based on the moving direction of the vehicle C with respect to the white line center line L of the white line D1 at two times.
 また、地物は、車両Cが走行する車線を区切る境界線である白線であって、算出部15dは、車両Cの操舵角が変化した場合のヨーレートを算出している。このようにすることにより、道路上の白線等の車線を区切る境界線に基づいてヨーレートを算出することができる。白線等の境界線は、道路上に多く存在し、また、他の車両等に遮られることも少ないため、ヨーレートの算出頻度を高めることができる。 Further, the feature is a white line which is a boundary that divides the lane in which the vehicle C travels, and the calculation unit 15 d calculates the yaw rate when the steering angle of the vehicle C changes. By doing this, it is possible to calculate the yaw rate based on the demarcation line that divides the lanes such as the white line on the road. Since many boundaries such as white lines exist on the road and are less likely to be blocked by other vehicles and the like, it is possible to increase the yaw rate calculation frequency.
 また、白線中心線Lの評価指標b、cを算出する評価部15eを有し、算出部15dは、評価指標b、cが所定以上であった場合はヨーレートを算出している。このようにすることにより、白線中心線Lの評価指標b、cの値(検出精度)が高い場合にのみヨーレートを算出することができる。そのため、ヨーレートの算出精度を向上させることができる。 The evaluation unit 15e calculates evaluation indices b and c of the white line center line L, and the calculation unit 15d calculates the yaw rate when the evaluation indices b and c are equal to or greater than a predetermined value. By doing this, it is possible to calculate the yaw rate only when the values (detection accuracy) of the evaluation indexes b and c of the white line center line L are high. Therefore, the calculation accuracy of the yaw rate can be improved.
 また、評価部15eは、車両Cとともに移動するウィンドウ内における白線D1を検出したライン数及び各ラインの幅と本来のラインの幅との誤差に基づいて評価している。このようにすることにより、例えばライン数が十分にない場合は必要とする検出精度が得られないと評価することができる。あるいは、各ラインの幅と本来のラインの幅との誤差が大きい場合は必要とする検出精度が得られないと評価することができる。 In addition, the evaluation unit 15e performs evaluation based on the number of lines in which the white line D1 is detected in the window moving together with the vehicle C and the difference between the width of each line and the original line width. By doing this, it can be evaluated that the required detection accuracy can not be obtained, for example, when the number of lines is insufficient. Alternatively, when the error between the width of each line and the original line width is large, it can be evaluated that the required detection accuracy can not be obtained.
 また、取得部15aがジャイロセンサ24から出力されたヨーレートを取得し、算出部15dは、自身で算出されたヨーレートに基づいてジャイロセンサ24から出力されたヨーレートを補正している。このようにすることにより、ジャイロセンサ24が検出したヨーレートを算出部15dで算出されたヨーレートにより補正することが可能となる。したがって、ジャイロセンサ24の検出値(ヨーレート)の精度を向上させることができる。 The acquiring unit 15a acquires the yaw rate output from the gyro sensor 24, and the calculating unit 15d corrects the yaw rate output from the gyro sensor 24 based on the yaw rate calculated by itself. By doing this, it becomes possible to correct the yaw rate detected by the gyro sensor 24 by the yaw rate calculated by the calculation unit 15 d. Therefore, the accuracy of the detection value (yaw rate) of the gyro sensor 24 can be improved.
 次に、本発明の第2の実施例にかかる検出装置及び情報処理装置を図8~図12を参照して説明する。なお、前述した第1の実施例と同一部分には、同一符号を付して説明を省略する。 Next, a detection apparatus and an information processing apparatus according to a second embodiment of the present invention will be described with reference to FIGS. 8 to 12. The same parts as those of the first embodiment described above are designated by the same reference numerals and the description thereof will be omitted.
 本実施例は、構成は図1と同様であるが、地物として白線等の走行路上に沿った連続性を有する地物ではなく、道路標識や看板等の連続性を有しない地物を対象としている。本実施例における自動運転車両Cのヨーレート算出方法について図8を参照して説明する。図8は、地物としての平面板状のランドマークLAであって、自動運転車両Cが車線変更や右左折等によりランドマークLAとの角度が変化する場合である。 The present embodiment is the same as the configuration shown in FIG. 1, but it is not a feature having continuity along a running road such as a white line as a feature but targets a feature having no continuity such as a road sign or a signboard And A method of calculating the yaw rate of the automatically driven vehicle C in the present embodiment will be described with reference to FIG. FIG. 8 shows a flat plate-like landmark LA as a feature, in which the automatically driven vehicle C changes its angle with the landmark LA due to a lane change or a turn.
 まず、地点Pでライダ21から照射したレーザ光とランドマークLA表面の法線との角度Φと、地点Pとは異なる地点Pでライダ21から照射したレーザ光とランドマークLA表面の法線との角度Φをそれぞれ求め、その角度の変化量(変化分)であるΔΦを求める((8)式を参照)。ランドマークLA表面の法線の情報は、例えば地図DB10のランドマークLAに関する情報に含ませ、設定部15cに設定すればよい。
Figure JPOXMLDOC01-appb-M000008
First, the angle [Phi 1 and a normal line of the laser beam and the landmark LA surface irradiated from the rider 21 at a point P 1, the laser beam and the landmark LA surface irradiated from the rider 21 at different points P 2 and the point P 1 The angle 2 2 with the normal to is calculated, and ΔΦ, which is the amount of change (change) in the angle, is determined (see equation (8)). The information on the normal of the landmark LA surface may be included in the information on the landmark LA in the map DB 10, for example, and may be set in the setting unit 15c.
Figure JPOXMLDOC01-appb-M000008
 次に、地点Pおけるライダ21がランドマークLAへ照射したレーザ光の照射角θA1と、地点Pおけるライダ21がランドマークLAへ照射したレーザ光の照射角θA2と、上記ΔΦと、を用いてヨー角度変化ΔΨを算出する((9)式を参照)。即ち、照射角θA1、照射角θA2が自動運転車両Cからの前記地物の向きを示す。
Figure JPOXMLDOC01-appb-M000009
Next, an irradiation angle theta A1 of the laser beam spot P 1 definitive rider 21 is irradiated to the landmark LA, the point P 2 definitive rider 21 and the irradiation angle theta A2 of the laser light applied to the landmark LA, and the ΔΦ , To calculate the yaw angle change Δ ((see equation (9)). That is, the irradiation angle θ A1 and the irradiation angle θ A2 indicate the orientation of the feature from the automatically driven vehicle C.
Figure JPOXMLDOC01-appb-M000009
 次に、Δθを求めた2つの地点における時間差Δtを求める((10)式を参照)。
Figure JPOXMLDOC01-appb-M000010
Next, the time difference Δt at the two points for which Δθ is determined is determined (see equation (10)).
Figure JPOXMLDOC01-appb-M000010
 そして、(9)式で求めたヨー角度変化ΔΨを(10)式で求めた時間差Δtで除算してヨーレートΨ_dotを算出する((11)式を参照)。
Figure JPOXMLDOC01-appb-M000011
Then, the yaw rate change ΔΨ obtained by the equation (9) is divided by the time difference Δt obtained by the equation (10) to calculate the yaw rate Ψ_dot (see the equation (11)).
Figure JPOXMLDOC01-appb-M000011
 即ち、地点P1の時刻(第1時刻)における自動運転車両CからのランドマークLA(地物)の向きと、地点P2の時刻(第2時刻)における自動運転車両CからのランドマークLA(地物)の向き及び地点P1の時刻(第1時刻)からのランドマークLA(地物)の法線方向の向きに対する角度の変化分とに基づいて自動運転車両Cのヨー角変化量を算出する。 That is, the direction of the landmark LA (feature) from the automatically driven vehicle C at the time of the point P1 (first time), and the landmark LA from the automatically driven vehicle C at the time (second time) of the point P2 Calculate the yaw angle change amount of the automatically driven vehicle C based on the direction of the object and the change in the angle relative to the direction of the normal direction of the landmark LA (feature) from the time (first time) of the point P1. .
 次に、本実施例におけるランドマークLAの検出精度の算出方法について図9及び図10を参照して説明する。第1の実施例に示したような白線等の地物は他の車両等によって隠れるようなことは少ないが、図8に示したような標識や看板等のランドマークLAは他の車両等により隠れてしまうことがある。例えば図9において、車両CaがランドマークLAを検出する場合、隣接車線を走行する車両CbによってランドマークLAの一部が隠されてしまうことがある(オクルージョンが発生する)。このようなオクルージョンが発生すると、図8に示したような2地点で検出結果を得られない場合があり得る。そこで、本実施例では、地図DB10にランドマークLAのサイズ情報を含ませておき、そのサイズ情報とライダ21による検出結果とを比較することで評価指標を算出する。 Next, a method of calculating the detection accuracy of the landmark LA in the present embodiment will be described with reference to FIGS. 9 and 10. Although features such as white lines shown in the first embodiment are unlikely to be hidden by other vehicles etc., landmarks LA such as signs and signs shown in FIG. Sometimes it hides. For example, in FIG. 9, when the vehicle Ca detects the landmark LA, a part of the landmark LA may be hidden by the vehicle Cb traveling in the adjacent lane (occlusion occurs). If such an occlusion occurs, the detection result may not be obtained at two points as shown in FIG. Therefore, in the present embodiment, the size information of the landmark LA is included in the map DB 10, and the evaluation index is calculated by comparing the size information with the detection result by the lidar 21.
 具体的には、図10に示すように、ランドマークLAの縦方向と横方向の長さを求める。ランドマークLAの縦方向の長さをH、ランドマークLAの横方向の長さをW、ライダ21で検出したランドマークLAの縦方向の長さをH、ライダ21で検出したランドマークLAの横方向の長さをWとすると、評価指標aは、次の(12)式で表される。
Figure JPOXMLDOC01-appb-M000012
Specifically, as shown in FIG. 10, the lengths in the vertical direction and the horizontal direction of the landmark LA are obtained. The vertical length of the landmark LA is H M , the horizontal length of the landmark LA is W M , the vertical length of the landmark LA detected by the lidar H L , the land detected by the lidar 21 Assuming that the horizontal length of the mark LA is W L , the evaluation index a is expressed by the following equation (12).
Figure JPOXMLDOC01-appb-M000012
 ここで、図10に示したように、ライダ21で検出したランドマークLAの縦方向の長さHは、ライダ21で検出されたランドマークLAを示す点群のうち最も縦方向に長い列の両端の点の間の長さとし、ライダ21で検出したランドマークLAの横方向の長さをWは、ライダ21で検出されたランドマークLAを示す点群のうち最も横方向に長い行の両端の点の間の長さとする。 Here, as shown in FIG. 10, the longitudinal length H L of the landmark LA detected by the lidar 21 is the longest row in the vertical direction among the point clouds indicating the landmark LA detected by the lidar 21. long Satoshi, W L is the horizontal length of the landmark LA detected by the rider 21, long lines in the most lateral direction in the group of points showing the landmark LA detected by the rider 21 between the ends of the points The length between the two points of.
 上述したジャイロセンサ24の補正処理を図11のフローチャートを参照して説明する。このフローチャートは制御部15で実行される。まず、ステップS301において、取得部15aが、地図DB10から自車位置近辺のランドマーク情報を読み取る。ランドマーク情報には、上述したランドマーク表面の法線の情報と、ランドマークLAの縦方向の長さ及びランドマークLAの横方向の長さの情報と、が含まれる。 The correction process of the above-described gyro sensor 24 will be described with reference to the flowchart of FIG. This flowchart is executed by the control unit 15. First, in step S301, the acquisition unit 15a reads landmark information in the vicinity of the vehicle position from the map DB 10. The landmark information includes the information on the normal to the surface of the landmark described above, and the information on the length in the vertical direction of the landmark LA and the length in the horizontal direction of the landmark LA.
 次に、ステップS302において、認識部15bが、ライダ21による検出結果からランドマーク検出処理を行って、ステップS303において、評価部15eが、上述した(12)式により評価指標aを算出する。 Next, in step S302, the recognition unit 15b performs landmark detection processing from the detection result of the lidar 21. In step S303, the evaluation unit 15e calculates an evaluation index a according to the above-described equation (12).
 次に、ステップS304において、評価部15eが、評価指標aが所定値よりも大きいか否か判断する。評価指標aが所定値より小さい場合(Noの場合)はステップS301に戻り、評価指標aが所定値より大きい場合(Yesの場合)はステップS305において、評価部15eが、ランドマークを利用したヨーレート算出が可能と判断する。 Next, in step S304, the evaluation unit 15e determines whether the evaluation index a is larger than a predetermined value. If the evaluation index a is smaller than the predetermined value (in the case of No), the process returns to step S301. If the evaluation index a is larger than the predetermined value (in the case of Yes), the evaluation unit 15e performs the yaw rate using the landmark in step S305. It is judged that calculation is possible.
 次に、ステップS306において、算出部15dが、車線変更可能か否かを判断する。この判断はステップS106と同様である。車線変更可能でない場合(Noの場合)はステップS301に戻り、車線変更可能な場合(Yesの場合)はステップS307において車線変更動作を行う。つまり、自動運転車両C内の操舵機能やアクセル、ブレーキ等を制御する制御装置に対して車線変更のための制御信号を出力部16を介して出力する。 Next, in step S306, the calculation unit 15d determines whether the lane change is possible. This determination is the same as step S106. If it is not possible to change the lane (in the case of No), the process returns to step S301. If the lane change is possible (in the case of Yes), the lane change operation is performed in step S307. That is, a control signal for changing the lane is output to the control device that controls the steering function, the accelerator, the brake, and the like in the automatically driven vehicle C via the output unit 16.
 次に、ステップS308において、算出部15dが、上述した(8)式~(11)式によりヨーレートを算出する。ヨーレート算出処理については後述する図12のフローチャートで説明する。そして、ステップS309において、第1の実施例で説明した(7)式によりジャイロセンサ24の感度とオフセットの算出(補正)をする。 Next, in step S308, the calculation unit 15d calculates the yaw rate according to the equations (8) to (11) described above. The yaw rate calculation process will be described with reference to the flowchart of FIG. 12 described later. Then, in step S309, the sensitivity and offset of the gyro sensor 24 are calculated (corrected) by the equation (7) described in the first embodiment.
 次に、ヨーレート算出処理(ステップS308)について図12のフローチャートを参照して説明する。このフローチャートは主に算出部15dで実行される。 Next, the yaw rate calculation process (step S308) will be described with reference to the flowchart of FIG. This flowchart is mainly executed by the calculation unit 15d.
 まず、ステップS400において、ライダ21がランドマークLAへ照射したレーザ光の照射角θを求める。 First, in step S400, the irradiation angle θ of the laser beam emitted to the landmark LA by the lidar 21 is determined.
 次に、ステップS401において、ステップS301で取得したランドマーク表面の法線の情報に基づいてライダ21から照射したレーザ光とランドマークLA表面の法線との角度を求める。 Next, in step S401, the angle between the laser light emitted from the lidar 21 and the normal to the surface of the landmark LA is determined based on the information on the normal to the surface of the landmark acquired in step S301.
 次に、ステップS402において、(8)式により前時刻の角度との差を求め、ステップS403において、前時刻との照射角θの差と、ステップS402で求めた角度差ΔΦから(9)式によりヨー角度変化ΔΨを求める。 Next, in step S402, the difference with the angle of the previous time is obtained by equation (8), and in step S403, the difference between the irradiation angle θ with the previous time and the angle difference ΔΦ obtained in step S402 The yaw angle change ΔΨ is obtained from
 次に、ステップS404において、算出部15dが、(10)式により前回スキャン時刻からの時間差Δt(k)を求め、ステップS405において、算出部15dが、(11)式によりヨーレートΨ_dot(k)を算出する。 Next, in step S404, the calculation unit 15d obtains the time difference Δt (k) from the previous scan time by equation (10), and in step S405, the calculation unit 15d calculates the yaw rate Ψ_dot (k) by equation (11). calculate.
 本実施例によれば、地物の法線方向の向きを設定する設定部15cを備え、算出部15dは、地点P1の時刻(第1時刻)における自動運転車両CからのランドマークLA(地物)の向きと、地点P2の時刻(第2時刻)における自動運転車両CからのランドマークLA(地物)の向き及びΔΦとに基づいて自動運転車両Cのヨー角変化量を算出している。このようにすることにより、例えば道路標識等の地物を用いてヨーレートの算出をすることができる。 According to the present embodiment, the calculation unit 15 d includes the setting unit 15 c that sets the direction of the normal direction of the feature, and the calculation unit 15 d is a landmark LA from the automatically driven vehicle C at the time (first time) of the point P1. Calculate the yaw angle change amount of the automatically driven vehicle C based on the direction of the object and the direction of the landmark LA (feature) from the automatically driven vehicle C at the time of the point P2 (second time) and ΔΔ There is. By doing this, it is possible to calculate the yaw rate using a feature such as a road sign, for example.
 なお、上述した実施例において、例えば図13に示すように、白線および道路標識等のランドマークといった複数の地物が検出された場合は、それぞれの地物に基づいてヨーレートを算出して平均化処理によって精度を向上させることができる。この平均化処理としては重み付け平均を行うことが挙げられる。このようにすることにより、それぞれの精度に応じた平均化が行われ、ヨーレートのさらなる高精度化が可能となる。例えば、M個のヨーレートが算出された場合、重み付け値をwとすると次の(13)式で算出できる。なお、(13)式において、ランドマークが対象の場合はw=aとし、白線が対象の場合はw=b×cとする。あるいは、bとcの平均値を用いてもよい。
Figure JPOXMLDOC01-appb-M000013
In the embodiment described above, for example, as shown in FIG. 13, when a plurality of features such as white lines and landmarks such as road signs are detected, the yaw rate is calculated based on the respective features and averaged. Processing can improve accuracy. As this averaging process, weighted averaging may be mentioned. By doing this, averaging according to each precision is performed, and it becomes possible to further improve the yaw rate. For example, when M yaw rates are calculated, the weighting value can be calculated by the following equation (13), where w is a weight value. In the equation (13), w = a is set when a landmark is a target, and w = b × c is set when a white line is a target. Alternatively, an average value of b and c may be used.
Figure JPOXMLDOC01-appb-M000013
 また、本発明は上記実施例に限定されるものではない。即ち、当業者は、従来公知の知見に従い、本発明の骨子を逸脱しない範囲で種々変形して実施することができる。かかる変形によってもなお本発明の情報処理装置の構成を具備する限り、勿論、本発明の範疇に含まれるものである。 Further, the present invention is not limited to the above embodiment. That is, those skilled in the art can carry out various modifications without departing from the gist of the present invention in accordance with conventionally known findings. As long as the configuration of the information processing apparatus of the present invention is provided even by such a modification, it is of course included in the scope of the present invention.
  1 検出装置
  15   制御部(情報処理装置)
  15a  取得部(第1取得部、第2取得部)
  15b  認識部
  15c  設定部
  15d  算出部(補正部)
  15e  評価部
  21   ライダ(検出部)
  S302 ライダによるランドマーク検出処理(取得工程)
  S307 車線変更動作(出力工程)
  S308 ヨーレート算出処理(算出工程)
1 Detector 15 Control Unit (Information Processing Device)
15a Acquisition unit (first acquisition unit, second acquisition unit)
15b recognition unit 15c setting unit 15d calculation unit (correction unit)
15e evaluation unit 21 lidar (detection unit)
S302 Landmark detection process by lidar (acquisition process)
S307 Lane change operation (output process)
S308 yaw rate calculation process (calculation process)

Claims (8)

  1.  自律走行する移動体の周辺の地物を検出する検出部の検出結果を取得する第1取得部と、
     前記取得された検出結果に基づき前記移動体の移動方向が変化するように当該移動体を制御するための制御情報を出力する出力部と、
     前記移動体が前記制御情報によって移動方向が変化しているときの前記検出部の検出結果に基づき、前記移動方向が変化しているときの前記移動体のヨー角変化量を算出する算出部と、
    を備えることを特徴とする情報処理装置。
    A first acquisition unit that acquires a detection result of a detection unit that detects a feature around a mobile object that travels autonomously;
    An output unit that outputs control information for controlling the moving body such that the moving direction of the moving body is changed based on the acquired detection result;
    A calculating unit that calculates a yaw angle change amount of the moving body when the moving direction is changing, based on a detection result of the detecting unit when the moving direction is changing according to the control information; ,
    An information processing apparatus comprising:
  2.  前記第1取得部は、前記移動体の周辺の、走行路上に沿った連続性を有する地物を検出する検出部の検出結果を、所定時間間隔で連続的に取得し、
     前記検出結果から所定の規則に基づいて前記地物の基準線を認識する認識部と、
     前記基準線の向きを設定する第1設定部と、を備え、
     前記算出部は、第1時刻における前記基準線に対する前記移動体の移動方向と、第2時刻における前記基準線に対する前記移動体の移動方向とに基づいて前記移動体のヨー角変化量を算出する、
    ことを特徴とする請求項1に記載の情報処理装置。
    The first acquisition unit continuously acquires, at predetermined time intervals, detection results of a detection unit that detects a feature having continuity along a traveling path around the moving body,
    A recognition unit that recognizes a reference line of the feature based on a predetermined rule from the detection result;
    A first setting unit configured to set the direction of the reference line;
    The calculation unit calculates the yaw angle change amount of the moving body based on the moving direction of the moving body with respect to the reference line at the first time and the moving direction of the moving body with respect to the reference line at the second time. ,
    An information processing apparatus according to claim 1, characterized in that.
  3.  前記地物の法線方向の向きを設定する第2設定部を備え、
     前記算出部は、第1時刻における前記移動体からの前記地物の向きと、第2時刻における前記自動運転車両からの前記地物の向き及び前記第1時刻からの前記地物の法線方向の向きに対する角度の変化分とに基づいて前記移動体のヨー角変化量を算出する、
    ことを特徴とする請求項1に記載の情報処理装置。
    A second setting unit configured to set the direction of the normal direction of the feature;
    The calculation unit is configured to determine an orientation of the feature from the moving body at a first time, an orientation of the feature from the autonomous driving vehicle at a second time, and a normal direction of the feature from the first time Calculating the yaw angle change amount of the moving object based on the change amount of the angle with respect to the direction of
    An information processing apparatus according to claim 1, characterized in that.
  4.  前記地物及び前記基準線の少なくとも一方の検出精度を評価する評価部を更に有し、
     前記算出部は、前記検出精度の評価結果が所定以上であった場合は前記ヨー角変化量を算出する、
    ことを特徴とする請求項2または3のうちいずれか一項に記載の情報処理装置。
    The system further includes an evaluation unit that evaluates detection accuracy of at least one of the feature and the reference line,
    The calculation unit calculates the yaw angle change amount when the evaluation result of the detection accuracy is a predetermined value or more.
    The information processing apparatus according to any one of claims 2 or 3, characterized in that:
  5.  ジャイロセンサから出力された情報を取得する第2取得部と、
     前記算出部で算出されたヨー角変化量に基づいて前記ジャイロセンサから出力された情報を補正する補正部と、
    を更に備えることを特徴とする請求項1から4のうちいずれか一項に記載の情報処理装置。
    A second acquisition unit that acquires information output from the gyro sensor;
    A correction unit that corrects information output from the gyro sensor based on the yaw angle change amount calculated by the calculation unit;
    The information processing apparatus according to any one of claims 1 to 4, further comprising:
  6.  自律走行する移動体の周辺の地物を検出する検出部と、
     前記検出部の検出結果を取得する第1取得部と、
     前記取得された検出結果に基づき前記移動体の移動方向が変化するように当該移動体を制御するための制御情報を出力する出力部と、
     前記移動体が前記制御情報によって移動方向が変化しているときの前記検出部の検出結果に基づき、前記移動方向が変化しているときの前記移動体のヨー角変化量を算出する算出部と、
    を備えることを特徴とする検出装置。
    A detection unit that detects features around a mobile object traveling autonomously;
    A first acquisition unit that acquires a detection result of the detection unit;
    An output unit that outputs control information for controlling the moving body such that the moving direction of the moving body is changed based on the acquired detection result;
    A calculating unit that calculates a yaw angle change amount of the moving body when the moving direction is changing, based on a detection result of the detecting unit when the moving direction is changing according to the control information; ,
    A detection apparatus comprising:
  7.  自律走行する移動体の周辺の地物を検出する検出部の検出結果に基づいて所定の処理を行う情報処理装置で実行される情報処理方法であって、
     前記検出部の検出結果を取得する取得工程と、
     前記取得された検出結果に基づき前記移動体の移動方向が変化するように当該移動体を制御するための制御情報を出力する出力工程と、
     前記移動体が前記制御情報によって移動方向が変化しているときの前記検出部の検出結果に基づき、前記移動方向が変化しているときの前記移動体のヨー角変化量を算出する算出工程と、
    を含むことを特徴とする情報処理方法。
    An information processing method executed by an information processing apparatus that performs a predetermined process based on a detection result of a detection unit that detects a feature around a mobile object traveling autonomously,
    An acquisition step of acquiring a detection result of the detection unit;
    An output step of outputting control information for controlling the moving body such that the moving direction of the moving body is changed based on the acquired detection result;
    Calculating the yaw angle change amount of the moving body when the moving direction is changing, based on the detection result of the detection unit when the moving direction is changing according to the control information; ,
    An information processing method comprising:
  8.  請求項7に記載の情報処理方法を、コンピュータにより実行させることを特徴とする情報処理プログラム。 An information processing program causing a computer to execute the information processing method according to claim 7.
PCT/JP2018/046194 2017-12-19 2018-12-14 Information processing device WO2019124279A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017242746 2017-12-19
JP2017-242746 2017-12-19

Publications (1)

Publication Number Publication Date
WO2019124279A1 true WO2019124279A1 (en) 2019-06-27

Family

ID=66993532

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/046194 WO2019124279A1 (en) 2017-12-19 2018-12-14 Information processing device

Country Status (1)

Country Link
WO (1) WO2019124279A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07179140A (en) * 1993-12-24 1995-07-18 Nissan Motor Co Ltd Automatic maneuver device for vehicle
JP2003048564A (en) * 2001-08-07 2003-02-18 Nissan Motor Co Ltd Vehicular steering control system
JP2012066777A (en) * 2010-09-27 2012-04-05 Mazda Motor Corp Yaw rate deviation detection apparatus
JP2014106683A (en) * 2012-11-27 2014-06-09 Aisin Aw Co Ltd Road gradient record system, road gradient record method, road gradient record program, driving support system, driving support method, and driving support program
JP2016133838A (en) * 2015-01-15 2016-07-25 トヨタ自動車株式会社 Composite line determination device and composite line determination method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07179140A (en) * 1993-12-24 1995-07-18 Nissan Motor Co Ltd Automatic maneuver device for vehicle
JP2003048564A (en) * 2001-08-07 2003-02-18 Nissan Motor Co Ltd Vehicular steering control system
JP2012066777A (en) * 2010-09-27 2012-04-05 Mazda Motor Corp Yaw rate deviation detection apparatus
JP2014106683A (en) * 2012-11-27 2014-06-09 Aisin Aw Co Ltd Road gradient record system, road gradient record method, road gradient record program, driving support system, driving support method, and driving support program
JP2016133838A (en) * 2015-01-15 2016-07-25 トヨタ自動車株式会社 Composite line determination device and composite line determination method

Similar Documents

Publication Publication Date Title
US10401503B2 (en) Location estimation device
CN115427759B (en) Map information correction method, driving assistance method, and map information correction device
JP7155284B2 (en) Measurement accuracy calculation device, self-position estimation device, control method, program and storage medium
JP6806891B2 (en) Information processing equipment, control methods, programs and storage media
US11828603B2 (en) Measurement device, measurement method and program
JP2020032986A (en) Posture estimation device, control method, program and storage medium
JP2023164553A (en) Position estimation device, estimation device, control method, program and storage medium
JP2023075184A (en) Output device, control method, program, and storage medium
JP2023118751A (en) Self-position estimation device
JP2023078138A (en) Output device, control method, program, and storage medium
US20230243657A1 (en) Vehicle control device and host vehicle position estimation method
JP6968877B2 (en) Self-position estimator, control method, program and storage medium
WO2021112074A1 (en) Information processing device, control method, program, and storage medium
WO2018212302A1 (en) Self-position estimation device, control method, program, and storage medium
WO2019124279A1 (en) Information processing device
WO2019124278A1 (en) Information processing device
WO2019124277A1 (en) Information processing device
WO2018212290A1 (en) Information processing device, control method, program and storage medium
US12140697B2 (en) Self-position estimation device, self-position estimation method, program, and recording medium
US20240053440A1 (en) Self-position estimation device, self-position estimation method, program, and recording medium
US12146746B2 (en) Information processing device, control method, program and storage medium
WO2020045057A1 (en) Posture estimation device, control method, program, and storage medium
JP2024161585A (en) Self-location estimation device
JP2023151307A (en) Runway estimation method and runway estimation device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18890012

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18890012

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP