WO2018070021A1 - 駐車支援方法及び駐車支援装置 - Google Patents
駐車支援方法及び駐車支援装置 Download PDFInfo
- Publication number
- WO2018070021A1 WO2018070021A1 PCT/JP2016/080386 JP2016080386W WO2018070021A1 WO 2018070021 A1 WO2018070021 A1 WO 2018070021A1 JP 2016080386 W JP2016080386 W JP 2016080386W WO 2018070021 A1 WO2018070021 A1 WO 2018070021A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- parking
- target position
- surrounding situation
- map
- parking target
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 45
- 230000007613 environmental effect Effects 0.000 description 63
- 238000012986 modification Methods 0.000 description 9
- 230000004048 modification Effects 0.000 description 9
- 238000001514 detection method Methods 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 230000010354 integration Effects 0.000 description 3
- 238000004519 manufacturing process Methods 0.000 description 3
- 240000004050 Pentaglottis sempervirens Species 0.000 description 2
- 235000004522 Pentaglottis sempervirens Nutrition 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 239000003086 colorant Substances 0.000 description 1
- 230000000052 comparative effect Effects 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000001502 supplementing effect Effects 0.000 description 1
- 230000008685 targeting Effects 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/06—Automatic manoeuvring for parking
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/02—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/10—Interpretation of driver requests or demands
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B62—LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
- B62D—MOTOR VEHICLES; TRAILERS
- B62D15/00—Steering not otherwise provided for
- B62D15/02—Steering position indicators ; Steering position determination; Steering aids
- B62D15/027—Parking aids, e.g. instruction means
- B62D15/0285—Parking performed automatically
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/168—Driving aids for parking, e.g. acoustic or visual feedback on parking space
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo, light or radio wave sensitive means, e.g. infrared sensors
- B60W2420/403—Image sensing, e.g. optical camera
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo, light or radio wave sensitive means, e.g. infrared sensors
- B60W2420/408—Radar; Laser, e.g. lidar
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2520/00—Input parameters relating to overall vehicle dynamics
- B60W2520/14—Yaw
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2552/00—Input parameters relating to infrastructure
- B60W2552/53—Road markings, e.g. lane marker or crosswalk
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
- B60W2554/20—Static objects
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
- B60W2554/40—Dynamic objects, e.g. animals, windblown objects
- B60W2554/402—Type
- B60W2554/4026—Cycles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Y—INDEXING SCHEME RELATING TO ASPECTS CROSS-CUTTING VEHICLE TECHNOLOGY
- B60Y2300/00—Purposes or special features of road vehicle drive control systems
- B60Y2300/06—Automatic manoeuvring for parking
Definitions
- the present invention relates to a parking support method and a parking support device for executing automatic parking.
- an object of the present invention is to provide a parking support method and a parking support apparatus that can store a surrounding situation of a parking target position suitable for reference during automatic parking.
- the surrounding situation of the parking target position for reference when executing automatic parking is stored.
- the surrounding situation of the parking target position is detected, and the detected surrounding situation is presented to the occupant.
- a parking support method and a parking support device for storing the surrounding situation are provided when the suitability input by the occupant with respect to the presented surrounding situation is received and the suitability input that the occupant is suitable for is obtained.
- the surrounding situation is stored when the suitability input that the occupant is suitable for is obtained, the surrounding situation of the parking target position suitable for referring to the automatic parking can be stored.
- a support method and a parking support device can be provided.
- FIG. 2A is a schematic view showing a state around a parking target position according to the first embodiment of the present invention
- FIG. 2B is a schematic view showing a state of parking at the parking target position.
- FIG. 4A is a schematic diagram illustrating a state around a parking target position according to a comparative example
- FIG. 4B is a schematic diagram illustrating a state in which an environment map cannot be correctly generated.
- first and second embodiments of the present invention will be described with reference to the drawings.
- the same or similar parts are affixed with the same or similar reference numerals.
- the drawings are schematic, and the relationship between the thickness and the planar dimensions, the ratio of the thickness, and the like are different from the actual ones.
- portions having different dimensional relationships and ratios are included between the drawings.
- the first and second embodiments shown below exemplify apparatuses and methods for embodying the technical idea of the present invention, and the technical idea of the present invention is a material of a component.
- the shape, structure, arrangement, etc. are not specified as follows.
- the technical idea of the present invention can be variously modified within the technical scope defined by the claims described in the claims.
- the parking assistance apparatus according to the first embodiment of the present invention can be mounted on a vehicle (hereinafter, the vehicle equipped with the parking assistance apparatus according to the first embodiment of the present invention is referred to as “own vehicle”).
- the parking assist apparatus according to the first embodiment of the present invention includes a control device (controller) 1, an ambient condition sensor 2, a vehicle condition sensor 3, an input device (interface) 4, a storage device 5, and A presentation device 6 is provided.
- the ambient situation sensor 2 detects the ambient situation of the host vehicle including the front situation, the rear situation, and the side situation of the own vehicle.
- the surroundings of the host vehicle that can be detected by the surroundings sensor 2 are features such as buildings, walls, pillars, signs, other vehicles, bicycles and other stationary solid objects (obstacles), white lines on the road, and parking frames ( Landmark).
- the surrounding situation sensor 2 illustrates a case where the camera 2a and the laser radar 2b are provided, the type and number of the surrounding situation sensors 2 are not particularly limited.
- the vehicle status sensor 3 includes a wheel speed sensor 3a and a yaw rate sensor 3b.
- the wheel speed sensor 3a detects the wheel speed of the host vehicle, and outputs the detected wheel speed to the control device 1 as odometry information.
- the yaw rate sensor 3b detects the change rate (yaw rate) of the yaw angle of the host vehicle, and outputs the detected yaw rate to the control device 1 as odometry information.
- the input device 4 accepts various instruction information of the occupant.
- the input device 4 may be a touch panel, a switch, or a button of the presentation device 6, or may be a switch or button installed on the center console or the like separately from the presentation device 6. Further, the input device 4 may be a voice input device.
- the storage device 5 can be configured by a semiconductor storage device, a magnetic storage device, an optical storage device, or the like, and may include a register, a cache memory, or the like.
- the storage device 5 may be built in the control device 1.
- a navigation device provided with a display such as a liquid crystal display can be used.
- the presentation device 6 is installed at a position where it can be easily seen by a passenger such as an instrument panel in the vehicle interior.
- the control device 1 is a controller such as an electronic control unit (ECU), for example, and can be composed of a computer including a central processing unit (CPU) and a storage device, a processor equivalent to the computer, and the like.
- the control device 1 may have a programmable logic device (PLD) such as a field programmable gate array (FPGA), a functional logic circuit set in a general-purpose semiconductor integrated circuit, etc. It doesn't matter.
- PLD programmable logic device
- FPGA field programmable gate array
- the parking assist device detects the surrounding situation of the parking target position by the surrounding situation sensor 2 when parking once or a plurality of times, thereby surrounding the parking target position. Is stored in the storage device 5.
- “one-time parking” includes, for example, an operation until the host vehicle reaches the parking target position. Further, in the “one-time parking”, even if the host vehicle does not reach the parking target position, if the surrounding situation of the parking target position can be detected by the surrounding situation sensor 2, the host vehicle will The operation
- multiple parking means repeating one parking, but the timing of repeating is not particularly limited.
- the second parking may be performed immediately after the first parking. You may park the second time the day after parking.
- the multiple parkings may be the same parking operations or different parking operations.
- the operation until the first parking of the two parkings reaches the parking target position, and the second parking may be an operation of traveling around the parking target position. In the first embodiment of the present invention, a case where one parking is executed will be described.
- the surrounding situation of the parking target position stored in the storage device 5 as a learning result by the parking assist device according to the first embodiment of the present invention can be referred to in the next and subsequent parking.
- the timing of “parking after the next time” is not particularly limited, and includes, for example, immediately after learning the surrounding situation of the parking target position or after the day after learning the surrounding situation of the parking target position.
- the parking operation of “parking after the next time” includes the operation of the vehicle targeting the parking target position such as the operation of the host vehicle reaching the parking target position and the operation of traveling around the parking target position.
- FIG. 2A and FIG. 2B an example of the learning method of the surrounding situation of a parking target position is demonstrated. It is assumed that the vehicle is parked at the parking target position P2 with the position P1 of the host vehicle 21 shown in FIG. 2A as the parking start position. In the vicinity of the parking target position P2, there are an obstacle 31 that is a building, obstacles 32 and 33 that are garage pillars, and another vehicle 22 that is parked in the garage. The occupant inputs information on the parking start position P ⁇ b> 1 via the input device 4. Thereafter, parking is started, and the vehicle is parked from the parking start position P1 to the parking target position P2, as indicated by the dashed arrow in FIG. 2B.
- the surrounding state sensor 2 detects the surrounding state including obstacles 31, 32, and 33 around the parking target position P2.
- the occupant inputs information on the parking target position P ⁇ b> 2 to the control device 1 via the input device 4.
- an occupant may perform parking manually, or may control the own vehicle 21 with a parking assistance device and perform parking.
- the map generation unit 11 generates an environment map (environment map data) including the surrounding situation of the parking target position based on the surrounding situation of the parking target position detected by the surrounding situation sensor 2, and stores the generated environment map in the storage device 5 is memorized.
- the map generation unit 11 generates an environmental map based on the ambient situation detected by the ambient situation sensor 2 and odometry information such as wheel speed and yaw rate detected by the vehicle situation sensor 3.
- FIG. 3 shows an example of the environmental map M11 generated by the map generation unit 11 for the peripheral area of the parking target position P2 shown in FIGS. 2A and 2B.
- the environment map M11 includes obstacles 31a, 32a, and 33a that exist around the parking target position P2.
- the obstacles 31a, 32a, 33a on the environment map M11 correspond to the obstacles 31, 32, 33 shown in FIGS. 2A and 2B.
- a spatial map composed of a plurality of dots is illustrated as the environment map M11.
- the present invention is not particularly limited to this, and is, for example, an ortho image subjected to ortho correction so as to connect overhead images. Also good.
- the map generation unit 11 sets the parking start position P1 on the environmental map M11 based on the information of the parking start position P1 input via the input device 4, and stores the set parking start position P1 in the storage device. 5 is memorized. Further, the map generation unit 11 sets the parking target position P2 on the environmental map M11 based on the information of the parking target position P2 input via the input device 4, and stores the set parking target position P2 in the storage device. 5 is memorized.
- FIG. 4A in a parking lot of a condominium or a commercial facility, parking frames 41 having the same interval and arrangement are arranged, and an environment map is generated when parking at the parking target position 41a.
- the parking frames 41 have the same interval and arrangement, so even if the distance and arrangement between the parking frames 41 are stored, for example,
- an environment map such as parking frames 42, 43, etc., deviating from the actual parking frame 41 may be generated due to a local solution.
- deviated with respect to the actual parking target position 41a are set.
- the same problem may occur when an occupant makes a mistake and parks in a parking frame different from the parking target position 41a shown in FIG. 4A.
- the presentation control unit 12 illustrated in FIG. 1 causes the presentation device 6 to present the surrounding state of the parking target position detected by the surrounding state sensor 2, and the surrounding state of the parking target position detected by the surrounding state sensor 2.
- the passenger is requested to input the suitability of
- the presentation control unit 12 provides the presentation device 6 with the environment map M11 including the parking start position P1 and the parking target position P2 illustrated in FIG. 3 as an example of the surrounding situation of the parking target position detected by the surrounding situation sensor 2.
- the passenger is requested to input the suitability of the environmental map M11 including the parking start position P1 and the parking target position P2.
- the presentation control unit 12 causes the presentation device 6 to present the surrounding situation of the parking target position, and at the same time, input "appropriateness of the surrounding situation of the parking target position?" It may be possible to prompt the passenger to input suitability by presenting character information such as “” or voice information.
- the input device 4 accepts an appropriateness input by the occupant for the surrounding situation of the presented parking target position when the surrounding situation of the parking target position is presented to the presentation device 6.
- the occupant visually recognizes the environmental map M11 including the parking start position P1 and the parking target position P2 presented on the presentation device 6, and determines whether the environmental map M11 is appropriate (that is, the obstacles 31a, 32a, 33a, The presence / absence of the other vehicle 22a and the appropriateness of the position), the appropriateness of the parking start position P1, and the appropriateness of the parking target position P2 are determined.
- the occupant inputs an appropriateness input indicating that it is inappropriate via the input device 4.
- the occupant determines that all of the environmental map M11, the parking start position P1, and the parking target position P2 are appropriate, the occupant inputs an appropriateness input that is appropriate via the input device 4.
- the determination unit 13 determines the suitability of the surrounding situation of the parking target position detected by the surrounding situation sensor 2 according to the suitability input of the occupant received by the input device 4. When the suitability input that the occupant finds appropriate is obtained, the determination unit 13 determines that the surrounding situation of the parking target position detected by the surrounding situation sensor 2 is appropriate for reference during automatic parking. It is adopted as data to be referred to during automatic parking and stored in the storage device 5. For example, the determination unit 13 may store the data of the surrounding situation of the parking target position detected by the surrounding situation sensor 2 in the storage device 5 as it is. Or the determination part 13 may memorize
- the determination unit 13 is inappropriate for referring to the surrounding situation of the parking target position detected by the surrounding situation sensor 2 during automatic parking. It is determined that there is, and is not adopted as data to be referred to during automatic parking, and is not stored in the storage device 5. At this time, the determination unit 13 causes the presentation device 6 to present that it is necessary to re-learn the surrounding situation of the parking target position, edit the environmental map M11 including the parking start position P1 and the parking target position P2, and the like. Also good.
- the parking support unit 14 reads the data of the surrounding situation of the parking target position stored in the storage device 5 when parking at the parking target position after the next time, and performs automatic parking using the surrounding situation data of the parking target position. Execute. For example, the parking support unit 14 estimates (initializes) the position of the host vehicle 21 on the environment map M11 in accordance with instruction information from the passenger via the input device 4. The parking assistance part 14 produces
- step S11 the map generation unit 11 determines whether or not the parking start position has been reached based on instruction information from the passenger via the input device 4, and when it is determined that the parking start position has been reached.
- the process proceeds to step S12.
- the position information of the parking target position is registered in advance in a navigation device such as the presentation device 6 and the vehicle position information obtained from a GPS signal or the like is referred to. It may be determined whether or not the parking start position has been reached, and the determination result may be presented to the occupant. For example, it is determined that the parking start position has been reached when the distance is within a fixed distance with respect to the parking target position.
- step S12 manual parking is started from the parking start position to the parking target position.
- the surrounding situation sensor 2 detects the surrounding situation of the parking target position including obstacles and feature points that exist around the parking target position.
- the vehicle status sensor 3 detects odometry information including wheel speed pulses and yaw rate.
- step S13 the map generation unit 11 determines whether the parking target position has been reached and parking has been completed based on the occupant's shift lever operation, parking brake operation, or the like. When it is determined that parking is completed, the process proceeds to step S14.
- step S ⁇ b> 14 the map generation unit 11 moves the position information of the obstacles and feature points in the surrounding situation detected by the surrounding situation sensor 2 based on the odometry information detected by the vehicle situation sensor 3, and then sets the parking target position.
- An environmental map including targets around is generated and stored in the storage device 5.
- the presentation control unit 12 causes the presentation device 6 to present the surrounding situation of the parking target position such as the environmental map generated by the map generation unit 11.
- the input device 4 accepts an appropriateness input for the surrounding situation of the presented parking target position when the surrounding situation of the parking target position is presented to the presentation device 6. The occupant inputs the suitability input for the surrounding situation of the parking target position presented on the presentation device 6 via the input device 4.
- step S ⁇ b> 15 the determination unit 13 determines whether the surrounding situation of the parking target position detected by the surrounding situation sensor 2 is appropriate as data to be referred to during automatic parking according to the passenger's suitability input received by the input device 4. Determine whether or not.
- the determination unit 13 determines that the surrounding state of the parking target position detected by the surrounding state sensor 2 is appropriate for reference during automatic parking, and the step The process proceeds to S16.
- step S ⁇ b> 16 the determination unit 13 adopts the surrounding situation of the parking target position as data (learning result) to be referred to during automatic parking and stores the data in the storage device 5.
- step S15 when the suitability input indicating that the occupant is not suitable is obtained in step S15, the determination unit 13 is inappropriate for referring to the surrounding situation of the parking target position detected by the surrounding situation sensor 2 during automatic parking. And the process proceeds to step S17.
- step S ⁇ b> 17 the determination unit 13 does not adopt the surrounding state of the parking target position as data (learning result) to be referred to when automatic parking, and does not store it in the storage device 5.
- the parking support program according to the first embodiment of the present invention causes the computer constituting the control device 1 to execute the procedure of the parking support method shown in FIG.
- the parking assistance program according to the embodiment of the present invention can be stored in, for example, the storage device 5.
- the surrounding situation of the parking target position is stored, and when the automatic parking is executed using the stored surrounding situation, the surroundings of the parking target position detected by the surrounding situation sensor 2
- the presentation device 6 presents an environmental map or the like that is the situation. And when the adequacy input which the passenger
- the occupant when presenting the surrounding situation including the parking target position when the presentation device 6 presents the environmental map or the like that is the surrounding situation of the parking target position detected by the surrounding situation sensor 2, the occupant can When inputting the suitability of the situation, it is possible to easily understand the positional relationship between the parking target position and obstacles existing around the parking target position.
- the presentation control unit 12 includes an environmental map generated by the map generation unit 11 as a surrounding situation of the parking target position detected by the surrounding situation sensor 2. The case where a parking target position candidate is presented at a position different from the parking target position in addition to the parking target position will be described.
- the presentation control unit 12 presents the parking target position 51 a and the parking target position candidates 51 b and 51 c different from the parking target position 51 a in the environmental map including the parking frame 51.
- the parking target position candidates 51b and 51c are positions where there is a possibility of a correct parking target position when the parking target position 51a is a position erroneously detected by the surrounding state sensor 2.
- the parking target position 51a and the parking target position candidates 51b and 51c are presented in different colors so that they can be distinguished from each other.
- the occupant inputs the suitability input of the environmental map including the parking frame 51 and the parking target position 51a via the input device 4 in consideration of the possibility that the parking target position candidates 51b and 51c are correct parking target positions.
- the occupant by presenting the surrounding situation including the parking target position candidates 51b and 51c different from the parking target position 51a, the occupant can set the parking target position candidates 51b, Using 51c as reference information, the suitability of the parking target position 51a can be confirmed.
- the presentation control unit 12 causes the presentation device 6 to present an environmental map that is a surrounding situation of the parking target position stored in the storage device 5 as illustrated in FIG.
- the environmental map includes parking frames 61, 62, and 63, and a parking target position 62a is presented so as to partially overlap the parking frame 62.
- the other vehicles 22 and 23 are parked in the parking frames 61 and 63. Of these, the other vehicle 23 protrudes from the parking frame 63 and is parked near the parking target position 62a. For this reason, it is in the situation where the own vehicle 21 is difficult to park at the parking target position 62a.
- the presentation control unit 12 presents a parking alternative position 62b in which the host vehicle 21 is easily parked at a position that is free from obstacles and is shifted from the parking target position 62a.
- the input device 4 inputs information for selecting the parking target position 62 a or the parking alternative position 62 b from the occupant via the input device 4.
- the parking assistance part 14 outputs the control signal for parking in the parking target position 62a or the parking alternative position 62b selected by the passenger
- parking alternative position 62b is shown in a different position near parking target position 62a, and it parks in either parking target position 62a or parking alternative position 62b. Request the occupant to choose whether to do this.
- the parking target position 62a can be changed according to the parking environment. For example, when there is an obstacle such as a bicycle at the parking target position at home, the parking target position can be changed so as to avoid the obstacle, and automatic parking can be performed.
- the configuration of the parking support apparatus according to the second embodiment of the present invention is the same as the configuration of the parking support apparatus according to the first embodiment of the present invention.
- An ambient condition sensor 2 a vehicle condition sensor 3, an input device (interface) 4, a storage device 5, and a presentation device 6.
- the surrounding situation sensor 2 detects the surrounding situation of the parking target position including obstacles and feature points existing around the parking target position every time parking is performed a plurality of times.
- the vehicle state sensor 3 detects odometry information including wheel speed pulses, yaw rate and the like every time parking is performed a plurality of times.
- the map generation unit 11 generates an environmental map based on the ambient situation detected by the ambient situation sensor 2 and the odometry information such as the wheel speed and the yaw rate detected by the vehicle situation sensor 3 every time parking is performed a plurality of times.
- the generated environment map is stored in the storage device 5. Further, the map generation unit 11 sets the parking start position P1 on the environmental map M11 based on the information of the parking start position P1 input via the input device 4, and stores the set parking start position P1 in the storage device. 5 is memorized. Further, the map generation unit 11 sets the parking target position P2 on the environmental map M11 based on the information of the parking target position P2 input via the input device 4, and stores the set parking target position P2 in the storage device. 5 is memorized.
- the map generation unit 11 generates data of an environmental map (hereinafter sometimes referred to as an “integrated map”) obtained by integrating a plurality of environmental maps generated for each parking of a plurality of times.
- an integrated map obtained by integrating a plurality of environmental maps generated for each parking of a plurality of times.
- a method for integrating a plurality of environment maps for example, a method using a least square method so as to minimize an error between corresponding feature points can be employed.
- an ICP (Iterative Closest Point) algorithm can be employed when the ambient condition is given as point cloud information using a laser range finder (LRF) as the ambient condition sensor 2.
- LRF laser range finder
- the map generation unit 11 may generate an integrated map by collectively integrating a plurality of environmental maps obtained for each parking after a plurality of parkings are completed. For example, as shown in FIG. 8, the map generator 11 obtains the environmental map M11 obtained by the first parking after the third parking, the environmental map M12 obtained by the second parking, and the third parking.
- the integrated environment map M13 may be integrated to generate an integrated map M21. 8 illustrates the case where the environment maps M11, M12, M13 and the integrated map M21 have the same configuration, the data of the environment maps M11, M12, M13 and the integrated map M21 may be different ( The same applies to FIGS. 9 and 10.
- the map generation unit 11 may generate (update) an integrated map each time the vehicle is parked after the second parking. For example, as shown in FIG. 9, the map generation unit 11 integrates the environmental map M11 obtained by the first parking and the environmental map M12 obtained by the second parking after the second parking. The first integrated map M21 is generated. Then, after the third parking, the map generation unit 11 generates the second integrated map M22 by integrating the environmental map M13 obtained by the third parking and the first integrated map M21. Also good.
- the map generation unit 11 may generate an integrated map by dividing a plurality of environmental maps into several groups and integrating the plurality of environmental maps for each group. For example, as shown in FIG. 10, the map generation unit 11 integrates the environmental map M11 obtained by the first parking and the environmental map M12 obtained by the second parking as a first group, 1 integrated map M21 is generated. On the other hand, the environment map M13 obtained by the third parking and the environment map M14 obtained by the fourth parking are integrated as a second group to generate a second integrated map M22. The map generation unit 11 further generates a third integrated map M23 by integrating the first integrated map M21 and the second integrated map M22.
- FIG. 11 shows an example of the integrated map M21 generated by the map generation unit 11.
- the integrated map M21 includes obstacles 31a, 32a, 33a around the parking target position P2. Furthermore, a parking start position P1 and a parking target position P2 are set on the integrated map M21. Note that the integrated map M21 illustrated in FIG. 11 illustrates the same case as the environmental map M11 illustrated in FIG. 3, but the integrated map M21 is more accurate by supplementing a plurality of learning results. It may be map data.
- the presentation control unit 12 causes the presentation device 6 to present an integrated map including the parking start position and the parking target position stored in the storage device 5, and whether or not the integrated map including the parking start position and the parking target position is appropriate for the occupant. Require input. For example, the presentation control unit 12 causes the presentation device 6 to present the integrated map M21 including the parking start position P1 and the parking target position P2 illustrated in FIG. 11, and the integrated map M21 including the parking start position P1 and the parking target position P2. Require passengers to enter the suitability for
- the presentation control unit 12 generates a unified map M21 by integrating a plurality of environmental maps M11, M12, and M13 after a plurality of parkings are completed.
- the integrated map M21 is presented, and the suitability input by the passenger for the integrated map M21 is requested.
- the first integrated map M21 and the second integrated map M22 are sequentially generated for each second and subsequent parking
- the 2nd integrated map M22 may be shown and the suitability input by the passenger
- only the second integrated map M22 that is finally generated may be presented without presenting the first integrated map M21, and the suitability input by the passenger for only the second integrated map M22 may be requested. .
- a first integrated map M21 and a second integrated map M22 are generated for each group, and the first integrated map M21 and the second integrated map M22 are integrated to generate a third integrated map M21.
- the integrated map M23 is generated, the first integrated map M21, the second integrated map M22, and the third integrated map M23 are sequentially presented, and the first integrated map M21, the second integrated map M22, and You may request
- the first integrated map M21 and the second integrated map M22 are not presented, but only the finally generated third integrated map M23 is presented, and the propriety input by the occupant for only the third integrated map M23 is presented. May be requested.
- the input device 4 accepts an appropriateness input by the occupant for the surrounding situation of the presented parking target position when the surrounding situation of the parking target position is presented to the presentation device 6. For example, when the integrated map M21 shown in FIG. 11 is presented to the presentation device 6, the occupant displays the presented integrated map M21 itself (that is, the obstacles 31a, 32a, 33a and other vehicles 22a in the integrated map M21). If it is determined that at least one of the parking start position P1 on the integrated map M21 and the parking target position P2 on the integrated map M21 is not appropriate, an appropriateness input indicating that it is inappropriate is input via the input device 4. To do.
- the occupant determines that all of the presented integrated map M21 itself, the parking start position P1 on the integrated map M21, and the parking target position P2 on the integrated map M21 are appropriate, the occupant inputs an appropriate / appropriate input. Input via the device 4.
- the determination unit 13 determines the suitability of the integrated map M21 according to the suitability input of the occupant received by the input device 4.
- the determination unit 13 determines that the integrated map M21 is appropriate, adopts the integrated map M21 as map data to be referred to during automatic parking, and the storage device 5 Remember me.
- the determination unit 13 determines that the integrated map M21 is inappropriate, and adopts the integrated map M21 as map data to be referred to during automatic parking. And not stored in the storage device 5.
- the parking support unit 14 reads the data of the surrounding situation of the parking target position stored in the storage device 5 when parking at the parking target position after the next time, and performs automatic parking using the surrounding situation data of the parking target position. Execute.
- the presentation control unit 12 may request the suitability input for the environmental map by causing the presentation device 6 to present the environmental map generated by the map generation unit 11 for every multiple parkings.
- the input device 4 may accept the suitability input by the occupant for the environmental map when the environmental map is presented on the presentation device 6 for every multiple parkings.
- the determination unit 13 may determine the suitability of the environmental map in accordance with the passenger's suitability input received by the input device 4 for every multiple parkings. That is, when the suitability input that is appropriate for the occupant is obtained, the determination unit 13 determines that the environment map is appropriate, and stores the environment map in the storage device 5. On the other hand, when a suitability input indicating that the passenger is not suitable is obtained, the determination unit 13 determines that the environment map is inappropriate and does not store the environment map in the storage device 5.
- step S ⁇ b> 21 the map generation unit 11 determines whether or not the parking start position has been reached based on instruction information from the passenger via the input device 4, and when it is determined that the parking start position has been reached. Control goes to step S22.
- step S22 manual parking is started from the parking start position to the parking target position.
- the ambient condition sensor 2 detects obstacles and feature points around the parking target position.
- the vehicle status sensor 3 detects odometry information including wheel speed pulses and yaw rate.
- step S23 the map generation unit 11 determines whether or not the parking target position has been reached and parking is completed based on the occupant's shift lever operation, parking brake operation, or the like. When it is determined that parking is completed, the process proceeds to step S24.
- the map generation unit 11 includes an environmental map (display image) including a parking start position and a parking target position based on the ambient situation detected by the ambient situation sensor 2 and the odometry information detected by the vehicle situation sensor 3. Is generated.
- step S25 the map generation unit 11 determines whether or not the parking executed in steps S21 to S23 is the first parking for the same parking target position. If it is determined that the parking is the first time, there is only one environment map generated this time by the map generation unit 11, and the integrated map cannot be generated, so the processing is completed. On the other hand, if it is determined in step S25 that the parking is not the first time, the process proceeds to step S26.
- step S26 the map generation unit 11 determines whether or not the detection (learning) of the surrounding situation of the parking target position is sufficient. For example, if the number of parkings (learning times) for the same parking target position is set as a threshold, it is determined that the surroundings are sufficiently detected when the number of parkings is 3 or more, and the surroundings are detected when the number of parking is less than 3 Is detected as insufficient. Alternatively, with the obstacle existence probability in the surrounding situation of the parking target position as a threshold, it is determined that the surrounding situation is sufficiently detected when the obstacle existence probability is equal to or greater than the threshold, and the obstacle existence probability is less than the threshold. In some cases, it may be determined that the detection of the surrounding situation is insufficient. Note that the step of determining whether or not the ambient condition is sufficiently detected in step S26 may be omitted, and the process may proceed to step S27.
- step S ⁇ b> 27 the presentation control unit 12 presents an environmental map (display image) including the parking start position and the parking target position generated by the map generation unit 11 to the presentation device 6, and requests an appropriateness input from the passenger. .
- the occupant visually recognizes the display image presented on the presentation device 6 and inputs suitability input for the display image via the input device 4.
- an appropriateness input indicating that the display image is inappropriate is input via the input device 4. To do.
- the occupant determines that the presence / absence or position of the obstacle in the display image, the parking start position, and the parking target position are correct, the occupant inputs an appropriateness input indicating that the display image is appropriate via the input device 4. To do.
- step S28 the determination unit 13 determines the suitability of the integrated map generated by the map generation unit 11 in accordance with the suitability input for the display image by the occupant received by the input device 4.
- the determination unit 13 determines that the environment map generated by the map generation unit 11 is appropriate, and proceeds to step S29.
- the map generation unit 11 uses an least square method, an ICP algorithm, or the like, and integrates a plurality of environmental maps including the environmental map generated by the current parking generated by the map generation unit 11 ( Integrated map). For example, if the current parking is the third time, the environmental map obtained by the first parking stored in the storage device 5, the environmental map obtained by the second parking, and the current (third) parking An integrated map is generated by integrating the environmental map obtained in step 1 above. Alternatively, if the environmental map obtained by the first parking and the environmental map obtained by the second parking are integrated and stored in the storage device 5 as the first integrated map, the first integrated map The second integrated map is generated by integrating the environmental map obtained in this (third) parking. Then, the integrated map generated by the map generation unit 11 is adopted as map data to be referred to during automatic parking, and is stored in the storage device 5.
- step S28 determines that the display image generated by the map generation unit 11 is inappropriate, and proceeds to step S30. To do. In step S ⁇ b> 30, the determination unit 13 does not employ the environment map generated by the map generation unit 11 and does not generate an integrated map that integrates the environment map.
- the parking support program according to the second embodiment of the present invention causes the computer constituting the control device 1 to execute the procedure of the parking support method shown in FIG.
- the parking assistance program according to the second embodiment of the present invention can be stored in the storage device 5, for example.
- the presentation control unit 12 has described the case where the presentation map 6 presents the environment map (display image) generated by the map generation unit 11 to the presentation device 6 in step S27. It is not limited to.
- the map generation unit 11 generates an integrated map (display image) obtained by integrating a plurality of environmental maps.
- the presentation control unit 12 presents the integrated map (display image) to the presentation device 6. May be.
- the determination unit 13 refers to the map for automatic parking.
- the data is adopted and stored in the storage device 5.
- step S28 determines that the integrated map is inappropriate and proceeds to step S30.
- step S ⁇ b> 30 the determination unit 13 does not employ the integrated map as map data that is referred to during automatic parking, and does not store the integrated map in the storage device 5.
- the surrounding situation of the parking target position is stored, and when the automatic parking is executed using the stored surrounding situation, the surroundings of the parking target position detected by the surrounding situation sensor 2
- the presentation device 6 presents an environmental map or the like that is the situation. And when the adequacy input which the passenger
- the surrounding situation of the parking target position is detected a plurality of times, and the surrounding situation of the detected parking target position is memorized a plurality of times around the parking target position.
- a more accurate environmental map can be generated by integrating and complementing the situation.
- the occupant can select appropriate data of the surrounding situation of the multiple parking target positions.
- the surrounding situation of the parking target position by storing the surrounding situation of the parking target position every time parking is performed at the parking target position, the surrounding situation such as an environmental map suitable for referring to automatic parking each time parking is performed at the parking target position. You can choose.
- the presentation device 6 presents an environmental map or the like that is the surrounding situation of the parking target position detected by the surrounding situation sensor 2, the surrounding situation including the parking target position is presented.
- the occupant can easily understand the positional relationship between the parking target position and the obstacles or the like existing around the parking target position when inputting the suitability of the surrounding situation.
- the surrounding situation for each parking at the parking target position is presented, and for each parking at the parking target position, the passenger's suitability input for the surrounding situation of the parking target position detected by the surrounding situation sensor 2 is received, Memorize the input ambient conditions.
- the surrounding situation of the parking target position suitable for referring in the case of automatic parking can be selectively memorize
- the presentation control unit 12 arranges or sequentially presents a plurality of environment maps obtained by multiple parkings on the presentation device 6, and selects an environment map (learning result) to be integrated among the plurality of environment maps to the occupant. To request.
- the occupant visually recognizes the plurality of environment maps presented on the presentation device 6 and inputs instruction information for selecting an environment map (learning result) suitable for integration among the plurality of environment maps via the input device 4. .
- the map generation unit 11 Based on the selection result of the environment map to be integrated by the occupant, extracts the environment map to be integrated among the plurality of environment maps stored in the storage device 5.
- the map generation unit 11 generates an integrated map by integrating the extracted environmental maps.
- an environment map (learning) that presents a plurality of environment maps obtained by repeating parking a plurality of times to the presentation device 6 and integrates among the plurality of environment maps. Requests the passenger to select (Result). This allows the occupant to select an environment map suitable for integration from among the environment maps at the time of each learning, and after selectively removing the inappropriate environment map, selectively integrate the environment map suitable for integration. can do.
- the presentation device 6 is mainly a display
- the presentation device 6 may be a device other than a display.
- the presentation device 6 uses the ambient condition sensor 2 such as “the recognition rate of the surrounding situation of the parking target position is 80%” and “the parking target position is recognized ahead”. By outputting a sound explaining the surrounding situation of the detected parking target position, information on the surrounding situation of the parking target position detected by the surrounding situation sensor 2 can be presented to the occupant.
Landscapes
- Engineering & Computer Science (AREA)
- Automation & Control Theory (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Mathematical Physics (AREA)
- Human Computer Interaction (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- Traffic Control Systems (AREA)
- Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
- Business, Economics & Management (AREA)
- Health & Medical Sciences (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Computation (AREA)
- Game Theory and Decision Science (AREA)
- Medical Informatics (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Navigation (AREA)
Abstract
Description
<駐車支援装置の構成>
本発明の第1の実施形態に係る駐車支援装置は、車両に搭載可能である(以下、本発明の第1の実施形態に係る駐車支援装置を搭載した車両を「自車両」と称する)。本発明の第1の実施形態に係る駐車支援装置は、図1に示すように、制御装置(コントローラ)1、周囲状況センサ2、車両状況センサ3、入力装置(インターフェース)4、記憶装置5及び提示装置6を備える。
次に、図5のフローチャートを参照しながら、本発明の第1の実施形態に係る駐車支援方法の一例を説明する。
本発明の第1の実施形態の第1の変形例として、提示制御部12が、周囲状況センサ2により検出した駐車目標位置の周囲状況として、地図生成部11により生成された環境地図中に、駐車目標位置に加えて、駐車目標位置とは異なる位置に駐車目標位置候補を提示する場合を説明する。
本発明の第1の実施形態の第2の変形例として、学習結果である記憶装置5に記憶された駐車目標位置の周囲状況を参照して自動駐車を実行する場合の変形例を説明する。自動駐車を開始するときに、提示制御部12は、図7に示すように、記憶装置5に記憶された駐車目標位置の周囲状況である環境地図を提示装置6に提示させる。環境地図は、駐車枠61,62,63を含み、駐車枠62と一部が重なるように駐車目標位置62aが提示されている。駐車枠61,63には他車両22,23が駐車しているが、このうちの他車両23が駐車枠63からはみ出して、駐車目標位置62a側に寄って駐車している。このため、自車両21が駐車目標位置62aに駐車し難い状況である。
<駐車支援装置の構成>
本発明の第2の実施形態では、駐車目標位置への駐車を複数回繰り返すことにより、駐車目標位置の周囲状況を複数回検出(学習)する場合を説明する。複数回の検出結果を統合することで、1回の駐車では検出できていない障害物や特徴点等を補完することができる。このため、駐車目標位置の周囲状況に基づいて生成する地図データ等の精度を高めることができる。
次に、図12のフローチャートを参照しながら、本発明の第2の実施形態に係る駐車支援方法の一例を説明する。
本発明の第2の実施形態では、複数回の駐車で得られた複数の環境地図のすべてを統合する場合を例示した。これに対して、本発明の第2の実施形態の変形例では、複数の環境地図の一部を選択的に統合する場合を説明する。
上記のように、本発明は第1及び第2の実施形態によって記載したが、この開示の一部をなす論述及び図面は本発明を限定するものであると理解すべきではない。この開示から当業者には様々な代替実施の形態、実施例及び運用技術が明らかとなろう。
2…周囲状況センサ
2a…カメラ
2b…レーザレーダ
3…車両状況センサ
3a…車輪速センサ
3b…ヨーレートセンサ
4…入力装置
5…記憶装置
6…提示装置
11…地図生成部
12…提示制御部
13…判定部
14…駐車支援部
Claims (8)
- 駐車目標位置への駐車に際し、前記駐車目標位置の周囲状況を記憶し、記憶された前記周囲状況を用いて自動駐車を実行する駐車支援方法において、
前記周囲状況を検出するステップと、
検出した前記周囲状況を提示するステップと、
提示した前記周囲状況に対する乗員による適否入力を受付けるステップと、
乗員が適するとした前記適否入力が得られた場合に、前記周囲状況を記憶するステップ
とを含むことを特徴とする駐車支援方法。 - 前記周囲状況を検出するステップは、前記駐車目標位置への駐車を複数回繰り返すことにより、前記周囲状況を複数回検出し、
前記周囲状況を記憶するステップは、検出した複数回の前記周囲状況を記憶する
ことを特徴とする請求項1に記載の駐車支援方法。 - 前記周囲状況を提示するステップは、記憶した複数回の前記周囲状況を提示することを特徴とする請求項2に記載の駐車支援方法。
- 前記周囲状況を記憶するステップは、前記駐車目標位置への駐車の都度、前記周囲状況を記憶することを特徴とする請求項2又は3に記載の駐車支援方法。
- 前記周囲状況を提示するステップは、前記駐車目標位置を含む前記周囲状況を提示することを特徴とする請求項1乃至4の何れか一項に記載の駐車支援方法。
- 前記周囲状況を提示するステップは、前記駐車目標位置とは異なる位置に駐車目標位置候補を含む前記周囲状況を提示することを特徴とする請求項1乃至5の何れか一項に記載の駐車支援方法。
- 前記周囲状況を提示するステップは、前記駐車目標位置への駐車毎の前記周囲状況を提示し、
前記乗員による適否入力を受付けるステップは、前記駐車目標位置への駐車毎に、前記周囲状況に対する前記乗員の適否入力を受付け、
前記周囲状況を記憶するステップは、前記乗員が適すると入力した周囲状況を記憶する
ことを特徴とする請求項2又は3に記載の駐車支援方法。 - 駐車目標位置への駐車に際し、前記駐車目標位置の周囲状況を記憶し、記憶された前記周囲状況を用いて自動駐車を実行する駐車支援装置において、
前記周囲状況を検出する周囲状況センサと、
検出した前記周囲状況を提示する提示装置と、
提示した前記周囲状況に対する乗員による適否入力を受付けるインターフェースと、
乗員が適するとした前記適否入力が得られた場合に、前記周囲状況を記憶するコントローラ
とを備えることを特徴とする駐車支援装置。
Priority Applications (11)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
BR112019007398-1A BR112019007398B1 (pt) | 2016-10-13 | 2016-10-13 | Método de assistência ao estacionamento e dispositivo de assistência ao estacionamento |
PCT/JP2016/080386 WO2018070021A1 (ja) | 2016-10-13 | 2016-10-13 | 駐車支援方法及び駐車支援装置 |
KR1020197013035A KR102154510B1 (ko) | 2016-10-13 | 2016-10-13 | 주차 지원 방법 및 주차 지원 장치 |
EP16918946.1A EP3527448B1 (en) | 2016-10-13 | 2016-10-13 | Parking assist method and parking assist device |
MX2019004331A MX2019004331A (es) | 2016-10-13 | 2016-10-13 | Metodo de asistencia al estacionamiento y dispositivo de asistencia al estacionamiento. |
RU2019113789A RU2721437C1 (ru) | 2016-10-13 | 2016-10-13 | Устройство помощи при парковке и способ помощи при парковке |
CA3041176A CA3041176C (en) | 2016-10-13 | 2016-10-13 | Parking assistance method and parking assistance device |
US16/341,282 US11273821B2 (en) | 2016-10-13 | 2016-10-13 | Parking assistance method and parking assistance device |
MYPI2019002016A MY183262A (en) | 2016-10-13 | 2016-10-13 | Parking assistance method and parking assistance device |
JP2018544646A JP7122968B2 (ja) | 2016-10-13 | 2016-10-13 | 駐車支援方法及び駐車支援装置 |
CN201680090073.1A CN109843677B (zh) | 2016-10-13 | 2016-10-13 | 停车辅助方法及停车辅助装置 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2016/080386 WO2018070021A1 (ja) | 2016-10-13 | 2016-10-13 | 駐車支援方法及び駐車支援装置 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018070021A1 true WO2018070021A1 (ja) | 2018-04-19 |
Family
ID=61905373
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/080386 WO2018070021A1 (ja) | 2016-10-13 | 2016-10-13 | 駐車支援方法及び駐車支援装置 |
Country Status (11)
Country | Link |
---|---|
US (1) | US11273821B2 (ja) |
EP (1) | EP3527448B1 (ja) |
JP (1) | JP7122968B2 (ja) |
KR (1) | KR102154510B1 (ja) |
CN (1) | CN109843677B (ja) |
BR (1) | BR112019007398B1 (ja) |
CA (1) | CA3041176C (ja) |
MX (1) | MX2019004331A (ja) |
MY (1) | MY183262A (ja) |
RU (1) | RU2721437C1 (ja) |
WO (1) | WO2018070021A1 (ja) |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2019196091A (ja) * | 2018-05-09 | 2019-11-14 | トヨタ自動車株式会社 | 停止位置設定装置 |
JP2020069958A (ja) * | 2018-11-01 | 2020-05-07 | トヨタ自動車株式会社 | 駐車支援装置 |
JP2020075550A (ja) * | 2018-11-06 | 2020-05-21 | トヨタ自動車株式会社 | 駐車支援装置 |
CN111369779A (zh) * | 2018-12-26 | 2020-07-03 | 北京图森智途科技有限公司 | 一种岸吊区卡车精准停车方法、设备及系统 |
JP2021526473A (ja) * | 2018-05-25 | 2021-10-07 | ロベルト・ボッシュ・ゲゼルシャフト・ミト・ベシュレンクテル・ハフツングRobert Bosch Gmbh | 運転者支援方法のためのトレーニング方法、運転者支援方法、制御機器、および制御機器を備えた車両 |
JP2022517371A (ja) * | 2019-01-16 | 2022-03-08 | ヴァレオ・シャルター・ウント・ゼンゾーレン・ゲーエムベーハー | 車両の軌道を学習する方法、および電子車両誘導システム |
EP3845424A4 (en) * | 2018-08-29 | 2022-06-15 | Faurecia Clarion Electronics Co., Ltd. | ON-BOARD PROCESSING DEVICE |
WO2023100229A1 (ja) * | 2021-11-30 | 2023-06-08 | 日産自動車株式会社 | 駐車支援方法及び駐車支援装置 |
US11804135B2 (en) | 2019-11-28 | 2023-10-31 | Mitsubishi Electric Corporation | Object recognition apparatus, object recognition method, and computer readable medium |
WO2024157459A1 (ja) * | 2023-01-27 | 2024-08-02 | 日産自動車株式会社 | 駐車支援方法及び駐車支援装置 |
WO2024184974A1 (ja) * | 2023-03-03 | 2024-09-12 | 日産自動車株式会社 | 駐車支援方法及び駐車支援装置 |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7046552B2 (ja) * | 2017-10-05 | 2022-04-04 | アルパイン株式会社 | ナビゲーション装置、目的地案内システム、プログラム |
JP6897597B2 (ja) * | 2018-02-16 | 2021-06-30 | トヨタ自動車株式会社 | 駐車支援装置 |
JP7218172B2 (ja) * | 2018-12-25 | 2023-02-06 | フォルシアクラリオン・エレクトロニクス株式会社 | 車載処理装置、及び車載処理装置の制御方法 |
JP7319590B2 (ja) * | 2019-10-11 | 2023-08-02 | トヨタ自動車株式会社 | 車両駐車支援装置 |
JP7468254B2 (ja) * | 2020-08-28 | 2024-04-16 | 富士通株式会社 | 位置姿勢算出方法および位置姿勢算出プログラム |
CN116508083B (zh) * | 2020-11-02 | 2024-04-26 | 日产自动车株式会社 | 停车辅助方法及停车辅助装置 |
US11769409B2 (en) * | 2020-12-15 | 2023-09-26 | Charter Communications Operating, Llc | Automated parking lot digital map generation and use thereof |
CN113012464B (zh) * | 2021-02-20 | 2022-03-22 | 腾讯科技(深圳)有限公司 | 一种寻车指引方法、装置、设备及计算机可读存储介质 |
DE102021214767A1 (de) | 2021-12-21 | 2023-06-22 | Robert Bosch Gesellschaft mit beschränkter Haftung | Verfahren zum Erstellen einer digitalen Karte |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11157404A (ja) * | 1997-11-26 | 1999-06-15 | Toyota Motor Corp | 駐車支援装置 |
JP2002240661A (ja) * | 2001-02-19 | 2002-08-28 | Nissan Motor Co Ltd | 駐車支援装置 |
JP2005326944A (ja) | 2004-05-12 | 2005-11-24 | Hitachi Ltd | レーザー計測により地図画像を生成する装置及び方法 |
JP2007055378A (ja) * | 2005-08-23 | 2007-03-08 | Nissan Motor Co Ltd | 駐車支援装置及び駐車支援方法 |
JP2013244852A (ja) * | 2012-05-25 | 2013-12-09 | Sharp Corp | 駐車支援装置、駐車支援方法およびそのプログラム |
Family Cites Families (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1157404A (ja) * | 1997-08-13 | 1999-03-02 | Junko Harashima | フロン破壊処理方法 |
JP3632563B2 (ja) * | 1999-10-19 | 2005-03-23 | 株式会社豊田自動織機 | 映像位置関係補正装置、該映像位置関係補正装置を備えた操舵支援装置、及び映像位置関係補正方法 |
JPWO2006064544A1 (ja) | 2004-12-14 | 2008-06-12 | 株式会社日立製作所 | 自動車庫入れ装置 |
JP2006298115A (ja) * | 2005-04-19 | 2006-11-02 | Aisin Aw Co Ltd | 運転支援方法及び運転支援装置 |
JP2007315956A (ja) * | 2006-05-26 | 2007-12-06 | Aisin Aw Co Ltd | 駐車場マップ作成方法、駐車場案内方法及びナビゲーション装置 |
KR100854766B1 (ko) | 2007-04-27 | 2008-08-27 | 주식회사 만도 | 거리 센서를 이용한 주차 공간 검출 방법 |
KR20120054879A (ko) * | 2010-11-22 | 2012-05-31 | 고려대학교 산학협력단 | 차량형 이동 로봇의 충돌 회피 경로 생성 방법 및 이를 이용한 주차 보조 시스템 |
JP5516992B2 (ja) * | 2010-11-30 | 2014-06-11 | アイシン精機株式会社 | 駐車位置調整装置 |
DE102011109492A1 (de) * | 2011-08-04 | 2013-02-07 | GM Global Technology Operations LLC (n. d. Gesetzen des Staates Delaware) | Fahrunterstützungsvorrichtung zur Unterstützung der Befahrung enger Fahrwege |
DE102012008858A1 (de) * | 2012-04-28 | 2012-11-08 | Daimler Ag | Verfahren zum autonomen Parken eines Kraftfahrzeugs, Fahrerassistenzvorrichtung zur Durchführung des Verfahrens, sowie Kraftfahrzeug mit der Fahrerassistenzvorrichtung |
DE102013209764B4 (de) * | 2013-05-27 | 2023-10-05 | Robert Bosch Gmbh | Unterstützung eines Fahrers eines Kraftfahrzeugs |
DE102013015349A1 (de) * | 2013-09-17 | 2014-04-10 | Daimler Ag | Verfahren und Vorrichtung zum Betrieb eines Fahrzeugs |
DE102013015348A1 (de) * | 2013-09-17 | 2014-04-10 | Daimler Ag | Verfahren und Vorrichtung zum Betrieb eines Fahrzeugs |
JP5949840B2 (ja) * | 2014-06-19 | 2016-07-13 | トヨタ自動車株式会社 | 駐車支援装置 |
US10293816B2 (en) * | 2014-09-10 | 2019-05-21 | Ford Global Technologies, Llc | Automatic park and reminder system and method of use |
DE102014223363B4 (de) * | 2014-11-17 | 2021-04-29 | Volkswagen Aktiengesellschaft | Verfahren und Vorrichtung zur Lokalisation eines Kraftfahrzeugs in einer ortsfesten Referenzkarte |
JP6507862B2 (ja) * | 2015-06-02 | 2019-05-08 | トヨタ自動車株式会社 | 周辺監視装置及び運転支援装置 |
RU165235U1 (ru) * | 2015-11-17 | 2016-10-10 | Общество с ограниченной ответственностью "Когнитивные технологии" | Система распознавания и анализа дорожной сцены |
CN107444264A (zh) * | 2016-05-31 | 2017-12-08 | 法拉第未来公司 | 使用相机检测车辆附近的物体 |
EP3498554B1 (en) * | 2016-08-09 | 2020-06-24 | JVC KENWOOD Corporation | Display control device, display device, display control method, and program |
US10252714B2 (en) * | 2016-08-11 | 2019-04-09 | Toyota Motor Engineering & Manufacturing North America, Inc. | Parking assistance control for vehicle with autonomous operation capability |
KR101965834B1 (ko) * | 2016-10-12 | 2019-08-13 | 엘지전자 주식회사 | 자동주차 보조장치 및 이를 포함하는 차량 |
-
2016
- 2016-10-13 JP JP2018544646A patent/JP7122968B2/ja active Active
- 2016-10-13 RU RU2019113789A patent/RU2721437C1/ru active
- 2016-10-13 KR KR1020197013035A patent/KR102154510B1/ko active IP Right Grant
- 2016-10-13 EP EP16918946.1A patent/EP3527448B1/en active Active
- 2016-10-13 CA CA3041176A patent/CA3041176C/en active Active
- 2016-10-13 BR BR112019007398-1A patent/BR112019007398B1/pt active IP Right Grant
- 2016-10-13 CN CN201680090073.1A patent/CN109843677B/zh active Active
- 2016-10-13 MX MX2019004331A patent/MX2019004331A/es unknown
- 2016-10-13 MY MYPI2019002016A patent/MY183262A/en unknown
- 2016-10-13 US US16/341,282 patent/US11273821B2/en active Active
- 2016-10-13 WO PCT/JP2016/080386 patent/WO2018070021A1/ja unknown
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11157404A (ja) * | 1997-11-26 | 1999-06-15 | Toyota Motor Corp | 駐車支援装置 |
JP2002240661A (ja) * | 2001-02-19 | 2002-08-28 | Nissan Motor Co Ltd | 駐車支援装置 |
JP2005326944A (ja) | 2004-05-12 | 2005-11-24 | Hitachi Ltd | レーザー計測により地図画像を生成する装置及び方法 |
JP2007055378A (ja) * | 2005-08-23 | 2007-03-08 | Nissan Motor Co Ltd | 駐車支援装置及び駐車支援方法 |
JP2013244852A (ja) * | 2012-05-25 | 2013-12-09 | Sharp Corp | 駐車支援装置、駐車支援方法およびそのプログラム |
Cited By (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7272756B2 (ja) | 2018-05-09 | 2023-05-12 | トヨタ自動車株式会社 | 停止位置設定装置 |
JP2019196091A (ja) * | 2018-05-09 | 2019-11-14 | トヨタ自動車株式会社 | 停止位置設定装置 |
US11780432B2 (en) | 2018-05-25 | 2023-10-10 | Robert Bosch Gmbh | Training method for a driver assistance method, driver assistance method, control device and vehicle comprising the control device |
JP2021526473A (ja) * | 2018-05-25 | 2021-10-07 | ロベルト・ボッシュ・ゲゼルシャフト・ミト・ベシュレンクテル・ハフツングRobert Bosch Gmbh | 運転者支援方法のためのトレーニング方法、運転者支援方法、制御機器、および制御機器を備えた車両 |
JP2023024995A (ja) * | 2018-05-25 | 2023-02-21 | ロベルト・ボッシュ・ゲゼルシャフト・ミト・ベシュレンクテル・ハフツング | 運転者支援方法のためのトレーニング方法、運転者支援方法、制御機器、および制御機器を備えた車両 |
EP3845424A4 (en) * | 2018-08-29 | 2022-06-15 | Faurecia Clarion Electronics Co., Ltd. | ON-BOARD PROCESSING DEVICE |
JP7047709B2 (ja) | 2018-11-01 | 2022-04-05 | トヨタ自動車株式会社 | 駐車支援装置 |
JP2020069958A (ja) * | 2018-11-01 | 2020-05-07 | トヨタ自動車株式会社 | 駐車支援装置 |
JP7047715B2 (ja) | 2018-11-06 | 2022-04-05 | トヨタ自動車株式会社 | 駐車支援装置 |
JP2020075550A (ja) * | 2018-11-06 | 2020-05-21 | トヨタ自動車株式会社 | 駐車支援装置 |
CN111369779A (zh) * | 2018-12-26 | 2020-07-03 | 北京图森智途科技有限公司 | 一种岸吊区卡车精准停车方法、设备及系统 |
JP2022517371A (ja) * | 2019-01-16 | 2022-03-08 | ヴァレオ・シャルター・ウント・ゼンゾーレン・ゲーエムベーハー | 車両の軌道を学習する方法、および電子車両誘導システム |
JP7565928B2 (ja) | 2019-01-16 | 2024-10-11 | ヴァレオ・シャルター・ウント・ゼンゾーレン・ゲーエムベーハー | 車両の軌道を学習する方法、および電子車両誘導システム |
US11804135B2 (en) | 2019-11-28 | 2023-10-31 | Mitsubishi Electric Corporation | Object recognition apparatus, object recognition method, and computer readable medium |
WO2023100229A1 (ja) * | 2021-11-30 | 2023-06-08 | 日産自動車株式会社 | 駐車支援方法及び駐車支援装置 |
WO2024157459A1 (ja) * | 2023-01-27 | 2024-08-02 | 日産自動車株式会社 | 駐車支援方法及び駐車支援装置 |
WO2024184974A1 (ja) * | 2023-03-03 | 2024-09-12 | 日産自動車株式会社 | 駐車支援方法及び駐車支援装置 |
Also Published As
Publication number | Publication date |
---|---|
KR20190060825A (ko) | 2019-06-03 |
CN109843677A (zh) | 2019-06-04 |
CA3041176A1 (en) | 2018-04-19 |
EP3527448B1 (en) | 2021-05-26 |
CN109843677B (zh) | 2020-07-17 |
KR102154510B1 (ko) | 2020-09-10 |
EP3527448A4 (en) | 2019-11-06 |
US11273821B2 (en) | 2022-03-15 |
BR112019007398A2 (pt) | 2019-07-02 |
EP3527448A1 (en) | 2019-08-21 |
MX2019004331A (es) | 2019-08-05 |
JP7122968B2 (ja) | 2022-08-22 |
RU2721437C1 (ru) | 2020-05-19 |
MY183262A (en) | 2021-02-18 |
JPWO2018070021A1 (ja) | 2019-08-15 |
US20200017099A1 (en) | 2020-01-16 |
BR112019007398B1 (pt) | 2022-12-20 |
CA3041176C (en) | 2020-07-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2018070021A1 (ja) | 駐車支援方法及び駐車支援装置 | |
JP4341649B2 (ja) | ナビゲーション装置、位置検出方法 | |
JP6926976B2 (ja) | 駐車支援装置及びコンピュータプログラム | |
KR102513745B1 (ko) | 운전 지원 장치 | |
JP6436054B2 (ja) | 運転支援装置 | |
JP4793227B2 (ja) | カーナビゲーション装置 | |
JP2018034541A (ja) | 駐車支援方法及び駐車支援装置 | |
US20170076608A1 (en) | Driving assistance apparatus | |
KR102145747B1 (ko) | 주차 지원 방법 및 주차 지원 장치 | |
WO2018070022A1 (ja) | 自己位置推定方法及び自己位置推定装置 | |
JP2007333502A (ja) | 合流支援装置及び合流支援方法 | |
JP4416021B2 (ja) | 車両ナビゲーション装置 | |
JP5573266B2 (ja) | 車両用対象物画像認識装置、車両用対象物画像認識方法及びコンピュータプログラム | |
JP2009220592A (ja) | 車載用縦列駐車支援装置および車載用縦列駐車支援装置のプログラム | |
JP2008202968A (ja) | 車両用ナビゲーション装置 | |
JP2023021775A (ja) | 駐車支援方法及び駐車支援装置 | |
US11935307B2 (en) | Vehicle control apparatus | |
WO2023166738A1 (ja) | 駐車支援方法及び駐車支援装置 | |
JP2006343480A (ja) | 地図表示装置 | |
JP2019066963A (ja) | 運転支援装置および運転支援方法 | |
JP2024113792A (ja) | 車両挙動分析装置、車道挙動分析方法、プログラム及び記憶媒体 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16918946 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2018544646 Country of ref document: JP Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 3041176 Country of ref document: CA |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112019007398 Country of ref document: BR |
|
ENP | Entry into the national phase |
Ref document number: 20197013035 Country of ref document: KR Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 2016918946 Country of ref document: EP Effective date: 20190513 |
|
ENP | Entry into the national phase |
Ref document number: 112019007398 Country of ref document: BR Kind code of ref document: A2 Effective date: 20190411 |