JP7367183B2 - 占有予測ニューラルネットワーク - Google Patents
占有予測ニューラルネットワーク Download PDFInfo
- Publication number
- JP7367183B2 JP7367183B2 JP2022510922A JP2022510922A JP7367183B2 JP 7367183 B2 JP7367183 B2 JP 7367183B2 JP 2022510922 A JP2022510922 A JP 2022510922A JP 2022510922 A JP2022510922 A JP 2022510922A JP 7367183 B2 JP7367183 B2 JP 7367183B2
- Authority
- JP
- Japan
- Prior art keywords
- neural network
- environment
- vehicle
- occupancy prediction
- sensor data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000013528 artificial neural network Methods 0.000 title claims description 131
- 238000000034 method Methods 0.000 claims description 66
- 230000008569 process Effects 0.000 claims description 38
- 238000012545 processing Methods 0.000 claims description 36
- 238000013527 convolutional neural network Methods 0.000 claims description 6
- 230000003068 static effect Effects 0.000 claims description 5
- 239000003795 chemical substances by application Substances 0.000 description 67
- 238000012549 training Methods 0.000 description 49
- 230000009471 action Effects 0.000 description 14
- 238000004590 computer program Methods 0.000 description 12
- 238000010801 machine learning Methods 0.000 description 9
- 230000006870 function Effects 0.000 description 8
- 210000004027 cell Anatomy 0.000 description 6
- 230000001133 acceleration Effects 0.000 description 5
- 238000004891 communication Methods 0.000 description 5
- 230000004044 response Effects 0.000 description 5
- 230000000295 complement effect Effects 0.000 description 4
- 238000010586 diagram Methods 0.000 description 4
- 230000007246 mechanism Effects 0.000 description 4
- 230000008901 benefit Effects 0.000 description 3
- 230000003993 interaction Effects 0.000 description 3
- 230000004807 localization Effects 0.000 description 3
- 230000004927 fusion Effects 0.000 description 2
- 230000007774 longterm Effects 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000013515 script Methods 0.000 description 2
- 238000000926 separation method Methods 0.000 description 2
- 230000026676 system process Effects 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- ORILYTVJVMAKLC-UHFFFAOYSA-N Adamantane Natural products C1C(C2)CC3CC1CC2C3 ORILYTVJVMAKLC-UHFFFAOYSA-N 0.000 description 1
- 241000009334 Singa Species 0.000 description 1
- 230000004931 aggregating effect Effects 0.000 description 1
- 230000004888 barrier function Effects 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000001149 cognitive effect Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000012217 deletion Methods 0.000 description 1
- 230000037430 deletion Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 238000002372 labelling Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 230000001537 neural effect Effects 0.000 description 1
- 238000003062 neural network model Methods 0.000 description 1
- 210000002569 neuron Anatomy 0.000 description 1
- 230000000644 propagated effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000001953 sensory effect Effects 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 239000000758 substrate Substances 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/18—Propelling the vehicle
- B60W30/18009—Propelling the vehicle related to particular drive situations
- B60W30/18159—Traversing an intersection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W10/00—Conjoint control of vehicle sub-units of different type or different function
- B60W10/20—Conjoint control of vehicle sub-units of different type or different function including control of steering systems
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W60/00—Drive control systems specially adapted for autonomous road vehicles
- B60W60/001—Planning or execution of driving tasks
- B60W60/0027—Planning or execution of driving tasks using trajectory prediction for other traffic participants
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/02—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
- G01S7/41—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
- G01S7/417—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section involving the use of neural networks
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0088—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots characterized by the autonomous decision making process, e.g. artificial intelligence, predefined behaviours
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0257—Control of position or course in two dimensions specially adapted to land vehicles using a radar
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/214—Generating training patterns; Bootstrap methods, e.g. bagging or boosting
- G06F18/2155—Generating training patterns; Bootstrap methods, e.g. bagging or boosting characterised by the incorporation of unlabelled data, e.g. multiple instance learning [MIL], semi-supervised techniques using expectation-maximisation [EM] or naïve labelling
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/25—Fusion techniques
- G06F18/254—Fusion techniques of classification results, e.g. of results related to same input data
- G06F18/256—Fusion techniques of classification results, e.g. of results related to same input data of results relating to different input data, e.g. multimodal recognition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/174—Segmentation; Edge detection involving the use of two or more images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/25—Determination of region of interest [ROI] or a volume of interest [VOI]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/77—Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
- G06V10/774—Generating sets of training patterns; Bootstrap methods, e.g. bagging or boosting
- G06V10/7753—Incorporation of unlabelled data, e.g. multiple instance learning [MIL]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/77—Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
- G06V10/80—Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
- G06V10/809—Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level of classification results, e.g. where the classifiers operate on the same input data
- G06V10/811—Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level of classification results, e.g. where the classifiers operate on the same input data the classifiers operating on different input data, e.g. multi-modal recognition
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo, light or radio wave sensitive means, e.g. infrared sensors
- B60W2420/403—Image sensing, e.g. optical camera
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo, light or radio wave sensitive means, e.g. infrared sensors
- B60W2420/408—Radar; Laser, e.g. lidar
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
- G01S2013/9323—Alternative operation using light waves
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20084—Artificial neural networks [ANN]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20112—Image segmentation details
- G06T2207/20132—Image cropping
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Evolutionary Computation (AREA)
- Artificial Intelligence (AREA)
- Multimedia (AREA)
- Automation & Control Theory (AREA)
- Databases & Information Systems (AREA)
- Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Computing Systems (AREA)
- General Health & Medical Sciences (AREA)
- Software Systems (AREA)
- Aviation & Aerospace Engineering (AREA)
- Electromagnetism (AREA)
- Data Mining & Analysis (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Evolutionary Biology (AREA)
- Life Sciences & Earth Sciences (AREA)
- Human Computer Interaction (AREA)
- Bioinformatics & Computational Biology (AREA)
- General Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Game Theory and Decision Science (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- Traffic Control Systems (AREA)
- Image Analysis (AREA)
Description
式中、t0は、現時点、iは、未来の各時間間隔の一定期間、Nは、未来の時間間隔の総数である。別の例では、未来の時間間隔のセットは、それぞれが現時点から開始する異なる期間の未来の時間間隔であり得る。図2に示される例を参照すると、占有予測出力214は、未来の各時間間隔[t,t+1]、[t,t+2]、[t,t+3](式中、tは、現時点である)に対応するそれぞれの占有予測ヒートマップ216を指定し得る。
Claims (19)
- 1つ以上のデータ処理装置によって実施される方法であって、
現時点での車両の近くの環境を特徴付ける、前記車両のセンサシステムによって生成されたセンサデータを受信することであって、前記センサデータが、それぞれ異なる時点でキャプチャされた環境を特徴付ける複数のセンササンプルを含むことと、
ニューラルネットワークを使用して前記センサデータを含むネットワーク入力を処理して、前記環境の領域に関する占有予測出力を生成することであって、
前記占有予測出力が、前記現時点の後の1つ以上の未来の時間間隔について、前記環境の前記領域が前記未来の時間間隔中に前記環境内のエージェントによって占有されるそれぞれの可能性を特徴付け、
前記ネットワーク入力が、前記ニューラルネットワークの入力レイヤに提供され、前記環境の前記領域に関する前記占有予測出力が、前記ニューラルネットワークの出力レイヤによって出力されることと、
前記占有予測出力を前記車両の計画システムに提供して、前記車両の未来の軌道を計画する計画決定を生成することと、を含む、方法。 - 前記センササンプルが、前記車両の1つ以上のカメラセンサによって生成された画像を含む、請求項1に記載の方法。
- 前記センササンプルが、前記車両の1つ以上のLIDARセンサによって生成された点群データ、前記車両の1つ以上のレーダーセンサによって生成されたセンサデータ、またはその両方、を含む、請求項1または2に記載の方法。
- 前記現時点の後の前記未来の時間間隔が、前記現時点から始まる時間間隔を含む、請求項1~3のいずれか一項に記載の方法。
- 前記現時点の後の複数の所与の未来の時間間隔のそれぞれについて、前記占有予測出力が、前記環境の前記領域が、前記所与の未来の時間間隔中に、前記環境内のエージェントによって占有されるそれぞれの可能性を特徴付ける、請求項1~4のいずれか一項に記載の方法。
- 前記環境の複数の所与の領域のそれぞれについて、前記占有予測出力が、前記環境の前記所与の領域が、前記現時点の後の未来の各時間間隔中に、前記環境内のエージェントによって占有されるそれぞれの可能性を特徴付ける、請求項1~5のいずれか一項に記載の方法。
- 前記環境の前記領域が、車道の交差点である、請求項1~6のいずれか一項に記載の方法。
- 前記計画システムによって、前記車道の前記交差点に関する前記占有予測出力に基づいて、前記車両に前記車道の前記交差点を横断させる計画決定を生成すること、をさらに含む、請求項7に記載の方法。
- 前記環境の前記領域が、静止したエージェントによって占有されている車道の一部分である、請求項1~6のいずれか一項に記載の方法。
- 前記計画システムによって、前記静止したエージェントによって占有されている前記車道の前記一部分に関する前記占有予測出力に基づいて、前記車両を減速させる計画決定を生成すること、をさらに含む、請求項9に記載の方法。
- ニューラルネットワークが、複数の畳み込みニューラルネットワークレイヤを含む畳み込みニューラルネットワークである、請求項1~10のいずれか一項に記載の方法。
- 前記ニューラルネットワークを使用して前記センサデータを処理して、前記環境の前記領域に関する前記占有予測出力を生成することが、
1つ以上のニューラルネットワークレイヤの第1のセットを使用して前記センサデータの第1のサブセットを処理して、第1の中間出力を生成することと、
1つ以上のニューラルネットワークレイヤの第2のセットを使用して前記センサデータの第2のサブセットを処理して、第2の中間出力を生成することと、
1つ以上のニューラルネットワークレイヤの第3のセットを使用して前記第1の中間出力および前記第2の中間出力を処理して、前記占有予測出力を生成することと、を含む、請求項1~11のいずれか一項に記載の方法。 - 前記センサデータの前記第1のサブセットが、第1のモダリティのセンサデータを含み、前記センサデータの前記第2のサブセットが、第2の異なるモダリティのセンサデータを含む、請求項12に記載の方法。
- 前記ニューラルネットワークが、前記車両の前記近くの前記環境内の他のエージェントの現在または未来の位置を明示的に位置特定することなく、前記占有予測出力を生成する、請求項1~13のいずれか一項に記載の方法。
- 前記ネットワーク入力が、前記車両の前記近くの前記環境の静的な特徴を特徴付ける道路グラフデータをさらに含む、請求項1~14のいずれか一項に記載の方法。
- 前記センサデータが、それぞれの時点でキャプチャされた画像を含み、
前記ニューラルネットワークを使用して前記画像を処理する前に、1つ以上の事前定義されたトリミング操作を各画像に適用すること、をさらに含む、請求項1~15のいずれか一項に記載の方法。 - 前記センサデータが、それぞれの時点でキャプチャされた画像を含み、
前記ニューラルネットワークが、前記画像を処理して画像トリミング操作を指定するデータを生成するように構成されているトリミングサブネットワークを含み、
前記ニューラルネットワークを使用して前記センサデータを処理することが、前記トリミングサブネットワークによって指定された前記画像トリミング操作を前記画像に適用すること、を含む、請求項1~15のいずれか一項に記載の方法。 - システムであって、
1つ以上のコンピュータと、
前記1つ以上のコンピュータに通信可能に結合された1つ以上の記憶デバイスであって、前記1つ以上の記憶デバイスは、命令を記憶しており、前記命令は、前記1つ以上のコンピュータによって実行されたときに、前記1つ以上のコンピュータに請求項1~17のいずれか一項に記載のそれぞれの方法の動作を行わせる、1つ以上の記憶デバイスと、を備える、システム。 - 1つ以上のコンピュータによって実行されたときに、前記1つ以上のコンピュータに請求項1~17のいずれか一項に記載のそれぞれの方法の動作を行わせる命令を記憶している、1つ以上の非一時的コンピュータ記憶媒体。
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/557,246 US11403853B2 (en) | 2019-08-30 | 2019-08-30 | Occupancy prediction neural networks |
US16/557,246 | 2019-08-30 | ||
PCT/US2020/042841 WO2021040910A1 (en) | 2019-08-30 | 2020-07-21 | Occupancy prediction neural networks |
Publications (2)
Publication Number | Publication Date |
---|---|
JP2022546283A JP2022546283A (ja) | 2022-11-04 |
JP7367183B2 true JP7367183B2 (ja) | 2023-10-23 |
Family
ID=74679811
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
JP2022510922A Active JP7367183B2 (ja) | 2019-08-30 | 2020-07-21 | 占有予測ニューラルネットワーク |
Country Status (6)
Country | Link |
---|---|
US (2) | US11403853B2 (ja) |
EP (1) | EP4000015A4 (ja) |
JP (1) | JP7367183B2 (ja) |
KR (1) | KR20220054358A (ja) |
CN (1) | CN114341950A (ja) |
WO (1) | WO2021040910A1 (ja) |
Families Citing this family (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10643084B2 (en) | 2017-04-18 | 2020-05-05 | nuTonomy Inc. | Automatically perceiving travel signals |
US10960886B2 (en) | 2019-01-29 | 2021-03-30 | Motional Ad Llc | Traffic light estimation |
DE102019209736A1 (de) * | 2019-07-03 | 2021-01-07 | Robert Bosch Gmbh | Verfahren zur Bewertung möglicher Trajektorien |
US11403853B2 (en) * | 2019-08-30 | 2022-08-02 | Waymo Llc | Occupancy prediction neural networks |
US11726492B2 (en) * | 2019-10-02 | 2023-08-15 | Zoox, Inc. | Collision avoidance perception system |
US11994866B2 (en) * | 2019-10-02 | 2024-05-28 | Zoox, Inc. | Collision avoidance perception system |
US11754408B2 (en) * | 2019-10-09 | 2023-09-12 | Argo AI, LLC | Methods and systems for topological planning in autonomous driving |
EP3832525A1 (en) * | 2019-12-03 | 2021-06-09 | Aptiv Technologies Limited | Vehicles, systems, and methods for determining an entry of an occupancy map of a vicinity of a vehicle |
US11193683B2 (en) * | 2019-12-31 | 2021-12-07 | Lennox Industries Inc. | Error correction for predictive schedules for a thermostat |
US11385642B2 (en) | 2020-02-27 | 2022-07-12 | Zoox, Inc. | Perpendicular cut-in training |
US11878682B2 (en) * | 2020-06-08 | 2024-01-23 | Nvidia Corporation | Path planning and control to account for position uncertainty for autonomous machine applications |
US20200324794A1 (en) * | 2020-06-25 | 2020-10-15 | Intel Corporation | Technology to apply driving norms for automated vehicle behavior prediction |
US20200326721A1 (en) * | 2020-06-26 | 2020-10-15 | Intel Corporation | Occupancy verification device and method |
CN112987765B (zh) * | 2021-03-05 | 2022-03-15 | 北京航空航天大学 | 一种仿猛禽注意力分配的无人机/艇精准自主起降方法 |
US20220301182A1 (en) * | 2021-03-18 | 2022-09-22 | Waymo Llc | Predicting the future movement of agents in an environment using occupancy flow fields |
CN113112061B (zh) * | 2021-04-06 | 2024-05-28 | 深圳市汉德网络科技有限公司 | 一种预测车辆油耗的方法与装置 |
US11675362B1 (en) * | 2021-12-17 | 2023-06-13 | Motional Ad Llc | Methods and systems for agent prioritization |
EP4310535A1 (en) * | 2022-07-20 | 2024-01-24 | GM Cruise Holdings LLC | Generating point clouds based upon radar tensors |
EP4312054A1 (en) * | 2022-07-29 | 2024-01-31 | GM Cruise Holdings LLC | Radar point cloud multipath reflection compensation |
WO2024103052A1 (en) * | 2022-11-11 | 2024-05-16 | The Regents Of The University Of Michigan | Vehicle repositioning determination for vehicle pool |
US20240217548A1 (en) * | 2023-01-04 | 2024-07-04 | Zoox, Inc. | Trajectory prediction for autonomous vehicles using attention mechanism |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2016024318A1 (ja) | 2014-08-11 | 2016-02-18 | 日産自動車株式会社 | 車両の走行制御装置及び方法 |
JP2016177729A (ja) | 2015-03-23 | 2016-10-06 | 本田技研工業株式会社 | 車両用衝突回避支援装置 |
US20190152490A1 (en) | 2017-11-22 | 2019-05-23 | Uber Technologies, Inc. | Object Interaction Prediction Systems and Methods for Autonomous Vehicles |
Family Cites Families (27)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7647180B2 (en) * | 1997-10-22 | 2010-01-12 | Intelligent Technologies International, Inc. | Vehicular intersection management techniques |
US8970701B2 (en) | 2011-10-21 | 2015-03-03 | Mesa Engineering, Inc. | System and method for predicting vehicle location |
US9983591B2 (en) * | 2015-11-05 | 2018-05-29 | Ford Global Technologies, Llc | Autonomous driving at intersections based on perception data |
US10611379B2 (en) * | 2016-08-16 | 2020-04-07 | Toyota Jidosha Kabushiki Kaisha | Integrative cognition of driver behavior |
JP7160251B2 (ja) | 2017-01-12 | 2022-10-25 | モービルアイ ビジョン テクノロジーズ リミテッド | ナビゲーションシステム、方法、およびプログラム |
US20180225585A1 (en) * | 2017-02-08 | 2018-08-09 | Board Of Regents, The University Of Texas System | Systems and methods for prediction of occupancy in buildings |
US11537134B1 (en) * | 2017-05-25 | 2022-12-27 | Apple Inc. | Generating environmental input encoding for training neural networks |
WO2019023628A1 (en) * | 2017-07-27 | 2019-01-31 | Waymo Llc | NEURAL NETWORKS FOR VEHICLE TRACK PLANNING |
US20200174490A1 (en) * | 2017-07-27 | 2020-06-04 | Waymo Llc | Neural networks for vehicle trajectory planning |
US10611371B2 (en) * | 2017-09-14 | 2020-04-07 | Toyota Motor Engineering & Manufacturing North America, Inc. | System and method for vehicle lane change prediction using structural recurrent neural networks |
US10739775B2 (en) * | 2017-10-28 | 2020-08-11 | Tusimple, Inc. | System and method for real world autonomous vehicle trajectory simulation |
KR101936629B1 (ko) | 2018-01-19 | 2019-01-09 | 엘지전자 주식회사 | 차량 및 그 제어방법 |
US11625036B2 (en) * | 2018-04-09 | 2023-04-11 | SafeAl, Inc. | User interface for presenting decisions |
US11511745B2 (en) * | 2018-04-27 | 2022-11-29 | Huawei Technologies Co., Ltd. | Method and system for adaptively controlling object spacing |
US11370423B2 (en) * | 2018-06-15 | 2022-06-28 | Uatc, Llc | Multi-task machine-learned models for object intention determination in autonomous driving |
US20200017124A1 (en) * | 2018-07-12 | 2020-01-16 | Sf Motors, Inc. | Adaptive driver monitoring for advanced driver-assistance systems |
US11724691B2 (en) * | 2018-09-15 | 2023-08-15 | Toyota Research Institute, Inc. | Systems and methods for estimating the risk associated with a vehicular maneuver |
US11169531B2 (en) * | 2018-10-04 | 2021-11-09 | Zoox, Inc. | Trajectory prediction on top-down scenes |
US11465633B2 (en) * | 2018-11-14 | 2022-10-11 | Huawei Technologies Co., Ltd. | Method and system for generating predicted occupancy grid maps |
US11755018B2 (en) * | 2018-11-16 | 2023-09-12 | Uatc, Llc | End-to-end interpretable motion planner for autonomous vehicles |
US11034348B2 (en) * | 2018-11-20 | 2021-06-15 | Waymo Llc | Agent prioritization for autonomous vehicles |
US10739777B2 (en) * | 2018-11-20 | 2020-08-11 | Waymo Llc | Trajectory representation in behavior prediction systems |
US11126180B1 (en) * | 2019-04-30 | 2021-09-21 | Zoox, Inc. | Predicting an occupancy associated with occluded region |
US11131993B2 (en) * | 2019-05-29 | 2021-09-28 | Argo AI, LLC | Methods and systems for trajectory forecasting with recurrent neural networks using inertial behavioral rollout |
US10990855B2 (en) * | 2019-06-13 | 2021-04-27 | Baidu Usa Llc | Detecting adversarial samples by a vision based perception system |
US11403853B2 (en) * | 2019-08-30 | 2022-08-02 | Waymo Llc | Occupancy prediction neural networks |
US11586931B2 (en) * | 2019-10-31 | 2023-02-21 | Waymo Llc | Training trajectory scoring neural networks to accurately assign scores |
-
2019
- 2019-08-30 US US16/557,246 patent/US11403853B2/en active Active
-
2020
- 2020-07-21 JP JP2022510922A patent/JP7367183B2/ja active Active
- 2020-07-21 WO PCT/US2020/042841 patent/WO2021040910A1/en unknown
- 2020-07-21 EP EP20856700.8A patent/EP4000015A4/en not_active Withdrawn
- 2020-07-21 KR KR1020227010052A patent/KR20220054358A/ko unknown
- 2020-07-21 CN CN202080060326.7A patent/CN114341950A/zh active Pending
-
2022
- 2022-07-12 US US17/862,499 patent/US11772654B2/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2016024318A1 (ja) | 2014-08-11 | 2016-02-18 | 日産自動車株式会社 | 車両の走行制御装置及び方法 |
JP2016177729A (ja) | 2015-03-23 | 2016-10-06 | 本田技研工業株式会社 | 車両用衝突回避支援装置 |
US20190152490A1 (en) | 2017-11-22 | 2019-05-23 | Uber Technologies, Inc. | Object Interaction Prediction Systems and Methods for Autonomous Vehicles |
Also Published As
Publication number | Publication date |
---|---|
CN114341950A (zh) | 2022-04-12 |
US20210064890A1 (en) | 2021-03-04 |
WO2021040910A1 (en) | 2021-03-04 |
US20220343657A1 (en) | 2022-10-27 |
KR20220054358A (ko) | 2022-05-02 |
US11772654B2 (en) | 2023-10-03 |
EP4000015A1 (en) | 2022-05-25 |
JP2022546283A (ja) | 2022-11-04 |
US11403853B2 (en) | 2022-08-02 |
EP4000015A4 (en) | 2023-07-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7367183B2 (ja) | 占有予測ニューラルネットワーク | |
US11797407B2 (en) | Systems and methods for generating synthetic sensor data via machine learning | |
JP7459224B2 (ja) | アンカー軌道を使用したエージェント軌道予測 | |
CN113272830B (zh) | 行为预测系统中的轨迹表示 | |
US11537127B2 (en) | Systems and methods for vehicle motion planning based on uncertainty | |
US20210200212A1 (en) | Jointly Learnable Behavior and Trajectory Planning for Autonomous Vehicles | |
US11727690B2 (en) | Behavior prediction of surrounding agents | |
US11967103B2 (en) | Multi-modal 3-D pose estimation | |
CN114830138A (zh) | 训练轨迹评分神经网络以准确分配分数 | |
US11858536B1 (en) | Systems and methods for interactive prediction and planning | |
CN114514524A (zh) | 多智能体模拟 | |
US20210364637A1 (en) | Object localization using machine learning | |
US11657268B1 (en) | Training neural networks to assign scores | |
US20230406360A1 (en) | Trajectory prediction using efficient attention neural networks | |
US20230406361A1 (en) | Structured multi-agent interactive trajectory forecasting |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A621 | Written request for application examination |
Free format text: JAPANESE INTERMEDIATE CODE: A621 Effective date: 20220414 |
|
A131 | Notification of reasons for refusal |
Free format text: JAPANESE INTERMEDIATE CODE: A131 Effective date: 20230417 |
|
A521 | Request for written amendment filed |
Free format text: JAPANESE INTERMEDIATE CODE: A523 Effective date: 20230628 |
|
TRDD | Decision of grant or rejection written | ||
A01 | Written decision to grant a patent or to grant a registration (utility model) |
Free format text: JAPANESE INTERMEDIATE CODE: A01 Effective date: 20230920 |
|
A61 | First payment of annual fees (during grant procedure) |
Free format text: JAPANESE INTERMEDIATE CODE: A61 Effective date: 20231011 |
|
R150 | Certificate of patent or registration of utility model |
Ref document number: 7367183 Country of ref document: JP Free format text: JAPANESE INTERMEDIATE CODE: R150 |