CN118938892A - Mobile object, server, and method for manufacturing mobile object - Google Patents
Mobile object, server, and method for manufacturing mobile object Download PDFInfo
- Publication number
- CN118938892A CN118938892A CN202410542261.XA CN202410542261A CN118938892A CN 118938892 A CN118938892 A CN 118938892A CN 202410542261 A CN202410542261 A CN 202410542261A CN 118938892 A CN118938892 A CN 118938892A
- Authority
- CN
- China
- Prior art keywords
- control
- vehicle
- unit
- mobile body
- completion
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 241
- 238000004519 manufacturing process Methods 0.000 title claims abstract description 133
- 230000008569 process Effects 0.000 claims abstract description 218
- 238000001514 detection method Methods 0.000 claims abstract description 137
- 238000012545 processing Methods 0.000 claims abstract description 60
- 230000008859 change Effects 0.000 claims description 72
- 230000005856 abnormality Effects 0.000 claims description 40
- 238000004891 communication Methods 0.000 claims description 32
- 230000001133 acceleration Effects 0.000 claims description 23
- 238000003860 storage Methods 0.000 claims description 23
- 230000002265 prevention Effects 0.000 claims description 12
- 239000000725 suspension Substances 0.000 claims description 12
- 238000010586 diagram Methods 0.000 description 28
- 230000006870 function Effects 0.000 description 23
- 238000004590 computer program Methods 0.000 description 8
- 230000002159 abnormal effect Effects 0.000 description 6
- 239000000047 product Substances 0.000 description 6
- 238000005266 casting Methods 0.000 description 5
- 238000013527 convolutional neural network Methods 0.000 description 4
- 238000007689 inspection Methods 0.000 description 3
- 238000000465 moulding Methods 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 230000001172 regenerating effect Effects 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 2
- 230000014509 gene expression Effects 0.000 description 2
- 238000005304 joining Methods 0.000 description 2
- 238000010801 machine learning Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 230000011218 segmentation Effects 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- 238000012549 training Methods 0.000 description 2
- 241000709691 Enterovirus E Species 0.000 description 1
- HBBGRARXTFLTSG-UHFFFAOYSA-N Lithium ion Chemical compound [Li+] HBBGRARXTFLTSG-UHFFFAOYSA-N 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000003745 diagnosis Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000001747 exhibiting effect Effects 0.000 description 1
- 239000000446 fuel Substances 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 229910001416 lithium ion Inorganic materials 0.000 description 1
- 229910052987 metal hydride Inorganic materials 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 229910052759 nickel Inorganic materials 0.000 description 1
- PXHVJJICTQNCMI-UHFFFAOYSA-N nickel Substances [Ni] PXHVJJICTQNCMI-UHFFFAOYSA-N 0.000 description 1
- -1 nickel metal hydride Chemical class 0.000 description 1
- 239000011265 semifinished product Substances 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 238000003466 welding Methods 0.000 description 1
Abstract
Provided is a technique capable of appropriately executing driving control of a mobile body by remote control or autonomous control per process. The moving body manufactured in the factory is provided with: a driving control unit that performs driving control of the unmanned moving body during manufacturing of the moving body in the factory; a process completion detection unit that detects completion of a process based on at least one step included in the manufacturing process; and a control content changing unit that changes the content of the control of the mobile body when the completion of the processing is detected.
Description
Cross Reference to Related Applications
The present application claims priority based on japanese patent application No. 2023-078687 filed 5/11/2023 and japanese patent application No. 2023-188214 filed 11/2023, the disclosures of which are incorporated herein by reference in their entirety.
Technical Field
The present disclosure relates to a mobile body, a server, and a method of manufacturing a mobile body.
Background
For example, japanese patent laying-open No. 2017-538619 discloses a vehicle traveling method for causing a vehicle to travel from the end of an assembly line of a manufacturing system to a parking lot of the manufacturing system autonomously or by remote control in a manufacturing system for manufacturing a vehicle.
The driving control of the vehicle based on the remote control or the autonomous control may be performed before the vehicle is completed, on the premise that the vehicle can run by the remote control or the autonomous control. However, when the vehicle approaches completion, various processes for the vehicle such as assembly of parts and modification of control parameters may be performed for each process. Accordingly, a technique capable of appropriately performing driving control of a vehicle based on remote control or autonomous control per process is desired. Such a problem is not limited to vehicles, and is common to any moving body.
Disclosure of Invention
Means for solving the problems
The present disclosure can be implemented as follows.
(1) According to a first aspect of the present disclosure, there is provided a mobile body manufactured in a factory. The moving body is provided with: a driving control unit that performs driving control of the mobile body by unmanned driving during manufacturing of the mobile body in the factory; a process completion detection unit configured to detect completion of a process based on at least one step included in the manufacturing process; and a control content changing unit that changes the content of the control of the mobile body when completion of the processing is detected.
According to the mobile body of this aspect, the content of the control of the mobile body can be changed every time the process is completed by the process, and the driving control of the mobile body by the remote control or the autonomous control can be appropriately performed for each process.
(2) In the above-described aspect of the mobile body, the mobile body may further include a communication unit configured to receive a control command for remote control, and the driving control unit may be configured to execute driving control of the mobile body in accordance with the received control command during the manufacturing process.
According to the mobile body of this aspect, the content of the control of the mobile body can be changed every time the process is completed, and the driving control of the mobile body by the remote control can be appropriately performed for each process.
(3) In the mobile body according to the above aspect, the process completion detection unit may detect completion of a process of adding an element to the mobile body or a process of changing an element provided in the mobile body based on the at least one step. When the completion of the process is detected, the control content changing unit may change the content of the control of the mobile body so as to be a control using an element added or changed to the mobile body by the completion of the process.
According to the mobile body of this aspect, the element added or changed to the mobile body can be utilized according to the progress of the manufacturing process of the mobile body, and the performance of the mobile body during the movement by the remote control can be suitably exhibited according to the progress of the manufacturing process.
(4) In the moving body according to the above aspect, the step may include an object detection device assembling step of: an object detection device including at least one of a radar device and a camera, which is capable of detecting an object around the mobile body, is added to the mobile body as the element. The control content changing unit may change the content of the control of the moving body to perform the collision prevention control using the added object detecting device when the completion of the object detecting device assembling process is detected.
According to the moving body of this aspect, collision prevention can be performed at the time of moving the moving body simultaneously with completion of the assembly process of the object detection device.
(5) In the mobile body according to the above aspect, the control content changing unit may further change the content of the control of the mobile body, and cause the mobile body to travel by the driving control of the mobile body using the collision prevention control instead of the driving control by the remote control.
According to the mobile body of this aspect, it is possible to switch from the driving control by the remote control to the movement by the driving control of the mobile body simultaneously with completion of the assembly process of the object detection device.
(6) In the moving body according to the above aspect, the step may include a speed detection device assembling step of: a speed detection device including at least one of a vehicle speed sensor, a wheel speed sensor, an acceleration sensor, and a yaw rate sensor, capable of acquiring speed information related to a speed of a vehicle as the moving body, is added to the moving body as the element. The control content changing unit may change the content of the control of the moving body to perform the driving control using the speed information detected by the speed detecting unit to be added, when the completion of the speed detecting unit assembling process is detected.
According to the moving body of this aspect, the self-position estimation using the detected speed information and the feedback control of the vehicle speed can be performed simultaneously with the completion of the assembly process of the speed detection device.
(7) In the moving body according to the above aspect, the step may include an adjustment step including at least any one of a wheel alignment adjustment step of performing a process of changing a wheel alignment of a vehicle as the moving body as the element and a suspension adjustment step of performing a process of changing a suspension of the vehicle as the element. When completion of the adjustment process is detected, the control content changing unit may change the content of the control of the moving body so that the upper limit value of the travel speed of the moving body increases.
According to the movable body of this aspect, the moving speed of the movable body can be increased simultaneously with completion of the adjustment process, and productivity of the movable body can be improved.
(8) The moving body according to the above aspect may further include: an operation unit for performing manual driving of the moving body; and a storage device that stores a threshold value set in advance using an operation amount of the operation unit, the threshold value being used to determine whether or not to prioritize driving control by the operation unit over driving control by the remote control when driving control by the remote control and driving control by the operation unit are simultaneously performed. The process completion detection unit may detect completion of a process based on a process preceding a process in which an operator is likely to contact the operation unit, among the at least one process. When completion of the processing by the preceding step is detected, the control content changing unit may change the content of the control of the moving body so as to relax the threshold value to a value at which it is difficult to preferentially perform the driving control by the operation unit.
According to the mobile body of this aspect, it is possible to suppress or prevent a problem that the movement of the mobile body is accidentally stopped due to a wrong contact of the operation portion by an operator or the like during the movement of the mobile body by remote control.
(9) In the mobile body according to the above aspect, the driving control unit may generate a control signal for moving the mobile body by the unmanned driving, and may perform driving control of the mobile body in accordance with the control signal.
(10) According to a second aspect of the present disclosure, a server is provided. The server is provided with: a remote control unit that travels a mobile body that is manufactured in a factory by remote control, and that includes a communication unit that receives a control instruction for the remote control, and a drive control unit that executes drive control of the mobile body in accordance with the received control instruction during manufacturing in the factory that manufactures the mobile body; a manufacturing information acquisition unit that acquires manufacturing information including a process of adding an element to the mobile body or a progress of a process of changing an element provided in the mobile body based on at least one step included in the manufacturing process; and a control content change instruction unit that instructs the mobile body to change the content of the control of the mobile body so as to be the control using the element added or changed to the mobile body by the completion of the processing when the completion of the processing is detected.
According to the server of this embodiment, the content of the control of the mobile body can be appropriately changed according to the progress of the manufacturing process of the mobile body managed by the server, and the driving control of the mobile body by the remote control can be appropriately executed for each step.
(11) In the server according to the above aspect, the manufacturing information obtaining unit may obtain completion of a process of adding an element to the mobile body or a process of changing an element provided in the mobile body based on the at least one step. When the completion of the processing is obtained, the control content change instruction unit may instruct the mobile body to change the content of the control of the mobile body so as to be a control using an element added or changed to the mobile body by the completion of the processing.
According to the server of this aspect, the elements added or changed to the mobile body can be appropriately utilized according to the progress of the manufacturing process of the mobile body managed by the server, and the performance of the mobile body during the movement by the remote control can be appropriately exhibited.
(12) In the server according to the above aspect, the manufacturing information obtaining unit may obtain completion of processing based on a step preceding a step in which an operator may touch the operation unit, among the at least one step. The control content change instruction unit may instruct the mobile body to change the content of the control of the mobile body so as to relax a threshold value for determining whether or not to give priority to the driving control by the operation unit than the driving control by the remote control when the driving control by the operation unit and the driving control by the remote control are simultaneously performed, when the completion of the processing by the preceding step is acquired.
According to the server of this aspect, it is possible to suppress or prevent a problem that the movement of the mobile body is accidentally stopped due to a wrong contact of the operating unit by an operator or the like during the movement of the mobile body by remote control.
(13) The server according to the above aspect may further include an abnormality measure unit that performs at least one of an abnormality measure for stopping and notifying the manufacture of the mobile body when the driving control by the operation unit is preferentially performed after the threshold value is relaxed.
According to the server of this embodiment, by executing the abnormal measure, the risk accompanying the relaxation of the threshold value can be suppressed or prevented.
(14) According to a third aspect of the present disclosure, a method of manufacturing a mobile body is provided. In the manufacturing method, in a manufacturing process in a factory for manufacturing a mobile body, the mobile body is driven by an unmanned operation, manufacturing information including progress status of a process based on at least one process included in the manufacturing process is acquired, and when completion of the process is detected, a change of control content of the mobile body is instructed to the mobile body.
According to the method for manufacturing a mobile body of this aspect, the content of the control of the mobile body can be changed every time the process is completed by the process, and the driving control of the mobile body by the remote control or the autonomous control can be appropriately performed for each process.
The present disclosure can be implemented in various modes other than the mobile body, the server, and the method of manufacturing the mobile body. For example, the present invention can be implemented as a system, a method for transporting a moving object, a method for changing or adding control in a moving object, a method for controlling a moving object, a computer program for realizing the control method, a non-transitory recording medium on which the computer program is recorded, or the like.
Drawings
Fig. 1 is an explanatory diagram showing a schematic configuration of a vehicle and a system according to a first embodiment.
Fig. 2 is a block diagram showing an internal functional configuration of the server.
Fig. 3 is a block diagram showing the internal functional configuration of the ECU.
Fig. 4A is a flowchart showing a processing sequence of travel control of the vehicle in the first embodiment.
Fig. 4B is an explanatory diagram showing automatic driving control of the vehicle based on remote control by the remote control unit.
Fig. 5 is an explanatory diagram conceptually showing the control content change table.
Fig. 6 is a flowchart showing a method of manufacturing a vehicle according to the first embodiment.
Fig. 7 is an explanatory diagram schematically showing a method of manufacturing a vehicle according to the first embodiment.
Fig. 8 is a block diagram showing an internal functional configuration of a server according to the second embodiment.
Fig. 9 is a block diagram showing an internal function configuration of an ECU included in the vehicle according to the second embodiment.
Fig. 10 is a block diagram showing an internal functional configuration of an ECU of the vehicle according to the third embodiment.
Fig. 11 is a block diagram showing an internal functional configuration of a server according to the third embodiment.
Fig. 12 is a flowchart showing a method of manufacturing a vehicle according to a third embodiment.
Fig. 13 is a block diagram showing an internal functional configuration of an ECU of the vehicle according to the fourth embodiment.
Fig. 14 is a block diagram showing an internal functional configuration of a server according to the fourth embodiment.
Fig. 15 is an explanatory diagram showing a schematic configuration of a system according to the fifth embodiment.
Fig. 16 is an explanatory diagram showing an internal functional configuration of an ECU of the vehicle according to the fifth embodiment.
Fig. 17 is a flowchart showing a processing sequence of travel control of the vehicle according to the fifth embodiment.
Detailed Description
A. First embodiment:
Fig. 1 is an explanatory diagram showing a schematic configuration of a vehicle 100 and a system 500 according to a first embodiment. The system 500 in the present embodiment is configured as a remote automatic driving system. The system 500 enables the vehicle 100 to automatically travel by remote control during the manufacturing process in the factory FC for manufacturing the vehicle 100 as a moving body. In the present specification, the state that has been completed as a product and the state of a semi-finished product/product during production are collectively referred to as a "vehicle".
In the present disclosure, "mobile" means an object that can move, for example, a vehicle, an electric vertical take-off and landing aircraft (so-called flying car). The vehicle may be a vehicle running on wheels or a vehicle running on tracks, for example, a passenger car, a truck, a bus, a two-wheel vehicle, a four-wheel vehicle, a tank, a construction vehicle, or the like. Vehicles include electric vehicles (BEVs: battery ELECTRIC VEHICLE), gasoline vehicles, hybrid vehicles, and fuel cell vehicles. In the case where the mobile object is other than a vehicle, the expressions "vehicle" and "vehicle" in the present disclosure may be appropriately replaced with "mobile object", and the expression "traveling" may be appropriately replaced with "moving".
The vehicle 100 is configured to be capable of traveling by unmanned driving. "unmanned" means driving that does not depend on the traveling operation of the rider. The running operation means an operation related to at least one of "running", "steering", "stopping" of the vehicle 100. The unmanned is achieved by automatic or manual remote control using a device located outside the vehicle 100, or autonomous control of the vehicle 100. The vehicle 100 traveling by the unmanned vehicle may be ridden with a rider who does not perform the traveling operation. The occupant who does not perform the running operation includes, for example, a person who sits only on the seat of the vehicle 100, and a person who performs an operation other than the running operation such as assembling, checking, or operating the switch group while riding on the vehicle 100. In addition, driving based on a traveling operation of the rider is sometimes referred to as "manned driving".
In the specification, "remote control" includes "full remote control" that decides all the operations of the vehicle 100 entirely from the outside of the vehicle 100, and "partial remote control" that decides a part of the operations of the vehicle 100 from the outside of the vehicle 100. In addition, "autonomous control" includes "full autonomous control" in which the vehicle 100 autonomously controls the actions of itself without receiving any information from a device external to the vehicle 100, and "partial autonomous control" in which the vehicle 100 autonomously controls the actions of itself using information received from a device external to the vehicle 100.
As shown in fig. 1, the factory FC includes a pre-process 50, a post-process 60, and a travel path RT of the vehicle 100. The travel road RT is a transport section of the vehicle 100 in the factory FC connecting the preceding step 50 and the following step 60. The steps in the factory FC and the manufacturing process are not limited to the case of one building, the case of one place or one place. The factory FC and each process in the manufacturing process may exist across a plurality of buildings, a plurality of places, a plurality of dwellings, and the like. The "vehicle 100 travels in the factory FC" includes not only a case where the vehicle 100 travels on a travel path in a factory existing in one place but also a case where the vehicle travels in a transportation section between a plurality of factories and processes existing in a plurality of places. The "vehicle 100 travels in the factory FC" includes, for example, a case where the vehicle 100 travels on a public road without being limited to a private road in order to travel between factories and processes existing in a plurality of places.
The pre-process 50 and the post-process 60 are various processes belonging to the manufacturing process of the vehicle 100. In the manufacturing process of the vehicle 100, a process of adding elements to the vehicle 100 and a process of changing elements included in the vehicle 100 may be performed. "processing" means performing a specific job on the vehicle 100. The processing may be performed manually by a worker or automatically by a device or the like. The "elements of the vehicle 100" include, for example, elements related to components and devices mounted on the vehicle 100, such as types, parameters, positions, and states of components and devices mounted on the vehicle 100, and elements related to control of the vehicle 100, such as control parameters, control programs, and the like related to various controls for operating various parts of the vehicle 100, including driving control of the vehicle 100. When the process by the process is completed, a new element may be added to the vehicle 100 or an element that the vehicle 100 has had may be changed.
The pre-process 50 includes, for example, an assembling process of assembling the components of the vehicle 100 to the vehicle 100, such as the detection device group 180, and an adjusting process of adjusting the components assembled to the vehicle 100. The post-process 60 is, for example, an inspection process of the vehicle 100. The vehicle 100 delivered from the previous step 50 is a product of the subsequent step 60, and travels on the travel road RT to the subsequent step 60 as a travel destination by remote control. After the inspection process as the post-process 60 is completed, the vehicle 100 is completed as a product and travels to a standby place in the factory FC for waiting for shipment. Thereafter, the vehicles 100 are shipped to shipment destinations corresponding to each of the vehicles 100. The pre-step 50 and the post-step 60 are not limited to the assembling step, the adjusting step, and the inspecting step, and various steps may be employed on the premise that the vehicle 100 after the processing in the pre-step 50 and the post-step 60 can travel by unmanned driving.
Each process in the factory FC including the pre-process 50 and the post-process 60 includes a process management device for managing manufacturing information of the vehicle 100. The "manufacturing information" includes, for example, progress status of processing based on the process, the number of products in production, the number of products in processing, manufacturing time of each process, starting time and finishing time of processing based on each process, vehicle identification information of the vehicle 100 existing in each process, a predetermined number of manufacturing processes per day, target manufacturing time of a process for manufacturing one vehicle 100, and the like. The target manufacturing time is sometimes referred to as a "takt time". The "vehicle identification information" means various information capable of individually identifying the vehicle 100. The vehicle identification information includes, for example, ID information given to each vehicle 100 such as vehicle identification information (VIN: vehicle Identification Number, vehicle identification number), specification information of the vehicle 100 such as model, color, shape, and the like, production management information of the vehicle 100 such as the name of a process in the manufacturing process, and the like. The vehicle Identification information may be obtained from, for example, an RF-ID (Radio Frequency-Identification) tag or the like attached to the vehicle 100 via short-range wireless communication or the like. The process management device of each process acquires the manufacturing conditions of the vehicle 100 of each process from a camera, a sensor, or the like, not shown, provided in each process, and transmits the acquired manufacturing conditions to the server 300 and the vehicle 100. The manufacturing conditions of the respective steps may be transmitted to a production management device that collectively manages the manufacturing conditions of the respective steps of the factory FC.
The system 500 is provided with a vehicle detector and a server 300. The vehicle detector detects vehicle information including at least any one of an image of the vehicle 100 and a position of the vehicle 100. The detected vehicle information is used for remote control based on the system 500. The "vehicle information" may also include the traveling direction of the vehicle 100 or the orientation of the vehicle 100. The traveling direction of the vehicle 100 and the direction of the vehicle 100 can be obtained by detecting the shape of the vehicle 100, parts of the vehicle 100, or the like, for example. However, the vehicle detector may acquire only the position of the vehicle 100, and estimate the traveling direction and the heading of the vehicle 100 using the change with time of the vehicle 100.
In the present embodiment, a camera 80 is used as the vehicle detector. The camera 80 is communicably connected with the server 300 by wireless communication or wired communication. The camera 80 includes an imaging section such as a CCD (Charge Coupled Device ) image sensor and a CMOS (Complementary Metal Oxide Semiconductor ) image sensor, and an optical system. The camera 80 is fixed at a position where the vehicle 100 traveling on the travel road RT and the travel road RT can be photographed, and acquires an image of the vehicle 100 as vehicle information. The image acquired by the camera 80 can be analyzed to acquire various pieces of vehicle information usable for remote control, such as the relative position of the vehicle 100 with respect to the road RT and the orientation of the vehicle 100. By using the image of the camera 80 provided in the factory FC, it is possible to perform automatic travel of the vehicle 100 by remote control without using a detector mounted on the vehicle 100, such as a camera, millimeter wave radar, or Light Detection and ranging (Light Detection AND RANGING). The vehicle detector may not acquire an image of the vehicle 100 as long as the vehicle detector can acquire the position of the vehicle 100. In this case, as the vehicle detector, for example, various detectors capable of detecting the position of the vehicle 100 in place of the image of the vehicle 100, such as a LiDAR, an infrared sensor, a laser sensor, an ultrasonic sensor, a millimeter wave radar, and the like, may be used.
Fig. 2 is a block diagram showing an internal functional configuration of the server 300. The server 300 includes a CPU 310 as a central processing unit, a storage device 320, and a remote communication unit 390, which are connected to each other via an internal bus, an interface circuit, and the like. The remote communication unit 390 is a circuit for communicating with the vehicle 100 and the like via the network 72.
The storage 320 is, for example, RAM, ROM, HDD (hard disk drive), SSD (solid state drive), or the like. Various programs for realizing the functions provided in the present embodiment are stored in the storage device 320. By executing the computer program stored in the storage device 320 by the CPU 310, the CPU 310 functions as a remote control unit 312, a manufacturing information acquisition unit 314, and the like. However, some or all of these functions may be constituted by hardware circuits.
The manufacturing information obtaining unit 314 obtains manufacturing information 322 from a process management device provided in each process, a production management device that collectively manages manufacturing conditions of each process, or the like. The manufacturing information acquisition unit 314 may acquire the manufacturing information 322 from each vehicle 100. The acquired manufacturing information 322 is stored in the storage device 320. As a result, the server 300 can individually acquire progress of the processing for each process of the vehicle 100 in the manufacturing process for each vehicle 100.
The remote control unit 312 performs automatic travel of the vehicle 100 in the factory FC by remote control. More specifically, the remote control portion 312 transmits a control signal requesting remote control to the vehicle 100 via the remote communication portion 390. Specifically, in the present embodiment, the control signal is a travel control signal described later. When the vehicle 100 receives the request for remote control, the ECU 200 realizes driving control in accordance with the control signal, and as a result, the vehicle 100 automatically runs. By using the transportation of the vehicle 100 by such unmanned driving, it is possible to suppress or prevent an artificial accident when the vehicle 100 is driving.
In other embodiments, when the vehicle 100 is driven by remote control, the remote control unit 312 may not transmit a driving control signal to the vehicle 100 and may transmit a control command to the vehicle 100. The control command includes at least one of a travel control signal and generation information for generating the travel control signal. As the generation information, for example, vehicle position information, a route, and a target position, which will be described later, can be used.
As shown in fig. 1, the vehicle 100 includes an operation unit 170, a vehicle communication unit 190, a power receiving device 150, a battery 120, a PCU 130, a motor 140, and detection device groups 180 and ECU (Electronic Control Unit) 200. The operation unit 170 is, for example, an accelerator, a steering wheel (STEERING WHEEL), a brake, or the like. The operation unit 170 receives a manual operation for performing functions of "running", "steering", and "stopping" of the vehicle 100. The manual operation corresponds to the above-described travel operation by the driver.
The vehicle communication unit 190 is a wireless communication device mounted on the vehicle 100, such as a dongle (dongle). The vehicle communication unit 190 has a communication function of performing communication using CAN (Controller Area Network ) communication that CAN be used for control or the like of the vehicle 100, and diagnostic communication that CAN be used for fault diagnosis or the like. CAN communication is a communication standard capable of transmitting or receiving in a plurality of directions. Diagnostic communication is a communication standard capable of one-to-one correspondence of requests and responses. The vehicle communication unit 190 is configured to be capable of wirelessly communicating with, for example, an external device of the vehicle 100, such as a server 300 connected to the network 72 and a production management device, not shown, that collectively manages production information of the vehicle 100 via the access point 70 in the factory FC. Hereinafter, the vehicle communication unit 190 is also simply referred to as a communication unit.
The power receiving device 150 converts ac power supplied from an external power supply device or the like into dc power by a rectifier, and supplies the dc power to the battery 120 as a load. The battery 120 is a chargeable secondary battery such as a lithium ion battery or a nickel metal hydride battery. The battery 120 is, for example, a high-voltage battery of several hundred V, and stores electric power used for running the vehicle 100. When electric power supplied from an external power feeding device to the power receiving device 150 and regenerative electric power generated by the motor 140 are supplied to the battery 120, the battery 120 is charged.
The motor 140 is, for example, an ac synchronous motor, and functions as a motor and a generator. When the motor 140 functions as a motor, the motor 140 is driven by the electric power stored in the battery 120 as a power source. The output of the motor 140 is transmitted to the wheels via the decelerator and the axle. When the vehicle 100 is decelerating, the motor 140 functions as a generator that uses rotation of wheels, and generates regenerative power. A PCU (Power Control Unit ) 130 is electrically connected between the motor 140 and the battery 120.
PCU 130 has an inverter, a boost converter, and a DC/DC converter. The inverter converts dc power supplied from the battery 120 into ac power, and supplies the converted ac power to the motor 140. The inverter converts regenerative power supplied from the motor 140 into dc power and supplies the dc power to the battery 120. When the electric power stored in the battery 120 is supplied to the motor 140, the boost converter boosts the voltage of the battery 120. When the electric power stored in battery 120 is supplied to an auxiliary machine or the like, the DC/DC converter steps down the voltage of battery 120.
The detection device group 180 is a sensor group provided in the vehicle 100. The detection device group 180 is mounted on the vehicle 100 by, for example, each step included in the preceding step 50. The detection device group 180 includes, for example, an object detection device, a speed detection device, and the like. The object detection device is a radar device, an in-vehicle camera, or the like. Radar devices include devices such as LiDAR and millimeter wave radar that detect the presence or absence of an object around the vehicle 100, the distance to the object, and the position of the object. The in-vehicle cameras include various cameras such as a stereo camera and a monocular camera that can capture an object around the vehicle 100. The speed detection device is a vehicle speed sensor, a wheel speed sensor, an acceleration sensor, a yaw rate sensor, or the like. The detection device group 180 is not limited to the object detection device and the speed detection device, and may include various normal sensors such as a steering angle sensor.
Fig. 3 is a block diagram showing the internal functional configuration of ECU 200. ECU 200 is mounted on vehicle 100 and executes various controls of vehicle 100. ECU 200 includes a storage device 220 such as an HDD (hard disk drive), an SSD (solid state drive), an optical recording medium, and a semiconductor memory, a CPU 210 as a central processing unit, and an interface circuit 280. The interface circuit 280 is connected to the detection device group 180, the vehicle communication unit 190, and the like. The control program 222 and the control content change table 224 are stored in the storage device 220.
The control program 222 is a computer program for causing the CPU 210 to function as the driving control unit 212. The control program 222 includes control parameters. The control content change table 224 shows the correspondence between the process and the content of the control program 222 added or changed after the process is completed.
Various programs for realizing the functions provided in the present embodiment are stored in the storage device 220. The CPU 210 executes various computer programs stored in the storage device 220 to realize various functions such as the driving control unit 212, the processing completion detection unit 214, and the control content change unit 216. ECU 200 controls PCU 130 to control transmission and reception of electric power between battery 120 and motor 140.
The process completion detection unit 214 detects completion of a process based on a process included in the manufacturing process. In the present embodiment, the process completion detection unit 214 obtains completion of the process for the own vehicle in each process from a sensor, a camera, or the like provided in each process. The process completion detection unit 214 may obtain completion of the process for the own vehicle in each process from a process management device provided in each process, a production management device that collectively manages the manufacturing conditions of each process, or the server 300 that obtains information on these processes.
When the completion of the processing by each step is detected by the processing completion detection unit 214, the control content changing unit 216 changes the control content of the vehicle 100. Specifically, the control content changing unit 216 changes the content of the control of the vehicle 100 so as to be a control using an element added or changed to the vehicle 100 by the processing of the detected process. In the present embodiment, the control content changing unit 216 refers to the control content changing table 224, and rewrites the control program 222 every time the process based on the process is completed. As a result, the control program 222 is changed to the content of the control using the element added or changed to the vehicle 100. The "rewriting of the control program 222" includes rewriting of the control parameter. In addition, when the process is completed without adding or changing elements to the vehicle 100, the control content changing unit 216 may not change the control content of the vehicle 100.
The driving control portion 212 executes driving control of the vehicle 100. The "driving control" refers to, for example, acceleration, speed, adjustment of rudder angle, and the like. In the driving control based on the remote control, the driving control unit 212 controls each actuator mounted on the vehicle 100 in accordance with the request for the remote control received from the server 300 via the vehicle communication unit 190.
Fig. 4A is a flowchart showing a processing sequence of travel control of the vehicle 100 in the first embodiment. In step S1, the server 300 obtains vehicle position information using a detection result output from an external sensor that is a sensor located outside the vehicle 100. The vehicle position information is position information that becomes a basis for generating the travel control signal. In the present embodiment, the vehicle position information includes the position and orientation of the vehicle 100 in the reference coordinate system of the factory FC. In the present embodiment, the reference coordinate system of the factory FC is a global coordinate system, and any position within the factory FC can be expressed by the coordinates X, Y, Z in the global coordinate system. In the present embodiment, the external sensor is a camera 80 provided in the factory FC, and a captured image is output from the external sensor as a detection result. That is, in step S1, the server 300 acquires the vehicle position information using the captured image acquired from the camera 80 as the external sensor.
Specifically, in step S1, the server 300 detects the external shape of the vehicle 100 from the captured image, calculates coordinates of the anchor point of the vehicle 100 in the local coordinate system, which is the coordinate system of the captured image, and converts the calculated coordinates into coordinates in the global coordinate system, thereby obtaining the position of the vehicle 100. The outline of the vehicle 100 included in the captured image can be detected by, for example, inputting the captured image into a detection model using artificial intelligence. The detection model is prepared, for example, inside the system 500 or outside the system 500, and stored in advance in the memory of the server 300. Examples of the detection model include a machine learning model after learning, which is learned to achieve either semantic segmentation or instance segmentation. As the machine learning model, for example, a convolutional neural network (hereinafter CNN) learned by supervised learning using a learning data set can be used. The learning data set includes, for example, a plurality of training images including the vehicle 100, and a tag showing which of the regions other than the vehicle 100 is the region indicating the vehicle 100, and each region in the training images. In CNN learning, it is preferable to update the parameters of the CNN by back propagation (error back propagation method) so as to reduce the error between the output result of the detection model and the label. Further, the server 300 can obtain the orientation of the vehicle 100 by estimating the direction of the movement vector of the vehicle 100 calculated from the position change of the feature point of the vehicle 100 between frames of the captured image by, for example, an optical flow method.
In step S2, the server 300 determines a target position to which the vehicle 100 should go next. In this embodiment, the target position is represented by coordinates X, Y, Z in the global coordinate system. The memory of the server 300 stores a reference route, which is a route to be traveled by the vehicle 100, in advance. The path is represented by a node showing a departure point, a node showing a passing point, a node showing a destination, and links connecting the nodes. The server 300 uses the vehicle position information and the reference path to determine a target position to which the vehicle 100 should go next. The server 300 determines the target position on a reference path ahead of the current position of the vehicle 100.
In step S3, the server 300 generates a travel control signal for causing the vehicle 100 to travel toward the determined target position. In the present embodiment, the travel control signal includes acceleration and steering angle of the vehicle 100 as parameters. The server 300 calculates the running speed of the vehicle 100 from the shift of the position of the vehicle 100, and compares the calculated running speed with the target speed. In general, the server 300 determines acceleration so as to accelerate the vehicle 100 when the running speed is lower than the target speed, and determines acceleration so as to decelerate the vehicle 100 when the running speed is higher than the target speed. When the vehicle 100 is located on the reference path, the server 300 determines the steering angle and the acceleration so that the vehicle 100 does not depart from the reference path, and when the vehicle 100 is not located on the reference path, in other words, when the vehicle 100 is separated from the reference path, the server determines the steering angle and the acceleration so that the vehicle 100 returns to the reference path. In other embodiments, the travel control signal may include the speed of the vehicle 100 as a parameter instead of or in addition to the acceleration of the vehicle 100.
In step S4, the server 300 transmits the generated travel control signal to the vehicle 100. The server 300 repeatedly acquires vehicle position information, determines a target position, generates a travel control signal, transmits the travel control signal, and the like at a predetermined cycle.
In step S5, vehicle 100 receives the travel control signal transmitted from server 300. In step S6, the vehicle 100 controls the actuator of the vehicle 100 using the received travel control signal, so that the vehicle 100 travels at the acceleration and steering angle indicated by the travel control signal. The vehicle 100 repeatedly receives a travel control signal and controls an actuator at a predetermined cycle. According to the system 500 of the present embodiment, the vehicle 100 can be moved without using a conveying device such as a crane or a conveyor by moving the vehicle 100 by remote control.
Fig. 4B is an explanatory diagram showing automatic driving control of the vehicle 100 based on remote control by the remote control unit 312. In the example of fig. 4B, the travel road RT includes a first travel road RT1, a second travel road RT2, a third travel road RT3, and a fourth travel road RT4 that are continuous with each other. The first travel road RT1 and the second travel road RT2 are connected to each other via a right-angled curve. A parking lot PA is connected between the third road RT3 and the fourth road RT4. The remote control unit 312 normally causes the vehicle 100 to travel along the travel path RT to the loading position PG of the following step 60.
As shown in fig. 4B, a camera 80 as a vehicle detector acquires an image of the vehicle 100 overlooking the travel road RT and the parking lot PA from above. Regarding the number of cameras 80, the number of cameras 80 that can capture the entire travel road RT and the parking lot PA is set in consideration of the angle of view and the like of the cameras 80. In the example of fig. 4B, the camera 80 includes: a camera 801 capable of capturing a range RG1 including the entire first travel road RT1, a camera 802 capable of capturing a range RG2 including the entire second travel road RT2, a camera 803 capable of capturing a range RG3 including the entire third travel road RT3 and the fourth travel road RT4, and a camera 804 capable of capturing a range RG4 including the entire parking lot PA. The camera 80 may acquire images from the front, rear, side, and the like of the vehicle 100, not limited to the images from the upper side of the vehicle 100. The cameras for acquiring these images may be combined arbitrarily.
A virtual target route to be traveled by the vehicle 100 under remote control is preset on the travel road RT. The target route in the present embodiment corresponds to the reference route. Further, remote control unit 312 causes ECU 200 to execute driving control of vehicle 100 while analyzing image of road RT and vehicle 100 acquired by camera 80 at predetermined time intervals. By requesting remote control to the vehicle 100 by the remote control unit 312 to successively adjust the relative position of the vehicle 100 with respect to the target route, the vehicle 100 can travel along the target route. In the remote control, an image of the entire vehicle 100 may be used, or an image of a part of the vehicle 100 such as an alignment mark provided on the vehicle 100 may be used.
As shown in fig. 4B, at the connection position of each road, the angles of view of the cameras 80 corresponding to the connected roads are configured to be repeated. In the example of the position P1, the angle of view of the camera 801 corresponding to the first travel road RT1 and the angle of view of the camera 802 corresponding to the second travel road RT2 are repeated. The vehicle 100 delivered from the previous step 50 travels to the position P1 by remote control using the captured image of the camera 801. When the position P1 is reached, instead of the camera 801, the control is switched to remote control using the captured image acquired by the camera 802, and the vehicle 100 travels on the second travel path RT 2. Similarly, the photographed image of the camera 803 is used during the travel on the third travel road RT3 and the fourth travel road RT4, and the photographed image of the camera 804 is used during the travel on the parking lot PA. In this way, the remote control unit 312 performs remote control of the vehicle 100 while appropriately switching the captured image to be analyzed for each range of the travel road RT. The remote control unit 312 can cause the vehicle 100 to travel from the travel road RT to the parking lot PA and to return from the travel road RT by remote control, and can further cause the vehicle to park at the parking position P2 of the parking lot PA.
Fig. 5 is an explanatory diagram conceptually showing the control content change table 224. The "large process", "medium process", and "small process" shown in fig. 5 are classifications set for convenience. In the example of fig. 5, the large step includes a step of assembling components such as the detection device group 180 to the vehicle 100 and an adjustment step of adjusting the components assembled to the vehicle 100.
As shown in fig. 5, the control content change table 224 shows the correspondence between the process and the content of the control program 222 added or changed after the process is completed. The content of the addition or modification of the control program 222 is set in advance so as to use the elements added or modified to the vehicle 100 by the completion of the processing in each step.
In the example of fig. 5, the assembling step includes an object detecting device assembling step of assembling the object detecting device to the vehicle 100, a speed detecting device assembling step of assembling the speed detecting device to the vehicle 100, and a final step of the assembling step. The object detection device mounting step is a step of mounting an object detection device as an element, and includes, for example, a radar device mounting step and an in-vehicle camera mounting step. The radar device mounting process includes, for example, a LiDAR mounting process and a millimeter wave radar mounting process.
When the radar device and the in-vehicle camera are mounted on the vehicle 100, the control program 222 is rewritten to execute collision prevention control using the mounted radar device and in-vehicle camera. The control program 222 is rewritten to implement automatic travel by collision prevention control without using remote control. That is, the vehicle 100 is switched to the automatic travel based on the driving control by the driving control unit 212, instead of using the remote control by the remote control unit 312 of the server 300. However, after the object detection device assembling process is completed, automatic traveling by remote control of the server 300 may be performed. In this case, for example, the collision detection device mounted on the vehicle 100 may be used in an auxiliary manner for collision prevention during self-walking transportation by the remote control unit 312 of the server 300. After the in-vehicle camera mounting process is completed, control is performed to acquire vehicle speed data using the captured image of the in-vehicle camera.
The speed detection device assembling step is a step of assembling the speed detection device as an element to the vehicle 100. The speed detection device is a sensor capable of acquiring speed information related to the speed of the vehicle 100. The "speed information" includes not only the vehicle speed but also various information related to the speed of the vehicle 100, such as the wheel speed, the acceleration of the vehicle 100, the angular speed, and the angular acceleration. In the example of fig. 5, the speed detection device assembling process includes a wheel speed sensor assembling process and an acceleration sensor assembling process. The speed detection device is not limited to the wheel speed sensor and the acceleration sensor, and may be at least one of a wheel speed sensor, an acceleration sensor, a vehicle speed sensor, and a yaw rate sensor.
After the speed detection device assembling process is completed, the vehicle 100 can detect the wheel speed data and the acceleration data as the speed information by assembling the wheel speed sensor and the acceleration sensor. By using the obtained wheel speed data and acceleration data, the vehicle speed data as speed information can be obtained, and the server 300 or the ECU 200 can execute automatic running using the vehicle speed data. Further, by using the acquired wheel speed data and acceleration data, the self-position estimation of the vehicle 100 can be performed. Therefore, the automatic travel of the vehicle 100 using the own position estimation can be performed. In this case, the automatic running may be any one of automatic running of the vehicle 100 based on remote control and automatic running based on driving control by the driving control section 212 without using remote control.
When the final process of the assembly process is completed, that is, when the assembly of all the detecting devices including the object detecting device and the speed detecting device is completed, the safety performance at the time of automatic running of the vehicle 100 is improved by using all the detecting devices. Therefore, as an example of the content of the control added or changed to the control program 222, it is possible to increase the upper limit value of the vehicle speed allowed at the time of automatic traveling and to expand the allowable range of the steering angle.
As shown in the lower left of fig. 5, the adjustment step includes, for example, a wheel alignment adjustment step, a drive loosening adjustment step, a suspension adjustment step, and a step of adjusting the mounting position of the detection device. In the wheel alignment adjustment step, a process of adjusting the mounting position of the wheel with respect to the vehicle body is performed as an element. After the wheel alignment adjustment process is completed, the vehicle 100 can be stably and straightly moved. Therefore, as an example of the content of the control added or changed to the control program 222, the upper limit value of the vehicle speed allowed at the time of automatic travel can be increased.
The drive backlash adjustment step is a step of eliminating backlash in a transmission system or the like from a motor to a wheel as an element. The suspension adjustment step is a step of adjusting the suspension as an element. A process of assembling the suspension bushing (bush) as an element may also be called. When the drive loosening adjustment process and the suspension adjustment process are completed, the running of the vehicle 100 can be stabilized. The step of adjusting the mounting position of the detection device includes a step of attaching the cover to the sensor mounted in the mounting step, a step of fixing the sensor to the final position, and the like. When the step of adjusting the mounting position of the detection device is completed, the detection accuracy of the sensor is improved, and thus the stability and safety performance of the running of the vehicle 100 are improved. Therefore, when the suspension adjustment step, the drive backlash adjustment step, and the adjustment step of the mounting position of the detection device are completed, an increase in the upper limit value of the vehicle speed allowed at the time of automatic traveling and an expansion in the allowable range of the steering angle can be achieved as examples of the content of the control added or changed to the control program 222.
Fig. 6 is a flowchart showing a method of manufacturing the vehicle 100 according to the first embodiment. The present process is started, for example, when the vehicle 100 reaches a predetermined process.
In step S10, a process for the vehicle 100 is performed based on the process. In step S18, the process for the vehicle 100 based on the process is completed. Completion of this process is detected by the process completion detection unit 214 or the manufacturing information acquisition unit 314. In step S20, the control content changing unit 216 confirms whether or not there is an addition or change in the content of the control based on the detected process. More specifically, the control content changing unit 216 refers to the control content changing table 224, and confirms the change or addition of the control corresponding to the detected process. If there is no change or addition of control of the vehicle 100 (no in S20), the process proceeds to step S40. When there is a change or addition of control of the vehicle 100 (yes in S20), the control content changing unit 216 moves the process to step S30.
In step S30, the control content changing unit 216 rewrites the control program 222 according to the control content changing table 224. In step S40, the process completion detection unit 214 confirms whether all the steps in the manufacturing process of the vehicle 100 have been completed. When all the steps have been completed (S40: yes), the present flow is completed. If not all the steps have been completed (S40: no), the process completion detection unit 214 moves the process to step S50. In step S50, the remote control unit 312 starts traveling of the vehicle 100, and causes the vehicle 100 to travel toward the next step. In this case, the driving control of the vehicle 100 is executed based on the rewritten control program 222. In step S60, the vehicle 100 goes to the next step. When the vehicle 100 reaches the next step, the process returns to step S10.
Fig. 7 is an explanatory diagram schematically showing a method of manufacturing the vehicle 100 according to the first embodiment. Fig. 7 schematically illustrates each step included in the preceding step 50 and vehicles 100p, 100q, and 100r that automatically travel in the conveyance sections C1, C2, and C3 between each step. The steps shown in fig. 7 are, for example, a radar device assembling step 50p and an in-vehicle camera assembling step 50q in the object detection device assembling step shown in fig. 5, and a wheel speed sensor assembling step 50r in the speed detection device assembling step.
After the completion of the processing in the radar device assembling step 50p, as shown in fig. 5, the control content changing unit 216, which refers to the control content changing table 224, rewrites the control program 222 to execute the automatic travel of the vehicle 100 using the radar device. As a result, the vehicle 100p automatically travels in the conveyance section C1 by the driving control of the driving control unit 212 of the vehicle 100p in place of the remote control by the remote control unit 312 of the server 300.
When the processing in the in-vehicle camera mounting step 50q is completed, the control content changing unit 216, which refers to the control content changing table 224, rewrites the control program 222 to execute the automatic travel using the in-vehicle camera. As a result, the vehicle 100q automatically travels in the conveyance section C2 by the driving control of the driving control unit 212 while performing collision prevention control using the radar device and the in-vehicle camera.
When the process in the wheel speed sensor mounting step 50r is completed, the control content changing unit 216 rewrites the control program 222 to execute automatic running by acquiring vehicle speed data using the acquired wheel speed data and estimating the position of the vehicle using the wheel speed data. As a result, the vehicle 100r automatically travels in the conveyance section C3 under the driving control of the driving control unit 212 while acquiring vehicle speed data using the wheel speed data and estimating its own position using the wheel speed data.
As described above, the vehicle 100 according to the present embodiment includes: a process completion detection unit 214 that detects completion of a process based on at least one step included in the manufacturing process; and a control content changing unit 216 that changes the control content of the vehicle 100 when the completion of the processing is detected. The content of the control of the vehicle 100 can be changed every time the process based on the process is completed, and the driving control of the vehicle 100 by the appropriate remote control can be executed for each process.
According to the vehicle 100 of the present embodiment, the process completion detection unit 214 detects completion of a process of adding an element to the vehicle 100 or a process of changing an element included in the vehicle 100 based on a process included in the manufacturing process. When completion of the process based on this step is detected, the control content changing unit 216 changes the content of the control of the vehicle 100 so as to be a control using an element added or changed to the vehicle 100. Therefore, the elements added or changed to the vehicle 100 can be appropriately utilized according to the progress of the manufacturing process of the vehicle 100, and the performance of the vehicle 100 during the automatic running can be appropriately exhibited. For example, by appropriately exhibiting the performance of the vehicle 100 at the time of traveling, such as an increase in the traveling speed of the vehicle 100 and an increase in the allowable range of the steering angle, the production efficiency of the vehicle 100 can be improved.
According to the vehicle 100 of the present embodiment, the manufacturing process of the vehicle 100 includes the assembly process of the object detection device that performs the following processes: an object detection device including at least one of a radar device and an in-vehicle camera, which is capable of detecting an object around the vehicle 100, is mounted. Upon detecting completion of the assembly process of the object detection device, the control content changing unit 216 changes the content of control of the vehicle 100 to execute collision prevention control using the assembled object detection device. Therefore, collision prevention at the time of automatic travel of the vehicle 100 can be performed simultaneously with completion of the assembly process of the object detection device.
According to the vehicle 100 of the present embodiment, the control content changing unit 216 also changes the content of the control of the vehicle 100, instead of the driving control based on the remote control, to cause the vehicle 100 to travel by the driving control of the vehicle 100 using the collision prevention control. Accordingly, it is possible to switch the main body controlling the vehicle 100 from the remote control by the server 300 to the automatic running by the driving control of the vehicle 100 simultaneously with the completion of the assembly process of the object detection device.
According to the vehicle 100 of the present embodiment, the manufacturing process of the vehicle 100 includes the assembly process of the speed detecting device that performs the following processes: a speed detection device capable of acquiring speed information related to the speed of the vehicle 100, including at least any one of a vehicle speed sensor, a wheel speed sensor, an acceleration sensor, and a yaw rate sensor, is mounted. When completion of the assembly process of the speed detection device is detected, the control content changing unit 216 changes the content of control of the vehicle 100 to perform driving control using the speed information detected by the assembled speed detection device. Accordingly, the self-position estimation using the detected speed information and the feedback control of the vehicle speed can be performed simultaneously with the completion of the assembly process of the speed detection device.
According to the vehicle 100 of the present embodiment, the manufacturing process of the vehicle 100 includes an adjustment process including at least any one of a wheel alignment adjustment process for performing a process of adjusting a wheel alignment and a suspension adjustment process for performing a process of adjusting a suspension. When completion of the adjustment process is detected, the control content changing unit 216 changes the content of control of the vehicle 100 so that the upper limit value of the running speed of the vehicle 100 increases. Therefore, the running speed of the vehicle 100 can be increased simultaneously with completion of the adjustment process, and productivity of the vehicle 100 can be improved.
B. Second embodiment:
Fig. 8 is a block diagram showing an internal functional configuration of a server 300b according to the second embodiment. Fig. 9 is a block diagram showing an internal functional configuration of ECU 200b included in vehicle 100 according to the second embodiment. As shown in fig. 8 and 9, the present embodiment differs from the first embodiment in that the control content change table 224 is not stored in the storage device 220 of the ECU 200b, and the control content change table 324 is stored in the storage device 320 of the server 300 b. The control content change table 324 has the same configuration as the control content change table 224 described in the first embodiment.
In this embodiment, the CPU 210 of the ECU 200b does not function as the processing completion detection unit 214, and the CPU 310 of the server 300b functions as the control content change instruction unit 316, which is different from the first embodiment. When the completion of the processing based on the respective steps is detected by the manufacturing information acquisition unit 314, the control content change instruction unit 316 instructs the control content change unit 216 of the vehicle 100 to add or change the control content of the vehicle 100 so as to use the element added or changed to the vehicle 100 by the completion of the processing. In the present embodiment, the control content change instruction unit 316 refers to the control content change table 324 and instructs the control content change unit 216 to rewrite the control program 222 added or changed to the control content corresponding to the completed process. As a result, the control program 222 is changed to use the content of the control of the element added or changed to the vehicle 100 by the completed processing.
As described above, the server 300b according to the present embodiment includes: a manufacturing information acquisition unit 314 that acquires manufacturing information 322 including a process of adding an element to the vehicle 100 or a progress of a process of changing an element provided in the vehicle 100 based on a process included in the manufacturing process; and a control content change instruction unit 316 that instructs the vehicle 100 to change the control content of the vehicle 100 so as to use the element added or changed to the vehicle 100 by the completion of the processing when the completion of the processing is detected. Therefore, the elements added or changed to the vehicle 100 can be appropriately utilized according to the progress of the manufacturing process of each of the vehicles 100 managed by the server 300b, and the performance of each of the vehicles 100 in the automatic traveling can be appropriately exhibited.
C. Third embodiment:
Fig. 10 is a block diagram showing an internal functional configuration of ECU 200c included in vehicle 100 according to the third embodiment. As shown in fig. 10, in the present embodiment, vehicle 100 includes ECU 200c including CPU 210c and storage device 220 c. The CPU 210c is similar to the CPU 210 shown in the first embodiment except that it further includes a manual operation detecting unit 217 and an abnormality detecting unit 218. The storage device 220c is different from the storage device 220 shown in the first embodiment in that the threshold 226 is also stored, and otherwise has the same configuration.
Fig. 11 is a block diagram showing an internal functional configuration of a server 300c according to the third embodiment. As shown in fig. 11, the server 300c is similar to the server 300 of the first embodiment except that it includes a CPU 310c including an abnormality measure unit 318 instead of the CPU 310.
The manual operation detection unit 217 is a sensor group for detecting an operation amount by the manual operation unit 170. The "operation amount of the operation unit 170" refers to, for example, an accelerator opening degree, a steering angle, a stepping amount of a foot brake, and the like. The operation amount of the operation unit 170 may be an output value of the vehicle 100, such as a speed, an acceleration, a deceleration, an actual rudder angle, and a braking force, based on a driving control by a manual operation of the operation unit 170.
The abnormality detection unit 218 detects an operation of the operation unit 170, which is not scheduled to be performed, as an abnormality. More specifically, when the operation amount of the operation unit 170 by the manual operation exceeds a threshold 226 that is widened by the control content changing unit 216 as described later, the abnormality detecting unit 218 detects that an operation of the operation unit 170 that is not scheduled is generated. The detection result of the abnormality detection unit 218 is output to the server 300 c.
The threshold 226 is used in order to determine whether or not to perform so-called override (override). In the present specification, "override" means a process for making the driving control of the vehicle 100 based on the operation unit 170 be executed in preference to the driving control of the vehicle 100 based on the unmanned operation, in the case where the driving control of the vehicle 100 based on the unmanned operation and the manual driving control based on the operation unit 170 are executed at the same time.
The threshold 226 is preset using the operation amount of the operation unit 170. In the present embodiment, when the operation amount of the operation unit 170 scheduled to be performed in the remote control is set as the reference value, the threshold 226 includes a lower limit value of the operation amount smaller than the reference value by a predetermined amount and an upper limit value of the operation amount larger than the reference value by a predetermined amount. In the case where the detected operation amount of the operation portion 170 is smaller than the lower limit value or larger than the upper limit value, driving control based on the operation of the operation portion 170 is preferentially performed by overriding. For example, in the case where a manual steering wheel operation of an operation amount greater than an upper limit value or less than a lower limit value of the steering wheel operation amount is performed simultaneously with a steering wheel operation of the vehicle 100 by remote control, driving control based on the manual steering wheel operation is preferentially performed by overriding. When the operation amount of the operation unit 170 is equal to or greater than the lower limit value and equal to or less than the upper limit value, the driving control by the remote control is prioritized over the driving control by the manual operation. The threshold 226 is set individually for each type of the operation unit 170 such as a steering wheel, an accelerator, and a brake. The threshold 226 may also be included in the control content change table 224.
In the manufacturing process of the vehicle 100, the self-walking transportation of the vehicle 100 based on the remote control may be performed in an unmanned state in which there is no driver in the vehicle 100. In the unmanned state, the normal operation unit 170 is not manually operated, and no override occurs. However, even during the self-traveling transportation of the vehicle 100, for example, an operator or the like may get in the vehicle 100 in order to perform an inspection in the vehicle 100, an installation of parts into the vehicle 100, or the like. In this case, for example, if the operator or the like touches the operation unit 170 by mistake, the operation amount of the operation unit 170 may exceed the threshold 226, and the self-traveling conveyance of the vehicle 100 may be stopped by the override.
In the present embodiment, in the step in which the operator may touch the operation unit 170 during the self-walking transportation, the threshold 226 is widened to a value at which it is difficult to preferentially perform the driving control by the operation unit 170, that is, a value at which it is difficult to perform the override, so that the occurrence of unexpected override by the operator during the self-walking transportation or the like is suppressed or prevented. Thus, the relaxed threshold 226 may also be referred to as a threshold for detecting an operation of the operation unit 170 that is not scheduled to be performed. In addition, the correspondence between the step of the operator contacting the operation unit 170 and the relaxed threshold 226 during the self-propelled conveying is set in advance in the control content change table 224. The threshold 226 after the relaxation is preset with an appropriate value for each type of the operation unit 170.
In the case where an operation of the operation section 170 that is not scheduled to be performed is detected by the abnormality detection section 218, the abnormality measure section 318 executes a predetermined abnormality measure. Examples of the abnormal measure include a manager of the abnormal process, a notification of occurrence of an abnormality by an operator, and a stop of the production of the vehicle 100 by an emergency stop of the production facility or the production line.
Fig. 12 is a flowchart showing a method of manufacturing the vehicle 100 according to the third embodiment. In this flow, the steps S12, S14, and S16 are provided after the step S10, and the steps S22 and S32 are provided instead of the steps S20 and S30, which is different from the method for manufacturing the vehicle 100 according to the first embodiment shown in fig. 6.
In step S12, the manual operation detection unit 217 monitors the operation amount of the operation unit 170. In step S14, the manual operation detection unit 217 determines whether or not the operation amount of the operation unit 170 exceeds the relaxed threshold 226. If the operation amount of the operation unit 170 is within the relaxed threshold 226 (S14: no), the process proceeds to step S18. In the case where the time lower threshold 226 of step S12 is not relaxed, steps S12, S14, and S16 may be omitted and the process may be shifted to step S18.
When the operation amount of the operation unit 170 exceeds the relaxed threshold 226 in step S14 (yes in S14), the manual operation detection unit 217 moves the process to step S16. In step S16, an abnormality measure based on the abnormality measure section 318 is performed. More specifically, the abnormality detection unit 218 outputs a detection result of an operation of the operation unit 170, which is not scheduled to be performed, to the server 300 c. When the abnormality detection unit 218 of the vehicle 100 obtains the detection result, the abnormality measure unit 318 of the server 300c executes the abnormality measure and ends the flow. Specifically, the abnormality measure unit 318 notifies the manager or operator of the abnormal process, and makes an emergency stop of the manufacturing line to stop the manufacture of the vehicle 100.
In step S22, the control content changing unit 216 confirms whether or not the operator is likely to touch the operation unit 170 in the next step. Specifically, the control content changing unit 216 refers to the control content changing table 224, and confirms whether or not the next step is a step in which the operator may touch the operation unit 170. If the next step is a step in which it is impossible to touch the operation unit 170 (S22: no), the control content changing unit 216 moves the process to step S40. Steps S22 and S32 may be executed after it is determined in step S40 that not all strokes are completed, and before step S50.
If the operator may touch the operation unit 170 in the next step (yes in S22), the control content changing unit 216 releases the threshold 226. More specifically, the control content changing unit 216 changes the threshold 226 to a relaxed value according to the control content changing table 224.
As described above, according to the vehicle 100 of the present embodiment, the threshold 226 for determining whether or not to prioritize the driving control by the operation unit 170 over the driving control by the remote control in the case where the driving control by the remote control and the driving control by the operation unit 170 are simultaneously executed is also stored in the storage device 220c of the ECU 200 c. The process completion detection unit 214 detects completion of a process based on a process preceding a process in which the operator may touch the operation unit 170. When the completion of the process by the preceding process is detected by the process completion detection unit 214, the control content changing unit 216 changes the content of the control of the vehicle 100 so as to widen the threshold 226 to a value at which it is difficult to preferentially perform the driving control by the operation unit 170. Accordingly, it is possible to suppress or prevent a problem that the self-walking transportation of the vehicle 100 is stopped accidentally due to the operator or the like touching the operation unit 170 by mistake during the self-walking transportation of the vehicle 100 by the remote control.
D. Fourth embodiment:
Fig. 13 is a block diagram showing an internal functional configuration of ECU 200d included in vehicle 100 according to the fourth embodiment. Fig. 14 is a block diagram showing an internal functional configuration of a server 300d according to the fourth embodiment. As shown in fig. 13 and 14, in the present embodiment, control content change table 224 is not stored in storage device 220d of ECU 200d, and control content change table 324 is stored in storage device 320 of server 300d, which is different from the configuration of ECU 200c and server 300c shown in the third embodiment. The control content change table 324 has the same configuration as the control content change table 224 described in the third embodiment.
The CPU 210d of the ECU 200d is different from the CPU 210c of the ECU 200c shown in the third embodiment in that it does not function as the processing completion detection unit 214 and the abnormality detection unit 218. The CPU 310d of the server 300d is different from the server 300c of the third embodiment in that it also functions as a control content change instruction unit 316 and an abnormality detection unit 317.
In the present embodiment, the manufacturing information acquiring unit 314 acquires manufacturing information, and refers to the acquired manufacturing information to acquire completion of processing based on a step preceding a step in which an operator may touch the operation unit 170 in the next step. The result of the acquisition by the manufacturing information acquisition unit 314 is output to the control content change instruction unit 316.
The control content change instruction unit 316 may refer to the control content change table 324, and use the manufacturing information acquired by the manufacturing information acquisition unit 314 to confirm whether or not the next step is a step in which the operator may touch the operation unit 170. If the next step is a step in which the operator may touch the operation unit 170, the control content change instruction unit 316 instructs the control content change unit 216 to change the threshold 226 to a relaxed value according to the control content change table 324. As a result, the threshold 226 stored in the memory 220d of the ECU 200d is changed to a relaxed value.
The abnormality detection unit 317 sequentially acquires the operation amount of the operation unit 170 acquired by the manual operation detection unit 217. The abnormality detection unit 317 detects that an operation of the operation unit 170 which is not scheduled is generated when the obtained operation amount of the operation unit 170 exceeds the relaxed threshold 226, similarly to the abnormality detection unit 218 described in the third embodiment. The detection result of the abnormality detection section 317 is output to the abnormality measure section 318. The abnormality measure unit 318 executes a predetermined abnormality measure when an operation of the operation unit 170 that is not scheduled is detected, as in the case of the abnormality measure unit 318 described in the third embodiment.
As described above, according to the server 300d of the present embodiment, the manufacturing information acquiring unit 314 acquires completion of the processing based on the step preceding the step predetermined as the step in which the operator may touch the operation unit 170. When the completion of the processing by the preceding step is obtained, the control content change instruction unit 316 instructs the vehicle 100 to change the content of the control of the vehicle 100 so as to widen the threshold 226 to a value at which it is difficult to preferentially perform the driving control by the operation unit 170. Therefore, in the present embodiment as well, as in the third embodiment, it is possible to suppress or prevent a problem that the self-walking transportation of the vehicle 100 is stopped accidentally due to the operator or the like touching the operation unit 170 by mistake during the self-walking transportation of the vehicle 100 by remote control.
According to the server 300d of the present embodiment, when the driving control by the operation unit 170 is preferentially performed after the threshold 226 is released by the control content changing unit 216, an abnormal measure of stopping and notifying the manufacture of the vehicle 100 is performed. By performing an exception measure instead of overriding, the risk of relaxation accompanied by threshold 226 can be suppressed or prevented.
E. Fifth embodiment:
Fig. 15 is an explanatory diagram showing a schematic configuration of a system 500e in the fifth embodiment. In the present embodiment, the system 500e is different from the first embodiment in that the server 300 is not provided. Other components of the system 500e are the same as those of the first embodiment unless otherwise specified.
Fig. 16 is an explanatory diagram showing an internal functional configuration of ECU 200e of vehicle 100e in the fifth embodiment. As shown in fig. 16, ECU 200e includes a storage device 220e such as CPU 210e, ROM, RAM as a central processing unit, and a vehicle communication unit 190 connected to an interface circuit, not shown. They are bidirectionally communicably connected via an internal bus. In the present embodiment, various functions of the driving control unit 212e, the processing completion detection unit 214, the control content changing unit 216, and the like are realized by the CPU 210e executing various computer programs stored in the storage device 220 e. As will be described later, the driving control unit 212e in the present embodiment can cause the vehicle 100e to travel by autonomous control of the vehicle 100 e. Specifically, the driving control unit 212e obtains the detection result of the sensor, generates a travel control signal using the detection result, and outputs the generated travel control signal to operate various actuators of the vehicle 100e, so that the vehicle 100e can travel by autonomous control.
Fig. 17 is a flowchart showing a processing sequence of travel control of the vehicle 100e in the fifth embodiment. In step S101, the vehicle 100e acquires vehicle position information using the detection result output from the camera 80 as an external sensor. In step S102, the vehicle 100e determines a target position to which the vehicle 100e should go next. In step S103, the vehicle 100e generates a travel control signal for causing the vehicle 100e to travel toward the determined target position. In step S104, the vehicle 100e controls the actuator of the vehicle 100e using the generated travel control signal, thereby causing the vehicle 100e to travel in accordance with the parameter indicated by the travel control signal. The vehicle 100e repeatedly acquires vehicle position information, determines a target position, generates a travel control signal, and controls an actuator at predetermined cycles. According to the system 500e in the present embodiment, the vehicle 100e can be driven by the autonomous control of the vehicle 100e even without remotely controlling the vehicle 100e by the server 300.
In this embodiment, a manufacturing method substantially similar to the manufacturing method of fig. 6 is performed. In the present embodiment, the completion of the processing in step S18 is detected by the processing completion detecting section 214. Regarding completion of the process for the own vehicle based on each step, the process completion detection unit 214 may be obtained from a sensor, a camera, or the like provided in each step, or may be obtained using manufacturing information. The process completion detection unit 214 may obtain manufacturing information from, for example, a process management device provided in each process or a production management device that collectively manages the manufacturing conditions of each process. In step S50 in the present embodiment, the driving control unit 212e of the vehicle 100e starts the travel of the vehicle 100e, and causes the vehicle 100e to travel toward the next step. In this case, the driving control of the vehicle 100e is executed based on the rewritten control program 222.
As described above, according to the system 500e of the present embodiment, the content of the control of the vehicle 100e can be changed each time the process based on the process is completed, and the driving control of the vehicle 100e by the autonomous control can be appropriately executed for each process.
In another embodiment in which the vehicle 100e travels by autonomous control, as in the present embodiment, for example, the CPU 210e may be provided with a manual operation detection unit 217, an abnormality detection unit 218, and an abnormality measure unit 318, and the threshold 226 may be stored in the storage device 220e, thereby executing the manufacturing method shown in fig. 12. In this case, step S16 is executed by the abnormality measure unit 318 of the vehicle 100e substantially in the same manner as in the third embodiment. Specifically, the abnormality detection unit 218 outputs a detection result of an operation of the operation unit 170 which is not scheduled to be performed, and the abnormality measure unit 318 obtains the detection result and performs an abnormality measure. In this way, it is possible to suppress or prevent a problem that the self-propelled transport of the vehicle 100e is stopped accidentally due to the operator or the like touching the operation unit 170 by mistake during the self-propelled transport of the vehicle 100e by the autonomous control.
In other embodiments in which the vehicle 100e travels by autonomous control, the system 500 may include the server 300, for example. In this case, the CPU 310 of the server 300 may function as the manufacturing information acquisition unit 314, the control content change instruction unit 316, and the abnormality measure unit 318, for example, in the same manner as in the above embodiments. In this case, for example, the manufacturing information 322 and the control content change table 324 may be stored in the storage 320 of the server 300 in the same manner as in the above embodiments.
F. other embodiments:
(F1) In the second embodiment, the control content change instruction unit 316 is provided in the server 300b, and the control content change unit 216 is provided in the ECU 200b of the vehicle 100. In contrast, the server 300b may include a control content changing unit instead of the control content changing instruction unit 316. With this configuration, the server 300b can directly change the control content of the vehicle 100. In this case, control content changing unit 216 of ECU 200b can be omitted, and the processing load of ECU 200b can be reduced.
(F2) In the above embodiments, the vehicle 100 may be configured to be movable by unmanned driving, and may be configured to include a platform (platform) configured as described below, for example. Specifically, the vehicle 100 may include at least a control device for controlling the running of the vehicle 100 and an actuator of the vehicle 100e in order to perform 3 functions of "running", "steering", and "stop" by unmanned driving. When the vehicle 100 obtains information from the outside for unmanned driving, the vehicle 100 may further include a communication device. That is, the vehicle 100 capable of unmanned movement may not be equipped with at least a part of an interior component such as a driver's seat or an instrument panel, may not be equipped with at least a part of an exterior component such as a bumper or a fender, or may not be equipped with a vehicle body case. In this case, the remaining parts such as the body case may be attached to the vehicle 100 until the vehicle 100 is shipped from the factory, or the remaining parts such as the body case may be attached to the vehicle 100 after the vehicle 100 is shipped from the factory in a state where the remaining parts such as the body case are not attached to the vehicle 100. The components may be assembled from any direction, such as the upper side, the lower side, the front side, the rear side, the right side, or the left side of the vehicle 100, may be assembled from the same direction, or may be assembled from different directions. In addition, the stage may be configured to perform position determination in the same manner as the vehicle 100 in the first embodiment.
(F3) In the third embodiment, the threshold 226 includes a lower limit value of an operation amount smaller than a reference value of an operation amount of the operation unit 170 scheduled to be performed in remote control by a predetermined amount and an upper limit value of an operation amount larger than the reference value by a predetermined amount. In contrast, the threshold 226 may be set to only one of the upper limit value and the lower limit value. The threshold 226 may be set using, for example, a difference between an operation amount of the operation unit 170 by the unmanned operation and an operation amount of the operation unit 170 by the manual operation. The threshold 226 may be set using an absolute value of an operation amount of the operation unit 170 by a manual operation. The threshold 226 may be set using a total value of the operation amount of the operation unit 170 based on the operation amount of the unmanned operation unit 170 and the amount added by the manual operation.
(F4) In the above-described third embodiment, an example in which the threshold 226 is relaxed to a value at which it is difficult to perform override is shown. In contrast, the relaxation of the threshold 226 may also include turning off the override function.
(F5) In the third embodiment, when the operation amount of the operation unit 170 exceeds the relaxed threshold 226 in step S14, an example is shown in which the abnormality measure by the abnormality measure unit 318 is executed in step S16. In contrast, in step S16, instead of or in addition to the abnormal measure, the driving control of the vehicle 100 by the operation unit 170 may be performed in priority over the driving control of the vehicle 100 by the remote control by overriding. With this configuration, the self-propelled transport of the vehicle 100 can be prevented from being stopped by the erroneous operation of the operation unit 170, and the manual operation of the operation unit 170 can be performed by overriding.
(F6) In the above embodiments, the external sensor is not limited to the camera 80, and may be, for example, a distance measuring device. The distance measuring device is, for example, a Light Detection AND RANGING. In this case, the detection result output by the external sensor may be three-dimensional point cloud data representing the vehicle 100. In this case, the server 300 and the vehicle 100 may acquire the vehicle position information by template matching using the three-dimensional point cloud data as the detection result and the reference point cloud data.
(F7) In the first to fourth embodiments described above, the processing from the acquisition of the vehicle position information to the generation of the travel control signal is executed by the server 300. In contrast, at least a part of the processing from the acquisition of the vehicle position information to the generation of the travel control signal may be executed by the vehicle 100. For example, the following modes (1) to (3) may be adopted.
(1) The server 300 may acquire vehicle position information, determine a target position to which the vehicle 100 should go next, and generate a route from the current position of the vehicle 100 indicated by the acquired vehicle position information to the target position. The server 300 may generate a route to the target location between the current location and the destination, or may generate a route to the destination. The server 300 may transmit the generated path to the vehicle 100. The vehicle 100 may generate a travel control signal such that the vehicle 100 travels on a path received from the server 300, and control an actuator of the vehicle 100 using the generated travel control signal.
(2) The server 300 may acquire vehicle position information and transmit the acquired vehicle position information to the vehicle 100. The vehicle 100 may determine a target position to which the vehicle 100 should go next, generate a path from the current position of the vehicle 100 indicated by the received vehicle position information to the target position, generate a travel control signal so that the vehicle 100 travels on the generated path, and control an actuator of the vehicle 100 using the generated travel control signal.
(3) In the modes (1) and (2), the vehicle 100 may be equipped with an internal sensor, and the detection result output from the internal sensor may be used for at least one of the generation of the route and the generation of the travel control signal. The internal sensor is a sensor mounted on the vehicle 100. Specifically, the internal sensor may include, for example, a camera, liDAR, millimeter wave radar, an ultrasonic sensor, a GPS sensor, an acceleration sensor, a gyro sensor, and the like. For example, in the embodiment (1), the server 300 may acquire the detection result of the internal sensor, and reflect the detection result of the internal sensor on the route when the route is generated. In the embodiment (1) described above, the vehicle 100 may acquire the detection result of the internal sensor, and reflect the detection result of the internal sensor to the travel control signal when the travel control signal is generated. In the aspect (2), the vehicle 100 may acquire the detection result of the internal sensor, and reflect the detection result of the internal sensor on the route when the route is generated. In the aspect (2) above, the vehicle 100 may acquire the detection result of the internal sensor, and reflect the detection result of the internal sensor to the travel control signal when the travel control signal is generated.
(F8) In the fifth embodiment, an internal sensor may be mounted on the vehicle 100e, and the detection result output from the internal sensor may be used for at least one of the generation of the route and the generation of the travel control signal. For example, the vehicle 100e may acquire a detection result of the internal sensor, and reflect the detection result of the internal sensor on the route when the route is generated. The vehicle 100e may acquire the detection result of the internal sensor, and reflect the detection result of the internal sensor to the travel control signal when the travel control signal is generated.
(F9) In the fifth embodiment described above, the vehicle 100e obtains the vehicle position information using the detection result of the external sensor. In contrast, an internal sensor may be mounted on the vehicle 100e, the vehicle 100e may acquire vehicle position information using a detection result of the internal sensor, determine a target position to which the vehicle 100e should go next, generate a path from a current position of the vehicle 100e indicated by the acquired vehicle position information to the target position, generate a travel control signal for traveling on the generated path, and control an actuator of the vehicle 100e using the generated travel control signal. In this case, the vehicle 100e can travel without using the detection result of the external sensor at all. The vehicle 100e may acquire the target arrival time and/or congestion information from outside the vehicle 100e, and reflect the target arrival time and/or congestion information on at least one of the route and the travel control signal. The functional configuration of the system 500 may be entirely provided in the vehicle 100. That is, the processing performed by the system 500 in the present disclosure may also be performed by the vehicle 100 alone.
(F10) In the first to fourth embodiments described above, the server 300 automatically generates the travel control signal transmitted to the vehicle 100. In contrast, the server 300 may generate a travel control signal to be transmitted to the vehicle 100 in response to an operation by an external operator located outside the vehicle 100. For example, an external operator may operate an operating device including a display for displaying a captured image output from an external sensor, a steering wheel, an accelerator pedal, a brake pedal for remotely operating the vehicle 100, and a communication device for communicating with the server 300 by wired communication or wireless communication, and the server 300 may generate a travel control signal according to an operation applied to the operating device. Hereinafter, driving of the vehicle 100 based on such control is also referred to as "remote manual driving". In this embodiment, for example, when the completion of the process by at least one step included in the manufacturing process is detected by the process completion detection unit 214 or the manufacturing information acquisition unit 314, the control content changing unit 216 can change the content of the control of the vehicle 100. Specifically, when the completion of the processing is detected, the control content changing unit 216 may change the content of the control of the remote manual driving to be the control using the element added or changed to the vehicle 100 by the completion of the processing, or may terminate the remote manual driving to start the unmanned driving using the added or changed element, for example. When the completion of the processing is detected, the control content changing unit 216 may change the content of the remote manual driving control so as to widen the threshold 226 to a value at which it is difficult to preferentially perform the driving control by the operation unit 170. When completion of the processing is detected, the control content change instruction unit 316 may instruct the vehicle 100 to change the content of each control described above, for example.
(F11) The vehicle 100 may be manufactured by combining a plurality of modules. The module means a unit composed of one or more components that are integrated according to the structure and function of the vehicle 100. For example, the rack of the vehicle 100 may be manufactured by combining a front module constituting a front portion of the rack, a center module constituting a center portion of the rack, and a rear module constituting a rear portion of the rack. The number of modules constituting the rack is not limited to 3, but may be 2 or less or 4 or more. In addition, a portion of the vehicle 100 different from the rack may be modularized in addition to the rack, or a portion of the vehicle 100 different from the rack may be modularized instead of the rack. The various modules may include any exterior parts such as a bumper and a grille, and any interior parts such as a seat and a console. Further, not limited to the vehicle 100, a plurality of modules may be combined to manufacture a mobile object of any type. Such a module may be manufactured by joining a plurality of parts by welding, a fixing member, or the like, or may be manufactured by integrally molding at least a part of the module into one part by casting. The molding process of integrally molding at least a portion of the module as one piece is also referred to as Giga-casting or Mega-casting. By using the Giga-casting, each part of the moving body formed by joining a plurality of parts in the past can be formed as one part. For example, the front module, the center module, and the rear module described above may also be manufactured using Giga-casting.
(F12) Conveying the vehicle 100 by traveling of the unmanned vehicle 100 is also referred to as "self-propelled conveying". In addition, a configuration for realizing self-propelled conveyance is also referred to as a "vehicle remote control autonomous travel conveyance system". The production method of the vehicle 100 by self-propelled transport is also referred to as "self-propelled production". In self-propelled production, for example, in a factory where the vehicle 100 is manufactured, at least a part of the conveyance of the vehicle 100 is realized by self-propelled conveyance.
The control and method described in this disclosure may be implemented by a special purpose computer provided by way of a processor programmed to perform one or more functions embodied by a computer program, and a memory. Alternatively, the control unit and the method thereof described in the present disclosure may be implemented by a special purpose computer provided by a processor configured by one or more special purpose hardware logic circuits. Alternatively, the control unit and the method thereof described in the present disclosure may be implemented by one or more special purpose computers including a combination of a processor and a memory programmed to perform one or more functions and a processor configured by one or more hardware logic circuits. Furthermore, the computer program may also be stored as instructions executed by a computer in a non-transitory tangible recording medium readable by a computer.
The present disclosure is not limited to the above-described embodiments, and can be implemented in various configurations within a range not departing from the gist thereof. For example, the technical features of the embodiments corresponding to the technical features described in each embodiment of the "summary" list may be replaced or combined as appropriate to solve part or all of the above-described problems or to achieve part or all of the above-described effects. Note that, as long as the technical features are not described as an essential part in the present specification, they may be deleted appropriately.
Claims (14)
1. A mobile body is manufactured in a factory, and comprises:
a driving control unit that performs driving control of the mobile body by unmanned driving during manufacturing of the mobile body in the factory;
A process completion detection unit configured to detect completion of a process based on at least one step included in the manufacturing process; and
And a control content changing unit that changes the content of the control of the mobile body when the completion of the processing is detected.
2. The mobile body according to claim 1, wherein,
Further comprises a communication unit for receiving a control command for remote control,
The driving control unit executes driving control of the mobile body in accordance with the received control instruction during the manufacturing process.
3. The mobile unit according to claim 2, wherein,
The process completion detection unit detects completion of a process of adding an element to the mobile body or a process of changing an element provided in the mobile body based on the at least one step,
When completion of the process is detected, the control content changing unit changes the content of the control of the mobile body so as to be a control using an element added or changed to the mobile body by completion of the process.
4. The mobile unit according to claim 3, wherein,
The process includes an object detection device assembling process of: an object detection device capable of detecting an object around the mobile body, including at least one of a radar device and a camera, is added to the mobile body as the element,
When completion of the object detection device assembling process is detected, the control content changing unit changes the content of the control of the moving body to execute collision prevention control using the added object detection device.
5. The mobile unit according to claim 4, wherein,
The control content changing unit also changes the content of the control of the mobile body, and causes the mobile body to travel by the driving control of the mobile body using the collision prevention control instead of the driving control by the remote control.
6. The mobile unit according to claim 3, wherein,
The process includes a speed detection device assembling process for performing the following processes: a speed detection device including at least one of a vehicle speed sensor, a wheel speed sensor, an acceleration sensor, and a yaw rate sensor, capable of acquiring speed information related to a speed of a vehicle as the moving body, is added to the moving body as the element,
When completion of the speed detection device assembling process is detected, the control content changing unit changes the content of the control of the moving body to execute driving control using the speed information detected by the speed detection device added.
7. The mobile unit according to claim 3, wherein,
The process includes an adjustment process including at least one of a wheel alignment adjustment process for performing a process of changing a wheel alignment of a vehicle as the moving body as the element and a suspension adjustment process for performing a process of changing a suspension of the vehicle as the element,
When completion of the adjustment process is detected, the control content changing unit changes the content of control of the vehicle so that the upper limit value of the running speed of the vehicle increases.
8. The mobile unit according to claim 2, further comprising:
an operation unit for performing manual driving of the moving body; and
A storage device that stores a threshold value set in advance using an operation amount of the operation unit, the threshold value being used to determine whether or not to prioritize driving control by the operation unit over driving control by the remote control when driving control by the remote control and driving control by the operation unit are simultaneously performed,
The process completion detection unit detects completion of a process based on a process preceding a predetermined process in which an operator is likely to contact the operation unit among the at least one process,
When completion of the processing by the preceding step is detected, the control content changing unit changes the content of the control of the moving body so as to relax the threshold value to a value at which it is difficult to preferentially perform the driving control by the operating unit.
9. The mobile body according to claim 1, wherein,
The driving control unit generates a control signal for moving the mobile body by the unmanned driving during the manufacturing process, and executes driving control of the mobile body in accordance with the control signal.
10. A server is provided with:
A remote control unit that travels a mobile body that is manufactured in a factory by remote control, and that includes a communication unit that receives a control instruction for the remote control, and a drive control unit that executes drive control of the mobile body in accordance with the received control instruction during manufacturing in the factory that manufactures the mobile body;
A manufacturing information acquisition unit that acquires manufacturing information including a progress state of a process based on at least one step included in the manufacturing process; and
And a control content change instruction unit configured to instruct the mobile body to change the control content of the mobile body when completion of the processing is detected.
11. The server of claim 10, wherein,
The manufacturing information acquisition unit acquires completion of a process of adding an element to the mobile body or a process of changing an element provided in the mobile body based on the at least one step,
When the completion of the processing is obtained, the control content change instruction unit instructs the mobile body to change the content of the control of the mobile body so as to be the control using the element added or changed to the mobile body by the completion of the processing.
12. The server of claim 10, wherein,
The manufacturing information acquisition unit acquires completion of processing based on a step preceding a step in which an operator may contact an operation unit for performing manual driving of the mobile body in the at least one step,
The control content change instruction unit is configured to,
When the completion of the processing by the preceding step is obtained,
The control method includes the steps of instructing the mobile body to change the content of the control of the mobile body so as to relax a threshold for determining whether or not to prioritize the driving control by the operation unit over the driving control by the remote control when the driving control by the operation unit and the driving control by the remote control are simultaneously performed, to a value at which the driving control by the operation unit is difficult to prioritize.
13. The server of claim 12, wherein,
And an abnormality measure unit that, when the threshold value is relaxed and the driving control by the operation unit is preferentially performed, performs at least one of an abnormality measure of stopping and notifying the manufacture of the mobile body.
14. A method for manufacturing a mobile body, wherein,
In a manufacturing process in a factory for manufacturing a mobile body, the mobile body is driven by unmanned driving,
Obtaining manufacturing information including progress of a process based on at least one step included in the manufacturing process,
When completion of the processing is detected, the mobile body is instructed to change the control content of the mobile body.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2023-078687 | 2023-05-11 | ||
JP2023-188214 | 2023-11-02 |
Publications (1)
Publication Number | Publication Date |
---|---|
CN118938892A true CN118938892A (en) | 2024-11-12 |
Family
ID=
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20220308577A1 (en) | Virtual towing device, system, and method | |
JP2019046013A (en) | Traveling control method of automatic operation vehicle and traveling control device | |
US12060086B2 (en) | Autonomous driving device, autonomous driving method, and non-transitory storage medium | |
CN118938892A (en) | Mobile object, server, and method for manufacturing mobile object | |
EP4462212A1 (en) | Moving object, server, and method of manufacturing moving object | |
US20240377832A1 (en) | Moving object, server, and method of manufacturing moving object | |
EP4432042A1 (en) | Automatic driving system, server, and vehicle | |
EP4418060A1 (en) | System, moving object , method of manufacturing moving object, and server | |
EP4439208A1 (en) | System and server | |
EP4443255A1 (en) | System and server | |
EP4414211A1 (en) | System, server,method of manufacturing vehicle,and charging device | |
CN118657304A (en) | Automatic driving system, server and vehicle | |
EP4439516A1 (en) | Moving object, remote control deactivation system, and remote control deactivation method | |
EP4443254A1 (en) | System | |
EP4439515A1 (en) | Remote control system, remote control deactivation device, and remote control deactivation method | |
US20240326940A1 (en) | Moving object, remote control deactivation system, and remote control deactivation method | |
US20240356353A1 (en) | Charging system and charging method | |
CN118519366A (en) | System, mobile body, method for manufacturing mobile body, and server | |
EP4439210A1 (en) | Moving object, remote driving system, and method of disabling remote control | |
EP4439517A1 (en) | Remote control system, remote control deactivation device, and remote control deactivation method | |
JP2024132824A (en) | Autonomous Driving Systems, Servers, and Vehicles | |
US20240329656A1 (en) | Abnormality detection device, detection device, and vehicle | |
EP4414800A1 (en) | System, method of manufacturing vehicle, server, vehicle, and power feeder | |
US20230237858A1 (en) | Vehicle management device and vehicle management method | |
JP2024117695A (en) | Remote automatic operation system, moving body, manufacturing method for moving body, and server |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication |