WO2023238724A1 - 農業機械の自動走行のための経路生成システムおよび経路生成方法 - Google Patents
農業機械の自動走行のための経路生成システムおよび経路生成方法 Download PDFInfo
- Publication number
- WO2023238724A1 WO2023238724A1 PCT/JP2023/019921 JP2023019921W WO2023238724A1 WO 2023238724 A1 WO2023238724 A1 WO 2023238724A1 JP 2023019921 W JP2023019921 W JP 2023019921W WO 2023238724 A1 WO2023238724 A1 WO 2023238724A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- route
- trajectory
- travel
- vehicle
- work vehicle
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 37
- 238000012545 processing Methods 0.000 claims abstract description 91
- 230000033001 locomotion Effects 0.000 claims description 31
- 230000008859 change Effects 0.000 claims description 17
- 230000000295 complement effect Effects 0.000 claims description 14
- 230000008569 process Effects 0.000 claims description 10
- 230000004044 response Effects 0.000 claims description 7
- 230000002123 temporal effect Effects 0.000 claims description 3
- 230000009471 action Effects 0.000 abstract description 8
- 238000010586 diagram Methods 0.000 description 34
- 238000004891 communication Methods 0.000 description 33
- 230000006870 function Effects 0.000 description 26
- 230000007613 environmental effect Effects 0.000 description 23
- 238000012937 correction Methods 0.000 description 9
- 238000004590 computer program Methods 0.000 description 8
- 230000008878 coupling Effects 0.000 description 8
- 238000010168 coupling process Methods 0.000 description 8
- 238000005859 coupling reaction Methods 0.000 description 8
- 230000005540 biological transmission Effects 0.000 description 7
- 239000000284 extract Substances 0.000 description 6
- 238000005259 measurement Methods 0.000 description 6
- 239000004065 semiconductor Substances 0.000 description 5
- 230000001133 acceleration Effects 0.000 description 4
- 238000013459 approach Methods 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 4
- 239000013589 supplement Substances 0.000 description 4
- 241000209094 Oryza Species 0.000 description 3
- 235000007164 Oryza sativa Nutrition 0.000 description 3
- 230000001413 cellular effect Effects 0.000 description 3
- 238000001514 detection method Methods 0.000 description 3
- 238000010295 mobile communication Methods 0.000 description 3
- 235000009566 rice Nutrition 0.000 description 3
- 238000004422 calculation algorithm Methods 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 238000002485 combustion reaction Methods 0.000 description 2
- 238000013480 data collection Methods 0.000 description 2
- 239000003337 fertilizer Substances 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 238000012544 monitoring process Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000012827 research and development Methods 0.000 description 2
- 230000002441 reversible effect Effects 0.000 description 2
- 230000001502 supplementing effect Effects 0.000 description 2
- 235000013311 vegetables Nutrition 0.000 description 2
- 241001124569 Lycaenidae Species 0.000 description 1
- 241000607479 Yersinia pestis Species 0.000 description 1
- 230000005856 abnormality Effects 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 230000004720 fertilization Effects 0.000 description 1
- 238000003306 harvesting Methods 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000010899 nucleation Methods 0.000 description 1
- 239000002420 orchard Substances 0.000 description 1
- 239000000047 product Substances 0.000 description 1
- 230000002250 progressing effect Effects 0.000 description 1
- 230000001141 propulsive effect Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000009331 sowing Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/20—Control system inputs
- G05D1/22—Command input arrangements
- G05D1/229—Command input data, e.g. waypoints
- G05D1/2297—Command input data, e.g. waypoints positional data taught by the user, e.g. paths
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01B—SOIL WORKING IN AGRICULTURE OR FORESTRY; PARTS, DETAILS, OR ACCESSORIES OF AGRICULTURAL MACHINES OR IMPLEMENTS, IN GENERAL
- A01B69/00—Steering of agricultural machines or implements; Guiding agricultural machines or implements on a desired track
- A01B69/001—Steering by means of optical assistance, e.g. television cameras
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01B—SOIL WORKING IN AGRICULTURE OR FORESTRY; PARTS, DETAILS, OR ACCESSORIES OF AGRICULTURAL MACHINES OR IMPLEMENTS, IN GENERAL
- A01B69/00—Steering of agricultural machines or implements; Guiding agricultural machines or implements on a desired track
- A01B69/007—Steering or guiding of agricultural vehicles, e.g. steering of the tractor to keep the plough in the furrow
- A01B69/008—Steering or guiding of agricultural vehicles, e.g. steering of the tractor to keep the plough in the furrow automatic
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/20—Control system inputs
- G05D1/24—Arrangements for determining position or orientation
- G05D1/243—Means capturing signals occurring naturally from the environment, e.g. ambient optical, acoustic, gravitational or magnetic signals
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D2105/00—Specific applications of the controlled vehicles
- G05D2105/15—Specific applications of the controlled vehicles for harvesting, sowing or mowing in agriculture or forestry
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D2107/00—Specific environments of the controlled vehicles
- G05D2107/20—Land use
- G05D2107/21—Farming, e.g. fields, pastures or barns
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D2109/00—Types of controlled vehicles
- G05D2109/10—Land vehicles
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D2111/00—Details of signals used for control of position, course, altitude or attitude of land, water, air or space vehicles
- G05D2111/10—Optical signals
Definitions
- the present disclosure relates to a route generation system and route generation method for automatic travel of agricultural machinery.
- Patent Documents 1 and 2 disclose examples of systems for automatically driving an unmanned work vehicle between two fields separated from each other across a road.
- the present disclosure provides a technique for appropriately generating an automatic driving route for agricultural machinery.
- a route generation system is a route generation system for automatic travel of agricultural machinery, and includes a processing device that generates an automatic travel route for the agricultural machinery.
- the processing device acquires data indicating the traveling trajectory from a vehicle that manually travels along the route on which the agricultural machine is scheduled to automatically travel while recording the traveling trajectory, and uses the traveling trajectory to avoid oncoming vehicles.
- a trajectory related to an avoidance operation performed for the purpose of the avoidance operation is removed, and an automatic travel route for the agricultural machine is generated based on the travel trajectory from which the trajectory related to the avoidance operation has been removed.
- a route generation method is a route generation method for automatically traveling an agricultural machine, in which the agricultural machine manually runs a route on which the agricultural machine is scheduled to automatically travel while recording a travel trajectory. acquiring data indicating the driving trajectory from the vehicle; removing from the driving trajectory a trajectory related to an avoidance operation performed to avoid the oncoming vehicle; and removing a trajectory related to the avoidance operation performed to avoid the oncoming vehicle and generating an automatic travel route for the agricultural machine based on the travel trajectory from which the travel trajectory has been removed.
- Computer-readable storage media may include volatile storage media or non-volatile storage media.
- the device may be composed of multiple devices. When a device is composed of two or more devices, the two or more devices may be placed within a single device, or may be separated into two or more separate devices. .
- an automatic travel route for agricultural machinery can be appropriately generated.
- FIG. 1 is a block diagram showing an example of a route generation system.
- FIG. 2 is a block diagram showing an example of a more detailed configuration of the route generation system.
- FIG. 2 is a diagram schematically showing how a vehicle travels on a road outside a field while collecting data. It is a flowchart which shows an example of operation which generates an automatic driving route.
- FIG. 3 is a diagram showing an example of an operation in which a vehicle avoids an oncoming vehicle.
- FIG. 3 is a diagram illustrating an example of a travel trajectory from which a trajectory related to an avoidance operation has been removed.
- FIG. 6 is a diagram illustrating a process of supplementing a portion removed from a travel trajectory with a linear supplementary route.
- FIG. 1 is a block diagram showing an example of a route generation system.
- FIG. 2 is a block diagram showing an example of a more detailed configuration of the route generation system.
- FIG. 2 is a diagram schematically showing how a vehicle travels on a
- FIG. 7 is a diagram showing another example of an operation in which a vehicle avoids an oncoming vehicle. It is a figure showing an example of a display of a display device.
- FIG. 6 is a diagram illustrating an example of a display screen when a user touches one of the dotted frame areas.
- FIG. 6 is a diagram illustrating an example of a display screen with one of the removed portions supplemented.
- FIG. 6 is a diagram illustrating an example of a display screen in a state where all removed portions have been supplemented and an automatic travel route has been completed.
- FIG. 1 is a diagram for explaining an overview of an agricultural management system according to an exemplary embodiment of the present disclosure.
- FIG. 1 is a side view schematically showing an example of a work vehicle and an implement connected to the work vehicle.
- FIG. 1 is a side view schematically showing an example of a work vehicle and an implement connected to the work vehicle.
- FIG. 2 is a block diagram showing an example of the configuration of a work vehicle and an implement.
- FIG. 2 is a conceptual diagram showing an example of a work vehicle that performs positioning using RTK-GNSS.
- FIG. 3 is a diagram showing an example of an operation terminal and a group of operation switches provided inside the cabin.
- FIG. 2 is a block diagram illustrating a schematic hardware configuration of a management device and a terminal device.
- FIG. 2 is a diagram schematically showing an example of a work vehicle that automatically travels along a target route in a field. It is a flow chart which shows an example of operation of steering control at the time of automatic driving performed by a control device.
- FIG. 3 is a diagram showing an example of a work vehicle traveling along a target route.
- FIG. 3 is a diagram illustrating an example of a work vehicle in a position shifted to the right from a target route.
- FIG. 3 is a diagram illustrating an example of a work vehicle in a position shifted to the left from a target route.
- FIG. 3 is a diagram illustrating an example of a work vehicle facing in a direction inclined with respect to a target route.
- agricultural machinery refers to machinery used in agricultural applications.
- Examples of agricultural machinery include tractors, harvesters, rice transplanters, riding management machines, vegetable transplanters, mowers, seeders, fertilizer spreaders, and agricultural mobile robots.
- a work vehicle such as a tractor function as an "agricultural machine” alone, but the implements attached to or towed by the work vehicle and the work vehicle as a whole function as a single "agricultural machine.”
- Agricultural machines perform agricultural work such as plowing, sowing, pest control, fertilization, planting crops, or harvesting on the ground within a field. These agricultural works are sometimes referred to as “ground work” or simply “work.”
- the movement of a vehicle-type agricultural machine while performing agricultural work is sometimes referred to as "work driving.”
- Automatic operation means that the movement of agricultural machinery is controlled by the function of a control device, without manual operation by a driver.
- Agricultural machinery that operates automatically is sometimes called “self-driving agricultural machinery” or “robotic agricultural machinery.”
- self-driving agricultural machinery or “robotic agricultural machinery.”
- the control device can control at least one of steering necessary for movement of the agricultural machine, adjustment of movement speed, start and stop of movement.
- the control device may control operations such as raising and lowering the work implement, starting and stopping the operation of the work implement, and the like.
- Movement by automatic driving may include not only movement of the agricultural machine toward a destination along a predetermined route, but also movement of the agricultural machine to follow a tracking target.
- a self-driving agricultural machine may move partially based on user instructions.
- agricultural machinery that performs automatic driving may operate in a manual driving mode in which the agricultural machine moves by manual operation by a driver.
- the act of steering agricultural machinery by means of a control device, rather than manually, is called "automatic steering.”
- Part or all of the control device may be external to the agricultural machine. Communication such as control signals, commands, or data may occur between a control device external to the agricultural machine and the agricultural machine.
- Agricultural machines that operate automatically may move autonomously while sensing the surrounding environment, without humans being involved in controlling the movement of the agricultural machines.
- Agricultural machinery capable of autonomous movement can run unmanned within a field or outside the field (for example, on a road). Obstacle detection and obstacle avoidance operations may be performed during autonomous movement.
- Environmental map is data that expresses the positions or areas of objects in the environment in which agricultural machinery moves using a predetermined coordinate system.
- Environmental maps are sometimes simply referred to as "maps" or “map data.”
- the coordinate system defining the environmental map may be, for example, a world coordinate system, such as a geographic coordinate system fixed relative to the earth.
- the environmental map may include information other than location (for example, attribute information and other information) about objects existing in the environment.
- Environmental maps include maps in various formats, such as point cloud maps or grid maps. Local map or partial map data generated or processed in the process of constructing an environmental map is also referred to as a "map" or "map data.”
- Automatic driving route means data on the route that connects the starting point to the destination point when agricultural machinery runs automatically.
- the automated driving route is also referred to as a "global route” or a "target route.”
- the automatic driving route can be defined, for example, by the coordinate values of a plurality of points on a map that the agricultural machine should pass through.
- a point through which agricultural machinery should pass is called a "waypoint,” and a line segment connecting adjacent waypoints is called a "link.”
- Waypoint data may include position and velocity information.
- generating data indicating an automatic travel route (for example, data on a plurality of waypoints) is expressed as "generating an automatic travel route.”
- FIG. 1 is a block diagram showing an example of a route generation system for automatic travel of agricultural machinery.
- the route generation system 10 shown in FIG. 1 is used in combination with a vehicle 20 that collects data necessary for generating an automatic travel route and an agricultural machine 30 that is capable of automatic travel.
- the route generation system 10 is a computer system including a processing device 15.
- the processing device 15 generates an automatic travel route for the agricultural machine 30 based on the data collected by the vehicle 20.
- the vehicle 20 is a vehicle that collects data necessary to generate an automatic travel route for the agricultural machine 30.
- Vehicle 20 can be, for example, a regular car, a truck (lorry), a van, or an agricultural work vehicle.
- the agricultural machine 30 is a self-driving agricultural machine that automatically travels according to the automatic travel route generated by the processing device 15.
- the agricultural machine 30 is, for example, an agricultural work vehicle such as a tractor.
- the agricultural machine 30 can automatically travel not only in the field but also on roads outside the field (for example, farm roads or general roads).
- the vehicle 20 is a different vehicle from the agricultural machine 30, but the agricultural machine 30 may also have the function of the vehicle 20. That is, one agricultural work vehicle capable of both automatic operation and manual operation may be used as the agricultural machine 30 and the vehicle 20.
- the route generation system 10 may be a system (for example, a cloud computing system) independent of the vehicle 20 and the agricultural machine 30, or may be mounted on the vehicle 20 or the agricultural machine 30.
- a cloud computing system for example, a cloud computing system
- FIG. 2 is a block diagram showing an example of a more detailed configuration of the system shown in FIG. 1.
- the route generation system 10 includes a processing device 15, an input interface (I/F) 11, an output interface 12, and a storage device 13.
- the vehicle 20 includes a positioning device 21, a camera 22, and a storage device 23.
- the agricultural machine 30 includes a self-position estimating device 31, a travel control device 32, and a storage device 33.
- FIG. 2 also illustrates a display device 45 that displays the automatic travel route generated by the processing device 15, and an input device 40 that is used by the user to edit the automatic travel route.
- FIG. 3 is a diagram schematically showing how the vehicle 20 travels on a road 75 (for example, a farm road) outside the field 70 while collecting data.
- FIG. 3 illustrates a plurality of farm fields 70, a road 75 around them, and a storage 78 for agricultural machinery 30.
- the user drives the vehicle 20 to travel along a route on which the agricultural machine 30 is scheduled to travel automatically later.
- the vehicle 20 travels while recording its own travel trajectory.
- the vehicle 20 records position data sequentially output from a positioning device 21 such as a GNSS receiver in the storage device 23 as data indicating a running trajectory.
- Location data may include, for example, latitude and longitude information in a geographic coordinate system.
- the data indicating the travel trajectory may include position data of the vehicle 20 and corresponding time information. That is, the data indicating the travel trajectory can indicate changes in the position of the vehicle 20 over time.
- the data indicating the traveling trajectory may include information on the traveling speed of the vehicle 20 at each time in addition to information on the position of the vehicle 20 at each time. Information about the location and running speed of the vehicle 20 may be recorded at relatively short intervals (eg, from a few milliseconds to a few seconds).
- FIG. 3 an example of the travel trajectory of the vehicle 20 is shown by a broken line arrow.
- the vehicle 20 travels from the storage 78 on a road 75 around a plurality of fields 70 where agricultural machinery 30 is scheduled to perform agricultural work, and returns to the storage 78.
- the route that the vehicle 20 travels to collect data is determined according to the route that the agricultural machine 30 is scheduled to travel.
- the vehicle 20 manually travels along a route on which the agricultural machine 30 is scheduled to automatically travel, while recording a travel trajectory. In this specification, when the vehicle 20 runs manually by the driver, it is expressed as "running manually.”
- the vehicle 20 may travel while photographing the surroundings of the vehicle 20 with the camera 22. In that case, the vehicle 20 runs while recording the moving images photographed by the camera 22 in the storage device 23.
- the data indicating the travel trajectory is sent to the processing device 15.
- the data indicating the travel trajectory may be transmitted via a wired or wireless communication line, or may be provided to the processing device 15 via any recording medium.
- the processing device 15 directly or indirectly acquires data indicating the travel trajectory from the vehicle 20.
- the processing device 15 generates an automatic travel route for the agricultural machine 30 based on the data indicating the acquired travel trajectory. For example, the processing device 15 can approximate the travel trajectory of the vehicle 20 as a combination of a plurality of line segments on a map prepared in advance, and generate the combination of these line segments as an automatic travel route.
- the automatic travel route can be appropriately generated by approximating the travel trajectory of the vehicle 20 using a plurality of line segments.
- the vehicle 20 while the vehicle 20 is traveling, there may be an oncoming vehicle ahead. In that case, if the width of the road 75 is narrow, the vehicle 20 will perform an operation to avoid an oncoming vehicle. For example, the vehicle 20 may perform an avoidance operation to avoid contact with an oncoming vehicle by decelerating and approaching the edge of the road 75, backing up, or temporarily stopping.
- the driving trajectory related to the avoidance action is also recorded, so if an automated driving route is simply generated based on the data showing the driving trajectory, an inappropriate automated driving route that reflects the avoidance action will be generated. It turns out.
- the processing device 15 in this embodiment generates an automatic driving route after performing a process of removing a trajectory related to an avoidance operation from the driving trajectory of the vehicle 20.
- An example of this processing will be described below with reference to FIG.
- FIG. 4 is a flowchart showing an example of the operation of the processing device 15 to generate an automatic travel route.
- the processing device 15 first obtains travel trajectory data recorded by the vehicle 20 (step S11). Next, the processing device 15 removes a trajectory related to an avoidance operation performed to avoid an oncoming vehicle from the travel trajectory indicated by the travel trajectory data (step S12). An example of a method for identifying a trajectory related to an avoidance operation from a travel trajectory will be described later.
- the processing device 15 generates an automatic travel route for the agricultural machine 30 based on the travel trajectory from which the trajectory related to the avoidance operation has been removed (S13). For example, an automatic travel route can be generated by performing complementary processing such as approximating the removed portion with a line segment. Thereafter, the processing device 15 transmits data indicating the automatic travel route to the agricultural machine 30 (step S14). Note that if the processing device 15 is mounted on the agricultural machine 30, the operation of step S14 may be omitted.
- step S12 and step S13 will be described with reference to FIGS. 5A to 5C.
- FIG. 5A shows an example of an operation in which the vehicle 20 avoids an oncoming vehicle 90.
- the driver of the vehicle 20 in order to avoid contact with the oncoming vehicle 90 approaching from the front, the driver of the vehicle 20 first moves the vehicle 20 to the left end of the road 75, passes the oncoming vehicle 90, and then stops the vehicle 20 on the road 75. Operate the steering wheel so that it returns to the center. Therefore, the travel trajectory recorded by the vehicle 20 connects two straight routes 91 and 93 and a non-straight route 92 caused by an avoidance operation between them, as shown by the broken line arrow in FIG. 5A. become something.
- the route related to the avoidance operation (hereinafter sometimes referred to as the "avoidance route") is not limited to the route 92 as shown in FIG. 5A.
- the avoidance route may include a backward route 95 and a subsequent forward route 96.
- the width of the road 75 is so narrow that the vehicle 20 and the oncoming vehicle 90 cannot pass each other. In such a case, the vehicle 20 once moves backward, returns to a place large enough to pass each other, stops temporarily, allows the oncoming vehicle 90 to pass, and then moves forward and returns to its original route.
- FIG. 5B shows an example of a travel trajectory from which a trajectory related to an avoidance motion has been removed.
- the processing device 15 may extract a trajectory related to the avoidance operation based on data of a moving image taken by the camera 22 while the vehicle 20 is traveling. In that case, in step S11 shown in FIG. 4, the processing device 15 acquires data of the moving image in addition to the data of the traveling trajectory. The processing device 15 detects an avoidance motion based on the moving image, determines and removes a trajectory related to the avoidance motion from the travel trajectory.
- the processing device 15 may perform image recognition processing based on the video image, and determine a trajectory related to the avoidance operation based on the result of recognizing the oncoming vehicle 90 approaching the vehicle 20 from the video image. For example, the processing device 15 converts a trajectory associated with an avoidance operation into a trajectory that corresponds to at least a portion of the period from when the oncoming vehicle 90 is recognized in the video image until the oncoming vehicle 90 is no longer recognized among the traveling trajectories. It may be removed as Alternatively, the processing device 15 recognizes that the oncoming vehicle 90 approaches the vehicle 20 by a predetermined distance (for example, 5 m, 10 m, or 20 m, etc.) in the moving image in the driving trajectory. A trajectory corresponding to a predetermined time period (for example, 10 seconds, 20 seconds, or 30 seconds, etc.) including the period until it disappears may be removed as a trajectory related to the avoidance operation.
- a predetermined distance for example, 5 m, 10 m, or 20 m, etc.
- the processing device 15 may also detect an avoidance operation based on a temporal change in the position of the vehicle 20 indicated by the travel trajectory. For example, the processing device 15 may detect at least one operation of reversing, turning, accelerating, and decelerating that the vehicle 20 performs to avoid the oncoming vehicle 90 as an avoidance operation. As an example, the processing device 15 may extract a portion of the traveling trajectory that depicts a non-linear trajectory even though the trajectory is in a straight portion of the road 75 as a trajectory related to the avoidance operation. A record of steering and/or acceleration/deceleration operations of the vehicle 20 may be used to extract a trajectory related to the avoidance operation.
- the processing device 15 may extract a portion of the travel trajectory in which a large direction change is made at a position other than an intersection on the road 75 as a trajectory related to the avoidance operation.
- the processing device 15 also extracts, from the traveling trajectory, a portion where the vehicle 20 decelerates or stops while traveling along the road 75, or moves backward and then moves forward again, as a trajectory related to the avoidance operation. It's okay.
- a machine learning algorithm such as deep learning may be used to detect the avoidance motion.
- the processing device 15 may extract a trajectory related to the avoidance operation from the travel trajectory based on travel trajectory data acquired from the vehicle 20 and a learned model trained in advance.
- the processing device 15 generates an automatic travel route by removing the trajectory related to the avoidance operation and then performing a process to complement the removed portion. For example, as shown in FIG. 5C, the processing device 15 may generate an automatic travel route by complementing the portion removed from the travel trajectory with a linear complementary route 94.
- Such complementary processing may be performed automatically by the processing device 15, or may be performed in response to an operation from the user.
- the processing device 15 causes the display device 45 to display the traveling trajectory from which the trajectory related to the avoidance motion has been removed, and in response to an operation performed by the user using the input device 40 to determine a complementary route, the processing device 15 displays the traveling trajectory from which the trajectory related to the avoidance operation has been removed. You may supplement the parts removed from the .
- FIG. 7A is a diagram showing a display example of the display device 45.
- the display device 45 in this example is a computer with a built-in display, such as a tablet computer or a smartphone.
- the illustrated display device 45 includes a touch screen and also functions as the input device 40.
- the display device 45 displays an environmental map around the field 70. A route obtained by removing the avoidance route from the travel trajectory of the vehicle 20 is displayed on the map. In FIG. 7A, a portion corresponding to the removed avoidance route is surrounded by a dotted line. The user can perform a route complement operation by, for example, touching the part surrounded by a dotted line.
- FIG. 7B shows an example of the display screen when the user touches one of the parts surrounded by the dotted line.
- a popup is displayed asking "Do you want to complete the route?" and the user can select "Yes” or "No". If the user selects "Yes", the processing device 15 generates a complementary path that complements the removed portion. For example, the processing device 15 complements the removed portion with a linear complementary path. Alternatively, the user may be able to specify a supplementary route.
- FIG. 7C illustrates a state in which one of the removed parts is complemented.
- the interpolated portion is indicated by a dashed arrow.
- the user can also fill in other removed portions with similar operations.
- FIG. 7D shows an example of a state in which all removed parts have been supplemented and the automatic travel route has been completed.
- An automated driving route may be defined by multiple waypoints, for example. Each waypoint may include position and velocity information, for example.
- waypoints are represented by dots, and links between waypoints are represented by arrows.
- waypoints are set at locations where the agricultural machine 30 can change direction (such as intersections, near farm entrances and exits, and storage entrances and exits).
- the waypoint setting method is not limited to the illustrated example, and the length of the link between waypoints can be set arbitrarily.
- the above operations can prevent the avoidance operation performed to avoid the oncoming vehicle 90 from being reflected in the automatic travel route. Thereby, a more appropriate automatic travel route for the agricultural machine 30 can be generated.
- the data indicating the generated automatic travel route is sent to the agricultural machine 30 and recorded in the storage device 33.
- the travel control device 32 of the agricultural machine 30 controls the travel speed and steering of the agricultural machine 30 so that the agricultural machine 30 travels along the automatic travel route. For example, if the automatic travel route is defined by a plurality of waypoints and each waypoint includes position and speed information, the travel control device 32 controls the travel speed and steering so that each waypoint is passed at a specified speed. control.
- the travel control device 32 can estimate how far the agricultural machine 30 deviates from the automatic travel route based on the position and orientation of the agricultural machine 30 estimated by the self-position estimating device 31.
- the self-position estimating device 31 is a device that performs self-position estimation using sensors such as GNSS, IMU (Inertial Measurement Unit), LiDAR (Light Detection and Ranging), and/or a camera (including an image sensor). .
- the travel control device 32 can realize travel along the automatic travel route by performing steering control so as to reduce deviation in position and/or direction of the agricultural machine 30 from the automatic travel route.
- the processing device 15 executes the above processing when generating a route for the agricultural machine 30 to automatically travel outside the field.
- the processing device 15 may perform similar processing when generating a route for the agricultural machine 30 to automatically travel within the field. Even within a field, other agricultural work vehicles may exist as oncoming vehicles, so the route generation method according to the present embodiment is effective.
- FIG. 8 is a diagram for explaining an overview of an agricultural management system according to an exemplary embodiment of the present disclosure.
- the system shown in FIG. 8 includes a work vehicle 100, a terminal device 400, and a management device 600.
- Work vehicle 100 is an agricultural machine that can run automatically.
- Terminal device 400 is a computer used by a user who remotely monitors work vehicle 100.
- the management device 600 is a computer managed by a business operator that operates the system. Work vehicle 100, terminal device 400, and management device 600 can communicate with each other via network 80.
- the system may include multiple work vehicles or other agricultural machinery.
- the work vehicle 100 functions as both the vehicle 20 and the agricultural machine 30 shown in FIG. 1.
- the management device 600 has the functions of the processing device 15 shown in FIG.
- the work vehicle 100 in this embodiment is a tractor.
- Work vehicle 100 can be equipped with an implement on one or both of the rear and front parts.
- the work vehicle 100 can travel within a field while performing agricultural work depending on the type of implement.
- the work vehicle 100 may run inside or outside the field without any implements attached thereto.
- the work vehicle 100 is equipped with an automatic driving function. In other words, the work vehicle 100 can be driven not manually but by the action of the control device.
- the control device in this embodiment is provided inside the work vehicle 100 and can control both the speed and steering of the work vehicle 100.
- the work vehicle 100 can automatically travel not only inside the field but also outside the field (for example, on a road).
- the work vehicle 100 is equipped with devices used for positioning or self-position estimation, such as a GNSS receiver and a LiDAR sensor.
- the control device of the work vehicle 100 automatically causes the work vehicle 100 to travel based on the position of the work vehicle 100 and information on the target route.
- the control device also controls the operation of the implement.
- the work vehicle 100 can perform agricultural work using the implement while automatically traveling within the field. Further, the work vehicle 100 can automatically travel along a target route on a road outside the field (for example, a farm road or a general road).
- the work vehicle 100 When the work vehicle 100 automatically travels along a road outside the field, the work vehicle 100 creates a local route that can avoid obstacles along the target route based on data output from a sensing device such as a camera or a LiDAR sensor. Run while generating. In the field, the work vehicle 100 may travel while generating a local route as described above, or may travel along the target route without generating a local route, and when an obstacle is detected. You may also perform an operation of stopping if there is a problem.
- a sensing device such as a camera or a LiDAR sensor
- the management device 600 is a computer that manages agricultural work performed by the work vehicle 100.
- the management device 600 may be, for example, a server computer that centrally manages information regarding fields on the cloud and supports agriculture by utilizing data on the cloud.
- the management device 600 has the same functions as the processing device 15 shown in FIG. That is, the management device 600 generates an automatic travel route (ie, a target route) for the work vehicle 100.
- the management device 600 acquires data indicating a travel trajectory when the work vehicle 100 travels in manual operation, and generates an automatic travel route for the work vehicle 100 based on the data. More specifically, before the work vehicle 100 starts automatic travel, the work vehicle 100 manually travels along the route on which the automatic travel is planned while recording the travel trajectory.
- the work vehicle 100 uses a positioning device such as a GNSS unit to record a travel trajectory by sequentially recording its own position.
- the management device 600 acquires data indicating the travel trajectory from the work vehicle 100.
- the travel trajectory may include a trajectory related to an avoidance operation performed to avoid an oncoming vehicle on the road.
- the management device 600 removes a trajectory related to an avoidance operation performed to avoid an oncoming vehicle from the trajectory indicated by the acquired data, and based on the trajectory from which the trajectory has been removed, , generates an automatic travel route for the work vehicle 100. Through such processing, an appropriate automatic travel route can be generated without reflecting the trajectory associated with the avoidance operation.
- the management device 600 may further create a work plan for the work vehicle 100 and give instructions to the work vehicle 100 to start and end automatic travel according to the work plan.
- the management device 600 may also generate an environmental map based on data collected by the work vehicle 100 or other vehicles using a sensing device such as a LiDAR sensor.
- the management device 600 transmits data such as the generated automatic driving route, work plan, and environmental map to the work vehicle 100.
- the work vehicle 100 automatically performs traveling and agricultural work based on the data.
- generation of the automatic travel route is not limited to the management device 600, and may be performed by another device.
- the control device of work vehicle 100 may generate the automatic travel route.
- the control device of work vehicle 100 functions as a processing device that generates an automatic travel route.
- the terminal device 400 is a computer used by a user located away from the work vehicle 100. Although the terminal device 400 shown in FIG. 8 is a laptop computer, it is not limited to this.
- the terminal device 400 may be a stationary computer such as a desktop PC (personal computer), or may be a mobile terminal such as a smartphone or a tablet computer.
- Terminal device 400 can be used to remotely monitor work vehicle 100 or remotely control work vehicle 100.
- the terminal device 400 can display images captured by one or more cameras (imaging devices) included in the work vehicle 100 on a display. The user can view the video, check the situation around the work vehicle 100, and send an instruction to the work vehicle 100 to stop or start.
- Terminal device 400 may further include the functions of input device 40 and display device 45 shown in FIG. That is, the terminal device 400 may be used for editing the automatic travel route generated by the management device 600.
- FIG. 9 is a side view schematically showing an example of the work vehicle 100 and the implement 300 connected to the work vehicle 100.
- Work vehicle 100 in this embodiment can operate in both manual driving mode and automatic driving mode. In the automatic driving mode, the work vehicle 100 can run unmanned.
- the work vehicle 100 is capable of automatic operation both inside and outside the field.
- the work vehicle 100 includes a vehicle body 101, a prime mover (engine) 102, and a transmission 103.
- the vehicle body 101 is provided with wheels 104 with tires and a cabin 105.
- the wheels 104 include a pair of front wheels 104F and a pair of rear wheels 104R.
- a driver's seat 107, a steering device 106, an operation terminal 200, and a group of switches for operation are provided inside the cabin 105.
- One or both of the front wheel 104F and the rear wheel 104R may be replaced with a plurality of wheels (crawlers) equipped with tracks instead of wheels with tires.
- the work vehicle 100 includes a plurality of sensing devices that sense the surroundings of the work vehicle 100.
- the sensing device includes multiple cameras 120, LiDAR sensors 140, and multiple obstacle sensors 130.
- the cameras 120 may be provided, for example, on the front, rear, left and right sides of the work vehicle 100. Camera 120 photographs the environment around work vehicle 100 and generates image data. Images acquired by camera 120 may be transmitted to terminal device 400 for remote monitoring. The image may be used to monitor work vehicle 100 during unmanned operation. The camera 120 is also used to generate images for recognizing surrounding features, obstacles, white lines, signs, signs, etc. when the work vehicle 100 travels on a road outside the field (farm road or general road). can be used. For example, the camera 120 may be used to detect an oncoming vehicle while the work vehicle 100 is manually driven while recording the travel trajectory.
- the LiDAR sensor 140 in the example of FIG. 9 is arranged at the lower front of the vehicle body 101. LiDAR sensor 140 may be provided at other locations. While the work vehicle 100 is mainly traveling outside the field, the LiDAR sensor 140 measures the distance and direction of objects in the surrounding environment to each measurement point, or the two-dimensional or three-dimensional coordinate values of each measurement point. Repeatedly outputs sensor data indicating . Sensor data output from LiDAR sensor 140 is processed by the control device of work vehicle 100. The control device can estimate the self-position of the work vehicle 100 by matching the sensor data with the environmental map. The control device further detects objects such as obstacles existing around the work vehicle 100 based on the sensor data, and generates a local route that the work vehicle 100 should actually follow along the target route.
- the control device may be configured to generate or edit the environmental map using an algorithm such as SLAM (Simultaneous Localization and Mapping).
- Work vehicle 100 may include a plurality of LiDAR sensors arranged at different positions and in different orientations.
- a plurality of obstacle sensors 130 shown in FIG. 9 are provided at the front and rear of the cabin 105. Obstacle sensor 130 may also be placed at other locations. For example, one or more obstacle sensors 130 may be provided at arbitrary positions on the side, front, and rear of the vehicle body 101. Obstacle sensor 130 may include, for example, a laser scanner or an ultrasonic sonar. Obstacle sensor 130 is used to detect surrounding obstacles during automatic driving and to stop work vehicle 100 or take a detour. LiDAR sensor 140 may be used as one of the obstacle sensors 130.
- the work vehicle 100 further includes a GNSS unit 110.
- GNSS unit 110 includes a GNSS receiver.
- the GNSS receiver may include an antenna that receives signals from GNSS satellites and a processor that calculates the position of work vehicle 100 based on the signals received by the antenna.
- the GNSS unit 110 receives satellite signals transmitted from a plurality of GNSS satellites, and performs positioning based on the satellite signals.
- GNSS is a general term for satellite positioning systems such as GPS (Global Positioning System), QZSS (Quasi-Zenith Satellite System, such as Michibiki), GLONASS, Galileo, and BeiDou.
- GPS Global Positioning System
- QZSS Quadasi-Zenith Satellite System
- Galileo Galileo
- BeiDou BeiDou.
- the GNSS unit 110 may include an inertial measurement unit (IMU). Signals from the IMU can be used to supplement the position data.
- the IMU can measure the tilt and minute movements of the work vehicle 100. By using data acquired by the IMU to supplement position data based on satellite signals, positioning performance can be improved.
- the control device of the work vehicle 100 may use sensing data acquired by a sensing device such as the camera 120 or the LiDAR sensor 140 for positioning.
- a sensing device such as the camera 120 or the LiDAR sensor 140
- the data acquired by the camera 120 or the LiDAR sensor 140 and the previously stored can be estimated with high accuracy based on the environmental map stored in the device.
- the position of work vehicle 100 can be specified with higher accuracy.
- the prime mover 102 may be, for example, a diesel engine.
- An electric motor may be used instead of a diesel engine.
- Transmission device 103 can change the propulsive force and moving speed of work vehicle 100 by shifting. The transmission 103 can also switch the work vehicle 100 between forward movement and reverse movement.
- the steering device 106 includes a steering wheel, a steering shaft connected to the steering wheel, and a power steering device that assists steering with the steering wheel.
- the front wheel 104F is a steered wheel, and by changing its turning angle (also referred to as a "steering angle"), the traveling direction of the work vehicle 100 can be changed.
- the steering angle of the front wheels 104F can be changed by operating the steering wheel.
- the power steering device includes a hydraulic device or an electric motor that supplies an auxiliary force to change the steering angle of the front wheels 104F. When automatic steering is performed, the steering angle is automatically adjusted by the power of a hydraulic system or an electric motor under control from a control device disposed within work vehicle 100.
- a coupling device 108 is provided at the rear of the vehicle body 101.
- the coupling device 108 includes, for example, a three-point support device (also referred to as a "three-point link” or “three-point hitch"), a PTO (Power Take Off) shaft, a universal joint, and a communication cable.
- the implement 300 can be attached to and detached from the work vehicle 100 by the coupling device 108.
- the coupling device 108 can change the position or posture of the implement 300 by raising and lowering the three-point link using, for example, a hydraulic device. Further, power can be sent from the work vehicle 100 to the implement 300 via the universal joint.
- the work vehicle 100 can cause the implement 300 to perform a predetermined work while pulling the implement 300.
- the coupling device may be provided at the front of the vehicle body 101. In that case, an implement can be connected to the front of the work vehicle 100.
- the implement 300 shown in FIG. 9 is a rotary tiller
- the implement 300 is not limited to a rotary tiller.
- any implement such as a seeder, spreader, transplanter, mower, rake, baler, harvester, sprayer, or harrow. It can be used by connecting to the work vehicle 100.
- the work vehicle 100 shown in FIG. 9 is capable of manned operation, it may also support only unmanned operation. In that case, components necessary only for manned operation, such as the cabin 105, the steering device 106, and the driver's seat 107, may not be provided in the work vehicle 100.
- the unmanned work vehicle 100 can run autonomously or by remote control by a user. When a work vehicle 100 without a manned driving function is used, travel locus data for route generation is acquired by a manned vehicle other than the work vehicle 100.
- FIG. 10 is a block diagram showing a configuration example of the work vehicle 100 and the implement 300.
- Work vehicle 100 and implement 300 can communicate with each other via a communication cable included in coupling device 108 .
- Work vehicle 100 can communicate with terminal device 400 and management device 600 via network 80 .
- the work vehicle 100 in the example of FIG. 10 includes, in addition to a GNSS unit 110, a camera 120, an obstacle sensor 130, a LiDAR sensor 140, and an operation terminal 200, a sensor group 150 that detects the operating state of the work vehicle 100, a control system 160, It includes a communication device 190, a group of operation switches 210, a buzzer 220, and a drive device 240. These components are communicatively connected to each other via a bus.
- the GNSS unit 110 includes a GNSS receiver 111 , an RTK receiver 112 , an inertial measurement unit (IMU) 115 , and a processing circuit 116 .
- IMU inertial measurement unit
- Sensor group 150 includes a steering wheel sensor 152, a turning angle sensor 154, and an axle sensor 156.
- Control system 160 includes a storage device 170 and a control device 180.
- Control device 180 includes a plurality of electronic control units (ECU) 181 to 185.
- the implement 300 includes a drive device 340, a control device 380, and a communication device 390. Note that FIG. 10 shows components that are relatively highly relevant to the automatic driving operation of the work vehicle 100, and illustration of other components is omitted.
- the GNSS receiver 111 in the GNSS unit 110 receives satellite signals transmitted from multiple GNSS satellites, and generates GNSS data based on the satellite signals.
- GNSS data is generated in a predetermined format, such as NMEA-0183 format.
- GNSS data may include, for example, values indicating the identification number, elevation, azimuth, and reception strength of each satellite from which the satellite signal was received.
- the GNSS unit 110 shown in FIG. 10 performs positioning of the work vehicle 100 using RTK (Real Time Kinematic)-GNSS.
- FIG. 11 is a conceptual diagram showing an example of a work vehicle 100 that performs positioning using RTK-GNSS.
- RTK-GNSS Real Time Kinematic
- a correction signal transmitted from the reference station 60 is used in positioning using RTK-GNSS.
- the reference station 60 may be installed near a field where the work vehicle 100 travels for work (for example, within 10 km from the work vehicle 100).
- the reference station 60 generates, for example, a correction signal in RTCM format based on the satellite signals received from the plurality of GNSS satellites 50, and transmits it to the GNSS unit 110.
- RTK receiver 112 includes an antenna and a modem, and receives the correction signal transmitted from reference station 60.
- the processing circuit 116 of the GNSS unit 110 corrects the positioning result by the GNSS receiver 111 based on the correction signal.
- RTK-GNSS it is possible to perform positioning with an accuracy of a few centimeters, for example.
- Location information including latitude, longitude, and altitude information is obtained through highly accurate positioning using RTK-GNSS.
- GNSS unit 110 calculates the position of work vehicle 100 at a frequency of about 1 to 10 times per second, for example.
- the positioning method is not limited to RTK-GNSS, and any positioning method (interferometric positioning method, relative positioning method, etc.) that can obtain position information with the necessary accuracy can be used.
- positioning may be performed using VRS (Virtual Reference Station) or DGPS (Differential Global Positioning System). If positional information with the necessary accuracy can be obtained without using the correction signal transmitted from the reference station 60, the positional information may be generated without using the correction signal.
- GNSS unit 110 may not include RTK receiver 112.
- the position of work vehicle 100 is estimated.
- the position of work vehicle 100 can be estimated by matching data output from LiDAR sensor 140 and/or camera 120 with a high-precision environmental map.
- the GNSS unit 110 in this embodiment further includes an IMU 115.
- IMU 115 may include a 3-axis acceleration sensor and a 3-axis gyroscope.
- the IMU 115 may include an orientation sensor such as a 3-axis geomagnetic sensor.
- IMU 115 functions as a motion sensor and can output signals indicating various quantities such as acceleration, speed, displacement, and posture of work vehicle 100.
- Processing circuit 116 can estimate the position and orientation of work vehicle 100 with higher accuracy based on the signal output from IMU 115 in addition to the satellite signal and correction signal.
- the signal output from IMU 115 may be used to correct or supplement the position calculated based on the satellite signal and the correction signal.
- IMU 115 outputs signals more frequently than GNSS receiver 111.
- the processing circuit 116 can measure the position and orientation of the work vehicle 100 at a higher frequency (eg, 10 Hz or more).
- a 3-axis acceleration sensor and a 3-axis gyroscope may be provided separately.
- IMU 115 may be provided as a separate device from GNSS unit 110.
- the camera 120 is an imaging device that photographs the environment around the work vehicle 100.
- the camera 120 includes an image sensor such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor).
- Camera 120 may also include an optical system including one or more lenses, and signal processing circuitry.
- Camera 120 photographs the environment around work vehicle 100 while work vehicle 100 is traveling, and generates image (for example, video) data.
- the camera 120 can shoot moving images at a frame rate of 3 frames per second (fps) or more, for example.
- the image generated by camera 120 can be used, for example, when a remote monitor uses terminal device 400 to check the environment around work vehicle 100. Images generated by camera 120 may be used for positioning or obstacle detection.
- the image generated by the camera 120 while the work vehicle 100 is traveling to collect data on the traveling trajectory described above is used in the process of recognizing an oncoming vehicle and detecting an action to avoid the oncoming vehicle. can be done.
- a plurality of cameras 120 may be provided at different positions on the work vehicle 100, or a single camera may be provided.
- a visible camera that generates visible light images and an infrared camera that generates infrared images may be provided separately. Both a visible camera and an infrared camera may be provided as cameras that generate images for surveillance. Infrared cameras can also be used to detect obstacles at night.
- the obstacle sensor 130 detects objects existing around the work vehicle 100.
- Obstacle sensor 130 may include, for example, a laser scanner or an ultrasonic sonar. Obstacle sensor 130 outputs a signal indicating that an obstacle exists when an object exists closer than a predetermined distance from obstacle sensor 130 .
- a plurality of obstacle sensors 130 may be provided at different positions of work vehicle 100. For example, multiple laser scanners and multiple ultrasonic sonars may be placed at different positions on work vehicle 100. By providing such a large number of obstacle sensors 130, blind spots in monitoring obstacles around the work vehicle 100 can be reduced.
- the steering wheel sensor 152 measures the rotation angle of the steering wheel of the work vehicle 100.
- the turning angle sensor 154 measures the turning angle of the front wheel 104F, which is a steered wheel. Measured values by the steering wheel sensor 152 and turning angle sensor 154 are used for steering control by the control device 180.
- the axle sensor 156 measures the rotational speed of the axle connected to the wheel 104, that is, the number of rotations per unit time.
- the axle sensor 156 may be a sensor using a magnetoresistive element (MR), a Hall element, or an electromagnetic pickup, for example.
- the axle sensor 156 outputs, for example, a numerical value indicating the number of revolutions per minute (unit: rpm) of the axle.
- Axle sensor 156 is used to measure the speed of work vehicle 100.
- the drive device 240 includes various devices necessary for running the work vehicle 100 and driving the implement 300, such as the above-mentioned prime mover 102, transmission device 103, steering device 106, and coupling device 108.
- Prime mover 102 may include, for example, an internal combustion engine such as a diesel engine.
- the drive device 240 may include an electric motor for traction instead of or in addition to the internal combustion engine.
- the buzzer 220 is an audio output device that emits a warning sound to notify of an abnormality. For example, the buzzer 220 emits a warning sound when an obstacle is detected during automatic driving. Buzzer 220 is controlled by control device 180.
- the storage device 170 includes one or more storage media such as flash memory or magnetic disks.
- the storage device 170 stores various data generated by the GNSS unit 110, camera 120, obstacle sensor 130, LiDAR sensor 140, sensor group 150, and control device 180.
- the data stored in the storage device 170 may include map data in the environment in which the work vehicle 100 travels (environmental map) and data on an automatic driving route (target route) for automatic driving.
- the environmental map includes information on a plurality of fields where the work vehicle 100 performs agricultural work and roads around the fields.
- the environmental map and target route may be generated by a processor in management device 600.
- the control device 180 in this embodiment has a function of generating or editing an environmental map and a target route.
- Control device 180 can edit the environmental map and target route acquired from management device 600 according to the driving environment of work vehicle 100.
- the storage device 170 also stores computer programs that cause each ECU in the control device 180 to execute various operations described below.
- Such a computer program may be provided to work vehicle 100 via a storage medium (eg, semiconductor memory or optical disk, etc.) or a telecommunications line (eg, the Internet).
- Such computer programs may be sold as commercial software.
- the control device 180 includes multiple ECUs.
- the plurality of ECUs include, for example, an ECU 181 for speed control, an ECU 182 for steering control, an ECU 183 for instrument control, an ECU 184 for automatic driving control, and an ECU 185 for route generation.
- ECU 181 controls the speed of work vehicle 100 by controlling prime mover 102, transmission 103, and brakes included in drive device 240.
- the ECU 182 controls the steering of the work vehicle 100 by controlling the hydraulic system or electric motor included in the steering device 106 based on the measured value of the steering wheel sensor 152.
- the ECU 183 controls the operations of the three-point link, PTO axis, etc. included in the coupling device 108 in order to cause the implement 300 to perform a desired operation. ECU 183 also generates a signal to control the operation of implement 300 and transmits the signal from communication device 190 to implement 300.
- the ECU 184 performs calculations and controls to realize automatic driving based on data output from the GNSS unit 110, camera 120, obstacle sensor 130, LiDAR sensor 140, and sensor group 150. For example, ECU 184 identifies the position of work vehicle 100 based on data output from at least one of GNSS unit 110, camera 120, and LiDAR sensor 140. In the field, ECU 184 may determine the position of work vehicle 100 based only on data output from GNSS unit 110. ECU 184 may estimate or correct the position of work vehicle 100 based on data acquired by camera 120 or LiDAR sensor 140. By using the data acquired by the camera 120 or the LiDAR sensor 140, the accuracy of positioning can be further improved.
- ECU 184 estimates the position of work vehicle 100 using data output from LiDAR sensor 140 or camera 120. For example, the ECU 184 may estimate the position of the work vehicle 100 by matching data output from the LiDAR sensor 140 or the camera 120 with an environmental map. During automatic driving, the ECU 184 performs calculations necessary for the work vehicle 100 to travel along the target route or the local route based on the estimated position of the work vehicle 100.
- the ECU 184 sends a speed change command to the ECU 181 and a steering angle change command to the ECU 182.
- ECU 181 changes the speed of work vehicle 100 by controlling prime mover 102, transmission 103, or brake in response to a speed change command.
- the ECU 182 changes the steering angle by controlling the steering device 106 in response to a command to change the steering angle.
- the ECU 185 While the work vehicle 100 is traveling along the target route, the ECU 185 sequentially generates local routes that can avoid obstacles. ECU 185 recognizes obstacles existing around work vehicle 100 based on data output from camera 120, obstacle sensor 130, and LiDAR sensor 140 while work vehicle 100 is traveling. The ECU 185 generates a local route to avoid the recognized obstacle. ECU 185 may have a function of generating a target route instead of management device 600. In that case, ECU 185 generates a target route based on data output from GNSS unit 110, camera 120, and/or LiDAR sensor 140 while work vehicle 100 is traveling for data collection. An example of a method for generating a target route is as described with reference to FIGS. 3 to 7D. Note that the target route may be generated not only by the management device 600 or the ECU 185 but also by other devices such as the operating terminal 200 or the terminal device 400.
- control device 180 realizes automatic operation.
- control device 180 controls drive device 240 based on the measured or estimated position of work vehicle 100 and the target route. Thereby, the control device 180 can cause the work vehicle 100 to travel along the target route.
- a plurality of ECUs included in the control device 180 can communicate with each other, for example, according to a vehicle bus standard such as CAN (Controller Area Network). Instead of CAN, a faster communication method such as in-vehicle Ethernet (registered trademark) may be used.
- CAN Controller Area Network
- a faster communication method such as in-vehicle Ethernet (registered trademark) may be used.
- FIG. 10 each of the ECUs 181 to 185 is shown as an individual block, but the functions of each of these may be realized by a plurality of ECUs.
- An on-vehicle computer that integrates at least some of the functions of the ECUs 181 to 185 may be provided.
- the control device 180 may include ECUs other than the ECUs 181 to 185, and any number of ECUs may be provided depending on the function.
- Each ECU includes processing circuitry including one or more processors.
- the communication device 190 is a device that includes a circuit that communicates with the implement 300, the terminal device 400, and the management device 600.
- the communication device 190 includes a circuit that transmits and receives signals compliant with the ISOBUS standard, such as ISOBUS-TIM, to and from the communication device 390 of the implement 300. Thereby, it is possible to cause the implement 300 to perform a desired operation or to obtain information from the implement 300.
- Communication device 190 may further include an antenna and a communication circuit for transmitting and receiving signals via network 80 with respective communication devices of terminal device 400 and management device 600.
- Network 80 may include, for example, a cellular mobile communications network such as 3G, 4G or 5G and the Internet.
- the communication device 190 may have a function of communicating with a mobile terminal used by a supervisor near the work vehicle 100. Communication with such mobile terminals may be conducted in accordance with any wireless communication standard, such as Wi-Fi (registered trademark), cellular mobile communications such as 3G, 4G or 5G, or Bluetooth (registered trademark). I can.
- Wi-Fi registered trademark
- cellular mobile communications such as 3G, 4G or 5G
- Bluetooth registered trademark
- the operation terminal 200 is a terminal for a user to perform operations related to the traveling of the work vehicle 100 and the operation of the implement 300, and is also referred to as a virtual terminal (VT).
- Operation terminal 200 may include a display device such as a touch screen and/or one or more buttons.
- the display device may be a display such as a liquid crystal or an organic light emitting diode (OLED), for example.
- OLED organic light emitting diode
- the operating terminal 200 may be configured to be detachable from the work vehicle 100. A user located away from work vehicle 100 may operate detached operation terminal 200 to control the operation of work vehicle 100. Instead of the operation terminal 200, the user may control the operation of the work vehicle 100 by operating a computer, such as the terminal device 400, in which necessary application software is installed.
- FIG. 12 is a diagram showing an example of the operation terminal 200 and the operation switch group 210 provided inside the cabin 105.
- a switch group 210 including a plurality of switches that can be operated by a user is arranged inside the cabin 105.
- the operation switch group 210 includes, for example, a switch for selecting a main gear or a sub-shift, a switch for switching between an automatic operation mode and a manual operation mode, a switch for switching between forward and reverse, and an implement. It may include a switch for raising and lowering 300, etc. Note that if the work vehicle 100 performs only unmanned operation and does not have the function of manned operation, the work vehicle 100 does not need to include the operation switch group 210.
- the drive device 340 in the implement 300 shown in FIG. 10 performs operations necessary for the implement 300 to perform a predetermined work.
- the drive device 340 includes a device depending on the use of the implement 300, such as a hydraulic device, an electric motor, or a pump, for example.
- Control device 380 controls the operation of drive device 340.
- Control device 380 causes drive device 340 to perform various operations in response to signals transmitted from work vehicle 100 via communication device 390. Further, a signal depending on the state of the implement 300 can be transmitted from the communication device 390 to the work vehicle 100.
- FIG. 13 is a block diagram illustrating a schematic hardware configuration of the management device 600 and the terminal device 400.
- the management device 600 includes a storage device 650, a processor 660, a ROM (Read Only Memory) 670, a RAM (Random Access Memory) 680, and a communication device 690. These components are communicatively connected to each other via a bus.
- the management device 600 can function as a cloud server that manages the schedule of agricultural work in the field performed by the work vehicle 100 and supports agriculture by utilizing the managed data.
- a user can input information necessary for creating a work plan using the terminal device 400 and upload the information to the management device 600 via the network 80.
- the management device 600 can create an agricultural work schedule, that is, a work plan, based on the information.
- the management device 600 can also generate or edit an environmental map and generate an automatic travel route for the work vehicle 100. The environmental map may be distributed from a computer external to the management device 600.
- the communication device 690 is a communication module for communicating with the work vehicle 100 and the terminal device 400 via the network 80.
- the communication device 690 can perform wired communication based on a communication standard such as IEEE1394 (registered trademark) or Ethernet (registered trademark), for example.
- the communication device 690 may perform wireless communication based on the Bluetooth (registered trademark) standard or the Wi-Fi standard, or cellular mobile communication such as 3G, 4G, or 5G.
- the processor 660 may be, for example, a semiconductor integrated circuit including a central processing unit (CPU).
- Processor 660 may be implemented by a microprocessor or microcontroller.
- the processor 660 is selected from an FPGA (Field Programmable Gate Array) equipped with a CPU, a GPU (Graphics Processing Unit), an ASIC (Application Specific Integrated Circuit), an ASSP (Application Specific Standard Product), or one of these circuits. It can also be realized by a combination of two or more circuits.
- the processor 660 sequentially executes a computer program stored in the ROM 670 that describes a group of instructions for executing at least one process, thereby realizing a desired process.
- the ROM 670 is, for example, a writable memory (eg, PROM), a rewritable memory (eg, flash memory), or a read-only memory.
- ROM 670 stores a program that controls the operation of processor 660.
- the ROM 670 does not need to be a single storage medium, and may be a collection of multiple storage media. A portion of the collection of storage media may be removable memory.
- the RAM 680 provides a work area for temporarily expanding the control program stored in the ROM 670 at boot time.
- RAM 680 does not need to be a single storage medium, and may be a collection of multiple storage media.
- the storage device 650 mainly functions as database storage.
- Storage device 650 may be, for example, a magnetic storage device or a semiconductor storage device.
- An example of a magnetic storage device is a hard disk drive (HDD).
- An example of a semiconductor storage device is a solid state drive (SSD).
- Storage device 650 may be a device independent of management device 600.
- the storage device 650 may be a storage device connected to the management device 600 via the network 80, such as a cloud storage.
- the terminal device 400 includes an input device 420, a display device 430, a storage device 450, a processor 460, a ROM 470, a RAM 480, and a communication device 490. These components are communicatively connected to each other via a bus.
- the input device 420 is a device for converting instructions from a user into data and inputting the data into the computer.
- Input device 420 may be, for example, a keyboard, mouse, or touch panel.
- Display device 430 may be, for example, a liquid crystal display or an organic EL display. Descriptions regarding each of the processor 460, ROM 470, RAM 480, storage device 450, and communication device 490 are as described in the hardware configuration example of the management device 600, and their descriptions will be omitted.
- the work vehicle 100 in this embodiment can automatically travel both inside and outside the field.
- the work vehicle 100 drives the implement 300 and performs predetermined agricultural work while traveling along a preset target route.
- the work vehicle 100 detects an obstacle by the obstacle sensor 130 while traveling in the field, the work vehicle 100 stops traveling, emits a warning sound from the buzzer 220, sends a warning signal to the terminal device 400, etc. perform an action.
- positioning of the work vehicle 100 is performed mainly based on data output from the GNSS unit 110.
- the work vehicle 100 automatically travels along a target route set on a farm road or a general road outside the field.
- the work vehicle 100 travels outside the field while detecting obstacles based on data acquired by the camera 120 or the LiDAR sensor 140.
- the work vehicle 100 either avoids the obstacle or stops on the spot.
- the position of work vehicle 100 is estimated based on data output from LiDAR sensor 140 or camera 120 in addition to positioning data output from GNSS unit 110.
- FIG. 14 is a diagram schematically showing an example of a work vehicle 100 that automatically travels along a target route in a field.
- the farm field includes a work area 72 where the work vehicle 100 performs work using the implement 300, and a headland 74 located near the outer periphery of the farm field. Which area of the field on the map corresponds to the work area 72 or the headland 74 can be set in advance by the user.
- the target route in this example includes a plurality of parallel main routes P1 and a plurality of turning routes P2 connecting the plurality of main routes P1.
- the main route P1 is located within the work area 72, and the turning route P2 is located within the headland 74.
- each main route P1 may include a curved portion.
- the broken line in FIG. 14 represents the working width of the implement 300.
- the working width is set in advance and recorded in the storage device 170.
- the working width can be set and recorded by the user operating the operating terminal 200 or the terminal device 400. Alternatively, the working width may be automatically recognized and recorded when the implement 300 is connected to the work vehicle 100.
- the intervals between the plurality of main routes P1 may be set according to the working width.
- the target route may be created based on a user's operation before automatic driving is started.
- the target route may be created to cover the entire working area 72 in the field, for example.
- the work vehicle 100 automatically travels along a target route as shown in FIG. 14 from a work start point to a work end point while repeating back and forth movements. Note that the target route shown in FIG. 14 is only an example, and the target route can be determined in any way.
- control device 180 Next, an example of control during automatic operation by the control device 180 will be described.
- FIG. 15 is a flowchart illustrating an example of the operation of steering control during automatic driving executed by the control device 180.
- the control device 180 performs automatic steering by executing the operations from steps S121 to S125 shown in FIG. 15 while the work vehicle 100 is traveling. Regarding the speed, for example, it may be maintained at a preset speed or may be adjusted depending on the situation.
- the control device 180 acquires data indicating the position of the work vehicle 100 generated by the GNSS unit 110 (step S121).
- control device 180 calculates the deviation between the position of work vehicle 100 and the target route (step S122). The deviation represents the distance between the position of work vehicle 100 at that point and the target route.
- the control device 180 determines whether the calculated positional deviation exceeds a preset threshold (step S123). If the deviation exceeds the threshold, the control device 180 changes the steering angle by changing the control parameters of the steering device included in the drive device 240 so that the deviation becomes smaller. If the deviation does not exceed the threshold in step S123, the operation in step S124 is omitted. In subsequent step S125, control device 180 determines whether or not it has received an instruction to end the operation.
- the instruction to end the operation may be issued, for example, when a user remotely instructs to stop automatic driving or when work vehicle 100 reaches a destination. If the instruction to end the operation has not been issued, the process returns to step S121, and a similar operation is executed based on the newly measured position of the work vehicle 100.
- the control device 180 repeats the operations from steps S121 to S125 until an instruction to end the operation is issued. The above operations are executed by the ECUs 182 and 184 in the control device 180.
- the control device 180 controls the drive device 240 based only on the deviation between the position of the work vehicle 100 specified by the GNSS unit 110 and the target route, but also takes into account the deviation in the direction. May be controlled. For example, when the azimuth deviation, which is the angular difference between the direction of the work vehicle 100 specified by the GNSS unit 110 and the direction of the target route, exceeds a preset threshold, the control device 180 drives the vehicle according to the deviation. Control parameters (eg, steering angle) of the steering device of device 240 may be changed.
- FIG. 16A is a diagram showing an example of the work vehicle 100 traveling along the target route P.
- FIG. 16B is a diagram showing an example of work vehicle 100 in a position shifted to the right from target route P.
- FIG. 16C is a diagram showing an example of work vehicle 100 in a position shifted to the left from target route P.
- FIG. 16D is a diagram showing an example of work vehicle 100 facing in a direction inclined with respect to target route P.
- the pose indicating the position and orientation of the work vehicle 100 measured by the GNSS unit 110 is expressed as r(x, y, ⁇ ).
- (x, y) are coordinates representing the position of the reference point of work vehicle 100 in the XY coordinate system, which is a two-dimensional coordinate system fixed to the earth.
- the reference point of the work vehicle 100 is located at the location where the GNSS antenna is installed on the cabin, but the location of the reference point is arbitrary.
- ⁇ is an angle representing the measured direction of work vehicle 100.
- the target route P is parallel to the Y-axis, but generally the target route P is not necessarily parallel to the Y-axis.
- control device 180 maintains the steering angle and speed of work vehicle 100 unchanged.
- the control device 180 steers the work vehicle 100 so that the traveling direction of the work vehicle 100 leans to the left and approaches the route P. Change the corner. At this time, the speed may be changed in addition to the steering angle.
- the magnitude of the steering angle can be adjusted, for example, depending on the magnitude of the positional deviation ⁇ x.
- the control device 180 steers the work vehicle 100 so that the traveling direction of the work vehicle 100 leans to the right and approaches the route P. Change the corner. In this case as well, the speed may be changed in addition to the steering angle. The amount of change in the steering angle can be adjusted, for example, depending on the magnitude of the positional deviation ⁇ x.
- the control device 180 steers the work vehicle 100 so that the azimuth deviation ⁇ becomes smaller. Change the corner.
- the speed may be changed in addition to the steering angle.
- the magnitude of the steering angle can be adjusted, for example, depending on the magnitude of each of the positional deviation ⁇ x and the azimuth deviation ⁇ . For example, the smaller the absolute value of the positional deviation ⁇ x, the larger the amount of change in the steering angle according to the azimuth deviation ⁇ .
- the absolute value of the positional deviation ⁇ x is large, the steering angle will be changed significantly in order to return to the route P, so the absolute value of the azimuth deviation ⁇ will inevitably become large. Conversely, when the absolute value of the positional deviation ⁇ x is small, it is necessary to bring the azimuth deviation ⁇ close to zero. Therefore, it is appropriate to relatively increase the weight (ie, control gain) of the azimuth deviation ⁇ for determining the steering angle.
- a control technique such as PID control or MPC control (model predictive control) may be applied to the steering control and speed control of the work vehicle 100. By applying these control techniques, it is possible to smoothly control the work vehicle 100 to approach the target route P.
- the control device 180 stops the work vehicle 100. At this time, the buzzer 220 may be made to emit a warning sound, or a warning signal may be transmitted to the terminal device 400. If the obstacle can be avoided, the control device 180 may control the drive device 240 to avoid the obstacle.
- the work vehicle 100 in this embodiment can automatically travel not only inside the field but also outside the field. Outside the field, the control device 180 performs steering control and speed control along the target route (automatic travel route) generated by the method described above.
- Control device 180 can detect objects (for example, other vehicles or pedestrians) that are located relatively far from work vehicle 100 based on data output from camera 120 or LiDAR sensor 140. .
- the control device 180 generates a local route to avoid the detected object, and performs speed control and steering control along the local route, thereby realizing automatic driving on roads outside the field.
- the storage device 170 records an environmental map of an area including a plurality of farm fields and roads around them, and a target route.
- the work vehicle 100 moves along the target route while sensing the surroundings using sensing devices such as the camera 120 and the LiDAR sensor 140 with the implement 300 raised.
- control device 180 sequentially generates local routes and causes work vehicle 100 to travel along the local routes. This allows the vehicle to travel automatically while avoiding obstacles.
- the target route may be changed depending on the situation.
- the track record of the driving route when the work vehicle 100 performs manual driving can be used to generate a route for automatic driving.
- the automatic driving route is generated by excluding the actual driving route when the work vehicle 100 performs an action to avoid an oncoming vehicle. Thereby, it is possible to prevent an inappropriate route associated with an avoidance operation from being included in the automated driving route, and to generate an appropriate automated driving route.
- the work vehicle 100 can appropriately execute automatic driving on, for example, roads around a farm field.
- the systems that perform automatic driving route generation or automatic driving control in each of the above embodiments can also be installed later on agricultural machines that do not have these functions.
- Such systems can be manufactured and sold independently of agricultural machinery.
- the computer programs used in such systems may also be manufactured and sold independently of the agricultural machinery.
- the computer program may be provided, for example, stored in a computer readable non-transitory storage medium. Computer programs may also be provided by download via telecommunications lines (eg, the Internet).
- the present disclosure includes the route generation system and route generation method described in the following items.
- a route generation system for automatic driving of agricultural machinery comprising a processing device that generates an automatic travel route for the agricultural machine,
- the processing device includes: Obtaining data indicating the traveling trajectory from a vehicle that manually travels along the route on which the agricultural machine is scheduled to automatically travel while recording the traveling trajectory; removing a trajectory related to an avoidance operation performed to avoid an oncoming vehicle from the travel trajectory; generating an automatic travel route for the agricultural machine based on the travel trajectory from which a trajectory related to the avoidance operation has been removed; Route generation system.
- the processing device includes: Obtaining data of a moving image taken while the vehicle is running by a camera mounted on the vehicle, detecting the avoidance motion based on the moving image, determining and removing a trajectory related to the avoidance motion from the travel trajectory; The route generation system described in item 1.
- the processing device includes: Recognizing an oncoming vehicle approaching the vehicle from the video image, removing, from the travel trajectory, a trajectory corresponding to at least a portion of a period from when the oncoming vehicle is recognized until the oncoming vehicle is no longer recognized as a trajectory related to the avoidance operation; The route generation system described in item 2.
- the processing device causes a display device to display the travel trajectory from which a trajectory related to the avoidance operation has been removed; Complementing the portion removed from the travel trajectory in response to an operation for determining a supplementary route performed by a user;
- the route generation system according to item 8.
- a route generation method for automatically running agricultural machinery comprising: Obtaining data indicating the traveling trajectory from a vehicle that manually travels along the route on which the agricultural machine is scheduled to automatically travel while recording the traveling trajectory; removing a trajectory related to an avoidance operation performed to avoid the oncoming vehicle from the travel trajectory; Generating an automatic travel route for the agricultural machine based on the travel trajectory from which a trajectory related to the avoidance operation has been removed;
- a route generation method including.
- the technology of the present disclosure is a system that generates an automatic driving route for agricultural machinery such as a tractor, a harvester, a rice transplanter, a riding management machine, a vegetable transplanter, a lawn mower, a seeding machine, a fertilizer application machine, or an agricultural robot. It can be applied to
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Life Sciences & Earth Sciences (AREA)
- Automation & Control Theory (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Mechanical Engineering (AREA)
- Soil Sciences (AREA)
- Environmental Sciences (AREA)
- Aviation & Aerospace Engineering (AREA)
- Guiding Agricultural Machines (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
Description
本開示において「農業機械」は、農業用途で使用される機械を意味する。農業機械の例は、トラクタ、収穫機、田植機、乗用管理機、野菜移植機、草刈機、播種機、施肥機、および農業用移動ロボットを含む。トラクタのような作業車両が単独で「農業機械」として機能する場合だけでなく、作業車両に装着または牽引される作業機(インプルメント)と作業車両の全体が一つの「農業機械」として機能する場合がある。農業機械は、圃場内の地面に対して、耕耘、播種、防除、施肥、作物の植え付け、または収穫などの農作業を行う。これらの農作業を「対地作業」または単に「作業」と称することがある。車両型の農業機械が農作業を行いながら走行することを「作業走行」と称することがある。
以下、本開示の実施形態を説明する。ただし、必要以上に詳細な説明は省略することがある。例えば、既によく知られた事項の詳細な説明および実質的に同一の構成に関する重複する説明を省略することがある。これは、以下の説明が不必要に冗長になることを避け、当業者の理解を容易にするためである。なお、発明者は、当業者が本開示を十分に理解するために添付図面および以下の説明を提供するのであって、これらによって特許請求の範囲に記載の主題を限定することを意図するものではない。以下の説明において、同一または類似の機能を有する構成要素については、同一の参照符号を付している。
図9は、作業車両100、および作業車両100に連結されたインプルメント300の例を模式的に示す側面図である。本実施形態における作業車両100は、手動運転モードと自動運転モードの両方で動作することができる。自動運転モードにおいて、作業車両100は無人で走行することができる。作業車両100は、圃場内と圃場外の両方で自動運転が可能である。
次に、作業車両100による自動走行の動作の例を説明する。本実施形態における作業車両100は、圃場内および圃場外の両方で自動で走行することができる。圃場内において、作業車両100は、予め設定された目標経路に沿って走行しながら、インプルメント300を駆動して所定の農作業を行う。作業車両100は、圃場内を走行中、障害物センサ130によって障害物が検出された場合、走行を停止し、ブザー220からの警告音の発出、および端末装置400への警告信号の送信などの動作を行う。圃場内において、作業車両100の測位は、主にGNSSユニット110から出力されるデータに基づいて行われる。一方、圃場外において、作業車両100は、圃場外の農道または一般道に設定された目標経路に沿って自動で走行する。作業車両100は、圃場外を走行中、カメラ120またはLiDARセンサ140によって取得されたデータに基づいて障害物の検出を行いながら走行する。圃場外において、作業車両100は、障害物が検出されると、障害物を回避するか、その場で停止する。圃場外においては、GNSSユニット110から出力される測位データに加え、LiDARセンサ140またはカメラ120から出力されるデータに基づいて作業車両100の位置が推定される。
農業機械の自動走行のための経路生成システムであって、
前記農業機械の自動走行経路を生成する処理装置を備え、
前記処理装置は、
前記農業機械が自動走行を行う予定の経路を、走行軌跡を記録しながら手動で走行する車両から、前記走行軌跡を示すデータを取得し、
前記走行軌跡から、対向車を回避するために行われた回避動作に関連する軌跡を除去し、
前記回避動作に関連する軌跡が除去された前記走行軌跡に基づいて、前記農業機械の自動走行経路を生成する、
経路生成システム。
前記処理装置は、
前記車両に搭載されたカメラによって前記車両の走行中に撮影された動画像のデータを取得し、
前記動画像に基づいて前記回避動作を検出し、前記走行軌跡から前記回避動作に関連する軌跡を決定して除去する、
項目1に記載の経路生成システム。
前記処理装置は、
前記動画像から前記車両に接近する対向車を認識し、
前記走行軌跡のうち、前記対向車を認識してから前記対向車が認識されなくなるまでの期間の少なくとも一部に対応する軌跡を、前記回避動作に関連する軌跡として除去する、
項目2に記載の経路生成システム。
前記処理装置は、前記走行軌跡が示す前記車両の位置の時間変化に基づいて前記回避動作を検出し、前記走行軌跡から前記回避動作に関連する軌跡を決定して除去する、項目1に記載の経路生成システム。
前記処理装置は、前記車両が対向車を回避するために行った後進、方向転換、加速、および減速の少なくとも1つの動作を前記回避動作として検出する、項目1から4のいずれかに記載の経路生成システム。
前記処理装置は、前記車両に搭載されたGNSS受信機から逐次出力された位置データを前記走行軌跡を示すデータとして取得する、項目1から5のいずれかに記載の経路生成システム。
前記処理装置は、各々が位置および速度の情報を含む複数のウェイポイントによって規定される経路を前記自動走行経路として生成する、項目1から6のいずれかに記載の経路生成システム。
前記処理装置は、前記走行軌跡から除去した部分を補完する処理を行うことによって前記自動走行経路を生成する、項目1から7のいずれかに記載の経路生成システム。
前記処理装置は、前記走行軌跡から除去した部分を直線的な補完経路で補完することによって前記自動走行経路を生成する、項目8に記載の経路生成システム。
前記処理装置は、前記回避動作に関連する軌跡が除去された前記走行軌跡を表示装置に表示させ、
ユーザによって行われた補完経路を決定する操作に応答して、前記走行軌跡から除去した部分を補完する、
項目8に記載の経路生成システム。
前記処理装置は、前記自動走行経路を生成する処理を、前記農業機械が圃場外で自動走行を行うための経路を生成する場合に実行する、項目1から10のいずれかに記載の経路生成システム。
農業機械の自動走行のための経路生成方法であって、
前記農業機械が自動走行を行う予定の経路を、走行軌跡を記録しながら手動で走行する車両から、前記走行軌跡を示すデータを取得することと、
前記走行軌跡から、前記対向車を回避するために行われた回避動作に関連する軌跡を除去することとと、
前記回避動作に関連する軌跡が除去された前記走行軌跡に基づいて、前記農業機械の自動走行経路を生成することと、
を含む経路生成方法。
Claims (12)
- 農業機械の自動走行のための経路生成システムであって、
前記農業機械の自動走行経路を生成する処理装置を備え、
前記処理装置は、
前記農業機械が自動走行を行う予定の経路を、走行軌跡を記録しながら手動で走行する車両から、前記走行軌跡を示すデータを取得し、
前記走行軌跡から、対向車を回避するために行われた回避動作に関連する軌跡を除去し、
前記回避動作に関連する軌跡が除去された前記走行軌跡に基づいて、前記農業機械の自動走行経路を生成する、
経路生成システム。 - 前記処理装置は、
前記車両に搭載されたカメラによって前記車両の走行中に撮影された動画像のデータを取得し、
前記動画像に基づいて前記回避動作を検出し、前記走行軌跡から前記回避動作に関連する軌跡を決定して除去する、
請求項1に記載の経路生成システム。 - 前記処理装置は、
前記動画像から前記車両に接近する対向車を認識し、
前記走行軌跡のうち、前記対向車を認識してから前記対向車が認識されなくなるまでの期間の少なくとも一部に対応する軌跡を、前記回避動作に関連する軌跡として除去する、
請求項2に記載の経路生成システム。 - 前記処理装置は、前記走行軌跡が示す前記車両の位置の時間変化に基づいて前記回避動作を検出し、前記走行軌跡から前記回避動作に関連する軌跡を決定して除去する、請求項1に記載の経路生成システム。
- 前記処理装置は、前記車両が対向車を回避するために行った後進、方向転換、加速、および減速の少なくとも1つの動作を前記回避動作として検出する、請求項1から4のいずれかに記載の経路生成システム。
- 前記処理装置は、前記車両に搭載されたGNSS受信機から逐次出力された位置データを前記走行軌跡を示すデータとして取得する、請求項1から4のいずれかに記載の経路生成システム。
- 前記処理装置は、各々が位置および速度の情報を含む複数のウェイポイントによって規定される経路を前記自動走行経路として生成する、請求項1から4のいずれかに記載の経路生成システム。
- 前記処理装置は、前記走行軌跡から除去した部分を補完する処理を行うことによって前記自動走行経路を生成する、請求項1から4のいずれかに記載の経路生成システム。
- 前記処理装置は、前記走行軌跡から除去した部分を直線的な補完経路で補完することによって前記自動走行経路を生成する、請求項8に記載の経路生成システム。
- 前記処理装置は、前記回避動作に関連する軌跡が除去された前記走行軌跡を表示装置に表示させ、
ユーザによって行われた補完経路を決定する操作に応答して、前記走行軌跡から除去した部分を補完する、
請求項8に記載の経路生成システム。 - 前記処理装置は、前記自動走行経路を生成する処理を、前記農業機械が圃場外で自動走行を行うための経路を生成する場合に実行する、請求項1から4のいずれかに記載の経路生成システム。
- 農業機械の自動走行のための経路生成方法であって、
前記農業機械が自動走行を行う予定の経路を、走行軌跡を記録しながら手動で走行する車両から、前記走行軌跡を示すデータを取得することと、
前記走行軌跡から、前記対向車を回避するために行われた回避動作に関連する軌跡を除去することとと、
前記回避動作に関連する軌跡が除去された前記走行軌跡に基づいて、前記農業機械の自動走行経路を生成することと、
を含む経路生成方法。
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2024526383A JPWO2023238724A1 (ja) | 2022-06-08 | 2023-05-29 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2022093143 | 2022-06-08 | ||
JP2022-093143 | 2022-06-08 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023238724A1 true WO2023238724A1 (ja) | 2023-12-14 |
Family
ID=89118252
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2023/019921 WO2023238724A1 (ja) | 2022-06-08 | 2023-05-29 | 農業機械の自動走行のための経路生成システムおよび経路生成方法 |
Country Status (2)
Country | Link |
---|---|
JP (1) | JPWO2023238724A1 (ja) |
WO (1) | WO2023238724A1 (ja) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017055673A (ja) * | 2015-09-14 | 2017-03-23 | 株式会社クボタ | 作業車支援システム |
JP2021029218A (ja) | 2019-08-29 | 2021-03-01 | 井関農機株式会社 | 農道経路情報格納システムおよび作業車両 |
JP2021073602A (ja) | 2021-01-28 | 2021-05-13 | ヤンマーパワーテクノロジー株式会社 | 自動走行システム及び状況報知装置 |
JP2022032803A (ja) * | 2020-08-14 | 2022-02-25 | 井関農機株式会社 | 作業車両の制御システム |
JP2023079152A (ja) * | 2021-11-26 | 2023-06-07 | ヤンマーホールディングス株式会社 | 経路生成方法、経路生成システム、及び経路生成プログラム |
-
2023
- 2023-05-29 JP JP2024526383A patent/JPWO2023238724A1/ja active Pending
- 2023-05-29 WO PCT/JP2023/019921 patent/WO2023238724A1/ja active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017055673A (ja) * | 2015-09-14 | 2017-03-23 | 株式会社クボタ | 作業車支援システム |
JP2021029218A (ja) | 2019-08-29 | 2021-03-01 | 井関農機株式会社 | 農道経路情報格納システムおよび作業車両 |
JP2022032803A (ja) * | 2020-08-14 | 2022-02-25 | 井関農機株式会社 | 作業車両の制御システム |
JP2021073602A (ja) | 2021-01-28 | 2021-05-13 | ヤンマーパワーテクノロジー株式会社 | 自動走行システム及び状況報知装置 |
JP2023079152A (ja) * | 2021-11-26 | 2023-06-07 | ヤンマーホールディングス株式会社 | 経路生成方法、経路生成システム、及び経路生成プログラム |
Also Published As
Publication number | Publication date |
---|---|
JPWO2023238724A1 (ja) | 2023-12-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20240138282A1 (en) | Management system for agricultural machines | |
US20240172577A1 (en) | Control system for agricultural machine and agriculture management system | |
US20240337753A1 (en) | Agricultural machine, sensing system for use in agricultural machine, and sensing method | |
US20240341216A1 (en) | Travel control system for agricultural machine capable of performing remotely-manipulated traveling | |
US20240188475A1 (en) | Agricultural assistance system and agricultural assistance method | |
JP7433267B2 (ja) | 作業車両、および作業車両の制御システム | |
WO2023238724A1 (ja) | 農業機械の自動走行のための経路生成システムおよび経路生成方法 | |
JP7620728B2 (ja) | 農道識別システム、制御システムおよび農業機械 | |
US20240338033A1 (en) | Obstacle detection system, agricultural machine and obstacle detection method | |
US20250068172A1 (en) | Map creation system and route planning system | |
WO2024004463A1 (ja) | 走行制御システム、走行制御方法およびコンピュータプログラム | |
US20240345253A1 (en) | Agricultural machine, sensing system used in agricultural machine, and sensing method | |
JP2023183840A (ja) | 農業機械の自動走行のための経路生成システムおよび経路生成方法 | |
US20240341215A1 (en) | Agricultural machine, sensing system, sensing method, remote operation system, and control method | |
WO2023238827A1 (ja) | 農業管理システム | |
EP4510111A1 (en) | Map creation system and route planning system | |
EP4449841A1 (en) | Agricultural machine and gesture recognition system for agricultural machine | |
US20240345603A1 (en) | Travel control system for agricultural machine capable of performing remotely-manipulated traveling | |
US20250044797A1 (en) | Turning control for autonomous agricultural vehicle | |
JP7437340B2 (ja) | 作業車両、および作業車両の制御システム | |
US20250040462A1 (en) | Turning control for autonomous agricultural vehicle | |
WO2024004486A1 (ja) | 作業車両、制御方法および制御システム | |
WO2023243369A1 (ja) | 映像表示システムおよび作業車両 | |
WO2023248909A1 (ja) | 走行制御システム、農業機械および走行制御方法 | |
WO2023234255A1 (ja) | センシングシステム、農業機械、およびセンシング装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 23819704 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2024526383 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2401008070 Country of ref document: TH |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202417097814 Country of ref document: IN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023819704 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2023819704 Country of ref document: EP Effective date: 20250108 |