WO2023127557A1 - 農業機械、農業機械に用いるセンシングシステムおよびセンシング方法 - Google Patents
農業機械、農業機械に用いるセンシングシステムおよびセンシング方法 Download PDFInfo
- Publication number
- WO2023127557A1 WO2023127557A1 PCT/JP2022/046459 JP2022046459W WO2023127557A1 WO 2023127557 A1 WO2023127557 A1 WO 2023127557A1 JP 2022046459 W JP2022046459 W JP 2022046459W WO 2023127557 A1 WO2023127557 A1 WO 2023127557A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- area
- agricultural machine
- work vehicle
- search area
- search
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 41
- 238000012545 processing Methods 0.000 claims abstract description 91
- 238000003860 storage Methods 0.000 claims description 45
- 230000008859 change Effects 0.000 claims description 15
- 238000010586 diagram Methods 0.000 description 53
- 238000004891 communication Methods 0.000 description 35
- 230000006870 function Effects 0.000 description 22
- 230000033001 locomotion Effects 0.000 description 16
- 230000008569 process Effects 0.000 description 15
- 230000005540 biological transmission Effects 0.000 description 11
- 230000007613 environmental effect Effects 0.000 description 11
- 238000012937 correction Methods 0.000 description 9
- 230000008878 coupling Effects 0.000 description 9
- 238000010168 coupling process Methods 0.000 description 9
- 238000005859 coupling reaction Methods 0.000 description 9
- 238000004590 computer program Methods 0.000 description 8
- 238000001514 detection method Methods 0.000 description 7
- 238000005516 engineering process Methods 0.000 description 7
- 238000005259 measurement Methods 0.000 description 6
- 239000004065 semiconductor Substances 0.000 description 5
- 230000001413 cellular effect Effects 0.000 description 3
- 239000000470 constituent Substances 0.000 description 3
- 238000010295 mobile communication Methods 0.000 description 3
- 230000004044 response Effects 0.000 description 3
- 241001124569 Lycaenidae Species 0.000 description 2
- 241000209094 Oryza Species 0.000 description 2
- 235000007164 Oryza sativa Nutrition 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 2
- 230000009471 action Effects 0.000 description 2
- 238000013459 approach Methods 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 238000002485 combustion reaction Methods 0.000 description 2
- 230000000295 complement effect Effects 0.000 description 2
- 239000003337 fertilizer Substances 0.000 description 2
- 230000012447 hatching Effects 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 238000000691 measurement method Methods 0.000 description 2
- 238000012544 monitoring process Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000010355 oscillation Effects 0.000 description 2
- 238000012827 research and development Methods 0.000 description 2
- 235000009566 rice Nutrition 0.000 description 2
- 230000007480 spreading Effects 0.000 description 2
- 239000013589 supplement Substances 0.000 description 2
- 235000013311 vegetables Nutrition 0.000 description 2
- 206010024796 Logorrhoea Diseases 0.000 description 1
- 241001465754 Metazoa Species 0.000 description 1
- 241000607479 Yersinia pestis Species 0.000 description 1
- 230000005856 abnormality Effects 0.000 description 1
- 239000010426 asphalt Substances 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 230000003028 elevating effect Effects 0.000 description 1
- 230000004720 fertilization Effects 0.000 description 1
- 238000003306 harvesting Methods 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 239000002420 orchard Substances 0.000 description 1
- 239000000047 product Substances 0.000 description 1
- 230000001141 propulsive effect Effects 0.000 description 1
- 239000002689 soil Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000009331 sowing Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0238—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
- G05D1/024—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
- G01S17/931—Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01B—SOIL WORKING IN AGRICULTURE OR FORESTRY; PARTS, DETAILS, OR ACCESSORIES OF AGRICULTURAL MACHINES OR IMPLEMENTS, IN GENERAL
- A01B69/00—Steering of agricultural machines or implements; Guiding agricultural machines or implements on a desired track
- A01B69/001—Steering by means of optical assistance, e.g. television cameras
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01B—SOIL WORKING IN AGRICULTURE OR FORESTRY; PARTS, DETAILS, OR ACCESSORIES OF AGRICULTURAL MACHINES OR IMPLEMENTS, IN GENERAL
- A01B69/00—Steering of agricultural machines or implements; Guiding agricultural machines or implements on a desired track
- A01B69/007—Steering or guiding of agricultural vehicles, e.g. steering of the tractor to keep the plough in the furrow
- A01B69/008—Steering or guiding of agricultural vehicles, e.g. steering of the tractor to keep the plough in the furrow automatic
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01B—SOIL WORKING IN AGRICULTURE OR FORESTRY; PARTS, DETAILS, OR ACCESSORIES OF AGRICULTURAL MACHINES OR IMPLEMENTS, IN GENERAL
- A01B76/00—Parts, details or accessories of agricultural machines or implements, not provided for in groups A01B51/00 - A01B75/00
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/481—Constructional features, e.g. arrangements of optical elements
- G01S7/4817—Constructional features, e.g. arrangements of optical elements relating to scanning
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
Definitions
- the present disclosure relates to agricultural machinery, sensing systems and sensing methods used in agricultural machinery.
- ICT Information and Communication Technology
- IoT Internet of Things
- GNSS Global Navigation Satellite System
- Patent Literature 1 discloses a technology for detecting obstacles around a tractor capable of automatic operation using a LiDAR (Light Detection and Ranging) sensor.
- LiDAR Light Detection and Ranging
- This disclosure provides a technique for searching the environment around the agricultural machine that is suitable for the area where the agricultural machine is located.
- a sensing system is a sensing system for a mobile agricultural machine, which is provided in the agricultural machine, senses the environment around the agricultural machine, and outputs sensing data. and a processing device for detecting an object positioned in a search area around the agricultural machine based on the sensing data, wherein the processing device detects an object located in a search area around the agricultural machine based on the sensing data, the processing device detecting , changing the pattern of the search area for detecting the object.
- a sensing method is a sensing method for a mobile agricultural machine, comprising sensing an environment around the agricultural machine using one or more sensors and outputting sensing data; Detecting an object located in a search area around the agricultural machine based on the sensing data, and changing the pattern of the search area for detecting the object according to the area in which the agricultural machine is located. including to do.
- a computer-readable storage medium may include both volatile and non-volatile storage media.
- a device may consist of a plurality of devices. When the device is composed of two or more devices, the two or more devices may be arranged in one device, or may be divided and arranged in two or more separate devices. .
- the pattern of the search area for object detection is changed according to the area in which the agricultural machine is located. Thereby, a search suitable for the area in which the agricultural machine is located can be performed.
- FIG. 1 is a diagram for explaining an overview of an agricultural management system according to an exemplary embodiment of the present disclosure
- FIG. 1 is a side view schematically showing an example of a working vehicle and a working machine connected to the working vehicle
- FIG. 2 is a block diagram showing a configuration example of a working vehicle and a working machine
- FIG. 1 is a conceptual diagram showing an example of a work vehicle that performs positioning by RTK-GNSS
- FIG. 3 is a diagram showing an example of an operation terminal and an operation switch group provided inside a cabin; It is a block diagram which illustrates the hardware constitutions of a management apparatus and a terminal device.
- FIG. 1 is a side view schematically showing an example of a working vehicle and a working machine connected to the working vehicle
- FIG. 2 is a block diagram showing a configuration example of a working vehicle and a working machine
- FIG. 1 is a conceptual diagram showing an example of a work vehicle that performs positioning by RTK-GNSS
- FIG. 3
- FIG. 4 is a diagram schematically showing an example of a working vehicle that automatically travels along a target route in a field; 4 is a flowchart showing an example of steering control operation during automatic driving; FIG. 3 is a diagram showing an example of a working vehicle that travels along a target route P; FIG. 4 is a diagram showing an example of a work vehicle that is shifted to the right from a target path P; FIG. 4 is a diagram showing an example of a work vehicle that is shifted to the left from a target path P; FIG. 4 is a diagram showing an example of a work vehicle facing in a direction that is inclined with respect to a target path P; FIG.
- FIG. 4 is a diagram schematically showing an example of a situation in which a plurality of work vehicles are automatically traveling on roads inside and outside a field; 7 is a flow chart showing an example of processing for changing a search area pattern according to an area in which an agricultural machine is located;
- FIG. 10 is a diagram showing an example of an area in which the pattern of the search area is changed;
- FIG. 4 is a diagram showing examples of a first search area and a second search area;
- FIG. 3 is a diagram showing the relationship between a sensing area sensed by a LiDAR sensor and a search area for searching for an object;
- FIG. 3 is a diagram showing the relationship between a sensing area sensed by a LiDAR sensor and a search area for searching for an object;
- FIG. 3 is a diagram showing the relationship between a sensing area sensed by a LiDAR sensor and a search area for searching for an object;
- FIG. 10 is a diagram showing another example of an area in which the pattern of search regions is changed; 7 is a flowchart illustrating an example of processing when an obstacle is detected; FIG. 10 is a diagram showing another example of the first search area and the second search area; FIG. 10 is a diagram showing still another example of the first search area and the second search area; FIG. 10 is a diagram showing still another example of the first search area and the second search area; FIG. 10 is a diagram showing still another example of an area in which the pattern of search regions is changed; FIG. 10 is a diagram showing still another example of the first search area and the second search area; FIG. 10 is a diagram showing still another example of an area in which the pattern of search regions is changed; FIG.
- FIG. 10 is a diagram showing still another example of an area in which the pattern of search regions is changed;
- FIG. 10 is a diagram showing still another example of the first search area and the second search area;
- FIG. 10 is a diagram showing still another example of an area in which the pattern of search regions is changed;
- FIG. 5 is a diagram showing an example of search areas set according to sizes of implements connected to a work vehicle;
- FIG. 5 is a diagram showing an example of search areas set according to the positional relationship between the work vehicle and the implement;
- agricultural machinery means machinery used in agricultural applications.
- the agricultural machine of the present disclosure may be a mobile agricultural machine capable of performing agricultural work while moving.
- Examples of agricultural machinery include tractors, harvesters, rice transplanters, ride-on tenders, vegetable transplanters, lawn mowers, seeders, fertilizer applicators, and agricultural mobile robots.
- a work vehicle such as a tractor function as an "agricultural machine” on its own, but a work vehicle (implement) attached to or towed by the work vehicle and the work vehicle as a whole function as one "agricultural machine”.
- Agricultural machines perform farm work such as plowing, sowing, pest control, fertilization, planting of crops, or harvesting on the ground in fields. These agricultural operations are sometimes referred to as “ground operations” or simply “operations.” Traveling while a vehicle-type agricultural machine performs farm work is sometimes referred to as "working travel.”
- “Automated operation” means that the movement of agricultural machinery is controlled by the operation of the control device, not by manual operation by the driver.
- Agricultural machines that operate automatically are sometimes called “automatic driving farm machines” or “robot farm machines”.
- automated driving farm machines not only the movement of the agricultural machine but also the operation of agricultural work (for example, the operation of the working machine) may be automatically controlled.
- the agricultural machine is a vehicle-type machine
- the automatic driving of the agricultural machine is called “automatic driving”.
- the controller may control at least one of steering, movement speed adjustment, movement start and stop required for movement of the agricultural machine.
- the control device may control operations such as raising and lowering the work implement and starting and stopping the operation of the work implement.
- Movement by automatic operation may include not only movement of the agricultural machine toward a destination along a predetermined route, but also movement following a tracking target.
- An agricultural machine that operates automatically may move partially based on a user's instruction.
- the agricultural machine that automatically operates may operate in a manual operation mode in which the agricultural machine is moved by manual operation by the driver.
- the act of steering an agricultural machine not by manual operation but by the action of a control device is called "automatic steering".
- Part or all of the controller may be external to the agricultural machine. Communication, such as control signals, commands, or data, may occur between a control device external to the agricultural machine and the agricultural machine.
- Agricultural machines that operate automatically may move autonomously while sensing the surrounding environment without human involvement in controlling the movement of the agricultural machines.
- Agricultural machines capable of autonomous movement can run unmanned inside or outside a field (for example, on roads). Obstacle detection and obstacle avoidance operation may be performed during autonomous movement.
- a "work plan" is data that defines a schedule for one or more farm work to be performed by an agricultural machine.
- a work plan may include, for example, information indicating the order of farm work to be performed by the agricultural machine and the field on which each farm work is to be performed.
- the work plan may include information about the days and times each farm work is scheduled to occur.
- the work plan may be created by a processing device that communicates with the agricultural machine to manage farm work, or a processing device mounted on the agricultural machine.
- the processing device can, for example, create a work plan based on information input by a user (a farmer, farm worker, etc.) by operating a terminal device.
- a processing device that communicates with agricultural machines and manages farm work is referred to as a “management device”.
- the management device may manage farm work of a plurality of agricultural machines.
- the management device may create a work plan including information on each farm work performed by each of the plurality of agricultural machines.
- the work plan may be downloaded by each agricultural machine and stored in storage. According to the work plan, each agricultural machine can automatically go to the field and perform the scheduled agricultural work.
- Environmental map is data expressing the position or area of an object existing in the environment in which the agricultural machine moves, using a predetermined coordinate system.
- Environmental maps are sometimes simply referred to as "maps" or “map data”.
- the coordinate system that defines the environment map can be, for example, a world coordinate system, such as a geographic coordinate system fixed with respect to the earth.
- the environment map may include information other than position (for example, attribute information and other information) about objects existing in the environment.
- Environmental maps include various types of maps, such as point cloud maps or grid maps. Local map or partial map data generated or processed in the process of constructing an environment map is also referred to as a "map" or "map data”.
- “Farm road” means a road that is mainly used for agricultural purposes.
- Agricultural roads are not limited to roads paved with asphalt, but also include unpaved roads covered with soil or gravel.
- Agricultural roads include roads (including private roads) exclusively passable by vehicle-type agricultural machines (for example, work vehicles such as tractors) and roads passable by general vehicles (passenger cars, trucks, buses, etc.). The work vehicle may automatically travel on general roads in addition to farm roads.
- General roads are roads maintained for general vehicle traffic.
- FIG. 1 is a diagram for explaining an overview of an agricultural management system 1 according to an exemplary embodiment of the present disclosure.
- the agricultural management system 1 shown in FIG. 1 includes a work vehicle 100 , a terminal device 400 and a management device 600 .
- Terminal device 400 is a computer used by a user who remotely monitors work vehicle 100 .
- the management device 600 is a computer managed by a business operator who operates the agricultural management system 1 .
- Work vehicle 100 , terminal device 400 , and management device 600 can communicate with each other via network 80 .
- the agricultural management system 1 may include multiple work vehicles or other agricultural machines.
- the work vehicle 100 in this embodiment is a tractor.
- Work vehicle 100 can be equipped with a work implement on one or both of its rear and front portions.
- the work vehicle 100 can travel in a field while performing farm work according to the type of work machine.
- Work vehicle 100 may travel inside or outside a farm without a work implement attached.
- the work vehicle 100 has an automatic driving function.
- the work vehicle 100 can be driven not by manual operation but by the function of the control device.
- the control device in this embodiment is provided inside the work vehicle 100 and can control both the speed and steering of the work vehicle 100 .
- the work vehicle 100 can automatically travel not only inside the farm field but also outside the farm field (for example, roads).
- the work vehicle 100 is equipped with devices such as a GNSS receiver and a LiDAR sensor, which are used for positioning or self-position estimation.
- the control device of work vehicle 100 automatically causes work vehicle 100 to travel based on the position of work vehicle 100 and information on the target route.
- the control device also controls the operation of the work implement.
- the work vehicle 100 can perform farm work using the work machine while automatically traveling in the field.
- the work vehicle 100 can automatically travel along the target route on a road outside the field (for example, a farm road or a general road).
- the work vehicle 100 automatically travels along a road outside the field while utilizing data output from sensing devices such as the camera 120, the obstacle sensor 130, and the LiDAR sensor 140.
- the management device 600 is a computer that manages farm work by the work vehicle 100 .
- the management device 600 may be a server computer that centrally manages information about agricultural fields on the cloud and supports agriculture by utilizing data on the cloud, for example.
- the management device 600 for example, creates a work plan for the work vehicle 100 and causes the work vehicle 100 to perform farm work according to the work plan.
- Management device 600 generates a target course in a field based on information which a user inputted using terminal device 400 or other devices, for example.
- Management device 600 may further generate and edit an environment map based on data collected by work vehicle 100 or other moving objects using sensing devices such as LiDAR sensors.
- Management device 600 transmits the generated work plan, target route, and environment map data to work vehicle 100 .
- Work vehicle 100 automatically performs movement and farm work based on those data.
- the terminal device 400 is a computer used by a user who is remote from the work vehicle 100 .
- the terminal device 400 shown in FIG. 1 is a laptop computer, but is not limited to this.
- the terminal device 400 may be a stationary computer such as a desktop PC (personal computer), or a mobile terminal such as a smart phone or tablet computer.
- the terminal device 400 can be used to remotely monitor the work vehicle 100 or remotely operate the work vehicle 100 .
- the terminal device 400 can display images captured by one or more cameras (imaging devices) included in the work vehicle 100 on the display.
- the terminal device 400 can also display on the display a setting screen for the user to input information necessary for creating a work plan (for example, a schedule for each agricultural work) for the work vehicle 100 .
- the terminal device 400 transmits the input information to the management device 600 .
- Management device 600 creates a work plan based on the information.
- the terminal device 400 may further have a function of displaying on the display a setting screen for the user to input information necessary for setting the target route.
- FIG. 2 is a side view schematically showing an example of work vehicle 100 and work machine 300 coupled to work vehicle 100.
- the work vehicle 100 in this embodiment can operate in both manual operation mode and automatic operation mode. In the automatic operation mode, work vehicle 100 can run unmanned. The work vehicle 100 can be automatically driven both inside and outside the field.
- the work vehicle 100 includes a vehicle body 101 , a prime mover (engine) 102 and a transmission (transmission) 103 .
- a vehicle body 101 is provided with wheels 104 with tires and a cabin 105 .
- Wheels 104 include a pair of front wheels 104F and a pair of rear wheels 104R.
- a driver's seat 107 , a steering device 106 , an operation terminal 200 , and a group of switches for operation are provided inside the cabin 105 .
- one or both of the front wheels 104F and the rear wheels 104R may be a plurality of wheels (crawlers) equipped with tracks instead of wheels with tires. .
- the work vehicle 100 can include at least one sensing device that senses the environment around the work vehicle 100 and a processing device that processes sensing data output from the at least one sensing device.
- work vehicle 100 includes a plurality of sensing devices.
- the sensing device includes multiple cameras 120 , LiDAR sensors 140 and multiple obstacle sensors 130 .
- the cameras 120 may be provided on the front, rear, left, and right of the work vehicle 100, for example. Camera 120 captures an image of the environment around work vehicle 100 and generates image data. The image acquired by the camera 120 can be output to a processing device mounted on the work vehicle 100 and transmitted to the terminal device 400 for remote monitoring. The image can also be used to monitor work vehicle 100 during unmanned operation. The camera 120 can also be used to generate images for recognizing surrounding features or obstacles, white lines, signs, displays, etc. when the work vehicle 100 travels on a road outside the field (farm road or general road). can be used.
- the LiDAR sensor 140 in the example of FIG. 2 is arranged at the lower front portion of the vehicle body 101. LiDAR sensor 140 may be provided at other locations. For example, LiDAR sensor 140 may be provided on top of cabin 105 . LiDAR sensor 140 may be a 3D-LiDAR sensor, but may also be a 2D-LiDAR sensor.
- the LiDAR sensor 140 senses the environment around the work vehicle 100 and outputs sensing data.
- the LiDAR sensor 140 detects the distance and direction to each measurement point of an object present in the surrounding environment, or the three-dimensional or two-dimensional coordinate value of each measurement point while the work vehicle 100 is traveling mainly outside the farm field.
- the sensor data output from LiDAR sensor 140 is processed by the control device of work vehicle 100 .
- the control device can estimate the self-location of the work vehicle 100 by matching the sensor data and the environment map.
- the control device can further detect objects such as obstacles existing around work vehicle 100 based on the sensor data.
- the controller can also generate or edit an environment map using algorithms such as SLAM (Simultaneous Localization and Mapping).
- Work vehicle 100 may include multiple LiDAR sensors arranged at different locations and with different orientations.
- a plurality of obstacle sensors 130 shown in FIG. 2 are provided at the front and rear of the cabin 105. Obstacle sensors 130 may be placed at other locations as well. For example, one or more obstacle sensors 130 may be provided at any position on the side, front, and rear of the vehicle body 101 . Obstacle sensors 130 may include, for example, laser scanners or ultrasonic sonars. The obstacle sensor 130 is used to detect surrounding obstacles and stop or detour the work vehicle 100 during automatic travel.
- a LiDAR sensor 140 may be utilized as one of the obstacle sensors 130 .
- the work vehicle 100 further includes a GNSS unit 110.
- GNSS unit 110 includes a GNSS receiver.
- the GNSS receiver may include an antenna that receives signals from GNSS satellites and a processor that calculates the position of work vehicle 100 based on the signals received by the antenna.
- the GNSS unit 110 receives satellite signals transmitted from multiple GNSS satellites and performs positioning based on the satellite signals.
- GNSS is a general term for satellite positioning systems such as GPS (Global Positioning System), QZSS (Quasi-Zenith Satellite System, eg, Michibiki), GLONASS, Galileo, and BeiDou.
- GPS Global Positioning System
- QZSS Quadasi-Zenith Satellite System
- GLONASS Galileo
- BeiDou BeiDou.
- the GNSS unit 110 may include an inertial measurement unit (IMU). Signals from the IMU can be used to supplement the location data.
- the IMU can measure the tilt and minute movements of work vehicle 100 . Positioning performance can be improved by using data obtained by the IMU to supplement position data based on satellite signals.
- the control device of work vehicle 100 may use sensing data acquired by sensing devices such as camera 120 and/or LiDAR sensor 140 for positioning in addition to the positioning result by GNSS unit 110 .
- sensing devices such as camera 120 and/or LiDAR sensor 140
- the data acquired by the camera 120 and/or the LiDAR sensor 140 can be estimated with high accuracy based on the environment map stored in advance in the storage device.
- the position of the work vehicle 100 can be determined with higher accuracy.
- the prime mover 102 may be, for example, a diesel engine.
- An electric motor may be used instead of the diesel engine.
- the transmission 103 can change the propulsive force and the moving speed of the work vehicle 100 by shifting.
- the transmission 103 can also switch between forward and reverse travel of the work vehicle 100 .
- the steering device 106 includes a steering wheel, a steering shaft connected to the steering wheel, and a power steering device that assists steering by the steering wheel.
- the front wheels 104F are steerable wheels, and the running direction of the work vehicle 100 can be changed by changing the turning angle (also referred to as the "steering angle") of the front wheels 104F.
- the steering angle of the front wheels 104F can be changed by operating the steering wheel.
- the power steering system includes a hydraulic system or an electric motor that supplies an assist force for changing the steering angle of the front wheels 104F. When automatic steering is performed, the steering angle is automatically adjusted by the power of the hydraulic system or the electric motor under the control of the control device arranged in the work vehicle 100 .
- a coupling device 108 is provided at the rear portion of the vehicle body 101 .
- the coupling device 108 includes, for example, a three-point support device (also called a "three-point link” or “three-point hitch”), a PTO (Power Take Off) shaft, a universal joint, and a communication cable.
- Work implement 300 can be attached to and detached from work vehicle 100 by coupling device 108 .
- the coupling device 108 can change the position or attitude of the working machine 300 by elevating the three-point linkage by, for example, a hydraulic device.
- power can be sent from work vehicle 100 to work implement 300 via the universal joint.
- Work vehicle 100 can cause work implement 300 to perform a predetermined work while pulling work implement 300 .
- the coupling device may be provided at the front portion of the vehicle body 101 . In that case, work implement 300 can be connected to the front portion of work vehicle 100 .
- the working machine 300 shown in FIG. 2 is a rotary tiller, but the working machine 300 is not limited to the rotary tiller. Any implement such as seeder, spreader, transplanter, mower, rake, baler, harvester, sprayer, or harrow It can be used by connecting to the work vehicle 100 .
- the work vehicle 100 shown in FIG. 2 is capable of manned operation, but may only be compatible with unmanned operation. In that case, components required only for manned operation, such as cabin 105 , steering device 106 and driver's seat 107 , may not be provided in work vehicle 100 .
- the unmanned work vehicle 100 can travel autonomously or remotely controlled by a user.
- FIG. 3 is a block diagram showing a configuration example of work vehicle 100 and work machine 300. As shown in FIG. Work vehicle 100 and work machine 300 can communicate with each other via a communication cable included in coupling device 108 . Work vehicle 100 can communicate with terminal device 400 and management device 600 via network 80 .
- the GNSS unit 110 comprises a GNSS receiver 111 , an RTK receiver 112 , an inertial measurement unit (IMU) 115 and processing circuitry 116 .
- the sensor group 150 includes a steering wheel sensor 152 , a steering angle sensor 154 and an axle sensor 156 .
- the control system 160 comprises a processing device 161 , a storage device 170 and a control device 180 .
- the controller 180 includes a plurality of electronic control units (ECUs) 181-185.
- Work machine 300 includes a drive device 340 , a control device 380 , and a communication device 390 .
- FIG. 3 shows constituent elements that are relatively highly relevant to the operation of automatic driving by the work vehicle 100, and illustration of other constituent elements is omitted.
- the GNSS receiver 111 in the GNSS unit 110 receives satellite signals transmitted from multiple GNSS satellites and generates GNSS data based on the satellite signals.
- GNSS data is generated in a predetermined format, eg, NMEA-0183 format.
- GNSS data may include, for example, values indicating the identification number, elevation, azimuth, and received strength of each satellite from which the satellite signal was received.
- the GNSS unit 110 shown in FIG. 3 performs positioning of the work vehicle 100 using RTK (Real Time Kinematic)-GNSS.
- FIG. 4 is a conceptual diagram showing an example of the work vehicle 100 that performs positioning by RTK-GNSS. Positioning by RTK-GNSS uses correction signals transmitted from the reference station 60 in addition to satellite signals transmitted from a plurality of GNSS satellites 50 .
- the reference station 60 can be installed near the field where the work vehicle 100 travels (for example, within 10 km from the work vehicle 100).
- the reference station 60 generates a correction signal, for example in RTCM format, based on the satellite signals received from the plurality of GNSS satellites 50 and transmits it to the GNSS unit 110 .
- RTK receiver 112 includes an antenna and modem to receive correction signals transmitted from reference station 60 .
- the processing circuit 116 of the GNSS unit 110 corrects the positioning result by the GNSS receiver 111 based on the correction signal.
- RTK-GNSS it is possible to perform positioning with an accuracy of, for example, an error of several centimeters.
- Location data including latitude, longitude, and altitude information, are obtained by RTK-GNSS high-precision positioning.
- the GNSS unit 110 calculates the position of the work vehicle 100, for example, at a frequency of about 1 to 10 times per second.
- the positioning method is not limited to RTK-GNSS, and any positioning method (interferometric positioning method, relative positioning method, etc.) that can obtain position data with the required accuracy can be used.
- positioning may be performed using VRS (Virtual Reference Station) or DGPS (Differential Global Positioning System). If position data with the required accuracy can be obtained without using the correction signal transmitted from the reference station 60, the position data may be generated without using the correction signal.
- GNSS unit 110 may not include RTK receiver 112 .
- the position of work vehicle 100 is estimated.
- the position of work vehicle 100 may be estimated by matching data output from LiDAR sensor 140 and/or camera 120 with a highly accurate environmental map.
- the GNSS unit 110 in this embodiment further includes an IMU 115 .
- IMU 115 may include a 3-axis accelerometer and a 3-axis gyroscope.
- the IMU 115 may include an orientation sensor, such as a 3-axis geomagnetic sensor.
- IMU 115 functions as a motion sensor and can output signals indicating various quantities such as acceleration, speed, displacement, and attitude of work vehicle 100 .
- Processing circuitry 116 may more accurately estimate the position and orientation of work vehicle 100 based on signals output from IMU 115 in addition to satellite signals and correction signals. Signals output from IMU 115 may be used to correct or impute positions calculated based on satellite signals and correction signals.
- IMU 115 outputs signals more frequently than GNSS receiver 111 .
- processing circuitry 116 can measure the position and orientation of work vehicle 100 at a higher frequency (eg, 10 Hz or higher).
- a higher frequency eg, 10 Hz or higher.
- IMU 115 may be provided as a separate device from GNSS unit 110 .
- the camera 120 is an imaging device that captures the surrounding environment of the work vehicle 100 .
- the camera 120 includes an image sensor such as a CCD (Charge Coupled Device) or CMOS (Complementary Metal Oxide Semiconductor).
- Camera 120 may also include an optical system, including one or more lenses, and signal processing circuitry.
- the camera 120 captures an image of the environment around the work vehicle 100 while the work vehicle 100 is running, and generates image (for example, moving image) data.
- the camera 120 can capture moving images at a frame rate of 3 frames per second (fps) or higher, for example.
- the image generated by the camera 120 can be used, for example, when a remote monitor uses the terminal device 400 to check the environment around the work vehicle 100 .
- the images generated by camera 120 may be used for positioning or obstacle detection.
- a plurality of cameras 120 may be provided at different positions on work vehicle 100, or a single camera may be provided. There may be separate visible cameras for generating visible light images and infrared cameras for generating infrared images. Both visible and infrared cameras may be provided as cameras for generating images for surveillance. Infrared cameras can also be used to detect obstacles at night.
- the obstacle sensor 130 detects objects existing around the work vehicle 100 .
- Obstacle sensors 130 may include, for example, laser scanners or ultrasonic sonars. Obstacle sensor 130 outputs a signal indicating the presence of an obstacle when an object is present closer than a predetermined distance from obstacle sensor 130 .
- a plurality of obstacle sensors 130 may be provided at different positions on work vehicle 100 . For example, multiple laser scanners and multiple ultrasonic sonars may be placed at different locations on work vehicle 100 . By providing such many obstacle sensors 130, blind spots in monitoring obstacles around the work vehicle 100 can be reduced.
- the steering wheel sensor 152 measures the rotation angle of the steering wheel of the work vehicle 100.
- the steering angle sensor 154 measures the steering angle of the front wheels 104F, which are steered wheels. Measured values by the steering wheel sensor 152 and the steering angle sensor 154 are used for steering control by the controller 180 .
- the axle sensor 156 measures the rotational speed of the axle connected to the wheel 104, that is, the number of revolutions per unit time.
- Axle sensor 156 can be, for example, a sensor utilizing a magnetoresistive element (MR), a Hall element, or an electromagnetic pickup.
- the axle sensor 156 outputs, for example, a numerical value indicating the number of rotations per minute (unit: rpm) of the axle.
- Axle sensors 156 are used to measure the speed of work vehicle 100 .
- the drive device 240 includes various devices necessary for running the work vehicle 100 and driving the work implement 300, such as the prime mover 102, the transmission device 103, the steering device 106, and the coupling device 108 described above.
- Prime mover 102 may comprise an internal combustion engine, such as a diesel engine, for example.
- Drive system 240 may include an electric motor for traction instead of or in addition to the internal combustion engine.
- the buzzer 220 is an audio output device that emits a warning sound to notify an abnormality. Buzzer 220 emits a warning sound when an obstacle is detected, for example, during automatic driving. Buzzer 220 is controlled by controller 180 .
- the processing device 161 is, for example, a microprocessor or microcontroller.
- the processing device 161 processes sensing data output from sensing devices such as the camera 120 , the obstacle sensor 130 and the LiDAR sensor 140 .
- the processing device 161 detects objects located around the work vehicle 100 based on data output from the camera 120 , the obstacle sensor 130 and the LiDAR sensor 140 .
- the storage device 170 includes one or more storage media such as flash memory or magnetic disk.
- Storage device 170 stores various data generated by GNSS unit 110 , camera 120 , obstacle sensor 130 , LiDAR sensor 140 , sensor group 150 , and control device 180 .
- the data stored in the storage device 170 may include map data (environmental map) of the environment in which the work vehicle 100 travels and target route data for automatic driving.
- the environment map includes information of a plurality of farm fields where work vehicle 100 performs farm work and roads around the fields.
- the environment map and target route may be generated by a processor in management device 600 .
- the control device 180 may have the function of generating or editing the environment map and the target route. Control device 180 can edit the environment map and target route acquired from management device 600 according to the traveling environment of work vehicle 100 .
- the storage device 170 also stores work plan data received by the communication device 190 from the management device 600 .
- the storage device 170 also stores a computer program that causes each ECU in the processing device 161 and the control device 180 to execute various operations described later.
- a computer program can be provided to work vehicle 100 via a storage medium (such as a semiconductor memory or an optical disk) or an electric communication line (such as the Internet).
- Such computer programs may be sold as commercial software.
- the control device 180 includes multiple ECUs.
- the plurality of ECUs include, for example, an ECU 181 for speed control, an ECU 182 for steering control, an ECU 183 for work machine control, an ECU 184 for automatic operation control, and an ECU 185 for route generation.
- the ECU 181 controls the speed of the work vehicle 100 by controlling the prime mover 102, the transmission 103, and the brakes included in the drive device 240.
- the ECU 182 controls the steering of the work vehicle 100 by controlling the hydraulic system or the electric motor included in the steering system 106 based on the measurement value of the steering wheel sensor 152 .
- the ECU 183 controls the operations of the three-point linkage and the PTO shaft included in the coupling device 108 in order to cause the working machine 300 to perform desired operations. ECU 183 also generates a signal for controlling the operation of work machine 300 and transmits the signal from communication device 190 to work machine 300 .
- the ECU 184 performs calculations and controls for realizing automatic driving based on data output from the GNSS unit 110, camera 120, obstacle sensor 130, LiDAR sensor 140, sensor group 150, and processing device 161. For example, ECU 184 identifies the position of work vehicle 100 based on data output from at least one of GNSS unit 110 , camera 120 , and LiDAR sensor 140 . Within the field, ECU 184 may determine the position of work vehicle 100 based solely on data output from GNSS unit 110 . ECU 184 may estimate or correct the position of work vehicle 100 based on data acquired by camera 120 and/or LiDAR sensor 140 . By using the data acquired by the camera 120 and/or the LiDAR sensor 140, the accuracy of positioning can be further improved.
- ECU 184 estimates the position of work vehicle 100 using data output from LiDAR sensor 140 and/or camera 120 .
- ECU 184 may estimate the position of work vehicle 100 by matching data output from LiDAR sensor 140 and/or camera 120 with an environmental map.
- the ECU 184 performs calculations necessary for the work vehicle 100 to travel along the target route based on the estimated position of the work vehicle 100 .
- the ECU 184 sends a speed change command to the ECU 181 and a steering angle change command to the ECU 182 .
- ECU 181 changes the speed of work vehicle 100 by controlling prime mover 102, transmission 103, or brakes in response to speed change commands.
- the ECU 182 changes the steering angle by controlling the steering device 106 in response to the command to change the steering angle.
- the ECU 185 can determine the destination of the work vehicle 100 based on the work plan stored in the storage device 170, and determine the target route from the start point of movement of the work vehicle 100 to the destination point. ECU 185 may perform processing for detecting objects located around work vehicle 100 based on data output from camera 120 , obstacle sensor 130 and LiDAR sensor 140 .
- control device 180 realizes automatic operation.
- control device 180 controls drive device 240 based on the measured or estimated position of work vehicle 100 and the target route. Thereby, the control device 180 can cause the work vehicle 100 to travel along the target route.
- a plurality of ECUs included in the control device 180 can communicate with each other according to a vehicle bus standard such as CAN (Controller Area Network). Instead of CAN, a higher-speed communication method such as in-vehicle Ethernet (registered trademark) may be used.
- CAN Controller Area Network
- An on-board computer that integrates at least some functions of the ECUs 181 to 185 may be provided.
- the control device 180 may include ECUs other than the ECUs 181 to 185, and an arbitrary number of ECUs may be provided according to functions.
- Each ECU includes processing circuitry that includes one or more processors.
- the processor 161 may be included in the controller 180 .
- Processor 161 may be integrated with any of the ECUs included in controller 180 .
- the communication device 190 is a device including circuits for communicating with the work machine 300 , the terminal device 400 and the management device 600 .
- Communication device 190 includes a circuit for transmitting/receiving signals conforming to the ISOBUS standard such as ISOBUS-TIM to/from communication device 390 of working machine 300 .
- ISOBUS-TIM As a result, work machine 300 can be caused to perform a desired operation, or information can be acquired from work machine 300 .
- Communication device 190 may further include an antenna and communication circuitry for transmitting and receiving signals over network 80 to and from respective communication devices of terminal device 400 and management device 600 .
- Network 80 may include, for example, cellular mobile communication networks such as 3G, 4G or 5G and the Internet.
- the communication device 190 may have a function of communicating with a mobile terminal used by a supervisor near the work vehicle 100 .
- Communication with such mobile terminals is based on any wireless communication standard, such as Wi-Fi (registered trademark), cellular mobile communication such as 3G, 4G or 5G, or Bluetooth (registered trademark).
- Wi-Fi registered trademark
- cellular mobile communication such as 3G, 4G or 5G
- Bluetooth registered trademark
- the operation terminal 200 is a terminal for the user to perform operations related to the travel of the work vehicle 100 and the operation of the work machine 300, and is also called a virtual terminal (VT).
- Operation terminal 200 may include a display device such as a touch screen and/or one or more buttons.
- the display device can be a display such as a liquid crystal or an organic light emitting diode (OLED), for example.
- OLED organic light emitting diode
- the operation terminal 200 By operating the operation terminal 200, the user can perform various operations such as switching the automatic driving mode on/off, recording or editing an environment map, setting a target route, and switching the working machine 300 on/off. can be executed. At least part of these operations can also be realized by operating the operation switch group 210 .
- Operation terminal 200 may be configured to be removable from work vehicle 100 .
- a user located away from work vehicle 100 may operate operation terminal 200 that has been removed to control the operation of work vehicle 100 .
- the user may control the operation of work vehicle 100 by operating a computer, such as terminal device 400 , in which necessary application software is installed, instead of operating terminal 200 .
- FIG. 5 is a diagram showing an example of the operation terminal 200 and the operation switch group 210 provided inside the cabin 105.
- FIG. Inside the cabin 105, an operation switch group 210 including a plurality of switches that can be operated by the user is arranged.
- the operation switch group 210 includes, for example, a switch for selecting the gear stage of the main transmission or the sub-transmission, a switch for switching between the automatic operation mode and the manual operation mode, a switch for switching between forward and reverse, and a working machine.
- a switch or the like for raising or lowering 300 may be included. Note that if the work vehicle 100 only operates unmanned and does not have a function of manned operation, the work vehicle 100 need not include the operation switch group 210 .
- the driving device 340 in the work machine 300 shown in FIG. 3 performs operations necessary for the work machine 300 to perform a predetermined work.
- Drive device 340 includes a device, such as a hydraulic device, an electric motor, or a pump, depending on the application of work machine 300 .
- Controller 380 controls the operation of drive 340 .
- Control device 380 causes drive device 340 to perform various operations in response to signals transmitted from work vehicle 100 via communication device 390 .
- a signal corresponding to the state of work implement 300 can also be transmitted from communication device 390 to work vehicle 100 .
- FIG. 6 is a block diagram illustrating a schematic hardware configuration of the management device 600 and the terminal device 400. As shown in FIG. 6
- the management device 600 includes a storage device 650 , a processor 660 , a ROM (Read Only Memory) 670 , a RAM (Random Access Memory) 680 and a communication device 690 . These components are communicatively connected to each other via a bus.
- the management device 600 can function as a cloud server that manages the schedule of agricultural work in the field performed by the work vehicle 100 and utilizes managed data to support agriculture.
- a user can use the terminal device 400 to input information necessary for creating a work plan and upload the information to the management device 600 via the network 80 . Based on the information, the management device 600 can create a farm work schedule, that is, a work plan.
- the management device 600 can also generate or edit an environment map. The environment map may be distributed from a computer outside the management device 600 .
- the communication device 690 is a communication module for communicating with the work vehicle 100 and the terminal device 400 via the network 80.
- the communication device 690 can perform wired communication conforming to a communication standard such as IEEE1394 (registered trademark) or Ethernet (registered trademark), for example.
- the communication device 690 may perform wireless communication conforming to the Bluetooth® standard or Wi-Fi standard, or cellular mobile communication such as 3G, 4G or 5G.
- the processor 660 may be, for example, a semiconductor integrated circuit including a central processing unit (CPU).
- Processor 660 may be implemented by a microprocessor or microcontroller.
- the processor 660 is an FPGA (Field Programmable Gate Array) equipped with a CPU, a GPU (Graphics Processing Unit), an ASIC (Application Specific Integrated Circuit), an ASSP (Application Specific Standard Product), or selected from these circuits. It can also be realized by a combination of two or more circuits.
- the processor 660 sequentially executes a computer program describing a group of instructions for executing at least one process stored in the ROM 670 to achieve desired processes.
- the ROM 670 is, for example, a writable memory (eg PROM), a rewritable memory (eg flash memory), or a read-only memory.
- ROM 670 stores programs that control the operation of processor 660 .
- the ROM 670 does not have to be a single storage medium, and may be a collection of multiple storage media. Part of the collection of multiple storage media may be removable memory.
- the RAM 680 provides a work area for temporarily expanding the control program stored in the ROM 670 at boot time.
- the RAM 680 does not have to be a single storage medium, and may be a collection of multiple storage media.
- the storage device 650 mainly functions as database storage.
- Storage device 650 may be, for example, a magnetic storage device or a semiconductor storage device.
- An example of a magnetic storage device is a hard disk drive (HDD).
- An example of a semiconductor memory device is a solid state drive (SSD).
- Storage device 650 may be a device independent of management device 600 .
- the storage device 650 may be a storage device connected to the management device 600 via the network 80, such as a cloud storage.
- the terminal device 400 includes an input device 420 , a display device 430 , a storage device 450 , a processor 460 , a ROM 470 , a RAM 480 and a communication device 490 . These components are communicatively connected to each other via a bus.
- the input device 420 is a device for converting a user's instruction into data and inputting it to the computer.
- Input device 420 may be, for example, a keyboard, mouse, or touch panel.
- Display device 430 may be, for example, a liquid crystal display or an organic EL display. Descriptions of the processor 460, the ROM 470, the RAM 480, the storage device 450, and the communication device 490 are the same as those described in the hardware configuration example of the management device 600, and the description thereof will be omitted.
- the work vehicle 100 in this embodiment can automatically travel both inside and outside the field.
- work vehicle 100 drives work machine 300 to perform predetermined farm work while traveling along a preset target route.
- work vehicle 100 stops traveling, emits a warning sound from buzzer 220 , and performs operations such as transmitting a warning signal to terminal device 400 .
- Positioning of the work vehicle 100 in the field is performed mainly based on data output from the GNSS unit 110 .
- the work vehicle 100 automatically travels along a target route set on a farm road or a general road outside the field.
- the work vehicle 100 utilizes data acquired by the camera 120 and/or the LiDAR sensor 140 while traveling outside the field. Outside the field, when an obstacle is detected, work vehicle 100 avoids the obstacle or stops on the spot. Outside the field, the position of work vehicle 100 is estimated based on data output from LiDAR sensor 140 and/or camera 120 in addition to positioning data output from GNSS unit 110 .
- FIG. 7 is a diagram schematically showing an example of the working vehicle 100 that automatically travels along a target route in a field.
- farm field 70 includes a work area 72 where work vehicle 100 performs work using work machine 300 , and a headland 74 located near the outer edge of farm field 70 .
- Which area of the farm field 70 corresponds to the work area 72 or the headland 74 on the map can be set in advance by the user.
- the target paths in this example include a plurality of parallel main paths P1 and a plurality of turning paths P2 connecting the plurality of main paths P1.
- the main path P1 is located within the working area 72 and the turning path P2 is located within the headland 74 .
- each main path P1 may include a curved portion.
- the dashed line in FIG. 7 represents the working width of work implement 300 .
- the working width is preset and recorded in the storage device 170 .
- the working width can be set and recorded by the user operating the operation terminal 200 or the terminal device 400 . Alternatively, the working width may be automatically recognized and recorded when work implement 300 is connected to work vehicle 100 .
- the intervals between the main paths P1 can be set according to the working width.
- a target route can be created based on a user's operation before automatic driving is started.
- the target route may be created to cover the entire work area 72 within the field 70, for example.
- the work vehicle 100 automatically travels along a target route as shown in FIG. 7 from a work start point to a work end point while repeating reciprocation. Note that the target route shown in FIG. 7 is merely an example, and the method of determining the target route is arbitrary.
- control device 180 Next, an example of control during automatic operation in a field by the control device 180 will be described.
- FIG. 8 is a flowchart showing an example of the steering control operation during automatic driving executed by the control device 180.
- the control device 180 performs automatic steering by executing the operations of steps S121 to S125 shown in FIG. As for the speed, it is maintained at a preset speed, for example.
- the control device 180 acquires data indicating the position of the work vehicle 100 generated by the GNSS unit 110 while the work vehicle 100 is traveling (step S121).
- control device 180 calculates the deviation between the position of work vehicle 100 and the target route (step S122). The deviation represents the distance between the position of work vehicle 100 at that time and the target route.
- the control device 180 determines whether or not the calculated positional deviation exceeds a preset threshold value (step S123).
- control device 180 changes the steering angle by changing the control parameters of the steering device included in the drive device 240 so that the deviation becomes smaller. If the deviation does not exceed the threshold in step S123, the operation of step S124 is omitted. In subsequent step S125, control device 180 determines whether or not an operation end command has been received.
- An operation end command can be issued, for example, when the user instructs to stop the automatic operation by remote control, or when work vehicle 100 reaches the destination. If the command to end the operation has not been issued, the process returns to step S121, and similar operations are executed based on the newly measured position of the work vehicle 100.
- FIG. The control device 180 repeats the operations of steps S121 to S125 until an operation end command is issued. The above operations are executed by ECUs 182 and 184 in control device 180 .
- controller 180 controls drive 240 based only on the deviation between the position of work vehicle 100 identified by GNSS unit 110 and the target path, but also takes into account heading deviation. may be controlled. For example, when the azimuth deviation, which is the angular difference between the direction of the work vehicle 100 identified by the GNSS unit 110 and the direction of the target route, exceeds a preset threshold value, the control device 180 drives according to the deviation.
- a control parameter eg, steering angle
- FIG. 9A is a diagram showing an example of the work vehicle 100 traveling along the target route P.
- FIG. 9B is a diagram showing an example of work vehicle 100 at a position shifted to the right from target path P.
- FIG. 9C is a diagram showing an example of work vehicle 100 at a position shifted to the left from target path P.
- FIG. 9D is a diagram showing an example of the work vehicle 100 oriented in a direction inclined with respect to the target path P.
- the pose indicating the position and orientation of work vehicle 100 measured by GNSS unit 110 is expressed as r(x, y, ⁇ ).
- (x, y) are coordinates representing the position of the reference point of work vehicle 100 in the XY coordinate system, which is a two-dimensional coordinate system fixed to the earth.
- the reference point of work vehicle 100 is at the position where the GNSS antenna is installed on the cabin, but the position of the reference point is arbitrary.
- ⁇ is an angle representing the measured orientation of work vehicle 100 .
- the target path P is parallel to the Y-axis, but in general the target path P is not necessarily parallel to the Y-axis.
- control device 180 maintains the steering angle and speed of work vehicle 100 without changing.
- control device 180 steers work vehicle 100 so that the traveling direction of work vehicle 100 leans leftward and approaches path P. change the angle.
- the speed may be changed in addition to the steering angle.
- the magnitude of the steering angle can be adjusted, for example, according to the magnitude of the positional deviation ⁇ x.
- control device 180 steers so that the traveling direction of work vehicle 100 leans to the right and approaches path P. change the angle. Also in this case, the speed may be changed in addition to the steering angle. The amount of change in the steering angle can be adjusted, for example, according to the magnitude of the positional deviation ⁇ x.
- control device 180 when work vehicle 100 does not deviate greatly from target path P but is oriented in a different direction from target path P, control device 180 performs steering such that azimuth deviation ⁇ becomes small. change the angle. Also in this case, the speed may be changed in addition to the steering angle.
- the magnitude of the steering angle can be adjusted, for example, according to the respective magnitudes of the position deviation ⁇ x and heading deviation ⁇ . For example, the smaller the absolute value of the positional deviation ⁇ x, the larger the amount of change in the steering angle corresponding to the azimuth deviation ⁇ .
- the absolute value of the positional deviation ⁇ x is large, the steering angle will be greatly changed in order to return to the route P, so the absolute value of the azimuth deviation ⁇ will inevitably become large. Conversely, when the absolute value of the positional deviation ⁇ x is small, it is necessary to make the azimuth deviation ⁇ close to zero. Therefore, it is appropriate to relatively increase the weight (that is, the control gain) of the azimuth deviation ⁇ for determining the steering angle.
- a control technique such as PID control or MPC control (model predictive control) can be applied to the steering control and speed control of work vehicle 100 .
- PID control or MPC control model predictive control
- the control device 180 stops the work vehicle 100. At this time, the buzzer 220 may emit a warning sound or may transmit a warning signal to the terminal device 400 . If obstacle avoidance is possible, controller 180 may control drive 240 to avoid the obstacle.
- the work vehicle 100 in this embodiment can automatically travel not only inside the field but also outside the field. Outside the field, the processing device 161 and/or the control device 180 detects objects (for example, , other vehicles or pedestrians) can be detected. By using the camera 120 and the LiDAR sensor 140, it is possible to detect an object existing at a relatively distant position from the work vehicle 100. FIG. The control device 180 can realize automatic traveling on the road outside the field by performing speed control and steering control so as to avoid the detected object.
- objects for example, , other vehicles or pedestrians
- FIG. 10 is a diagram schematically showing an example of a situation in which a plurality of work vehicles 100 are automatically traveling inside a farm field 70 and on roads 76 outside the farm field 70 .
- the storage device 170 records an environmental map of an area including a plurality of fields 70 and roads around them and a target route.
- Environmental maps and target routes may be generated by management device 600 or ECU 185 .
- work vehicle 100 senses the surroundings using sensing devices such as camera 120, obstacle sensor 130, and LiDAR sensor 140 with work implement 300 raised. Drive along the target path.
- sensing devices such as the camera 120, the obstacle sensor 130, and the LiDAR sensor 140 sense the environment around the work vehicle 100 and output sensing data.
- Processing device 161 (FIG. 3) detects an object located in a search area around work vehicle 100 based on the sensing data.
- the search area is an area in which an object is searched among areas around work vehicle 100 sensed by the sensing device.
- the search area may have the same size as the sensing area sensed by the sensing device, or may be smaller than the sensing area.
- a search region may also be referred to as a Region of Interest (ROI).
- ROI Region of Interest
- the pattern of the search area in the process of detecting an object using the sensing data output by the LiDAR sensor 140 is changed according to the area in which the work vehicle 100 is located.
- Changing the pattern of the search area means, for example, changing at least one of the shape, size, and relative position of the search area with respect to the work vehicle 100 .
- the work vehicle 100 of this embodiment includes a sensing system 10 ( FIG. 3 ) that detects objects positioned around the work vehicle 100 using sensing data output by the LiDAR sensor 140 .
- Sensing system 10 comprises a processing unit 161 and a LiDAR sensor 140 .
- Sensing system 10 may include GNSS unit 110 and storage device 170 when position data and map data generated by GNSS unit 110 are used to detect the area in which work vehicle 100 is located.
- Sensing system 10 may include camera 120 and storage device 170 when estimating the area where work vehicle 100 is located by matching the data output from camera 120 with an environmental map.
- the LiDAR sensor 140 sequentially emits pulses of a laser beam (hereinafter abbreviated as “laser pulses”) while changing the emission direction, and from the time difference between the emission time and the time when the reflected light of each laser pulse is acquired, The distance to the position of the reflection point can be measured.
- laser pulses a laser beam
- a “reflection point” may be an object located in the environment around work vehicle 100 .
- the LiDAR sensor 140 can measure the distance from the LiDAR sensor 140 to the object by any method. Measurement methods of the LiDAR sensor 140 include, for example, a mechanical rotation method, a MEMS method, and a phased array method. These measurement methods differ in the method of emitting a laser pulse (scanning method).
- a mechanical rotation type LiDAR sensor rotates a cylindrical head that emits a laser pulse and detects the reflected light of the laser pulse to scan the surrounding environment in all directions 360 degrees around the rotation axis.
- a MEMS-type LiDAR sensor uses a MEMS mirror to oscillate the emission direction of a laser pulse, and scans the surrounding environment within a predetermined angular range around the oscillation axis.
- a phased array LiDAR sensor controls the phase of light to oscillate the direction of light emission, and scans the surrounding environment within a predetermined angular range around the oscillation axis.
- FIG. 11 is a flowchart showing an example of processing for changing the search area pattern according to the area in which the agricultural machine is located.
- the control device 180 acquires position data indicating the position of the work vehicle 100 generated by the GNSS unit 110 while the work vehicle 100 is traveling (step S201).
- the position data includes information on the geographical coordinates of the position of work vehicle 100 .
- the storage device 170 stores map data of the area in which the work vehicle 100 moves. Map data includes information on the geographical coordinates of the area indicated by the map.
- the processing device 161 uses the map data to determine the area corresponding to the geographical coordinates indicated by the position data (step S202).
- the area corresponding to the geographical coordinates indicated by the position data corresponds to the area in which work vehicle 100 is located.
- the processing device 161 determines whether the area corresponding to the geographical coordinates indicated by the position data is a predetermined area (step S203). A predetermined area is registered in map data in advance.
- the processing device 161 sets the first search area as the search area (step S205). If the area corresponding to the geographical coordinates indicated by the position data is the predetermined area, the processing device 161 sets the second search area as the search area (step S204).
- FIG. 12 is a diagram showing an example of areas in which the pattern of the search area is changed.
- the predetermined area is area 712 near the outer edge of field 70 .
- the area 712 is indicated by diagonal hatching.
- a ridge 710 is formed along the outer edge of the farm field 70, and in this case, the area 712 can be a ridge edge area.
- Fields in which ridges are formed are not limited to paddy fields.
- FIG. 13 is a diagram showing examples of the first search area and the second search area.
- First search area 810 is a search area that is set when work vehicle 100 is not located in a predetermined area.
- Second search area 820 is a search area that is set when work vehicle 100 is located within a predetermined area.
- the first search area 810 includes a forward search area 810F, a backward search area 810Re, a left side search area 810L, and a right side search area 810R.
- the second search area 820 includes a forward search area 820F, a backward search area 820Re, a left side search area 820L, and a right side search area 820R.
- FIG. 13 shows the search area in a plan view seen from the vertical direction when the work vehicle 100 is positioned on the horizontal ground. In this embodiment, the pattern of the search area in plan view seen from the vertical direction is changed.
- the work vehicle 100 is provided with four LiDAR sensors 140F, 140Re, 140L, and 140R.
- the LiDAR sensor 140 ⁇ /b>F is arranged in the front part of the work vehicle 100 and mainly senses the surrounding environment spreading in front of the work vehicle 100 .
- the LiDAR sensor 140Re is arranged at the rear of the work vehicle 100 and mainly senses the surrounding environment spreading behind the work vehicle 100 .
- LiDAR sensor 140L is arranged on the left side of work vehicle 100 and mainly senses the surrounding environment that spreads to the left side of work vehicle 100 .
- LiDAR sensor 140R is arranged on the right side of work vehicle 100 and mainly senses the surrounding environment that spreads to the right side of work vehicle 100 .
- LiDAR sensors 140Re, 140L, 140R may be provided in cabin 105 (FIG. 2) of work vehicle 100, for example.
- the LiDAR sensor 140 Re may be provided in the implement 300 .
- FIGS 14 and 15 are diagrams showing the relationship between the sensing area sensed by the LiDAR sensor and the search area for searching for objects.
- the shape, size and position of the search area can be realized, for example, by changing the data portion used for searching for the object in the 3D point cloud data output by the LiDAR sensor.
- the 3D point cloud data output by the LiDAR sensor contains information (attribute information) such as information on the positions of multiple points and the received intensity of the photodetector.
- the information about the positions of the points is, for example, the emission direction of the laser pulse corresponding to the points and the distance between the LiDAR sensor and the points.
- the information about the positions of the points is information about the coordinates of the points in the local coordinate system.
- the local coordinate system is a coordinate system that moves together with work vehicle 100, and is also called a sensor coordinate system. The coordinates of each point can be calculated from the emission direction of the laser pulse corresponding to the point and the distance between the LiDAR sensor and the point.
- the search area can be set based on the coordinates of each point.
- a search area having the desired shape can be set.
- FIG. 14 shows a sensing area 830L sensed by the LiDAR sensor 140L and a sensing area 830R sensed by the LiDAR sensor 140R.
- a search area 810L can be set by selecting points positioned within a predetermined shape in the local coordinate system from among the plurality of points indicated by the three-dimensional point cloud data output by the LiDAR sensor 140L.
- a search region 820L can be established by selecting points that lie within another shape in the local coordinate system.
- the search area 810R can be set by selecting points positioned within a predetermined shape in the local coordinate system from among the plurality of points indicated by the three-dimensional point cloud data output by the LiDAR sensor 140R.
- a search region 820R can be set by selecting a point located within another shape in the local coordinate system.
- the shape of the search areas 810L, 810R, 820L, and 820R is substantially rectangular, but it is not limited thereto and may be another shape.
- FIG. 15 shows a sensing area 830F sensed by the LiDAR sensor 140F and a sensing area 830Re sensed by the LiDAR sensor 140Re.
- a search area 810F can be set by selecting points located within a predetermined shape in the local coordinate system from among the plurality of points indicated by the three-dimensional point cloud data output by the LiDAR sensor 140F.
- a search area 820F can be established by selecting a point located within another shape in the local coordinate system.
- the search area 810Re can be set by selecting points positioned within a predetermined shape in the local coordinate system from among the plurality of points indicated by the three-dimensional point cloud data output by the LiDAR sensor 140Re.
- a search region 820Re can be set by selecting a point located within another shape in the local coordinate system.
- the shape of the search areas 810F, 810Re, 820F, and 820Re is substantially fan-shaped, but it is not limited to this and may be another shape.
- the search area 810F and the search area 820F can be the same as each other, but they can also be different.
- the search area 810Re and the search area 820Re can be the same, but they can also be different.
- the longitudinal length L2 of the lateral search regions 820L and 820R is greater than the longitudinal length L1 of the lateral search regions 810L and 810R.
- the longitudinal length of the side search areas 820L and 820R is longer than when traveling in an area other than the furrow area 712 (for example, an area relatively far from the outer periphery in the field 70).
- points indicated by the three-dimensional point cloud data output by the LiDAR sensors 140F and/or 140Re may be included as points to be searched within the lateral search area.
- FIG. 16 is a diagram showing another example of areas in which the pattern of the search area is changed.
- the predetermined area for changing the pattern of the search area is an area 722 adjacent to the waterway 720 on the road 76 (farm road or general road) outside the field.
- the area 722 is indicated by diagonal hatching.
- the processing device 161 detects objects around the work vehicle 100 using the output data of the LiDAR sensor 140 corresponding to the set search area (step S206). The processing device 161 repeats the operations from steps S201 to S206 until a command to end the operation is issued (step S207).
- FIG. 17 is a flowchart showing an example of processing when an obstacle is detected.
- the processing device 161 determines that there is an obstacle (step S301). For example, when an object that is not included in the pre-generated "environmental map" and is above a predetermined height is detected on the target route, it is determined that there is an obstacle.
- the ECU 185 determines whether or not a detour route that can avoid the obstacle can be generated (step S302). For example, if there is enough space on the road 76 to allow a detour, it is determined that a detour route can be generated. In the field 70, for example, if a detour route can be generated that does not affect farm work and crops, it is determined that the detour route can be generated. For example, if the farm work is prohibited from generating a detour, or if it is determined that the detour will cause the work vehicle 100 to come into contact with the crops, it is determined that the detour cannot be generated. Further, for example, when a detour route that does not enter the worked area in the field 70 can be generated, it is determined that the detour route can be generated.
- the ECU 185 If it is determined that the detour route can be generated, the ECU 185 generates the detour route, and the control device 180 controls the work vehicle 100 to travel along the detour route (step S303). After traveling on the detour route, the control device 180 returns the work vehicle 100 to the target route, and returns to the process of step S207 shown in FIG. 11 .
- control device 180 When it is determined that the detour route cannot be generated, the control device 180 performs control to stop the work vehicle 100 (step S304). In parallel, operations such as issuing a warning sound from the buzzer 220 and transmitting a warning signal to the terminal device 400 are performed.
- control device 180 When it is determined that the object detected as an obstacle has moved or the obstacle has been removed by the operator, the control device 180 restarts the work vehicle 100 to travel (steps S305 and S306), It returns to the process of step S207 shown in FIG.
- the location data generated by the GNSS unit 110 was used to detect the area in which the work vehicle 100 is located, but it is not limited to this.
- the area in which work vehicle 100 is located may be estimated by matching the data output from LiDAR sensor 140 and/or camera 120 with an environmental map.
- FIG. 18 is a diagram showing another example of the first search area 810 and the second search area 820.
- FIG. 19 is a diagram showing still another example of the first search area 810 and the second search area 820.
- the lateral search areas 810L, 810R, 820L, and 820R have substantially rectangular shapes, but they may have other shapes.
- the lateral search regions 810L, 810R, 820L, and 820R may be substantially fan-shaped.
- the LiDAR sensor can sense the surrounding environment within a predetermined angular range around the rocking axis.
- the search area is substantially fan-shaped, the point used for searching the object from among the plurality of points indicated by the three-dimensional point cloud data output by the LiDAR sensor may be selected by focusing on the emission angle of the corresponding laser pulse. good.
- the pattern of the search area can be changed by changing the angle range that serves as the reference for the selection.
- the points used for searching for the object from among the plurality of points indicated by the three-dimensional point cloud data may be selected by focusing on the distance between the LiDAR sensor and the points. .
- the pattern of the search area can be changed by changing the size of the distance that serves as the criterion for selection.
- the pattern of the search area may be changed by changing the sensing range of the LiDAR sensor.
- the size of the search area may be changed by changing the power of the laser pulse emitted from the LiDAR sensor.
- the angular range of the search area may be changed by changing the angular range in which the LiDAR sensor emits laser pulses.
- the angular range of the search area may be changed by changing the angular range for swinging the emission direction of the laser pulse.
- FIG. 20 is a diagram showing yet another example of the first search area 810 and the second search area 820.
- the lateral search areas 820L, 820R have a shape that includes areas closer to the front wheel 104F and the rear wheel 104R compared to the lateral search areas 810L, 810R.
- the lateral search areas 820L, 820R are set specifically to include areas near the front outer ends of the front wheels 104F.
- FIG. 21 is a diagram showing still another example of areas for changing the pattern of the search area.
- FIG. 21 shows a bridge area 732 that modifies the pattern of search regions.
- Bridge area 732 may include the area over bridge 730 over river or waterway 734 and the area of road 76 near bridge 730 . It is desirable to be able to detect the state of the vicinity of the wheels of work vehicle 100, particularly the vicinity of the front wheels, when traveling in bridge area 732 .
- a search suitable for the bridge area 732 can be performed by setting the pattern of the side search areas 820L and 820R shown in FIG.
- FIG. 22 is a diagram showing still another example of the first search area 810 and the second search area 820.
- the side search areas 810L, 810R, 820L, and 820R have substantially rectangular shapes, but may have other shapes.
- the lateral search areas 810L, 810R, 820L, and 820R may be substantially fan-shaped.
- FIG. 21 further shows a barn doorway area 742 that changes the pattern of the search area.
- a barn doorway area 742 that changes the pattern of the search area.
- the search area 820 shown in FIGS. 13, 18, and 19 is set.
- the length of the lateral search areas 820L and 820R in the front-rear direction is made larger than when traveling in areas other than the entrance/exit area 742, so that the lateral state of the work vehicle 100 can be detected earlier. can be detected.
- FIG. 23 is a diagram showing still another example of areas for changing the pattern of the search area.
- the predetermined area in which the pattern of the search area is changed is the area 772 of the work place 78 where the work vehicle 100 is loaded onto the truck 770 .
- Area 772 may be located on road 76 .
- the work vehicle 100 is loaded onto the transport vehicle 770 by running the work vehicle 100 along the ladder rails 771 provided on the transport vehicle 770 .
- the loading work area 772 includes the area near the truck 770 and ladder rails 771 .
- a search suitable for the loading operation can be performed.
- FIG. 24 is a diagram showing still another example of areas for changing the pattern of the search area.
- FIG. 25 is a diagram showing still another example of the first search area 810 and the second search area 820.
- the predetermined area in which the pattern of the search area is changed is the ridge area 752 in which the ridge 750 in the agricultural field 70 is provided. In ridge area 752, it is desirable to be able to detect the condition of a wider area around work vehicle 100.
- the search area 820 is larger in size than the search area 810.
- a search area 820 shown in FIG. 25 is set.
- the state of a wider area around the work vehicle 100 can be detected. .
- FIG. 26 is a diagram showing still another example of areas for changing the pattern of the search area.
- the predetermined area in which the search area pattern is changed is the crop row area 762 where the crop row 760 in the field 70 is located.
- a search area 820 shown in FIG. 25 is set in the crop row area 762 .
- the size of the search area 820 is made larger than when traveling in the field 70 other than the crop row area 762, so that the state of a wider area around the work vehicle 100 can be detected. can be done.
- FIG. 27 is a diagram showing an example of the search area 810 set according to the size of the implement 300 connected to the work vehicle 100.
- FIG. A plurality of types of implements 300 having different sizes can be connected to the work vehicle 100 , and the processing device 161 changes the pattern of the search area 810 according to the sizes of the implements 300 connected to the work vehicle 100 . .
- the storage device 170 pre-stores information on the sizes of multiple types of implements 300 .
- the processing device 161 communicates with the implement 300 and acquires unique information that can identify the model of the implement 300 .
- the unique information includes the model number of the implement 300, for example.
- the processing device 161 can determine the size of the implement 300 connected to the work vehicle 100 by reading the size information corresponding to the unique information of the implement 300 from the storage device 170 . Information on the size of the implement 300 may be input to the work vehicle 100 by the worker.
- the implement 300b connected to the work vehicle 100 on the right side of FIG. 27 is larger in size than the implement 300a connected to the work vehicle 100 on the left side.
- the implement 300b is larger in at least one of the longitudinal direction and the lateral direction than the implement 300a.
- the side search areas 810L and 810R shown on the right side are larger in the horizontal direction than the side search areas 810L and 810R shown on the left side.
- the rear search region 810Re shown on the right side has larger sizes in the front-rear direction and the left-right direction.
- FIG. 28 is a diagram showing an example of search area 810 set according to the positional relationship between work vehicle 100 and implement 300 .
- the implement 300c shown in FIG. 28 is an implement that can change the positional relationship with the work vehicle 100.
- the implement 300c shown in FIG. Such implements may also be referred to as offset implements.
- the processing device 161 can determine the position of the implement 300c, for example, based on a control signal output from the work vehicle 100 to the implement 300c.
- the implement 300c shown on the right side extends further to the right with respect to the work vehicle 100 than the implement 300c shown on the left side.
- the side search area 810R shown on the right side has larger sizes in the front-rear direction and the left-right direction than the side search area 810R shown on the left side.
- the rear search region 810Re shown on the right side has larger sizes in the front-rear direction and the left-right direction.
- the implement 300c shown on the left side of FIG. By increasing the size of the lateral search region 810R shown on the left side of FIG. It can be performed.
- the pattern of the search area in the process of detecting an object using the sensing data output by the LiDAR sensor was changed.
- ) may be used to change the pattern of the search area in the process of detecting an object using the sensing data output by .
- the pattern of the search area may be changed by changing the portion of the image data captured by the camera that is used for object detection.
- the pattern of the search area may be changed by changing the output of the ultrasonic sonar or by changing the angle range for sensing.
- the sensing system 10 of this embodiment can also be retrofitted to agricultural machines that do not have those functions.
- Such systems can be manufactured and sold independently of agricultural machinery.
- Computer programs used in such systems may also be manufactured and sold independently of agricultural machinery.
- the computer program may be provided by being stored in a non-transitory computer-readable storage medium, for example. Computer programs may also be provided by download via telecommunications lines (eg, the Internet).
- a part or all of the processing executed by the processing device 161 in the sensing system 10 may be executed by another device.
- Such other device may be at least one of processor 660 of management device 600 , processor 460 of terminal device 400 and operation terminal 200 .
- such other device and processing device 161 function as the processing device of sensing system 10 , or such other device functions as the processing device of sensing system 10 .
- the processing device 161 and the processor 660 function as processing devices of the sensing system 10 .
- a part or all of the processing executed by the processing device 161 may be executed by the control device 180 .
- the control device 180 and the processing device 161 function as the processing device of the sensing system 10 or the control device 180 functions as the processing device of the sensing system 10 .
- the present disclosure includes the following agricultural machines, sensing systems and sensing methods used in agricultural machines.
- a sensing system 10 is a sensing system 10 for a mobile agricultural machine 100, is provided in the agricultural machine 100, senses the environment around the agricultural machine 100, and outputs sensing data. and a processing device 161 for detecting objects located in search areas 810 and 820 around the agricultural machine 100 based on the sensing data. The pattern of the search areas 810, 820 for object detection is changed according to the area in which it is located.
- the processing device 161 is positioned when the agricultural machine 100 is positioned in a first area in the field 70 and in a second area closer to the outer edge of the field 70 than the first area in the field 70 .
- the patterns of the search areas 810 and 820 may be different depending on whether the search area is being used.
- the second area may be the ridge 712 .
- the search areas 810, 820 include lateral search areas 810L, 810R, 820L, 820R including lateral areas of the agricultural machine 100, and the processing device 161 determines whether the agricultural machine 100 is in the first area.
- the length of the lateral search area in the front-rear direction may be greater when positioned in the second area than when positioned.
- the lateral state of the agricultural machine 100 can be detected early.
- the processing device 161 changes the pattern of the search areas 810, 820 when the agricultural machine 100 is located on the bridge 730 and when it is located on the predetermined road 76 different from the bridge 730. They may be different from each other.
- the agricultural machine 100 includes a work vehicle 100 with front wheels 104F, and the handling device 161 is located on the predetermined road 76 when the agricultural machine 100 is located on the bridge 730.
- a search area 820 is set to include an area closer to the front outer edge of the front wheel 104F than the time.
- the processing unit 161 determines the search areas 810, 820 when the agricultural machine 100 is located on the road 76 adjacent to the waterway 720 and when the agricultural machine 100 is located on the road 76 not adjacent to the waterway 720. patterns may be different from each other.
- the search areas 810, 820 include lateral search areas 810L, 810R, 820L, 820R that include lateral areas of the agricultural machine 100, and the processing device 161 determines whether the agricultural machine 100 is not adjacent to the waterway 720.
- the longitudinal length of the lateral search area may be greater when located on the road 76 adjacent to the waterway 720 than when located on the road 76 .
- the processing device 161 searches the search area 810 when the agricultural machine 100 is located at the doorway 741 of the barn 740 and when it is located in a third area different from the doorway 741 of the barn 740 .
- 820 may be different from each other.
- the search areas 810, 820 include lateral search areas 810L, 810R, 820L, 820R including lateral areas of the agricultural machine 100, and the processing device 161 determines whether the agricultural machine 100 is in the third area.
- the longitudinal length of the lateral search area may be greater when positioned at the doorway 741 of the barn 740 than when positioned.
- the processing device 161 is closer to the ridge 750 or the crop planting area 760 of the field 70 than to the fourth area in the field 70 when the agricultural machine 100 is located in the fourth area in the field 70 .
- the patterns of the search areas 810 and 820 may be different depending on when they are located in the fifth area.
- the processing device 161 may cause the patterns of the search areas 810 and 820 to differ between when the agricultural machine 100 is in a position to be loaded onto the truck 770 and when it is not.
- the agricultural machine 100 includes a work vehicle 100 and an implement 300 connected to the work vehicle 100 , and the processing unit 161 , depending on the implement 300 connected to the work vehicle 100 , search area 810 , The pattern of 820 may be changed.
- multiple types of implements 300 having different sizes can be connected to the work vehicle 100 , and the processing device 161 selects the search area 810 according to the sizes of the implements 300 connected to the work vehicle 100 . , 820 may be changed.
- the positional relationship between the work vehicle 100 and the implement 300 connected to the work vehicle 100 can be changed, and the processing device 161 changes the positional relationship between the work vehicle 100 and the implement 300.
- the pattern of the search areas 810, 820 may be changed according to the change.
- the sensing system 10 further includes a positioning device 110 that detects the position of the agricultural machine 100 and outputs position data, and a storage device 170 that stores map data of the area in which the agricultural machine 100 moves.
- the processing device 161 may determine the area in which the agricultural machine 100 is located based on the location data and the map data.
- the positioning device 110 By using the positioning device 110, it is possible to determine the area where the agricultural machine 100 is located.
- the agricultural machine 100 may be equipped with the sensing system 10 described above. Thereby, a search suitable for the area in which the agricultural machine 100 is located can be performed.
- the agricultural machine 100 may further include a traveling device 240 that causes the agricultural machine 100 to travel, and a control device 160 that controls the operation of the traveling device 240 and causes the agricultural machine 100 to operate automatically. This makes it possible to perform a search suitable for the area where the automatically traveling agricultural machine 100 is located.
- a sensing method is a sensing method for a mobile agricultural machine 100, in which one or more sensors 140 are used to sense the environment around the agricultural machine 100 and output sensing data. detection of objects located in search areas 810 and 820 around the agricultural machine 100 based on the sensing data; search area 810 for detecting objects according to the area in which the agricultural machine 100 is located; , 820 pattern.
- the technology of the present disclosure is particularly useful in the field of agricultural machinery such as tractors, harvesters, rice transplanters, ride-on care machines, vegetable transplanters, mowers, seeders, fertilizers, or agricultural robots.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Remote Sensing (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Mechanical Engineering (AREA)
- Soil Sciences (AREA)
- Environmental Sciences (AREA)
- Computer Networks & Wireless Communication (AREA)
- Electromagnetism (AREA)
- Aviation & Aerospace Engineering (AREA)
- Automation & Control Theory (AREA)
- Optics & Photonics (AREA)
- Guiding Agricultural Machines (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
- Geophysics And Detection Of Objects (AREA)
- Testing Or Calibration Of Command Recording Devices (AREA)
Abstract
Description
本開示において「農業機械」は、農業用途で使用される機械を意味する。本開示の農業機械は、移動しながら農作業を行うことが可能な移動型の農業機械(Mobile Agricultural Machine)であり得る。農業機械の例は、トラクタ、収穫機、田植機、乗用管理機、野菜移植機、草刈機、播種機、施肥機、および農業用移動ロボットを含む。トラクタのような作業車両が単独で「農業機械」として機能する場合だけでなく、作業車両に装着または牽引される作業機(インプルメント)と作業車両の全体が一つの「農業機械」として機能する場合がある。農業機械は、圃場内の地面に対して、耕耘、播種、防除、施肥、作物の植え付け、または収穫などの農作業を行う。これらの農作業を「対地作業」または単に「作業」と称することがある。車両型の農業機械が農作業を行いながら走行することを「作業走行」と称することがある。
以下、本開示の実施形態を説明する。ただし、必要以上に詳細な説明は省略することがある。例えば、既によく知られた事項の詳細な説明および実質的に同一の構成に関する重複する説明を省略することがある。これは、以下の説明が不必要に冗長になることを避け、当業者の理解を容易にするためである。なお、発明者は、当業者が本開示を十分に理解するために添付図面および以下の説明を提供するのであって、これらによって特許請求の範囲に記載の主題を限定することを意図するものではない。以下の説明において、同一または類似の機能を有する構成要素については、同一の参照符号を付している。
図2は、作業車両100、および作業車両100に連結された作業機300の例を模式的に示す側面図である。本実施形態における作業車両100は、手動運転モードと自動運転モードの両方で動作することができる。自動運転モードにおいて、作業車両100は無人で走行することができる。作業車両100は、圃場内と圃場外の両方で自動運転が可能である。
次に、作業車両100、端末装置400、および管理装置600の動作を説明する。
まず、作業車両100による自動走行の動作の例を説明する。本実施形態における作業車両100は、圃場内および圃場外の両方で自動で走行することができる。圃場内において、作業車両100は、予め設定された目標経路に沿って走行しながら、作業機300を駆動して所定の農作業を行う。作業車両100は、圃場内を走行中に障害物が検出された場合、走行を停止し、ブザー220からの警告音の発出、および端末装置400への警告信号の送信などの動作を行う。圃場内において、作業車両100の測位は、主にGNSSユニット110から出力されるデータに基づいて行われる。一方、圃場外において、作業車両100は、圃場外の農道または一般道に設定された目標経路に沿って自動で走行する。作業車両100は、圃場外を走行中、カメラ120および/またはLiDARセンサ140によって取得されたデータを活用して走行する。圃場外において、作業車両100は、障害物が検出されると、障害物を回避するか、その場で停止する。圃場外においては、GNSSユニット110から出力される測位データに加え、LiDARセンサ140および/またはカメラ120から出力されるデータに基づいて作業車両100の位置が推定される。
次に、農業機械が位置するエリアに応じてサーチ領域を設定する処理を説明する。
次に、作業車両100に接続されたインプルメント300に応じてサーチ領域を設定する処理を説明する。
Claims (19)
- 移動型の農業機械のセンシングシステムであって、
農業機械に設けられ、前記農業機械の周辺の環境をセンシングして、センシングデータを出力する一つ以上のセンサと、
前記センシングデータに基づいて、前記農業機械の周辺のサーチ領域に位置する物体を検出する処理装置と、
を備え、
前記処理装置は、前記農業機械が位置しているエリアに応じて、前記物体の検出を行う前記サーチ領域のパターンを変更する、センシングシステム。 - 前記処理装置は、前記農業機械が圃場内の第1エリアに位置しているときと、前記圃場内における前記第1エリアよりも前記圃場の外周縁に近い第2エリアに位置しているときとで、前記サーチ領域のパターンを互いに異ならせる、請求項1に記載のセンシングシステム。
- 前記第2エリアは畦際である、請求項2に記載のセンシングシステム。
- 前記サーチ領域は、前記農業機械の側方の領域を含む側方サーチ領域を含み、
前記処理装置は、前記農業機械が前記第1エリアに位置しているときよりも前記第2エリアに位置しているときの前記側方サーチ領域の前後方向の長さを大きくする、請求項2または3に記載のセンシングシステム。 - 前記処理装置は、前記農業機械が橋に位置しているときと、前記橋とは異なる所定の道に位置しているときとで、前記サーチ領域のパターンを互いに異ならせる、請求項1から4のいずれかに記載のセンシングシステム。
- 前記農業機械は、前輪が設けられた作業車両を含み、
前記処理装置は、前記農業機械が前記橋に位置しているときは、前記所定の道に位置しているときよりも、前記前輪の前方の外側端部に近い領域を含むように前記サーチ領域を設定する、請求項5に記載のセンシングシステム。 - 前記処理装置は、前記農業機械が水路に隣接する道に位置しているときと、水路に隣接しない道に位置しているときとで、前記サーチ領域のパターンを互いに異ならせる、請求項1から6のいずれかに記載のセンシングシステム。
- 前記サーチ領域は、前記農業機械の側方の領域を含む側方サーチ領域を含み、
前記処理装置は、前記農業機械が前記水路に隣接しない道に位置しているときよりも前記水路に隣接する道に位置しているときの前記側方サーチ領域の前後方向の長さを大きくする、請求項7に記載のセンシングシステム。 - 前記処理装置は、前記農業機械が納屋の出入り口に位置しているときと、前記納屋の出入り口とは異なる第3エリアに位置しているときとで、前記サーチ領域のパターンを互いに異ならせる、請求項1から8のいずれかに記載のセンシングシステム。
- 前記サーチ領域は、前記農業機械の側方の領域を含む側方サーチ領域を含み、
前記処理装置は、前記農業機械が前記第3エリアに位置しているときよりも前記納屋の出入り口に位置しているときの前記側方サーチ領域の前後方向の長さを大きくする、請求項9に記載のセンシングシステム。 - 前記処理装置は、前記農業機械が圃場内の第4エリアに位置しているときと、前記圃場内における前記第4エリアよりも前記圃場の畝または作物植え付けエリアに近い第5エリアに位置しているときとで、前記サーチ領域のパターンを互いに異ならせる、請求項1から10のいずれかに記載のセンシングシステム。
- 前記処理装置は、前記農業機械が運搬車に積み込まれる位置にあるときとないときとで、前記サーチ領域のパターンを互いに異ならせる、請求項1から11のいずれかに記載のセンシングシステム。
- 前記農業機械は、作業車両および前記作業車両に接続されたインプルメントを含み、
前記処理装置は、前記作業車両に接続された前記インプルメントに応じて、前記サーチ領域のパターンを変更する、請求項1から12のいずれかに記載のセンシングシステム。 - 前記作業車両には、サイズが互いに異なる複数種類のインプルメントが接続可能であり、
前記処理装置は、前記作業車両に接続されたインプルメントのサイズに応じて前記サーチ領域のパターンを変更する、請求項13に記載のセンシングシステム。 - 前記作業車両と前記作業車両に接続された前記インプルメントとの間の位置関係は変更可能であり、
前記処理装置は、前記作業車両と前記インプルメントとの間の位置関係の変化に応じて前記サーチ領域のパターンを変更する、請求項13または14に記載のセンシングシステム。 - 前記農業機械の位置を検出して位置データを出力する測位装置と、
前記農業機械が移動するエリアの地図データを記憶する記憶装置と、
をさらに備え、
前記処理装置は、前記位置データと前記地図データとに基づいて、前記農業機械が位置しているエリアを決定する、請求項1から15のいずれかに記載のセンシングシステム。 - 請求項1から16のいずれかに記載のセンシングシステムを備える農業機械。
- 前記農業機械を走行させる走行装置と、
前記走行装置の動作を制御し、前記農業機械を自動運転させる制御装置と、
をさらに備える、請求項17に記載の農業機械。 - 移動型の農業機械のセンシング方法であって、
一つ以上のセンサを用いて農業機械の周辺の環境をセンシングして、センシングデータを出力すること、
前記センシングデータに基づいて、前記農業機械の周辺のサーチ領域に位置する物体を検出すること、
前記農業機械が位置しているエリアに応じて、前記物体の検出を行う前記サーチ領域のパターンを変更すること、
を含む、センシング方法。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2023570862A JPWO2023127557A1 (ja) | 2021-12-27 | 2022-12-16 | |
EP22915781.3A EP4434313A1 (en) | 2021-12-27 | 2022-12-16 | Agricultural machine, sensing system used in agricultural machine, and sensing method |
US18/749,601 US20240345253A1 (en) | 2021-12-27 | 2024-06-20 | Agricultural machine, sensing system used in agricultural machine, and sensing method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021-212577 | 2021-12-27 | ||
JP2021212577 | 2021-12-27 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/749,601 Continuation US20240345253A1 (en) | 2021-12-27 | 2024-06-20 | Agricultural machine, sensing system used in agricultural machine, and sensing method |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023127557A1 true WO2023127557A1 (ja) | 2023-07-06 |
Family
ID=86998822
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/046459 WO2023127557A1 (ja) | 2021-12-27 | 2022-12-16 | 農業機械、農業機械に用いるセンシングシステムおよびセンシング方法 |
Country Status (4)
Country | Link |
---|---|
US (1) | US20240345253A1 (ja) |
EP (1) | EP4434313A1 (ja) |
JP (1) | JPWO2023127557A1 (ja) |
WO (1) | WO2023127557A1 (ja) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2015191592A (ja) * | 2014-03-28 | 2015-11-02 | ヤンマー株式会社 | 自律走行作業車両 |
JP2017015409A (ja) * | 2015-06-26 | 2017-01-19 | シャープ株式会社 | 路面検知装置、移動体、路面検知方法、および路面検知プログラム |
JP2018113937A (ja) * | 2017-01-20 | 2018-07-26 | 株式会社クボタ | 自動走行作業車 |
JP2019175059A (ja) | 2018-03-28 | 2019-10-10 | ヤンマー株式会社 | 作業車両の走行制御システム |
WO2020240983A1 (ja) * | 2019-05-27 | 2020-12-03 | ヤンマー株式会社 | 障害物判定システム及び自律走行システム |
JP2021108621A (ja) * | 2020-01-14 | 2021-08-02 | 株式会社クボタ | 走行経路管理システム |
-
2022
- 2022-12-16 WO PCT/JP2022/046459 patent/WO2023127557A1/ja active Application Filing
- 2022-12-16 JP JP2023570862A patent/JPWO2023127557A1/ja active Pending
- 2022-12-16 EP EP22915781.3A patent/EP4434313A1/en active Pending
-
2024
- 2024-06-20 US US18/749,601 patent/US20240345253A1/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2015191592A (ja) * | 2014-03-28 | 2015-11-02 | ヤンマー株式会社 | 自律走行作業車両 |
JP2017015409A (ja) * | 2015-06-26 | 2017-01-19 | シャープ株式会社 | 路面検知装置、移動体、路面検知方法、および路面検知プログラム |
JP2018113937A (ja) * | 2017-01-20 | 2018-07-26 | 株式会社クボタ | 自動走行作業車 |
JP2019175059A (ja) | 2018-03-28 | 2019-10-10 | ヤンマー株式会社 | 作業車両の走行制御システム |
WO2020240983A1 (ja) * | 2019-05-27 | 2020-12-03 | ヤンマー株式会社 | 障害物判定システム及び自律走行システム |
JP2021108621A (ja) * | 2020-01-14 | 2021-08-02 | 株式会社クボタ | 走行経路管理システム |
Also Published As
Publication number | Publication date |
---|---|
US20240345253A1 (en) | 2024-10-17 |
JPWO2023127557A1 (ja) | 2023-07-06 |
EP4434313A1 (en) | 2024-09-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20240338037A1 (en) | Path planning system and path planning method for agricultural machine performing self-traveling | |
US20240341216A1 (en) | Travel control system for agricultural machine capable of performing remotely-manipulated traveling | |
US20240172577A1 (en) | Control system for agricultural machine and agriculture management system | |
US20240188475A1 (en) | Agricultural assistance system and agricultural assistance method | |
WO2023127557A1 (ja) | 農業機械、農業機械に用いるセンシングシステムおよびセンシング方法 | |
WO2023127556A1 (ja) | 農業機械、農業機械に用いるセンシングシステムおよびセンシング方法 | |
WO2024004463A1 (ja) | 走行制御システム、走行制御方法およびコンピュータプログラム | |
WO2023119996A1 (ja) | 障害物検出システム、農業機械および障害物検出方法 | |
WO2024004486A1 (ja) | 作業車両、制御方法および制御システム | |
WO2023248909A1 (ja) | 走行制御システム、農業機械および走行制御方法 | |
US20240317238A1 (en) | Agricultural road identification system, control system, and agricultural machine | |
WO2023238827A1 (ja) | 農業管理システム | |
WO2023218688A1 (ja) | 地図作成システムおよび経路計画システム | |
WO2023238724A1 (ja) | 農業機械の自動走行のための経路生成システムおよび経路生成方法 | |
WO2023112515A1 (ja) | 地図生成システムおよび地図生成方法 | |
WO2023234255A1 (ja) | センシングシステム、農業機械、およびセンシング装置 | |
WO2024004881A1 (ja) | 制御システム、制御方法および運搬車 | |
WO2023095856A1 (ja) | 自動運転を行う農業機械のための経路計画システム | |
WO2023007835A1 (ja) | 管理システム、および農業機械の圃場へのアクセスを管理するための方法 | |
US20240341215A1 (en) | Agricultural machine, sensing system, sensing method, remote operation system, and control method | |
JP7584654B2 (ja) | 農業機械のための管理システム | |
JP2023183840A (ja) | 農業機械の自動走行のための経路生成システムおよび経路生成方法 | |
WO2023119986A1 (ja) | 農業機械、および、農業機械に用いるジェスチャ認識システム | |
US20240345603A1 (en) | Travel control system for agricultural machine capable of performing remotely-manipulated traveling | |
JP2024003645A (ja) | 経路生成システム、農業機械および経路生成方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22915781 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023570862 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022915781 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2022915781 Country of ref document: EP Effective date: 20240620 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |